BackgroundCheck.run
Search For

David A Caballero DeceasedTucumcari, NM

David Caballero Phones & Addresses

Tucumcari, NM   

1309 Walnut Ave, Friona, TX 79035    806-2505072   

1212 Euclid Ave, Friona, TX 79035    806-2505072   

Compton, CA   

Long Beach, CA   

Amarillo, TX   

Work

Company: Pga west greg norman country club - La Quinta, CA Dec 2012 Position: Customer service

Education

School / High School: Temple City High School 1995

Mentions for David A Caballero

Career records & work history

Lawyers & Attorneys

David Caballero Photo 1

David Caballero - Lawyer

ISLN:
1000542896
Admitted:
2003

License Records

David B Caballero

Licenses:
License #: P17103 - Active
Category: Emergency medical services
Issued Date: Sep 11, 2000
Expiration Date: Nov 30, 2018

David B Caballero

Licenses:
License #: E017619 - Active
Category: Emergency medical services
Issued Date: Jan 7, 2009
Expiration Date: Jun 30, 2019
Type: Los Angeles County FD

David Caballero resumes & CV records

Resumes

David Caballero Photo 43

Regulatory Compliance

Work:
Ey
Regulatory Compliance
David Caballero Photo 44

David Caballero

David Caballero Photo 45

David Caballero

David Caballero Photo 46

David Caballero

Skills:
Microsoft Office
David Caballero Photo 47

Electrician

Work:

Electrician
David Caballero Photo 48

David Botero Caballero

David Caballero Photo 49

David Caballero

David Caballero Photo 50

David Caballero

Location:
United States

Publications & IP owners

Us Patents

Methods And Apparatus For Planetary Navigation

US Patent:
8532328, Sep 10, 2013
Filed:
Aug 16, 2007
Appl. No.:
11/893619
Inventors:
David L. Caballero - Huntington Beach CA, US
Thomas Paul Weismuller - Orange CA, US
Assignee:
The Boeing Company - Chicago IL
International Classification:
G06K 9/00
US Classification:
382100, 382103, 382104, 382199, 382209, 382218, 382279, 382281, 382283, 382325, 701 13, 244 318
Abstract:
A method of navigating a space vehicle. An image of a planet surface is received. The received image is processed to identify edge pixels and angle data. The edge pixels and angle data are used to identify planetary features by shape, size, and spacing relative to other planetary features. At least some of the planetary features are compared with a predefined planet surface description including sizes and locations of planet landmarks. One or more matches are determined between the planetary feature(s) and the planet surface description. Based on the match(es), a location of the space vehicle relative to the planet is determined.

Method And System For Applying Silhouette Tracking To Determine Attitude Of Partially Occluded Objects

US Patent:
2008028, Nov 20, 2008
Filed:
May 18, 2007
Appl. No.:
11/750735
Inventors:
Thomas P. Weismuller - Orange CA, US
David L. Caballero - Huntington Beach CA, US
International Classification:
G06K 9/62
US Classification:
382215
Abstract:
A method and system for increasing the certainty of a silhouette matching process, where the process is being used for attitude determination of an object of interest, for example an aircraft. The method involves using one or more mask images that include structure or features that may or may not always be associated with the object of interest, and overlaying the mask image(s) onto a library image of the aircraft. Each pixel of the library image is compared against corresponding pixels of the mask image(s) to determine which pixels represent ambiguous areas of the library image. Those pixels are eliminated from consideration in determining a Fit score, where the Fit score represents a percentage value indicative of a certainty of the matching process in identifying the attitude of the aircraft. The method and system is applicable to a wide ranging variety of object detection applications.

Object Detection System And Method Incorporating Background Clutter Removal

US Patent:
2008031, Dec 18, 2008
Filed:
Jun 18, 2007
Appl. No.:
11/764396
Inventors:
Thomas P. Weismuller - Orange CA, US
David L. Caballero - Huntington Beach CA, US
International Classification:
G06K 9/00
US Classification:
382103
Abstract:
A method and system for optically detecting an object within a field of view where detection is difficult because of background clutter within the field of view that obscures the object. A camera is panned with movement of the object to motion stabilize the object against the background clutter while taking a plurality of image frames of the object. A frame-by-frame analysis is performed to determine variances in the intensity of each pixel, over time, from the collected frames. From this analysis a variance image is constructed that includes an intensity variance value for each pixel. Pixels representing background clutter will typically vary considerably in intensity from frame to frame, while pixels making up the object will vary little or not at all. A binary threshold test is then applied to each variance value and the results are used to construct a final image. The final image may be a black and white image that clearly shows the object as a silhouette.

Cluttered Background Removal From Imagery For Object Detection

US Patent:
2020009, Mar 19, 2020
Filed:
Sep 9, 2019
Appl. No.:
16/565008
Inventors:
- Chicago IL, US
David L. CABALLERO - Huntington Beach CA, US
Thomas P. WEISMULLER - Orange CA, US
International Classification:
G06T 7/73
G06T 7/215
G06T 7/246
G06K 9/38
G06K 9/20
G06T 7/194
G06T 7/136
G01C 21/36
G06K 9/32
Abstract:
Embodiments herein describe tracking the location and orientation of a target in a digital image. In an embodiment, this tracking can be used to control navigation for a vehicle. In an embodiment, a digital image can be captured by a visual sensor is received. A first array including a plurality of binary values related to the pixel velocity of a first plurality of pixels in the digital image as compared to corresponding pixels in a first one or more prior digital images can be generated. A second array including a plurality of values related to the standard deviation of pixel intensity of the first plurality of pixels in the digital image as compared to corresponding pixels in a second one or more prior digital images can be further generated. A plurality of thresholds relating to the values in the second array can be determined. A plurality of target pixels and a plurality of background pixels can be identified in the digital image, based on the first array, the second array, and the plurality of thresholds. A binary image related to the digital image, based on the identified plurality of target pixels and the identified plurality of background pixels, and identifying at least one of a location and an orientation of the target in the digital image based on the binary image, can be generated. In an embodiment, a command can be transmitted to a navigation system for a vehicle, to assist in navigating the vehicle toward the target, based on the identified at least one of a location and an orientation of the target.

Cluttered Background Removal From Imagery For Object Detection

US Patent:
2019019, Jun 27, 2019
Filed:
Dec 21, 2017
Appl. No.:
15/850219
Inventors:
- Chicago IL, US
David L. CABALLERO - Huntington Beach CA, US
Thomas P. WEISMULLER - Orange CA, US
International Classification:
G06T 7/73
G06T 7/246
G06K 9/38
G06K 9/20
G06K 9/32
G06T 7/194
G06T 7/136
G01C 21/36
Abstract:
Embodiments herein describe tracking the location and orientation of a target in a digital image. In an embodiment, this tracking can be used to control navigation for a vehicle. In an embodiment, a digital image can be captured by a visual sensor is received. A first array including a plurality of binary values related to the pixel velocity of a first plurality of pixels in the digital image as compared to corresponding pixels in a first one or more prior digital images can be generated. A second array including a plurality of values related to the standard deviation of pixel intensity of the first plurality of pixels in the digital image as compared to corresponding pixels in a second one or more prior digital images can be further generated. A plurality of thresholds relating to the values in the second array can be determined. A plurality of target pixels and a plurality of background pixels can be identified in the digital image, based on the first array, the second array, and the plurality of thresholds. A binary image related to the digital image, based on the identified plurality of target pixels and the identified plurality of background pixels, and identifying at least one of a location and an orientation of the target in the digital image based on the binary image, can be generated. In an embodiment, a command can be transmitted to a navigation system for a vehicle, to assist in navigating the vehicle toward the target, based on the identified at least one of a location and an orientation of the target.

Resolving Closely Spaced Objects

US Patent:
2017002, Jan 26, 2017
Filed:
Oct 2, 2014
Appl. No.:
14/504575
Inventors:
- Chicago IL, US
David L. Caballero - Huntington Beach CA, US
International Classification:
G06T 7/00
G06K 9/40
G06K 9/38
Abstract:
A method and apparatus for resolving a set of objects in an image of an area. A partition that captures a set of objects is identified using the image. The partition is comprised of a group of contiguous object pixels. A number of local max pixels are identified from the group of contiguous object pixels in the partition. A quantitative resolution of the set of objects captured in the partition is performed based on the number of local max pixels identified.

Isotropic Feature Matching

US Patent:
2015007, Mar 12, 2015
Filed:
Sep 12, 2013
Appl. No.:
14/025266
Inventors:
- Chicago IL, US
David Leonard Caballero - Huntington Beach CA, US
Assignee:
THE BOEING COMPANY - Chicago IL
International Classification:
G06K 9/32
US Classification:
382103, 382199
Abstract:
A computer-implemented method and apparatus for detecting an object of interest. An edge image is generated from an image of a scene. A sectioned structure comprising a plurality of sections is generated for use in analyzing the edge image. The edge image is analyzed using the sectioned structure to detect a presence of the object of interest in the edge image.

NOTICE: You may not use BackgroundCheck or the information it provides to make decisions about employment, credit, housing or any other purpose that would require Fair Credit Reporting Act (FCRA) compliance. BackgroundCheck is not a Consumer Reporting Agency (CRA) as defined by the FCRA and does not provide consumer reports.