BackgroundCheck.run
Search For

Christopher J Peri, 4111861 E Yale Ct, Aurora, CO 80014

Christopher Peri Phones & Addresses

Aurora, CO   

Reno, NV   

Blythewood, SC   

Dublin, CA   

Sparks, NV   

Houston, TX   

Cornelius, NC   

Social networks

Christopher J Peri

Linkedin

Work

Company: Cemex Sep 2011 Address: Columbia, South Carolina Area Position: Plant manager

Education

Degree: B.S. School / High School: University of Nevada-Reno 2002 to 2006 Specialities: Material Science and Engineering

Skills

Materials • Continuous Improvement • Cement • Process Improvement • Engineering • Operations Management • Process Engineering • Project Management • Manufacturing • Mineral Processing • Mining • Project Engineering • Contract Management • Training • Maintenance Management • Concrete • Materials Science • Mining Engineering • Chemistry • Construction • Plant Maintenance • Metallurgy • Aggregates • Building Materials • Steel • Nanotechnology • Solidworks • Supply Chain Optimization • Optimization

Ranks

Certificate: Six Sigma Green Belt

Industries

Construction

Mentions for Christopher J Peri

Christopher Peri resumes & CV records

Resumes

Christopher Peri Photo 21

Supply Chain Manager

Location:
1253 south Alton Ct, Denver, CO 80247
Industry:
Construction
Work:
CEMEX - Columbia, South Carolina Area since Sep 2011
Plant Manager
CEMEX Jun 2008 - Jan 2012
Plant Superintendent
CEMEX Jul 2007 - Jun 2008
Professional in Development
Interlock USA Jan 2007 - Jun 2007
Application Engineer
Education:
University of Nevada-Reno 2002 - 2006
B.S., Material Science and Engineering
Skills:
Materials, Continuous Improvement, Cement, Process Improvement, Engineering, Operations Management, Process Engineering, Project Management, Manufacturing, Mineral Processing, Mining, Project Engineering, Contract Management, Training, Maintenance Management, Concrete, Materials Science, Mining Engineering, Chemistry, Construction, Plant Maintenance, Metallurgy, Aggregates, Building Materials, Steel, Nanotechnology, Solidworks, Supply Chain Optimization, Optimization
Certifications:
Six Sigma Green Belt
Msha Hazard Training- Coal, Surface and Underground
Smith System Advanced Driving Traffic Safety Seminar
License King Mine Ii05-04864

Publications & IP owners

Us Patents

System And Method To Create A Collaborative Web-Based Multimedia Layered Platform

US Patent:
7933956, Apr 26, 2011
Filed:
Jan 24, 2007
Appl. No.:
11/657787
Inventors:
Henry Hon - Berkeley CA, US
Christopher Anthony Peri - Oakland CA, US
Timothy Hon - Berkeley CA, US
Frankie Waitim Wong - Pullman WA, US
Assignee:
Simulat, Inc. - Berkeley CA
International Classification:
G06F 15/16
US Classification:
709205, 709246
Abstract:
The present invention relates to a system and method to allow multiple users to collaborate on tasks and interact in a shared space session within a network in real-time; using a media application to manage media-layers. Each media-layer serves as a container for multimedia programs or plug-ins. The invention allows which media-layer to display via organization metaphors and filtering criteria. When multiple users are logged into the same shared space, each user can invoke and observe modifications to media-layers with the browser based or client based application. All events are synchronized among all users in that shared space, where the system is a communication conduit. The media-layers in the shared space maintains spatial and temporal correlation by a media application stage manager tool and described as a collection file descriptor such as an XML file. The ability to invoke events that affect media-layers can be supported in a synched or non synched mode on demand.

System And Method For Emergency Response

US Patent:
2004000, Jan 15, 2004
Filed:
Feb 11, 2003
Appl. No.:
10/365019
Inventors:
Michael Aratow - Mountain View CA, US
Craig Buxton - Point Richmond CA, US
Christopher Peri - Point Richmond CA, US
International Classification:
H04Q009/00
G08C019/22
G08B023/00
US Classification:
340/870070, 340/573100
Abstract:
This application describes an information and resource management system for collecting data from diverse sources and organizing multiple types of data and information to facilitate dynamic multi-dimensional displays that will enhance cognition and situational awareness for diverse user communities. This system may facilitate collaborative cross-agency research and response to public health and safety issues. The system will generate more rapid awareness of potentially critical situations and promote greater awareness of the cost and benefits of alternative courses of action across diverse agencies and organizations serving common populations and communities. The invention includes customized geographically enabled data collection tools and techniques, dedicated databases and parsing schemes that feed into customized data visualization and simulation engines that drive the display of context sensitive interactive environments on a wide variety of computing platforms. The invention provides a novel approach to inter-disciplinary information integration processing, visualization, sharing and decision-making in the domain of public health and safety, disaster management and mitigation.

Three-Dimensional Scene Recreation Using Depth Fusion

US Patent:
2022040, Dec 22, 2022
Filed:
Oct 13, 2021
Appl. No.:
17/500817
Inventors:
- Gyeonggi-Do, KR
Christopher A. Peri - Mountain View CA, US
International Classification:
G06T 17/20
G06T 3/40
G06T 19/00
G06N 20/00
G06T 7/55
G06T 19/20
Abstract:
Generating a 3D scene reconstruction using depth fusion can include creating a high-resolution sparse depth map by mapping sensor depths from a low-resolution depth map to points corresponding to pixels of a high-resolution color image of a scene. The high-resolution sparse depth map can have the same resolution as the high-resolution color image. A fused sparse depth map can be produced by combining the high-resolution sparse depth map with sparse depths reconstructed from the high-resolution color image. The high-resolution dense depth map can be generated based on fused sparse depths of the fused sparse depth map.

Image-Guided Depth Propagation For Space-Warping Images

US Patent:
2022029, Sep 15, 2022
Filed:
Aug 13, 2021
Appl. No.:
17/402005
Inventors:
- Gyeonggi-Do, KR
Christopher A. Peri - Mountain View CA, US
International Classification:
G06T 3/00
G06T 7/50
G06F 3/01
G06T 7/13
Abstract:
Updating an image during real-time rendering of images by a display device can include determining a depth for each pixel of a color frame received from a source device and corresponding to the image. Each pixel's depth is determined by image-guided propagation of depths of sparse points extracted from a depth map generated at the source device. With respect to pixels corresponding to an extracted sparse depth point, image-guided depth propagation can include retaining the depth of the corresponding sparse depth point unchanged from the source depth map. With respect to each pixel corresponding to a non-sparse depth point, image-guided depth propagation can include propagating to the corresponding non-sparse depth point a depth of a sparse depth point lying within a neighborhood of the non-sparse depth point. Pixel coordinates of the color frame can be transformed for generating a space-warped rendering of the image.

Systems And Methods For Reconstruction Of Dense Depth Maps

US Patent:
2022023, Jul 21, 2022
Filed:
Jun 3, 2021
Appl. No.:
17/338521
Inventors:
- Suwon-si, KR
Christopher A. Peri - Mountain View CA, US
International Classification:
G06T 19/00
G06T 7/55
Abstract:
A method by an extended reality (XR) display device includes accessing image data and sparse depth points corresponding to a plurality of image frames to be displayed on one or more displays of the XR display device. The method further includes determining a plurality of sets of feature points for a current image frame of the plurality of image frames, constructing a cost function configured to propagate the sparse depth points corresponding to the current image frame based on the plurality of sets of feature points, and generating a dense depth map corresponding to the current image frame based on an evaluation of the cost function. The method thus includes rendering the current image frame on the one or more displays of the XR display device based on the dense depth map.

System And Method For Scene Reconstruction With Plane And Surface Reconstruction

US Patent:
2023008, Mar 23, 2023
Filed:
Mar 16, 2022
Appl. No.:
17/696746
Inventors:
- Suwon-si, KR
Christopher A. Peri - Mountain View CA, US
International Classification:
G06T 17/20
G06T 7/12
G06T 19/00
Abstract:
A system and method for 3D reconstruction with plane and surface reconstruction, scene parsing, depth reconstruction with depth fusion from different sources. The system includes display and a processor to perform the method for 3D reconstruction with plane and surface reconstruction. The method includes dividing a scene of an image frame into one or more plane regions and one or more surface regions. The method also includes generating reconstructed planes by performing plane reconstruction based on the one or more plane regions. The method also includes generating reconstructed surfaces by performing surface reconstruction based on the one or more surface regions. The method further includes creating the 3D scene reconstruction by integrating the reconstructed planes and the reconstructed surfaces.

Method And Apparatus For Scene Segmentation For Three-Dimensional Scene Reconstruction

US Patent:
2023009, Mar 23, 2023
Filed:
Jun 7, 2022
Appl. No.:
17/805828
Inventors:
- Suwon-si, KR
Christopher A. Peri - Mountain View CA, US
International Classification:
G06T 7/11
G06V 10/25
G06V 20/20
G06T 7/50
G06V 10/82
G06V 10/764
G06V 10/32
Abstract:
A method includes obtaining, from an image sensor, image data of a real-world scene; obtaining, from a depth sensor, sparse depth data of the real-world scene; and passing the image data to a first neural network to obtain one or more object regions of interest (ROIs) and one or more feature map ROIs. Each object ROI includes at least one detected object. The method also includes passing the image data and sparse depth data to a second neural network to obtain one or more dense depth map ROIs; aligning the one or more object ROIs, one or more feature map ROIs, and one or more dense depth map ROIs; and passing the aligned ROIs to a fully convolutional network to obtain a segmentation of the real-world scene. The segmentation contains one or more pixelwise predictions of one or more objects in the real-world scene.

Depth Map Re-Projection Based On Image And Pose Changes

US Patent:
2020037, Dec 3, 2020
Filed:
Aug 18, 2020
Appl. No.:
16/996610
Inventors:
- Suwon Si, KR
Christopher A. Peri - Mountain View CA, US
International Classification:
G02B 27/01
G02B 27/00
G06T 7/73
G06T 15/20
G06T 19/00
Abstract:
A method implemented by an extended reality (XR) display device includes rendering a current image frame received from an external electronic device associated with the XR display device. The current image frame is associated with a current pose of the XR display device. The method further includes receiving an updated image frame from the external electronic device, calculating an updated pose based on one or more characteristics of the updated image frame, and determining whether the updated pose is within a pose range with respect to the current pose. The method thus further includes re-rendering, on one or more displays of the XR display device, a previous image frame based on whether the current pose is determined to be within the pose range.

NOTICE: You may not use BackgroundCheck or the information it provides to make decisions about employment, credit, housing or any other purpose that would require Fair Credit Reporting Act (FCRA) compliance. BackgroundCheck is not a Consumer Reporting Agency (CRA) as defined by the FCRA and does not provide consumer reports.