BackgroundCheck.run
Search For

Saurabh B Kulkarni, 44Mountain View, CA

Saurabh Kulkarni Phones & Addresses

Mountain View, CA   

Oakland, CA   

San Francisco, CA   

New York, NY   

Gainesville, FL   

171 E 84Th St APT 24E, New York, NY 10028   

Mentions for Saurabh B Kulkarni

Saurabh Kulkarni resumes & CV records

Resumes

Saurabh Kulkarni Photo 32

Associate Manager At Globeop Financial Services

Position:
Associate Manager at GlobeOp Financial Services
Location:
United States
Industry:
Financial Services
Work:
GlobeOp Financial Services since Aug 2006
Associate Manager
GE Money Aug 2005 - Jul 2006
Branch Manager - RLP
HDFC Bank May 2005 - Aug 2005
Sales Manager - NRI Services
Citifinancial Consumer Finance India Ltd Feb 2004 - Apr 2005
Officer - Personal Loans
The Coca-Cola Company Jan 2002 - Jul 2002
Management Trainee – CSS
Education:
IBS Hyderabad 2002 - 2004
MBA, Finance
Barkatullah Vishwavidyalaya 1999 - 2001
M.Com, Commerce
Saurabh Kulkarni Photo 33

Senior System Software Engineer At Akamai Technologies

Position:
Senior System Software Engineer at Akamai Technologies
Location:
San Francisco Bay Area
Industry:
Computer Networking
Work:
Akamai Technologies - San Francisco Bay Area since Nov 2012
Senior System Software Engineer
Bloomberg - Greater New York City Area Jun 2010 - Oct 2012
Senior Software Engineer
Riverbed Technology - San Francisco Bay Area Aug 2007 - Jun 2010
Member of Technical Staff
Microsoft Corporation May 2006 - Aug 2006
Software Development Intern
Kasura Technologies Dec 2004 - Jul 2005
Senior Member Technical Staff
Conexant Systems Inc (formerly Paxonet Communications) Aug 2002 - Dec 2004
Senior Design Engineer
Education:
University of Florida 2005 - 2007
Master of Science (MS)
National Institute of Technology, Surat 1998 - 2002
Bachelor of Engineering, Computer Engineering
Languages:
English
Hindi
Saurabh Kulkarni Photo 34

Senior Manager At Nimble Storage

Position:
Senior Manager, Quality at Nimble Storage
Location:
San Jose, California
Industry:
Computer Software
Work:
Nimble Storage - San Francisco Bay Area since Aug 2011
Senior Manager, Quality
Autonomy, An HP Company - Mountain View, CA Oct 2006 - Aug 2011
Manager, Quality Assurance
Symphony Services - Pune, India Jun 2004 - Sep 2006
Senior QA Engineer
Agribuys (now FoodLink Online) - Pune, India Jul 2001 - Jun 2004
Project Lead
IBN Infosys Ltd. - Pune, India Jul 2000 - Jul 2001
Sales Engineer/ Management Representative (ISO 9001:2000)
Skills:
Storage, Enterprise Software, Virtualization, Data Privacy, Agile Methodologies, Cloud Computing, Microsoft Exchange, Microsoft SQL Server, SDLC, Pre-sales, Content Management, MS Project, Web Commerce, Integration, Databases, QA Engineering, Disaster Recovery, Data Center, Solaris
Saurabh Kulkarni Photo 35

Consultant-- User Experience Practice At Prolifics

Position:
Consultant-- User Experience Practice at Prolifics
Location:
New York, New York
Industry:
Information Technology and Services
Work:
Prolifics - Greater New York City Area since Aug 2012
Consultant-- User Experience Practice
Infosys Technologies Ltd Feb 2009 - Aug 2012
Technology Lead
Infosys Technologies Ltd Jan 2008 - Jan 2009
Senior Software Engineer
Infosys Technologies Ltd Sep 2006 - Dec 2007
Software Engineer
Education:
PVPP COE 2002 - 2006
B.E., Electronics and Telecommunications
Skills:
Spring, Portals, Enterprise Content Management, Requirements Analysis, Integration, SharePoint, Java Enterprise Edition, SOA

Publications & IP owners

Us Patents

Token-Position Handling For Sequence Based Neural Networks

US Patent:
2021031, Oct 14, 2021
Filed:
Apr 14, 2020
Appl. No.:
16/848748
Inventors:
- Redmond WA, US
Tiyasa MITRA - San Jose CA, US
Sujeeth Subramanya BHARADWAJ - Milpitas CA, US
Marc TREMBLAY - Bellevue WA, US
Saurabh Mohan KULKARNI - Redmond WA, US
International Classification:
G06N 3/04
G06N 3/08
G06F 40/289
G06K 9/62
Abstract:
Embodiments of the present disclosure include a method for token-position handling comprising: processing a first sequence of tokens to produce a second sequence of tokens, wherein the second sequence of tokens has a smaller number of tokens than the first sequence of tokens; masking at least some tokens in the second sequence to produce masked tokens; moving the masked tokens to the beginning of the second sequence to produce a third sequence; encoding tokens in the third sequence into a set of numeric vectors in a first array; and processing the first array in a transformer neural network to determine correlations among the third sequence, the processing the first array producing a second array.

Spread Neural Networks

US Patent:
2021031, Oct 14, 2021
Filed:
Apr 14, 2020
Appl. No.:
16/848707
Inventors:
- Redmond WA, US
Tiyasa MITRA - San Jose CA, US
Sujeeth Subramanya BHARADWAJ - Milpitas CA, US
Saurabh Mohan KULKARNI - Redmond WA, US
Marc TREMBLAY - Bellevue WA, US
International Classification:
G06N 3/08
G06N 3/04
G06F 40/126
G06F 17/16
G06F 17/15
Abstract:
Techniques for training neural networks are provided. According to one set of embodiments, a first array is processed in a spreading component to produce a second array, where a first dimension of the first array corresponds to at least one sequence of approximately orthogonal numeric vectors representing tokens, and where the spreading component combines values along the first dimension. The second array is processed in a transformer neural network to determine correlations between the sequence, which produces a third array. One or more batches of the third array are processed in a de-spreading component to produce a fourth array.

Pipelined Neural Network Processing With Continuous And Asynchronous Updates

US Patent:
2021009, Apr 1, 2021
Filed:
Sep 27, 2019
Appl. No.:
16/585105
Inventors:
- Redmond WA, US
Tiyasa Mitra - San Jose CA, US
Saurabh M. Kulkarni - Redmond WA, US
Marc Tremblay - Bellevue WA, US
Sujeeth S. Bharadwaj - Milpitas CA, US
International Classification:
G06N 3/04
G06F 17/27
G06N 3/08
Abstract:
Systems and methods for pipelined neural network processing with continuous and asynchronous updates are described. A method for processing a neural network comprising L layers, where L is an integer greater than two, includes partitioning the L layers among a set of computing resources configured to process forward passes and backward passes associated with each of the L layers. The method further includes initiating processing of the forward passes and the backward passes using the set of computing resources. The method further includes upon completion of a first set of forward passes and a first set of backward passes associated with a first layer of the L layers, initiating update of parameters associated with the first layer when gradients are available for updating the parameters associated with the first layer without waiting to calculate gradients associated with any of remaining L layers.

NOTICE: You may not use BackgroundCheck or the information it provides to make decisions about employment, credit, housing or any other purpose that would require Fair Credit Reporting Act (FCRA) compliance. BackgroundCheck is not a Consumer Reporting Agency (CRA) as defined by the FCRA and does not provide consumer reports.