Job added in hotlist
Applied job
Contract job
90-day-old-job
part-time-job
Recruiter job
Employer job
Expanded search
Apply online not available
View more jobs in Lansing, MI
View more jobs in Michigan

Job Details

Research Scientist Intern CV and Machine Perception PhD

Company name
Facebook

Location
Lansing, MI, United States

Employment Type
Full-Time

Industry
Sciences, Scientist, Internship

Posted on
Oct 22, 2021

Apply for this job






Profile

Intro:

Facebook's mission is to give people the power to build community and bring the world closer together. Through our family of apps and services, we're building a different kind of company that connects billions of people around the world, gives them ways to share what matters most to them, and helps bring people closer together. Whether we're creating new products or helping a small business expand its reach, people at Facebook are builders at heart. Our global teams are constantly iterating, solving problems, and working together to empower people around the world to build community and connect in meaningful ways. Together, we can help people build stronger communities - we're just getting started.

Summary:

The Facebook Reality Labs (FRL) Research Team brings together a world-class team of researchers, developers, and engineers to create the future of AR and VR, which together will become as universal and essential as smartphones and personal computers are today. The Surreal Vision team is looking for the next generation of scientists and engineers to tackle the most ambitious problems in machine perception.

In this internship, you will not only work on the forefront research topics present in academia and industry, but beyond you will work with a team invested in building the complete stack from sensor design to state estimation and tracking to 3D reconstruction to novel scene representations.

Our internships are twelve (12) to twenty four (24) weeks long and you have the option to start at various dates throughout the year.

Required Skills:

Research, develop and prototype advanced software technologies in the domain of SLAM, visual localization, tracking, 3D reconstruction, SFM, Machine Perception, state estimation etc.

Analyze and improve efficiency, accuracy, scalability and stability of currently deployed systems

Design and execution of algorithms

Prototyping, building and analysis of experimental systems

Collaboration with and support of other researchers across various disciplines

Communication of research agenda, progress and results

Minimum Qualifications:

Currently has, or is in the process of obtaining, a PhD in the field of Computer Science, Computer Vision, Robotics or related field

Must obtain work authorization in country of employment at the time of hire, and maintain ongoing work authorization during employment

Must be available for 3 month to 6 month internship between January 1, 2022 and December 31, 2022

Experience with statistical analysis of data and mathematical modeling

Experience with developing 3D geometry, optimization, or information theory

Experience in Python or C

Interpersonal experience: cross-group and cross-culture collaboration

Preferred Qualifications:

Intent to return to degree-program after the completion of the internship

Experience with SLAM systems e.g. monocular, stereo, visual-inertial, LIDAR, or RGBD

Experience in geometric computer vision, including tracking, visual-inertial odometry, SLAM, relocalization, sensor/display calibration, large-scale structure from motion, probabilistic graphical models, or information sparsification technologies

Experience working with cameras and advance imaging sensors, IMUs, Magnetometers, and other sensors that can be used in the context of localization and mapping

Broad understanding of the full machine vision pipeline from sensors to high-level algorithms

Proven track record of achieving significant results as demonstrated by grants, fellowships, patents, as well as first-authored publications at leading workshops or conferences such as CVPR, ECCV/ICCV, BMVC, ICRA, IROS, or RSS

Demonstrated software engineer experience via an internship, work experience, coding competitions, or widely used contributions in open source repositories (e.g. GitHub)

Industry: Internet

Equal Opportunity: Facebook is proud to be an Equal Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Facebook is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.

Company info

Facebook
Website : http://www.facebook.com

What I liked about the service is that it had such a comprehensive collection of jobs! I was using a number of sites previously and this took up so much time, but in joining EmploymentCrossing, I was able to stop going from site to site and was able to find everything I needed on EmploymentCrossing.
John Elstner - Baltimore, MD
  • All we do is research jobs.
  • Our team of researchers, programmers, and analysts find you jobs from over 1,000 career pages and other sources
  • Our members get more interviews and jobs than people who use "public job boards"
Shoot for the moon. Even if you miss it, you will land among the stars.
ScientistCrossing - #1 Job Aggregation and Private Job-Opening Research Service — The Most Quality Jobs Anywhere
ScientistCrossing is the first job consolidation service in the employment industry to seek to include every job that exists in the world.
Copyright © 2024 ScientistCrossing - All rights reserved. 21 192