At Toyota Research Institute (TRI), we're on a mission to improve the quality of human life. We're developing new tools and capabilities to amplify the human experience. To lead this transformative shift in mobility, we've built a world-class team in Energy & Materials, Human-Centered AI, Human Interactive Driving, and Robotics.
The Mission Make general-purpose robots a reality.
The Challenge We envision a future where robots assist with household chores and cooking, aid the elderly in maintaining their independence, and enable people to spend more time on the activities they enjoy most. To achieve this, robots need to be able to operate reliably in messy, unstructured environments. Our mission is to answer the question "What will it take to create truly general-purpose robots that can accomplish a wide variety of tasks in settings like human homes with minimal human supervision?". We believe that the answer lies in cultivating large-scale datasets of physical interaction from a variety of sources and building on the latest advances in machine learning to learn general purpose robot behaviors from this data.
The Team Our goal is to revolutionize the field of robotic manipulation, enabling long-horizon dexterous behaviors to be efficiently taught, learned, and improved over time in diverse, real world environments.
Our team has deep cross-functional expertise across simulation, perception, controls, and machine learning, and we measure our success in terms of fundamental capabilities development, as well as research impact via open-source software and publications. Our north star is fundamental technological advancement in building robots which can flexibly perform a wide variety of tasks in diverse environments with minimal human supervision. Come join us and let's make general-purpose robots a reality.
We operate a fleet of robots, and robot-embodied teaching and deployment is a key part of our strategy.
The Opportunity We're looking for a driven engineer with a "make it happen" mentality. The ideal candidate is able to operate independently when needed, but works well as part of a larger integrated group at the cutting edge of state-of-the-art robotics and machine learning. If our mission of revolutionizing robotics through machine learning resonates with you, get in touch and let's talk about how we can create the next generation of AI-powered capable robots together!
Responsibilities Research and contribute to the design of novel robotic systems through software development, including control, perception, planning, and their interactions with learned policies.Develop tooling, drivers, and controllers to enable stable and performant robotic platforms.Enable research into robot foundation models by working with mechanical/electrical engineers, technicians, and researchers to build and integrate new enabling robotics technologies.Help robotics research scientists work toward applying and integrating their research toward more robust, perceptive, and scalable systems.Design and integrate creative system solutions; combining actuation, structure, and sensing, as well as new mechanisms and sensory for human-scale manipulation.Qualifications B.S. or higher in an engineering related field and 4+ years of relevant industry experience.Strong software engineering skills; very comfortable with working in mixed C++ and Python codebase.Experience with a full-stack approach to robotics, including familiarity with electromechanical systems and actuation, as well as practical software design.The ability to design and deploy integrated systems that complement and bring to bear advanced software and learning algorithms.Deep multi-functional understanding of all levels of a robotic system including both hardware and software, with experience operating robots.Experience with classical motion planning and robotic control, and familiarity with machine-learned policies.Experience with inter- and in-process communication, parallelism, logging, networking and data systems, and common methods.Background or familiarity with some of the following: motion control and actuation, whole-body control, robot teleoperation methods, common communication protocols, industrial/research robotic arms, visual perception and depth sensors, machine learning, robotic simulation, force and tactile sensing systems, haptic interfaces.The pay range for this position at commencement of employment is expected to be between $201,600 and $289,800/year for California-based roles; however, base pay offered may vary depending on multiple individualized factors, including market location, job-related knowledge, skills, and experience. Note that TRI offers a generous benefits package (including 401(k) eligibility and various paid time off benefits, such as vacation, sick time, and parental leave) and an annual cash bonus structure. Details of participation in these benefit plans will be provided if an employee receives an offer of employment. Please reference this Candidate Privacy Notice to inform you of the categories of personal information that we collect from individuals who inquire about and/or apply to work for Toyota Research Institute, Inc. or its subsidiaries, including Toyota A.I. Ventures GP, L.P., and the purposes for which we use such personal information.
TRI is fueled by a diverse and inclusive community of people with unique backgrounds, education and life experiences. We are dedicated to fostering an innovative and collaborative environment by living the values that are an essential part of our culture. We believe diversity makes us stronger and are proud to provide Equal Employment Opportunity for all, without regard to an applicant's race, color, creed, gender, gender identity or expression, sexual orientation, national origin, age, physical or mental disability, medical condition, religion, marital status, genetic information, veteran status, or any other status protected under federal, state or local laws.
Pursuant to the San Francisco Fair Chance Ordinance, we will consider qualified applicants with arrest and conviction records for employment.
#J-18808-Ljbffr