• AI & Data Science
October 27, 2022

Evolving from cars to robots – Tesla’s AI Day 2022

Autonomous Robots – not so different from autonomous cars according to Tesla. Introducing Optimus – Tesla’s AI bot.

In the latest Tesla AI Day, held on September 30th, Tesla showcased their latest development progress, with a lot of focus on their recent investments in the humanoid robot “Optimus”. The Tesla bot, an odd product for a typical car company, was announced at last year’s AI day. However, Tesla is not a typical car company and according to them research and investments into solving autonomous driving can be leveraged in autonomous robot development.

Teslas bot timeline – from concept to current generation.

 

If the company will be prolific within bots remains to be seen. They have set out targets for making a bot for the ordinary consumer at a price of about $20.000 USD and production scale of millions.

The bot is designed with inspiration from the human anatomy, it will weigh 73 kg and have a power consumption of 500W while walking and 100W sitting idle. The current design is equipped with a 2.3kWh battery which is intended to last for a whole day and therefore much effort goes into designing efficient actuators and batteries. The hands, modeled after human hands but simplified, contains 6 actuators and 11 deg of freedom, they are able to lift 4.5 kg and capable of manipulating small parts with precision.

 

A leg actuator lifting a 0.5 ton concert piano, matching human quadriceps strength when direct driven.

 

Reuse of infrastructure

One challenge for Tesla was to protect the robot if it was to fall. Crash simulation software used to model the dynamics of a car collisions was repurposed to understand where the robot was most sensitive. It turns out that falling on the chest, where there are fewer gearboxes and actuators, is less likely to cause damage compared to the robot using its arms to protect itself. The simulation software was also used to understand which parts take the most amount of stress during normal activities, providing input to the design teams.

Tools that auto-pilot use in their cars for 3D occupancy understanding and scene segmentation was ported to the robot. This, along with high frequency visual keypoints gives the bot a reference system and input for trajectory planning for indoor navigation.

 

Combining AI/ML with traditional control systems
Visualization of trajectory planning where each individual footstep is planned ahead of time and corrected using current state estimates.

 

Understanding the world around the robot is however not enough, the robot must also be able to move and interact with its environment. To do so, trajectory planning algorithms are used to provide high level planning down to the foot placement of the robot, and model-based control loops handles the real-world challenges posed by unexpected disturbances and noise.

 

Digital twin

At the heart of fast development lies a detailed model of the robot. From autonomous car development Tesla understands the value creating a digital model of the product and providing real data feedback loops to understand failure modes and improve simulations. They now approach robot design in the same manner, and hope to be able to deliver a product within 3-5 years.

 

Read our latest case where we use a Digital Twin to generate synthetic data to train a neural network to estimate what cannot be measured, the friction between the tire and the road in a car here.

Author: Adam Jalkemo

Are you interested in hearing more from us?

Being a part of Edge means staying at the forefront of technology. Enter your contact information in order to stay in touch with us at Combine. Let’s connect!

Join the Edge