Tue. Nov 5th, 2024

Universal “brain”: MIT introduced a new way to teach robots

Universal "brain": MIT presented a new method for teaching robots

MIT presented a new method for teaching robots that uses large amounts of data, similar to how large language models are trained, writes TechCrunch. Here's the details.

What Happened

The Massachusetts Institute of Technology (MIT) has unveiled a new model to help train robots using massive amounts of data, as in large language models (LLM).

200% Deposit Bonus up to €3,000 180% First Deposit Bonus up to $20,000

The researchers note that traditional learning methods, particularly imitation learning, are ineffective when unexpected conditions such as changing lighting or new obstacles arise. In such cases, the robots do not have enough data to adapt.

That is why MIT scientists decided to use a large amount of data, as they do for training language models. They developed a new architecture called Heterogeneous Pretrained Transformers (HPT). This system integrates information from various sensors and environments to teach robots to better adapt to change.

What's Next

Associate Professor David Held emphasized, that the goal of the research is to create a universal brain for robots that can be downloaded without the need for additional training.

Natasha Kumar

By Natasha Kumar

Natasha Kumar has been a reporter on the news desk since 2018. Before that she wrote about young adolescence and family dynamics for Styles and was the legal affairs correspondent for the Metro desk. Before joining The Times Hub, Natasha Kumar worked as a staff writer at the Village Voice and a freelancer for Newsday, The Wall Street Journal, GQ and Mirabella. To get in touch, contact me through my natasha@thetimeshub.in 1-800-268-7116

Related Post