Humanoid robots, also referred to as general-purpose robots, use AI to perform the tasks they’re given to do. If you missed the feature we did on these robots, click here. By now you know that humanoid robots are not just a thing of the future. In fact, you’ve likely seen videos of how they are currently being used in manufacturing and retail and could very well be deployed soon on the packaging line. But do you know what drives them? What makes them effective, and what technology enables them to “think” and move like humans? It’s none other than artificial intelligence (AI).
With a combination of advanced computer vision and machine learning, the robot can navigate complex environments and perform tasks like climbing stairs and grasping objects. The powerful machine learning program allows the humanoids to apply new experiences to known information, in effect learning new experiences. and 'learn' how to take this information and its own experiences into account for future actions. It’s this ability that lets the robot reason, draw conclusions, and ultimately, make decisions.
As the humanoid robots move around their environment, the AI is what allows the robot to capture information through cameras and LiDAR sensors, analyze that data, make inferences, and then move or act to the desired outcome. Its these sophisticated tasks that allow the humanoid to resemble humans in their thinking.
As CEO & co-founder of Sanctuary AI, the robotics company that created humanoid Phoenix, Geordie Rose has said, “general-purpose robots must be able to sense, understand, and act in the world the same way we do.” And to do that, they require AI.
“While we're immensely proud of our physical robot, the real star of the show is the underlying software. Carbon is our pioneering and unique AI control system, designed to give Phoenix human-like intelligence and enable it to do a wide range of work to help address the labor challenges affecting many organizations today. It is a cognitive platform that provides Phoenix with the ability to think and then act to complete work tasks just like a person. Integrating modern AI technologies to translate natural language into action in the real world, Carbon features reasoning, task, and motion plans that are both explainable and auditable,” says Rose.
Humanoid robots, also referred to as general-purpose robots, use AI to perform the tasks they’re given to do. If you missed the feature we did on these robots, click here. By now you know that humanoid robots are not just a thing of the future. In fact, you’ve likely seen videos of how they are currently being used in manufacturing and retail and could very well be deployed soon on the packaging line. But do you know what drives them? What makes them effective, and what technology enables them to “think” and move like humans? It’s none other than artificial intelligence (AI).
With a combination of advanced computer vision and machine learning, the robot can navigate complex environments and perform tasks like climbing stairs and grasping objects. The powerful machine learning program allows the humanoids to apply new experiences to known information, in effect learning new experiences. and 'learn' how to take this information and its own experiences into account for future actions. It’s this ability that lets the robot reason, draw conclusions, and ultimately, make decisions.
Human-Like Intelligence
As the humanoid robots move around their environment, the AI is what allows the robot to capture information through cameras and LiDAR sensors, analyze that data, make inferences, and then move or act to the desired outcome. Its these sophisticated tasks that allow the humanoid to resemble humans in their thinking.
As CEO & co-founder of Sanctuary AI, the robotics company that created humanoid Phoenix, Geordie Rose has said, “general-purpose robots must be able to sense, understand, and act in the world the same way we do.” And to do that, they require AI.
“While we're immensely proud of our physical robot, the real star of the show is the underlying software. Carbon is our pioneering and unique AI control system, designed to give Phoenix human-like intelligence and enable it to do a wide range of work to help address the labor challenges affecting many organizations today. It is a cognitive platform that provides Phoenix with the ability to think and then act to complete work tasks just like a person. Integrating modern AI technologies to translate natural language into action in the real world, Carbon features reasoning, task, and motion plans that are both explainable and auditable,” says Rose.
EVE, the humanoid robot from 1x, backed by Open AI, the company behind ChatGPT, leverages AI to reason and perform tasks. EVE is referred to as an “embodiment chatbot,” generating answers to questions much like Open AI’s ChatGPT does, using available information and patterns in data to give the best answer. Open AI’s mission states that it aims to create “a computer that can think like a human in every way and use that for the maximal benefit of humanity.”
Part of thinking like a human is not only using data to come up with a solution, but also being able to handle many tasks and work autonomously (apart from other human help). Humanoids for use in manufacturing often perform tasks like picking and placing, and material handling tasks, apart from human influence while at the same time operating safely around humans on the line.
Humanoid robots are described as general purpose in that they function in many different real-world environments. The humanoids are there to help do a little bit of everything, learn the environment, predict future needs, and go where they’re needed. But not all humanoids are developed solely for work. Developing more than just task-based learning, Hanson Robotics’ Sophia, is described as a social robot and is learning how to read human faces and expressions for ultimate human-like functionality.
The robot is powered by Hanson AI’s OpenCog, a cloud-based AI program that enables the robotics company to have large-scale cloud control of its robots. Sophia’s “brain “has deep-learning data analytics for processing the data that she extracts from her millions of interactions. She learns through both her own interactions, like humans, as well as what she is programmed to know.
Sophia describes it best, “My real AI combines cutting-edge work in symbolic AI, neural networks, expert systems, machine perception, conversational natural language processing, adaptive motor control and cognitive architecture.”
Whether designed for social interactions or packaging and manufacturing help, humanoid robots have the ability to use their human-like intelligence to process data, learn from experiences, and accomplish tasks.
But First, Training
As stated, the AI behind the robot is what is helping it learn new things and get better at them over time.
“The best way to think about Apollo is like an iPhone, or a personal computer. You can think of the AI as different applications that can go on top of it,” says Jeff Cardenas, founder of Apptronik, the robotics company that created the humanoid robot Apollo.
The robot then learns these different applications through the idea of reinforcement learning, in which the robot practices how to do something in simulation and performs the task thousands of times over. In doing that, it is constantly optimizing how it performs the task over time, performing the task better and better.
“There are a lot of new software frameworks that have real potential to enable the robots to do a much wider range of tasks and learn much quicker. Its early days for this, but we’re starting to adopt things like large language models, and in robotics there’s a version of this called large behavior models, “says Cardenas.
Through this idea, a human would show the robot how to do something, either by tele-operation or eventually by video and the robot would learn how to do that based on demonstration. Cardenas says that Apptronik is currently working on this and researching how it can be used to help Apollo do many more things over time.
“The idea is to build a general-purpose platform that is a software update away from doing something new,” says Cardenas.
Since humanoids are not humans, but rather machine, they of course have limitations, especially as it relates to experiencing a new problem or environment for the first time. As humans, we can anticipate what could occur even if we never experienced it, but these robots can’t do that unless that problem has already occurred, and it has “experienced” it. Meaning, each humanoid robot would require some sort of training to complete a given task.
As mentioned above, large language models and demonstration are the future in robotic task learning. In a similar vein, Sanctuary AI reported a system for training its humanoid Phoenix on specific tasks.
To get it right, the robotics company says that it films a particular task being done and then digitizes the entire event as a virtual environment. The AI can then practice the task in this virtual environment, perfecting the performance until it is ready for the physical world.
“AI gets better, smarter, and faster with more data, so the robots can complete tasks faster with more data and repetition,” says Rose.
While its early days of general-purpose robots, Cardenas says that they are working on exciting new things that leverage proven robotic technology and AI to improve humanoid robotic function in the future.
Melonee Wise of Agility Robots, maker of Digit, describes that the future of humanoid robots and learning lies in the large language model that enables the robot to understand a voice command and perform the task without any pre-programming (kind of like using ChatGPT).
“A lot of the places that we use learning or artificial intelligence is in the higher aspects of the system for learning a new tote, or a new model of a customer’s facility. Where it will become very powerful is in high-level programming of Digit. Instead of having an engineer type lines of code to explain the task, you’ll verbally ask Digit to do the task… That’s where we see long-term the value of solutions like ChatGPT and large language models,” says Wise.
Wise says that today Agility is using AI for identifying objects in the facility, but in the future the company could use it to simply tell a robot a command and it will perform the commanded task.
As more companies adopt AI into their daily lives, the technology will only improve, producing faster results and smarter innovation. Should we feel threatened by AI-powered robots as mere humans? Or can humanoid robots truly share the packaging line with us? For now, only the future holds the truth about whether or not humanoid robots will truly ever be adopted into packaging lines large scale. Until then, keep your eyes open for a robot co-worker to come parading into your plant, and if you happen to see one, let us know.