Researchers develop generative AI that can realistically replicate human motion | Science News

Generative AI struggles to create realistic human motion, especially in challenging environments. Living organisms come up with a wide variety of solutions for movement, but generative AI is often trained on a small number of correct solution. An international team of researchers have overcome the challenges to create an AI that generates realistic movements for humans in a range of scenarios.

Robots have always struggled to move like humans. (Image Credit: Tohoku University).

New Delhi: Conventional generative AI models are not very efficient and effective when inferencing realistic movements for humans, especially in unknown or challenging environments. Living organisms exhibit a wide range of motion, without one correct pattern to follow, and it is not clear which is the best or most efficient solution. An international team of researchers have combined central pattern generators (CPGs) and deep reinforcement learning (DRL) for developing a novel approach for imitating human motion.

The method can generate walking and running motions, and also generates frequencies for situations where motion data is absent, allowing for smooth transitions from walking to running, and for adapting to environments with unstable surfaces. The intricacy and complexity of the range of possible movements makes it notoriously challenging for reproducing human-like movements in robots. For example, scientists struggle to make a bipedal robot break a door using a hatchet.

Deep Reinforcement Learning

DRL extends traditional reinforcement learning by using deep neural networks to handle more complex tasks, and learning directly from raw sensory inputs. The approach provides the models with more flexible and powerful learning capabilities. The disadvantage is the tremendous computational cost of exploring a vast input space, especially when there is a high degree of freedom within the system.

Imitation learning

Another approach is imitation learning, in which a robot imitates the motion measured from a human performing the same task. Imitation learning is great ins table environments, but the method struggles to provide workable solutions when faced with new situations, or environments that were not encountered during training. The narrow scope of its learned behaviours limits its ability to modify and navigate effectively when faced with novel surroundings.

Central pattern generators

CPGs are neural circuits located in the spinal cord, that generate rhythmic patterns of muscle activity. The reflex circuit in animals works in tandem with CPGs to provide feedback that allows organisms to adjust their speed and movements to suit the terrain. The researchers adapted the CPGs and the reflexive counterpart with an AI-CPG, improving the stability and adaptability of motion generation, when imitating humans.

A paper describing the findings has been published in IEEE Robotics and Automation Letters. One of the study authors, Mitsuhiro Hayashibe says, “This breakthrough sets a new benchmark in generating human-like movement in robotics, with unprecedented environmental adaptation capability. Our method represents a significant step forward in the development of generative AI technologies for robot control, with potential applications across various industries.”

Follow us on social media

Originally Appeared Here

Author: Rayne Chancer