More

    This Lip-Syncing Robot Face Could Help Future Bots Talk Like Us

    The slight unease that creeps up your spine when you see something that acts human but isn’t remains a big issue in robotics — especially for robots built to look and speak like us.

    That peculiar feeling is called the uncanny valley. One way roboticists work to bridge that valley is by matching a robot’s lip movements with its voice. On Wednesday, Columbia University announced research that delves into how a new wave of robot faces can speak more realistically.

    Hod Lipson, a Columbia engineering professor who worked on the research, told CNET that a main reason why robots are «uncanny» is they don’t move their lips like us when they talk. «We are aiming to solve this problem, which has been neglected in robotics,» Lipson said.


    Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.


    This research comes as hype has been spiking around robots designed for use at home and work. At CES 2026 last week, for instance, CNET saw a range of robots designed to interact with people. Everything from the latest Boston Dynamics Atlas robot to household robots like those that fold laundry, and even a turtle-shaped bot designed for environmental research, made appearances at the world’s biggest tech show. If CES is any indication, 2026 could be a big year for consumer robotics.

    Central among those are humanoid robots that come with bodies, faces and synthetic skin that mimics our own. The CES cohort included human-looking robots from Realbotix that could work information booths or provide comfort to humans, as well as a robot from Lovense designed for relationships that’s outfitted with AI to «remember» intimate conversations.

    But a split-second mismatch between lip movement and speech can mean the difference between a machine that you can form an emotional attachment to and one that’s little more than an unsettling animatronic.

    So if people are going to accept humanoid robots «living» among us in everyday life, it’s probably better if they don’t make us mildly uncomfortable whenever they talk.

    Lip-syncing robots

    To make robots with human faces that speak like us, the robot’s lips must be carefully synced to the audio of its speech. The Columbia research team developed a technique that helps robot mouths move like ours do by focusing on how language sounds.

    First, the team built a humanoid robot face with a mouth that can talk — and sing — in a way that reduces the uncanny valley effect. The robot face, made with silicone skin, has magnet connectors for complex lip movements. This enables the face to form lip shapes that cover 24 consonants and 16 vowels.

    To match the lip movements with speech, they designed a «learning pipeline» to collect visual data from lip movements. An AI model uses this data for training, then generates reference points for motor commands. Next, a «facial action transformer» turns the motor commands into mouth motions that synchronize with audio.

    Using this framework, the robot face, called Emo, was able to «speak» in multiple languages, including languages that weren’t part of the training, such as French, Chinese and Arabic. The trick is that the framework analyzes the sounds of language, not the meaning behind the sound.

    «We avoided the language-specific problem by training a model that goes directly from audio to lip motion,» Lipson said. «There is no notion of language.»

    Why does a robot even need a face and lips?

    Humans have been working alongside robots for a long time but they have always looked like machines, not people — the disembodied and very mechanical-looking arms on assembly lines or the chunky disc that is a robot vacuum scooting around our kitchen floors.

    However, as the AI language models behind chatbots have become more prevalent, tech companies are working hard to teach robots how to communicate with us using language in real time.

    There’s a whole field of study called human-robot interaction that examines how robots should coexist with humans, physically and socially. In 2024, a study out of Berlin that used 157 participants found that a robot’s ability to express empathy and emotion through verbal communication is critical for interacting effectively with humans. And another 2024 study from Italy found that active speech was important for collaboration between humans and robots when working on complex tasks like assembly.

    If we’re going to rely on robots at home and at work, we need to be able to converse with them like we do with each other. In the future, Lipson says, research with lip-syncing robots would be useful for any kind of humanoid robot that needs to interact with people.

    It’s also easy to imagine a future where humanoid robots are identical to us. Lipson says careful design could ensure that people understand they’re talking to a robot, not a person. One example would be requiring humanoid robots to have blue skin, Lipson says, «so that they cannot be mistaken for a human.»

    Recent Articles

    spot_img

    Related Stories

    Stay on op - Ge the daily news in your inbox