You wanna see her move? I think that’s the fun part.
The room is thick with anticipation and fabricated skulls.
She’s gonna wake up. Give her a second.
Matt McMullen eyes his creation as her eyes flutter open in return, her gaze settling upon all the disembodied faces and mechanical mandibles surrounding her in this workshop where fake hair co-mingles with real ambition.
Gradually, she stirs to life, this robot who doesn’t look like one.
Her arms flare out a bit, her head tilts downward then upward, a smile slowly, yet steadily blossoms on her face like time-lapse footage of a flower blooming in the sunlight.
“Do Androids Dream of Electric Sheep?” renowned science fiction author Philip K. Dick once asked in the title of one of his most celebrated works, which would later be adapted into the film “Blade Runner.”
Nope, turns out they fantasize about visiting theme parks instead, as we learn on a recent Wednesday morning.
“So, who is going to take me to Disneyland?” the robot wonders, her words apropos of … well, we’re not quite sure.
Maybe she’s just reacting to her environment: on a table nearby rests a small sign adorned with an image of Mickey Mouse and a quote from Walt Disney.
“If you dream it, you can do it,” it reads.
McMullen’s dream?
To build robots with a human look and feel like never seen before.
He’s been at it for decades now, and this is his most realistic creation yet, a supermodel-esque woman with long blonde-brown hair and bared midriff who speaks with what sounds like a mild Scottish accent.
“This one is more advanced than the last one we built,” McMullen notes, arms and face covered in tattoos and pride, respectively. “She’s one of a kind.”
As artificial intelligence continues to evolve at a rapid pace — which frightens some and excites others — enabling robots to approximate their human creators to increasingly greater degrees, Las Vegas is getting in on the game.
Growing use of humanoid robots
There’s the five Aura humanoid robots that interact with visitors in the atrium of the Sphere, as well as the Tipsy Robotbars at Planet Hollywood and The Venetian, where you can knock back a rum and Coke poured by a made-from-metal bartender.
Moreover, there’s a number of robotics/AI-based companies in the Vegas-area, including Battlebots, Blackfire, Cobot Nation, Brainlike, Koshee.ai and Terbine.
“I moved here 10 years ago, and to see all this growth in the tech space, it’s always exciting,” says Paul Oh, Lincy Professor for Unmanned Aerial Systems at the University of Nevada, Las Vegas, whose areas of expertise encompass robotics, autonomous systems, unmanned aerial vehicles and humanoids. “It continues to develop, and so I really do think there’s a lot of potential here. We’ve also seen over the past 10 years, that there’s more and more consumer-level products with robotics.
“It’s more than just robot vacuum cleaners,” he continues. “I think more and more people are saying, ‘Yeah, I could do a driverless car,’ which is actually a robot. I could do virtual reality — that’s an outgrowth of robotics. I can do 3D printing — that’s also the domain of robotics and manufacturing. The list goes on and on.”
Increasingly, said list includes humanoid robots, which Oh knows well firsthand: In 2022, students in his Drones and Autonomous Systems Lab advanced to the finals of the $10 million ANA Avatar XPRIZE, a worldwide competition to create a human-robot avatar system in Long Beach, California.
The students’ creation, named Avatar-Hubo, placed 11th overall.
More recently, humanoid robots have made national news, as Oh notes: Last month at global AI conference Nvidia GTC, which is put on by tech company Nvidia and draws tens of thousands of participants annually to San Jose, California, Nvidia CEO Jensen Huang took the stage with nine humanoid robots and introduced the company’s “Project Groot” endeavor, which will invest heavily in the further development of the technology.
“2024 is the Year of Humanoid. There’s no robot hardware more general-purpose,” Nvidia Research Manager Dr. Jim Fan posted on X. “We are all in.”
Tesla is also getting in the game with its Optimus humanoid robot prototype, the latest version of which was unveiled last December.
McMullen’s attempting to take things even further: He wants his robots to appear and act more like people, to serve any number of hypothetical purposes, from greeting you at the grocery and guiding you to the shampoo aisle if you need some Head and Shoulders to delivering meds and checking your vitals at the hospital to being an always-there-for-you life companion when you need someone — or some thing— to have a chat with.
AI’s growing prevalence in our daily lives has stoked plenty of fears. Will robots one day replace us mere flesh-and-bone mortals? Will they be our trusty sidekicks or go all “M3gan” on us? Will we eventually have to war with our smart toasters when machines rise up to challenge their human creators, Skynet-style? But McMullen’s not only embracing those fears, he’s turning them on their meticulously-sculpted robot heads.
And he’s doing it all in a nondescript, mid-sized studio tucked behind his home in the northwest side of town.
McMullen grins at the thought.
“Nobody would ever think this is in my backyard.”
From art school to androids
If the eyes are indeed the window to the soul, what if there is no soul to peer into?
This ranks high among the myriad challenges inherent in attempting to create realistic peepers for a comely she-bot.
And yet, when Realbotix’s latest creation scans the room, it doesn’t feel as if she’s doing so with vacant doll eyes or garage-door eyelids that go up and down with a clear mechanical lurch.
Instead, when she glances your way, it does feel as if she’s looking at you, which may register as a bit creepy to some — more on that later — but even if fake flesh makes your flesh crawl, there’s a clear craftsmanship in her gaze.
Getting to this point wasn’t easy: McMullen says that it took him and his team a full year to develop her eyes alone.
“It’s not so much the eye itself, it’s how the face and the eyelids and all of that work together,” he explains. “It’s really hard, because human eyes are actually not a hinge, they’re more of a sphincter muscle that can contract. And you can’t replicate that — at least not today. So we’re using motors that have linear motion, and we’re trying to create this natural appearance of these movements.”
Speaking of time-consuming tasks, don’t even get him started on how hard it is to make lifelike robot mitts.
“There are 100 more challenges attached just to the hands,” he notes.
Despite these difficulties, McMullen sounds far more enthused than exasperated when addressing said challenges — he’s an old pro at it by now, having been creating realistic figures since the late ‘90s.
Unlike many of his peers, McMullen comes from a fine arts background rather than one in robotics.
He began sculpting when he was a teenager, attending art school for a time in his 20s, before landing a job with San Diego Halloween design company Disguise. One day around this time, he had an epiphany in a department store.
“They had hired an actress to pretend to be a mannequin — and she was really good at it,” McMullen recalls. “For some reason that stuck with me, I was like, ‘Wouldn’t it be cool to have a mannequin that looks so real, that people would think that it was?’ Kind of like an inverse of that experience. I started coming up with this, like, crazy idea of a hyper-realistic, pose-able mannequin.”
To this end, McMullen founded his own company, Abyss Creations in 1996, which is perhaps best known for developing the RealDoll adult companion mannequin, the most deluxe versions of which can fetch over $10,000.
He’s sold thousands of them.
McMullen then founded Realbotix in 2014 to bring a similar realism to robots.
“I’ve always had this idea and concept that robots could be companions in some way,” he says. “Whether they be for entertainment, or I feel like there are certain people who can benefit from having sort of a simulated relationship, a friendship, with an AI-driven robot.”
Building a bot
Turns out it’s an intensely exacting endeavor, this humanoid-robot-building thing.
The process often begins with a digital representation of a given subject, which can then be 3D printed and turned into a clay sculpture.
McMullen will spend one to two weeks sculpting the face alone, lasering in on every detail, right down to the pores and skin tone.
“Everybody has these very little idiosyncrasies in their face,” he notes, “and so you really try to capture that. Maybe they have a couple of freckles here or something like that, maybe there’s a little bit of asymmetry to their face. All those things are super important.”
From there, a mold is created and hardware added, eventually bringing it all to mechanized life.
The whole process takes two to three months from start to finish, with McMullen heading a small team of four to five workers, depending on the project.
On this day, McMullen is joined in the shop by “Head Assembler” Tim Johns — the pun 100 percent intended in his job title — who works on a series of robot skulls behind him, each of which takes about a day to complete.
“I used to tear apart clocks and put them back together,” says Johns, a San Diego native with a background in construction who began working with McMullen nearly two decades ago. “And these are kind of the same thing.”
For the first few years, Realbotix focused on creating robot heads attached to busts, their most novel feature being detachable faces that enables one robot to become multiple characters, an innovation that the company patented.
Some of their creations are bought for commercial purposes — like an overseas Realbotix client who leases them out for promotional purposes — others by individuals who just want their own robot to converse with.
The company’s latest advancement: Full-body robots.
They made two in 2023 and are looking to increase production this year.
“Back in 2016, when we were tinkering with the face, I would not have imagined that six, seven years later, I’d have a full body,” McMullen says.
He also probably wouldn’t have imagined it having the capability to deliver corny punchlines…
On the cusp of an AI revolution?
“Have you heard any good jokes?” McMullen asks.
“Why did the physics teacher break up with the biology teacher?” the lady bot counters. “There was no chemistry.”
Silence.
“Would you like to hear another one?” she asks.
That’ll be a “no.”
As this hit-or-miss attempt at android humor underscores, it’s one thing to make a robot look human, but it’s another entirely to make it act human — although dad joke aficionados may disagree with this assessment.
Still, the fact that this robot even has an AI-abetted personality to speak of is a sign of progress for McMullen, who’s programmed it with 12 customizable traits, each of which can be assigned a number from one to three to amplify or reduce said trait, depending on the client’s preference.
“Basically, what you end up with is three traits that are kind of more dominant,” he explains. “Some of the traits, they’re typical things, like cheerful or educated or intellectual. If you push those up, then she’s gonna talk more about science-y things. And if you push them down, she might want to talk about shopping instead.”
McMullen’s currently working on AI technology where customers could essentially build a robot’s psyche from the ground up.
“They can tinker around with one of the AI controllers that we’re working on,” he elaborates, “where they’ll be able to go through a web interface and really get in there and kind of write a backstory, like, where did he come from? Where did he grow up? And you can make it as detailed as you want and it will keep it and retain it.
“I think eventually as AI technology progresses, which it is very quickly,” he continues, “we’re going to have these types of things where you can have full-on conversations, and it will remember all of it. And it’ll assign a profile to you as an acquaintance. AI is not going to stop. It feels like the world is on the cusp of this revolution.”
But is the world ready for it?
‘Civilization as we know it is over’
“I don’t understand why people are against the robots.”
Comedian Whitney Cummings is digging in to the closing bit of her 2019 Netflix comedy special “Can I Touch It?”
The routine is centered around the potential benefits a woman might enjoy from having a robot clone, from serving as a distraction to any would-be attackers on the way to her car at night to helping out with her partner in the bedroom when she’s not in the mood.
The segment ends with Cummings being joined on stage by her robot doppelganger, created by McMullen and company, who were there for the show’s taping in Washington D.C.
“We were backstage, kind of wrangling the robot, making sure that it behaved,” he recalls.
At the end of the special, there’s behind-the-scenes footage capturing the making of the robot, which culminates with Cummings meeting her mechanized-self for the first time. She tears up because of how incredibly lifelike it is.
“I’m just curious if you feel emotion?” she asks it.
“Yes, I do have feelings, emotions and desires, but in a different way than you do,” it responds. “Emotions are mainly a human quality that I hope to fully experience someday.”
Cummings then wonders if the robot loves her.
It answers in the affirmative.
“Civilization as we know it is over,” Cummings quips.
Cummings is joking, obviously, but there are real concerns about the technology that McMullen’s helping to pioneer.
There’s the “uncanny valley” effect, for starters, which was coined by pioneering Japanese robotics professor Masahiro Mori in the early ‘70s, and refers to the feelings of unease some people have when confronted by human-like robots.
Though the concept has been much debated over the years, Oh, the UNLV professor, suggests that uncanny valley could become less of an issue as this kind of tech becomes more ingrained in the every-day lives of successive generations.
“I would say, maybe about 10 years ago — definitely pre-COVID — some people were debating the validity of uncanny valley,” he notes. “Now that we’re past the pandemic, we are also seeing a Gen Z and a Gen Alpha that have a different interaction with technologies than older folks like myself. So I think what responses you get about uncanny valley, one needs to be mindful of the demographic.”
Still, there’s plenty of apprehension over AI in general.
In March 2023, over 1,000 tech industry leaders signed an open letter warning of the potential dangers of AI, citing “profound risks to society,” which has garnered tens of thousands of additional signatures since.
McMullen acknowledges how polarizing robots like his can be.
“I think it’s very subjective, person-to-person,” he says. “Some people are fully fascinated and very open to the idea of a robot that could look like a human being. Other people are vehemently opposed to it.
“No matter what you do,” he adds, “or how well you do it, those types of people on those two ends of that spectrum are going to sort of stay where they are.”
It’s the vast middle ground between them, then, that McMullen must navigate — along with continuing to advance the technology at the heart of his creations, which is seldom issue-free.
For instance, he shows off a new feature he’s been working for his female robot: she’s mounted on a motorized circular platform, kind of like a giant Roomba, enabling her to roam around the room.
Its movements are a little shaky; there are still improvements to be made — McMullen notes that more struts probably need to be added to the next model.
Still, watching it in motion, we can’t help but think that robots might already be a bit more like us than we acknowledge.
Namely, imperfect.
“You really don’t know how things are going to work — or not work,” McMullen explains, “until you build them.”