From Roombas to Rosie – Engineering Domestic Robots
Proving that our distaste for household chores knows no bounds, people have been dreaming about robots that can do our housework for more than half a century.
From Rosie the Robot, who served the Jetsons in the 1960s, to Bender in Futurama, domestic robots have long been a mainstay of science fiction. Until recently, such autonomous and interactive robots were confined to the realm of imagination.
The majority of domestic robots that have made their way into our homes so far are relatively “mindless” creations designed to perform a single function. The most ubiquitous example is the Roomba robotic vacuum cleaner, which has been patrolling floors around the world since 2002. Joining Roomba has been an array of one-hit wonders designed to do things like mop floors, fold our clothes, clean the cat litter box, mow our lawns and clean our swimming pools.
More recently, advancements in artificial intelligence (AI) have led to the creation of more intelligent domestic robots that have the capacity to do multiple tasks and learn as they go. But there are still plenty of engineering hurdles to overcome on the way to creating a true Rosie: a robot that can do all your household chores, take care of the kids and even crack a joke to cheer you up when you’re having a bad day.
A lot of progress has been made, and some pretty clever robots are on their way to our homes in the next few years, but there are plenty of engineering challenges remaining on the road to building a personal domestic robot.
Human-Machine Interactions – Developing People Skills
It’s hard enough for us to understand each other, so making a machine that can navigate the complexities of human interaction is no easy task. However, if you want a robot that can not only take directions but anticipate your needs as well, enhancing human-machine interactions is a necessity.
The rise of
brain-computer interfaces (BCIs)
is granting us the ability to give instructions to machines in new ways, but the path forward for domestic robots calls for machines that can understand our needs and interact with us using natural language.
We can already control machines to some extent using voice commands, as evidenced by the speech recognition capacities of programs like Apple’s Siri and Amazon’s Alexa. But, as the IEEE points out, proper autonomous robots need to go one step further, to the point where they can understand the nuances of human behavior and establish meaningful connections the same way we do with each other.
For more information visit:-Ingen Dynamics
In other words, domestic robots (at least the kind you’d be willing to trust with your kids) need to have empathy. Researchers are hard at work on natural language processing and human-machine interactions—with some interesting results already—but the technology still has a ways to go.
Navigating Human Environments
As researchers at Stanford University have pointed out, today’s robots perform best when doing repetitive jobs like grasping and moving objects. Moreover, controlled environments like factories are well-suited to robotic automation.
However, as anyone who has had young children can attest, a household environment tends to be about as far from controlled as you can get. Our homes involve far too many variables to preprogram a robot that can deal with them all. These includes people and possibly pets moving around in spaces that are optimized for humans, not robots.
Add to that the fact that the environment can change without notice—for example, when remodelling—and it’s clear that successful domestic robots will need to be highly adaptive. The Stanford paper breaks the challenge of navigating a human environment down into five categories: perception, learning, working with people, platform design and control.
Researchers and engineers around the world are currently working on projects designed to overcome each of these individual challenges, but the ultimate challenge lies in finding a way to integrate the approaches into functional systems that will work for robots operating in the real world.
Reducing Sensor Costs
Just as buying a home computer in the ‘80s wasn’t financially practical for most people, a key challenge to any up-and-coming modern technology is cost. Although the costs may have come down in the last few years, many of today’s robots still aren’t cheap.
One of the reasons for this is that in order to successfully navigate its environment, a robot needs a whole array of sensors that are currently expensive to manufacture. Micro-electrical mechanical systems (MEMS) technology has brought down the cost of inertial sensors in recent years, but other sensing technologies like LiDAR are still fairly expensive.
Bringing domestic robots into the average home means bringing those costs down. Fortunately, recent interest in
autonomous vehicle technology
has spurred electronics manufacturers to find ways to produce LiDAR systems as lower costs.
Leddartech
, for example, has developed
a proprietary solid-state LiDAR technology
to cope with navigating densely populated urban areas. This could help reduce the cost of LiDAR, but that alone may not be sufficient to make domestic robots a more affordable proposition.
Domestic Robots Today
While our very own Rosie the Robot might still be out of reach from both engineering and financial perspectives, autonomous helper robots are starting to become a reality.
Take Zenbo by ASUS, for example. Announced in spring of 2016 with a price tag of $599 USD, Zenbo is branded as “your smart little companion” by its creators.
The company provided little in the way of technical details about the robot when it was released, but according to the demonstration, it’s packing at least one camera, speakers, a microphone and some kind of wireless connectivity.