In The Air_Uncanny Valley

6th May 2014

There is much discussion about the integration of robots in our daily lives, especially with futurists such as Kevin Kelly predicting that robots will override 70% of the workplace in the coming decades. This is something we are already witnessing too, as robots are notably favored for their acute accuracy and cost-effective solutions. But, what is more interesting still is the current shift in robots from functional to emotional beings. What started as a master-servant relationship has developed into an equal partnership, and in some cases has ensued into dependency. Employed as carers for the elderly or eyes for the disabled, the role of robots in our lives is changing fast, so much so that we are wondering what in the future will set us aside from augmented beings, which have been specifically programmed to replace or even better us. Currently robots function, but they do not appear to be able to experience or share empathy, which brings us to question what defines empathy in a digital age. Regardless of the answer, and regardless of whether we are optimistic or pessimistic about the digital age, the reality is that we are becoming more and more dependent on technology. Our interaction with each other and technology is changing: human interaction is shifting to a screen-to-screen interface and we live in a world with fluctuating ethics and social structures. With more and more investment fuelling the world of robotics, we need to begin to question what we will do with ourselves when robots get really good at what we do.

There is a new breed of robot that has a sense of awareness. Perhaps not a sense of self, but this new breed is changing the way that we connect with bots. For example, meet Baxter, a robot that exhibits common sense and is capable of adapting to his task and his environment. Using force feedback technology, Baxter is testament to the robotic evolution, as he displays human characteristics, such as being courteous and friendly. Another bot growing in evolutionary form is the Google car: proving it possible to automate behavior, the self-driving car is now treading into new territory originally preserved for humans alone. As driving requires a sophistication of skills - such as reading the road, the ability to think ahead, as well as reacting in real time - and with Google’s car being deemed a better driver than humans, it opens up a new discussion as to the definition of a machine or bot. ASIMO from Honda is also capable of decision-making, and similarly to Baxter is able to navigate a room without colliding with anyone or anything. Pushing the boundaries, the holy grail of the robotics world is to instill a sense of taste and smell to bots, which researchers believe to be a mere 5 years away. More than smell-o-vision, these advances will enable bots to smell possible health issues. IBM, on the other hand, has developed a cyberchef, a collaborative bot, which helps to explore new flavors as well as design new dishes to help us eat more healthily. Alexander du Preez’s machine craft explores how man and machine will coexist in the future, creating an intimate relationship between the two. Robots are also beginning to create abstract and emotional art: BNJMN is a paintbrush-wielding bot designed by two students at the Basel Academy of Art and Design who are questioning the difference between robotic and human art.

Robotic solutions are being increasingly applied to real life problems, such as our aging population, in order to meet human needs. One particular robot called Paro is a baby harp seal designed by Takanori Shibata from Japan’s Intelligent System Research Institute. Used in nursing homes as a companion or comforter for the elderly or infirm, Paro is being hailed as the world’s most therapeutic robot, as he is able to respond empathetically by purring when stroked or squealing when handled inappropriately. With robots that dry hair and administer drugs, machines are increasingly fulfilling tasks and replacing manual jobs. The annual Human Robotic Interaction Conference devoted its latest event to holistic human robot development, exploring new ways in which to foster robotic interaction in everyday settings. MIT is also leading the way with artificial intelligence and social robots, as Cynthia Breazeal teaches the next generation of robots to be more humanlike. Her robots can sense human emotions, such as anger or surprise, and have the ability to sense human touch, as well as being able to make friends. She even goes so far as to teach her robots to coexist with humans, by designing gaming platforms where they can play together. With the ability to shrug its shoulders and wink, her robot named Tofu elicits an emotional response that opens up a new direction for artificial intelligence and bots.

In response to this ever-changing digital world, we see a slew of artists and designers who are trying to replicate the hand of the machine, almost mirroring what bots are attempting to do by acquiring human traits. STSQ has developed a project called The Human Printer, which breaks down the CMYK separation in order to create CMYK dots by hand. Also using the hand in an almost automated way is John Gulliver Hancock, who is planning to draw all the buildings of New York by hand. Richard Wright who, with a little human help, hand-painted 47,000 stars on the ceiling of the Rijksmuseum. Blending technology and the manual hand, xy patterning addresses digital textures with a hands-on approach, as each graphic pattern is meticulously completed using vector-based software without the use of automation. Shapes and forms are put into repeat, but each work has subtle variations that can only come about by hand manipulation.

Robots are not only being designed to replace humans in the workplace, but also seem to be infiltrating into leisure time too – doing things we enjoy but just don’t have the time to do. The Pareidolic robot, for example, looks for faces in the clouds: described by its designer Neil Usher, it is a robot that aims to "improve the efficiency of our leisure time" by automatically scanning clouds for faces. The Jazz Connect is a meter tall robot that is designed to go places we are unable to go; advertized as being able to “attend the big board meeting you can’t make” it allows us to be in two places at once. Similarly, Double Robotics, which taxis the iPad about the place, cites itself as being “the most elegant way to be somewhere else in the world without flying there”. In its promotional video, it even suggests sending your iPad to visit galleries on your behalf, which you’ll then be able to witness in real time from the comfort of your sofa. The MH-2, on the other hand, not only sees on your behalf, but also acts as your friend: with two arms, a head and a body – as well as a mechanism for realistic breathing – it sits perched on the user’s shoulder, like a parrot, and can be remotely inhibited by your real life friends from anywhere in the world, allowing others who are unable to be with you to share a personal journey or experience.

Last year Amazon invested $775 million for Kiva systems, an automated shelf stacking system to work alongside humans, while NET-A-PORTER recently introduced robots in all its UK warehouses. With a second wave of intelligence and automation, coupled with cheap sensors, machine learning and artificial cognition, this automation will, for the first time, affect all jobs from manual labor to knowledge work. The robotic bartender from MIT’s Senseable City Lab is a recent example of an automated service in a social setting, while US startup Momentum Machines has invented a robotic hamburger chef, designed not as a complement to the workforce but as a replacement, which they believe will save the industry $9 billion per year. Price will be a key driver in the rise of machines. For example, Baxter who epitomizes this new breed of intelligent robots is priced at $22,000, which when equated to an annual salary that incorporates holiday and sick pay becomes a very viable option. And, this is where it becomes interesting, as Baxter stands for a new breed of robots, representing a new way of working. Who Owns the Future? by Jaron Lanier pertinently poses the question about the role of humans in the workplace, stating: “At the height of its power, the photography company Kodak employed more than 14,000 people and was worth $28 billion. They even invented the first digital camera. But today Kodak is bankrupt, and the new face of digital photography has become Instagram. When Instagram was sold to Facebook for a billion dollars in 2012, it employed only 13 people. Where did all those jobs disappear?”

Shifting away from the typical humanoids and pet-like devices, a series of researchers are exploring an animal kingdom of robots whose behavior is more akin to nature. Salamandra Robotica II, which comes from the Biorobotics Laboratory at the École Polytechnique Fédérale de Lausanne in Switzerland, has incredibly natural movements and can swim, crawl and walk. Deemed as unusual in the robotics world and designed as an amphibious service, it and other animal robots may offer a glimpse into a dystopian animal world. Markus Fischer’s bird is not the first robot to fly, but it is the first robot to fly by flapping its wings and mimicking a real bird. Festo, the company behind the bird, has also developed a bionic dragonfly and an aqua jellyfish that mimic real, natural traits.

Programmed and trained robots are getting smarter, but it’s their future ability to think and make conscious decisions that will be the game changer. Thought Experiment has built supercomputer replicas of the human brain, while scientists are hinting at the possibility of a sim card being implanted into a robot, which in turn will make it conscious. In his TED talk last year, Miguel Nicolelis talks about an experiment, in which a monkey in the US learns to control a monkey avatar and a robotic arm in Japan – purely with its thoughts. In a similar vein, a student in Israel has successfully controlled the movement of a robot in France just by thinking about it. Although these are all early stages of research programs, they hint at a future whereby man and machine do merge and the robots of the future have a truly sentient existence.

There are clearly defined traits that set us aside from animals, such as our understanding of death and our sense of self. But, what sets us aside from robots and artificial intelligence? Humanness is currently the divider, as it explores the whole spectrum of emotion, from love to death to hate, empathy, jealousy and the imagination; but as robots are being programmed to feel will they also develop the intangible senses that humans take for granted? Robots are not human and currently there are subtleties they lack that they will need to incorporate if they are to live alongside us seamlessly. As Kevin Kelly says in his article on the future of robotics, we need to shed our view that robots are humanlike, and, instead, concentrate on the new wave of robots that are aware, courteous and have a newfound sense of self. And, admittedly, although robots are replacing us in the workplace, they are also liberating our busy schedules, allowing us to multitask like we’ve never done before.

Published July 2013 © Reproduced with kind permission of