When one thinks about human encounters with artificial intelligence (AI), the humanoid robot comes to mind, and with it a host of hopes and anxieties surface: can AI ever embody and overwrite both the human mind and intuition? Can we teach AI to help ourselves for good? Can AI machines establish real human connections? In August 2018, I collaborated with contemporary dance company RAW Moves Studio to launch a "Movement Research Living Laboratory" under the title Alice, Bob & Eve . Its aim was to investigate the relationship between movement and new technologies: how the digital world choreographs our movements in our daily lives, regardless of our awareness of it, and to find ways to artistically and critically reflect on these issues.
We went out in search of applications to contextualise our exploration in the real world. It was not difficult to find applications where personality-driven humanoids and AI have already been developed to cater to human needs and wants. In particular, we noticed that most of the examples that reached mainstream attention centred around the provision of emotional care and labour by the AI robot or humanoid. Was this a sign of how we in our communities and societies have been so lacking in emotional care and support, that we have become so unavailable for each other, or have not been good communicators of our emotional needs? Are we turning away from what seems to be the disappointing and inconsistent fellow human, and instead now reaching for AI as a more consistent and reliable substitute?
Are we turning away from what seems to be the disappointing and inconsistent fellow human, and instead now reaching for AI as a more consistent and reliable substitute?
One of the applications we were drawn to was Akihido Kondo’s public marriage to a hologram of Hatsune Miku, which stirred international interest and debate . Parallels were drawn to Spike Jonze’s 2013 film Her where the protagonist similarly falls in love with a personal operating system. We also discovered the Youper app which is a “first-responder for emotional health problems for people all over the world and to ensure that no one has to wait years to address their issues” . Its intuitive and immediate series of responses were developed by a collaboration between a psychiatrist, a software engineer, and a designer. As our team explored its UX (user experience), some felt that they remained disengaged in spite of the “live chat” feel, as they felt the automatic responses could not make up for the need for human connection and care, whereas others felt the methodical process was very informed and that this one technology developed could benefit many. In Singapore, we also learned of a Robocoach developed by the Electrical Engineering department of Ngee Ann Polytechnic which was designed to lead seniors in exercise classes and to provide encouragement .
In the months leading up to our final presentation of the findings of the Movement Laboratory to the public, SO-FAR supplemented our research with an opportunity to meet with Dr. Marcelo Ang from the Advanced Robotics Centre at the National University of Singapore. I was joined by another frequent collaborator and artist Urich Lau, to ask Dr. Ang about the development and aspirations of roboticists along this line of AI and emotional labour.
Teow Yue Han: ’m interested in understanding the movement of the robotic arms, and the process of calibrating them.
Marcelo Ang: This robotic arm is perhaps the most advanced in the world because there is a force sensor in each of each joints, allowing the robot to have a sense of feeling. Most robots that are position control devices don’t have that. Imagine moving your arm without feeling your arm — this is how most robots are designed to move. But this robot can feel its own joints, and it is the first truly collaborative robot meant for human environments. Let’s say it’s moving and it hits something. It won’t hurt you because it feels that it has touched you. You can programme it to be soft or stiff, or behave like it’s underwater, like when you’re swimming underwater and you cannot move very fast.
This is very important in the field of robotics. Robots are now working well in factory automation. Factories are what we call “structured environments”, where the workplaces are exactly where they should be. The robots are supposed to pick something up and put something down, and the robots know where it is, so they don’t need to look — they can just operate. That’s why they call it a “lights-out factory”. They can do things very fast because everything is precise. And how the robot knows that is because somebody has taught the robot by moving it one joint at a time, and it remembers the positions accurately.
TYH: Is that the most efficient way?
MA: Well, it works well so far.
The bigger impact is taking this robot out of the factory and into everyday life. That means into my home, into the shopping mall, the airport, or outdoor parks — anywhere except the factory. The difference is that these environments are “unstructured”, which means that the position of things are not known. So that is the challenge — how to make the robot work in unstructured environments that are designed for the movement of human beings. Who is the shopping mall or your home designed for? People. Factories are not designed for people — they are designed for machines, and people are separated. We want to make the robot adapt to an environment designed for human beings, so that we won’t need to modify the environment so that the machine can work.
A good example is a mobile robot, like a vehicle that moves around. In the early days, for these robots to move, you needed to have tape on the floor to guide them. But today, you don’t need that.
Another good example is a self-driving vehicle. As the car moves into active traffic, we want the car to drive better. Most of the research is to make robots sense and understand the environment first, then know what to do from that understanding, plan, make decisions, then execute them, and finally learn and improve. Robots pick up experiences like how humans learn from experience. But these four things — sense, plan, act, learn — how do we do that? That’s the challenge of artificial intelligence.
What’s also very important to achieve is hardware that is meant to be soft, if you will, and human-friendly. A typical industrial robot looks intimidating, right? And it’s very unsafe, because it doesn’t care. It doesn’t have force sensors and it doesn’t need to. So we need a new generation of robots called “soft robotics”, where the materials are sponges. Remember that children’s movie, Big Hero 6? Baymax is a balloon creature that looks like a robot. Imagine this robot, with limbs that are all made of soft balloons. It can move and help you, and it can feel things. That’s the hardware — the body. Then you have the software and AI, which is the control centre and communicative capabilities for the body. The body needs to be soft, because you cannot sense, plan or execute with 100% accuracy, so you must be able to tolerate mistakes, and the softness should be enough so that you don’t hurt people.
Going onto the process of developing the machines, we try different things, like this “macro-mini framework”. A big robot is very rough and cannot exert small forces, so it has to be very gentle. It’s almost impossible for it to do some minute tasks that require a lot of precision. So what we do is we buy a big robot, build a small, sensitive robotic hand, and combine the parts.
I also know another German startup called Franka Emika that sells another robot called the PANDA . It’s very advanced and has force sensing.
TYH: Is this considered haptics?
MA: Not exactly. Haptics is more about a sense of feeling. This is about force control, and feeling the pressure.
But in terms of haptic devices, we created a robotic joystick that has force feedback. When you’re moving it, you can feel the cushion. If it touches a brick wall, it feels the brick wall. And the sensor is programmable, so it’s very good for training. Let’s say you want to lift up a bottle. Instead of just looking at how it moves, you can feel how to do the task and connect this to the robot. When I lift it, I feel the weight of the bottle: 1 kilogram. Or you can scale it to make it feel 100 grams, because it’s all computed. It’s almost the same concept as tele-robotics. There is a robot for surgery called da Vinci . It’s minimally invasive, and is connected to a feed where the surgeon is operating two joysticks much like this, and just looks at the monitor.
Ulrich Lau: I’ve read that a surgeon can even be overseas.
MA: They are testing this because of the delay in response. And now with 5G coming, that wouldn’t be a problem. But even if you lose communications for a split second, the robot’s still stable, so it just waits. It doesn’t do anything wrong or go crazy. That’s why anytime that you have a human-robot system, the robot must have some intelligence so that the human doesn’t always need to talk to the robot.
The da Vinci system is already quite old, and if I tell you the truth, you will be really worried about it. Imagine a doctor is operating just by looking at the screen, cutting the heart or some vital organ, but he doesn’t feel anything. There is no force feedback — he can only see the screen. But to take robots from the factory to everyday life, you need force feeling. Humans are not 100% accurate, and yet we can do many things quite accurately because we can have feeling. So they are working on developing that.
There are many applications in the field of healthcare. We’re also collaborating with a Japanese start-up called WHILL, to make an intelligent wheelchair with a camera and range sensors in it . Going back to the idea of the “ultimate slave”, we need mobility. You have to be able to tell it, “Go to the bathroom,” and it’s safe because it’s the computer driving it — not a human. But I also wanted to design it in kind of a “cool” way like a Personal Mobility Device (PMD), so that when people see an elder using it, they won’t pity them.
In modern robotics labs like this one, we are working on how to make the robot useful for everyday life, so that it can be your domestic worker, to cook and clean up for you. It can be the ultimate slave, for lack of a better term. It also becomes your social companion, or even like your wife if you were to live alone.
UL: There are now robots meant for old folks’ homes for —
MA: — Companionship, yes. They’re not only to help you physically, but just to talk. We are already using the same technology with Siri or Alexa, asking them, “What’s showing on TV now?” But those devices are all just data. The ultimate achievement would be a robot that’s capable of moving, with an arm to do physical tasks, with a level of intelligence to be your interactive companion. Do you remember the cartoon Teletubbies ? Imagine if that’s a model companion for your elderly mother or father. Instead of using a phone to call people, you use a Teletubby. You could tell it, “Hey, call my son Mark.” And Mark’s picture appears through a video call. And because it’s a robot with haptics, your son not only talks to you, he can gesture to you too through the robot. You could even receive a remote kiss. And it’s not only for older people. Say if you want a nanny for your kids? This robot can be your babysitter. It’s a big goal, but all our projects are leading to this ultimate thing.
We are working on how to make the robot useful for everyday life, so that it can be your domestic worker... or the ultimate slave, for lack of a better term.
UL: Like The Jetsons ! Or like the Sony AIBO dog .
MA: There’s also a robot from Japan called PARO. It’s a robotic seal, like a cute stuffed toy for —
UL: For old folks’ homes.
MA: Yes, and what I really like about it is not only its intelligence — it’s also cute! It reacts to how to pinch it, do things to it, and it may be random too. You charge it like a baby’s bottle. Takanori Shibata of the Japanese research agency AIST (National Institute of Advanced Industrial Science and Technology) built this . What’s interesting is how he conducted studies in hospitals in Japan, amongst people in ICU who are not allowed visitors and become lonely. In the study, they had one control group with PARO, and another group without. After a few months, the recovery of those with PARO was statistically, significantly higher. It makes sense, right? We underestimate the power of the mind to heal the body.
And of course, the highest aspiration is to reach cognitive capabilities. The problem is that the things humans do easily, robots are not yet ready to do. For example, picking up this object — you just do it right? But then, if you want to teach a robot how to do that, how would you do it? You cannot just put it into steps. That’s my definition of intelligence: the ability to do something intuitively, without needing to break it down into step one, step two. Because the latter is no longer intelligence, it’s just following instructions.
My interest in AI is to teach robots to do things that humans do so easily. Because, ironically, those things are very hard for robots. And how humans do it is through learning — so I ask, how can robots learn? I’m also very interested in building an artificial world by simulation. If I can create a simulated environment that is “the world”, then that world can be used to train the robot. How long does it take to learn how to drive a car? Maybe one or two years. To be a professional driver, maybe five to ten years. But we cannot wait that long, right? So if you have a simulated environment in the world, and I place the driverless car there, I can run the simulator ten times, fast-forwarded. So ten years becomes one year, or one year becomes one minute. The key is how to build a world that is as realistic as ours.
That’s my definition of intelligence: the ability to do something intuitively, without needing to break it down into step one, step two. Because the latter is no longer intelligence, it’s just following instructions.
The introduction to this dialogue was adapted from the “Research Guide for Laboratory Participants” that accompanied RAW Moves and Teow Yue Han’s performance Alice, Bob & Eve. The research guide was written and edited by Nah Dominic, with key concepts from Teow Yue Han. The final work was presented on 22-23 February 2019 at the School of the Arts (SOTA) Gallery. Photographs by Urich Lau.
Read more about this curious phenomenon here: https://www.straitstimes.com/asia/east-asia/ive-been-thinking-about-her-every-day-japanese-man-marries-hologram
The app can be accessed here: https://www.youper.ai/
Read more this elderly programme here: https://www.ihis.com.sg/Latest_News/News_Article/Pages/Singapore_turns_to_robots_to_get_seniors_moving.aspx
Franka Emika has developed a soft-robot for complex automation: https://www.franka.de/technology
The da Vinci Surgical System was developed by a company called Intuitive Surgical, and approved by the Food and Drug Administration in 2000. It has since completed hundreds of thousands of surgeries worldwide, especially for hysterectomies, prostate removals and cardiac valve repair.
Have a look at Whill's Intelligent Personal Electronic Vehicles here: https://whill.us
Sony produced the AIBO series of robotic pets from 1999 to 2006, then discontinued them. A new generation of AIBO was recently reintroduced in 2018.
Designed by scientist and roboticist Dr. Takanori Shibata in 1993, the PARO is a robotic baby harp seal meant to provide calming therapy for hospital patients, especially those with dementia and autism.