Read time 9 min
This is a post in our series on speakers at Reaktor Breakpoint. Dr Jessica Cauchard, a professor at the Efi Arazi School of Computer Science at IDC Herzliya, gave a presentation titled “Game of Drones” that you can watch here. I sat down with Jessica after the conference to talk more about how we can design for the unplanned encounters we are starting to have with autonomous technology.
“Blah, drones”, a friend sums up how he feels about the devices. I could relate – a month ago, drones really didn’t elicit any emotional response in me, apart from that one time when my neighbor flew his toy drone a little too close to my balcony. (That time, I felt indignation and anger, and made sure he knew.)
Talking to Dr Jessica Cauchard at the Speaker’s Dinner before the Reaktor Breakpoint conference, my indifference melted away in an instant. “I could go on about drones for days”, she laughs. I’m sure she could, and for a good reason: Her research into human-drone interaction is just so interesting, and it touches on many pressing questions about how we relate to technology.
But let’s back up a bit. How did Jessica, one of the leading researchers in human-drone interaction, get so deep into this multidisciplinary topic?
“When we started out, we wanted to implement a gesture recognition system”, Jessica explains. But as the team started looking, they realized there was no existing framework for interaction that would fit in with drone technology. “We had no idea how to do it properly. So we had two ways we could go: Design a set of gestures and make it happen. The problem with that, of course, is that we would have spent months designing and building it – drones hover and move, adding an extra layer of difficulty to gesture recognition. And what if the gestures we designed wouldn’t be what people wanted to use? So we went the second route and started drawing understanding and inspiration from other fields. In the end, I adapted a research methodology for new technologies to start understanding how people would naturally interact with a drone.”
Interacting with autonomous devices
The relationship between technology and ourselves is rapidly changing. We’re all used to thinking and talking about our devices as personal possessions – my phone, my laptop, my iPod – yet they’re fast becoming autonomous. This means that we now need mediation between people and technology; whereas before, it was the owner who mediated the relationship between the technology and the world, the technology is now venturing out on its own. “We’re just at this transition time where tech becomes more autonomous, and we don’t know how it will work. And in this transitional period, we run into many issues to be tackled. What happens when your car is autonomous and mine isn’t, and they can’t communicate?”, Jessica explains.
Jessica sees that there is a danger that the technology industry simply ignores these questions, and goes on with business as usual. That’s why it’s so important to make sure design and product teams are equipped to start thinking about autonomy and everything that entails. Jessica points out that the user is often not a tech specialist – and this means that the design of these systems should be much more nuanced, too. “Right now, all too often instead of supporting us, these systems and programs can make our lives more difficult”, she says. And as the companies designing the products are used to thinking about tech in proprietary terms, it can be difficult to transition into thinking about how autonomous devices can be interacted with by any member of the public.
So when talking about autonomous devices, why are drones so interesting?
“When it comes to drones, we know a lot, and not a lot at the same time”, Jessica explains, continuing: “A lot of the existing research has focused on how to monitor flight. With monitoring, you just need someone to supervise that everything is ok. And we’re getting very good at that”. Right now, people are mostly using drones for photography, where interaction is often minimal. But there are advances in using them for example in emergency recovery, deliveries, and search and rescue missions. As these kinds of use cases become more common, the number of people they need to interact with will explode.
Jessica points out an interesting and likely scenario related to unplanned encounters with drones: “What if a drone is making a delivery, and it lands on my door – but the package isn’t for me, it’s for a neighbor? How do I interact with it and tell it to go next door? Right now, companies aren’t designing for these unplanned encounters.” And as drones and other devices become autonomous in larger numbers, we need to make sure people stay in the loop.
Drones also feed into many of our cultural anxieties about surveillance. If a drone gets close to you, many people will wonder: Is it filming me? What is it doing? Who controls it? Who owns it and who sent it here? “If we don’t design for these encounters, we’ll end up in a very sad world where we have technology that we don’t want around”, Jessica points out.
Can drones convey feelings?
I asked Jessica what she found the most surprising about her research. Her answer? “The fact that people interact with drones like they do with animals or humans. This surprised me because unlike robots that can often be designed to look human-like, drones really don’t have any facial features.” Yet the participants in her research were treating the drones like a person or a family pet, saying things like “please” or “here boy”. “If you think about the technology of the past couple of decades, we never used to communicate with technology this way!”, Jessica exclaims.
As someone who finds it impossible to ask anything of my Amazon Alexa without adding in a “please” at the end, this makes sense. In fact, as a child of the eighties, the one thing that instantly pops to my mind is R2D2 from Star Wars – a beeping, whirring rubbish bin of a robot that somehow manages to project a forceful personality. Haven’t we always been good at anthropomorphizing the weirdest things?
Turns out that my R2D2 example isn’t all that far-fetched.
“When we started thinking about how we could make the drones feel like animated objects, and convey for example if their battery is running low, we started thinking about animation in general. And we thought about the seven dwarfs in Disney’s Snow White: how did the animators provide a range of very distinct personalities for characters who look very similar? What we found was that they all move very differently, and this conveys the personality. So we started thinking that we could do something similar with drones.”
Jessica and her team worked with a researcher specializing in animation theory. To simplify, they were looking to match up emotions with movements. Why emotions, if what they were trying to do was infuse the drones with a personality? “Personality is easier to define but harder to perceive. The two are also very much linked. Humans perceive emotion easily, so that’s what we decided to communicate”, Jessica explains.
One of the drones was designed to appear happy, another was exhausted, and one was anti-social. Jessica’s team was surprised to find out that people were good at picking up how the drones were “feeling”, and then approaching them accordingly.
This begs the question whether drones could also mirror or read how we feel? Would a drone approach someone angry in a different manner than someone who looks happy? “This is a new area of research”, Jessica smiles. “Right now we’re exploring questions like how fast a drone should approach you, how far it should stop from you, and so on. But it’s still in the very early stages.”
Shared ownership models over personal gadgets
So if we start having these drones that become extensions of how we feel, wouldn’t that mean that we’d all have a drone following us around, leading to some seriously congested airspace?
“This makes me so sad about new tech. So often, it’s about the gadgets: People buy them, they use it for a while, and then it just sits in the garage or the drawer and goes unused. From an environmental standpoint, I don’t think everyone owning a drone is sustainable.”
Instead, Jessica hopes that models of shared ownership will become more prevalent. Shared drones can become specialized in certain kinds of tasks, like a drone that accompanies people crossing a darkened park at night, lighting the way and acting as a deterrent to any attackers. Once you’ve made your way across, you can continue on your way while the drone waits for the next person who needs it.
“Autonomy can really enable these kinds of shared models, not just in drones, but other devices as well”, Jessica points out. Cars are probably the best (and of course, most used) example of this: Instead of having to be parked outside your office for the whole day, an autonomous car can drop you off, and then go pick up someone else. Once you’re ready for a ride, another one can come and pick you up. “It also has the added benefit of requiring fewer cars and parking spaces. Similar things can also happen with devices like drones”, Jessica says.
Legislation, transparency and making drones socially acceptable
Another big open question mark is the legislation surrounding drones. “So far, legislators have been looking at drones as devices with very limited interaction. Yet during our research, we noticed that people were getting extremely close to the drones, grabbing them and not really realizing that the propellers can be quite dangerous”, Jessica explains. “It’s almost as if they assumed that the products would be safe because they were available in the market. But the authorities probably didn’t even think that people would interact with the drones like this.”
Legislation is one half of the equation, Jessica points out; the other is whether drones become socially acceptable or not. “Google Glass, the infamous smart pair of glasses, is a great example of a product that the users loved – but it wasn’t socially acceptable, and the product didn’t make it”, Jessica explains. Indeed, the fear people have of being filmed or followed without their permission is one of the big hurdles that drone design has to overcome. It’s why we have to find new ways for drones to interact with the people (not just the owners) around them.
Jessica sums up: “In the end, it comes down to transparency. The question is how can we start integrating the technology safely both in our societies and our environment”.
You can watch Jessica’s presentation from Breakpoint here.