Last updated on 17 March 2022
We are giving computers more central and important roles in our lives. This is great, because it helps us to be better. We must proceed with caution though, because while we gain power, we are also losing control.
Computers as our Partners
The crucial component to any partnership is shared control, i.e. the power of all partners to make executive decisions. The benefit is that all partners acting together are more powerful than each acting alone. The more evenly the partners’ power and control are distributed, the more evenly the benefit is distributed.
Indeed, there are other models of cooperation than the partnership. Whereas a partnership should be equal in all ways, an e.g. Master/slave cooperative model shows an uneven distribution of power and benefit among participants.
As machines become our partners, we need to keep their relationship with us a healthy one.
As it Was
Historically, we humans have been the Masters, and computers have been our slaves. As our trust for computers grows, and we become more familiar with the possible benefits they can provide us, the power dynamics of our relationship shift toward equality. While computers used to function only as calculators, they now control many things through automation or even give us orders (e.g. GPS navigation tells us where to drive).
As it Will Be
MIT is experimenting with a Brain-Computer Interface that notices when it user is mentally overloaded and takes control a little, just until the user is ready to take over again completely. Erin Treacy Solovey built a maze, where a user has to simultaneously control two robots to reach an objective. When the robots notice that the user is having a tough time, they take on more of the navigation themselves.
Power in Partnership
In the experiment, the human-robot team was significantly more effective when the robots responded appropriately to the user’s mental state, as opposed to not responding or unnecessarily taking control. This shows that our power can be significantly greater when we allow computers to be more like partners than just slaves.
Interestingly, one example was first tested in the 1930’s and is widely in use today. Fly-by-wire flight controls stabilize aircraft in-flight by making some adjustments without pilots’ input.
On the horizon is Google’s Self-Driving Car. If we don’t have to “waste” our time and attention on driving, we could focus on other things when in transit. If driving became less error-prone, accidents might become things of the past, and our faster, safer cars would get us to our destinations more quickly.
Cautionary Tales
Plenty of cautionary tales tell us about the dangers of relying too much on technology; some from as early sources as Sigmund Freud or Isaac Asimov, and some more recent, like James Cameron and Pixar. Indeed, entering any kind of partnership is a risky proposition, and partnering with anything with different (or altogether nonexistent) morals than your own is problematic.
The Computer as an Agent
This YouTube video illustrates Ericsson’s very Weiserian vision of a social web of things. The story’s protagonist is David Ericsson, who has a house in which each appliance is a social actor and expresses “emotion,” not unlike the castle servants in Disney’s Beauty and the Beast.
David informs his house that he will be expecting a guest over for a date, and all his appliances whisk into busy preparation for his and his date’s arrival. The microwave shows sadness, because it won’t be used that evening. However, on the way home, David passes along that his date has canceled on him, so the house feverishly works to make its master, Dave, feel well and welcome when he gets back home (the TV order a pay-per-view soccer game, the house orders Chinese delivery, etc.).
Where It Goes Wrong
Here’s the interesting part: once the protagonist is on the couch, enjoying his Chinese food and watching the game, his canceled date calls him on his phone. After debating for a moment whether to pick up or not, David rejects the call and the house is happy!
The fact that the house and its appliances show “emotion” implies that they have an agenda. In this case, part of the house’s agenda was to convince David that it would be better to hang out at home alone than enjoy another person’s companionship. Having succeeded in its mission, the house influenced its Master’s behavior in such a way as to keep him for itself, defeating the rivaling interest of David’s would-be date.
Therein Lies the Rub
MIT’s and Ericsson’s examples here both show computers as partners, connected to and working with their human partners. They work to achieve a certain goal, and they do so in part without input from the user. That is what makes them at once so effective and so dangerous.
The key lies in aligning computers’ interests with our own and ensuring that the alignment always remains intact. As in any partnership, if partners work toward different goals, the partnership cannot work. This principle also applies to computer-human partnerships.
Take “David Ericsson” (from the video) for example. His date canceled on him while he was driving home from work. The house wanted to comfort him and protect him from that person. What if David wanted to drive to her place to talk, and his self-driving car would not let him? In this case, the computer thinks it is acting in David’s best interests, but it is also infringing on David’s freedom and privacy.
These conflicting interests illustrate how complex a partnership can be, from how far each partner is allowed to go in overriding the other’s wishes to which partner’s wishes take priority in any given situation. As for the latter, it is easy to say that the human should always be able to supersede the computer. However, our lives are filled everyday with mistakes that could have been prevented, if we just e.g. weren’t allowed to delete that crucial and unrecoverable file.
Computers make great partners for us humans. We just need to make sure, since we are the ones designing and building these computers, that we make them the kinds of partners we want to have.