The trolley problem

The Trolley Problem is more than a hundred years old. Some sadistic professor of Ethics dreamed it up for one of his classes : You are standing by a trolley line, next to a large lever that controls a switch toward which a runaway trolley is rolling. The switch controls a spur. On the main track are five people, who will be killed if the switch isn’t thrown. On the spur there is just one person standing on the track. What is the right thing for you to do? If you throw the switch, only the single person dies. If you don’t — if you do nothing — five people will die, but you have had no part in it.

When you first start to think about it, it sounds pretty easy. One dead is better than five dead, no? So pull the lever.

But this ignores the back story, as they say on the TV soaps. How did the situation arise in the first place? Pure accident? Who forgot to set the brake? Is there such a thing as an accident, or is an omnipotent God always in charge, and you would be trying to overrule Him if you interfered? What do you know about the people involved? Are the five people a band of muggers planning to rob the one, or are they on their way to choir practice? Is the single person a living saint, on whose continuing ministrations a hundred unfortunates might be depending for their support? Is he, or she, a genius from whose brain there might some day emerge a solution to some of mankind’s most pressing problems? You obviously have no time to determine any of these things. Then how can you think of playing God? If you elect to just walk away no one can fault you, because you didn’t do anything and no one was watching. You can’t even fault yourself, because you have no idea how the event when it finally happens would have played out if you had interfered. Do you have any “right” to interfere? Do you have a “duty” to interfere? Can a simple refusal to act be called interferance?

Looking at the problem from another viewpoint, ask yourself how much the mere numbers count? How many strangers’ lives would be willing to sacrifice if the one were a member of your own family? One or two? Five? A thousand?

Whatever solution the ethics class finally arrived at, it is doubtful that there was consensus. The teacher’s point was exactly that. There isn’t necessarily a right answer or a wrong answer. Life is complicated.

So now here we are, in the middle of one of the complications. As if there weren’t sufficiently many interlocking considerations in the original trolley problem, it has now become — for those of us who rely on our cars to get through life — the self-driving car problem.

Let us say that I am a programmer and I have been hired by a software start-up to help write the programs that will control the behavior of the self-driving cars we all say we would welcome and the automobile manufacturers say they will be producing in the millions in just a few years. These cars will be guardians of our safety as well as our chauffeurs. They will observe all the traffic laws, never miss a sign, know all the rules, watch the yellow lines and the white lines and the speedometer and be our nannies as well as our chauffeurs. They will be controlled by software (always characterized as “sophisticated” although in reality it is just a chain of determinedly simple yes-or-no gates) that will allow us, as riders, to sit back with our café lattes and our cell-phones and glance occasionally at the scenery before we arrive safely at Aunt Mabels’ house.

Until we come to the trolley problem.

It’s an intersection. The light is green. Traffic is moving briskly. Suddenly there appears directly ahead a mother with a baby carriage, crossing against the light. We, coffee cup in one hand, cell phone in the other, eyes on a storefront with an interesting window display, have no time to take over and act, or not.

Now to a computer programmer a second or two is an enormous succession of if-then decision opportunities — enough nanoseconds to go through the entire State Motor Vehicle Statutes backwards and forwards. In the real world there is room for only two or maybe three choices, and there is little in the statutes that would be of help.

To our left, on the other side of the yellow line, is oncoming traffic, in particular one large truck which our program has been watching as it swung a bit over the yellow line to get around a halted bus. On the right is an SUV that has overshot the stay-behind white line on the cross street and intrudes far enough into the intersection that we cannot possibly avoid hitting it if we swerve to miss the mother and child. Either of those options, according to the readings of speed, direction, and inertia instantly available to our decision-making CPU will almost surely result in severe injuries, or even fatalities, either to the mother and her baby or the occupants of the truck and the bus or the SUV, as well as to the driver and passengers in the car our program is driving.

The mother and baby should have obeyed the light. The truck should have stayed inside the yellow line. The bus should have stopped closer to the curb. The SUV should have stopped short of the white line. Too late for any of that. Spin the arrow to your choice. Where does it land?

How much extra did we pay the dealer for our car’s computer safety system, which the salesman assured us would protect us and our loved ones from our own inattention or that of other drivers? To whom therefore is owed our system’s primary allegiance? To our own protection? To the numbers? To the truck driver (who has a family)? To the SUV driver (ditto)? To the mother? To the baby? Throw the switch — or do nothing?

Think of the rejoicing in the offices of AmbulanceChasers LLP when the gory photographs surface. The maker of the automobile probably has the deepest pockets and can therefore expect to find the most hands clutching at them. “Safety? Was my client’s safety assured as the salesman asserted?” But the software start-up, my employer, probably has a at least one billionaire venture capitalist on the board who would be rendered deeply embarrassed by the idea of a mother and baby carriage sacrificed to his search for profit. Whichever victim my software chooses has, of course, only himself to blame, but that will not deter the lawyers — nor should it. The legal profession decrees that someone has to be blamed. Contingency fees beckon. How high up the ladder can the claims go? Even perhaps as far up as the legislators who allowed the lobbyists for the car companies to write the rules allowing them to escape responsibility? Maybe even up as far as to the techie cheerleaders for Artificial Intelligence who have encouraged the public to believe that moral decisions can ever be made by machines?

I wonder how long it would take me to ditch my computer science degree and try for one in law instead?

 

 

Go back

Your message has been sent

Warning
Warning

Warning

Warning.