Pedestrian of the future - trusting software
If you wrote firmware for a self-driving car, would you walk in front of the car?
Let’s look at 2 aspects of this. We don’t have answers but we have some probative thoughts.
Security - E.g., “99% of cars tested by the German General Automobile Club (ADAC) were found to be vulnerable to the flaw, enabling criminals to unlock cars and drive them away.”
Updates - The need for continuous updates like you phone and computer need.
Bugs - Caused by errors in programming, software complexity, miscommunication, etc.
Things we don’t know we don’t know
- Denver Airport
- Boeing 737max MCAS system
- Government of Canada Payroll System
Originally billed as the most advanced system in the world, the baggage handling system at the new Denver International Airport was to become one of the most notorious examples of project failure. Originally planned to automate the handling of baggage through the entire airport, the system proved to be far more complex than some had original believed. The problems building the system resulted in the newly complete airport sitting idle for 16 months while engineers worked on getting the baggage system to work. The delay added approximately $560M USD to the cost of the airport.
Oh, we will just fix that problem by changing the software. Here we are more than 1 year later with losses of about $11 Billion. Because complex software is complex and the interactions between modules is very challenging.
How hard can payroll be? Easier than driving a car I would think. Canada’s federal govt is spending $400 Million this year (2019) to fix the payroll system. the system, who’s original 2009 budget was $309-million, had already cost taxpayers $954-million and could rise to $2.2 billion by 2023 in unplanned costs.
Can software in complex systems ever really be safe
Watch Bill Gates, Windows 98, Blue Screen of Death HERE
When you get run down by a self-driving car who will be legally responsible?
To read more on Software's Chronic Crisis, CLICK HERE
Test Your Own Opinion
Even if you do not write code for a living: Assume software you wrote was part of the firmware of a self-driving car - specifically - in some way your code was part of the self-driving system - maybe as a sensor module, a protocol interface, an AI, a decision hierarchy, etc.
1) Would you feel safe crossing a road that was used by self-driving cars?
2) Would you trust that each company delivering self-driving cars will spend the same effort on safety and quality assurance?
3) Do you think pedestrians will be safer or less safe when most cars are self-driving?
4) Do you think pedestrians will have to wear an active beacon type of device for safety's sake?
5) Would you feel comfortable writing code that has to make a decision that will result in the loss of life? - for example, the car senses that action 'A' will kill one person and action 'B' will kill 2 people. i.e., there is no non-lethal option.
I hope you feel slightly unsettled by these questions. I am. I think my basic premise is a bit unfair because if I knew was, I was working on a system that had such awesome responsibility then I would work in different ways and use different methodologies.
My answers are
1) No. I know people who code for a living. I see error messages all the time. I can still crash a computer with bad enough code. What will be worse - driving the commute or applying patches during the commute?
2) Hell no. As a driver I know I will choose based on reputation. As a pedestrian - I am vulnerable to the lowest quality product. And dare I say it "Diesel Engine Scandal". trust is easily lost.
3) Less safe. The laws will change to give cars the right of way. If not, lawsuits will destroy the industry. There are already countries where cars have the right of way.
4) I think that's the best they can hope for. The government will require all manufacturers to sense pedestrian beacons and to actively try to protect their lives.
5) Only in this sense. I would rather I was writing the code than Adolf Hitler. Seriously - I would be able to if I worked as part of a program that emphasized human and animal safety, I am not sure.