![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Just came across an article on Ars Technica (yes, I'm behind): The intelligent intersection could banish traffic lights forever. It's neat stuff: basically, a researcher has designed a traffic-control system for autonomous vehicles, and demonstrated that by using such technology we could enormously reduce how often you have to stop at intersections -- not only speeding up travel times, but improving fuel efficiency quite a bit.
All of which is great, but my Security Architect senses are pinging here. This is postulating an external server that talks to the cars on the road and tells them what to do. That is absolutely terrifying if you understand the typical state of Internet-of-Things security.
But let's put a positive spin on this. This system is at least 1-2 decades from deployment as described (since it assumes only autonomous vehicles on the road). We might be able to head off disaster by figuring out the obvious hacks in advance, so they can be designed around.
So here's the challenge: name ways that a hacker could abuse this system, and/or ways to ameliorate those weaknesses.
I'll start with a few obvious ones:
- Base story: I (the hacker) send out signals spoofing the controller for traffic intersection T, allowing me to cause nightmarish havoc. Possible solution: traffic controllers are listed in some well-known registry, signed with public keys, so that their signals can be authenticated to prevent spoofing.
- Assuming the above hacking isn't prevented: I time the signals sent to the cars, telling them all to hit the intersection at the same moment. Crash! Solution: as a belt-and-suspenders thing, cars must not completely trust the signal controllers. Their autonomous checks have to override those signals, to prevent crashes.
- Reverse of the previous: I send out signals telling all the cars, in all directions, that the intersection is currently blocked by opposing traffic. The entire city quickly devolves into gridlock. Solution: good question. I got nothing.
What else? I'm sure we can come up with more nightmarish scenarios, and possible solutions.
Yes, this may seem like overkill to think about now, but history says that, if you don't design the system around abuses, you will hurt forevermore. Security isn't something you add later: it should be baked into the designs from the get-go. (Which is why it accounts for a large fraction of Querki's architecture, despite the fact that we only have a couple hundred users yet...)
(no subject)
Date: 2017-06-08 03:17 pm (UTC)But of course, the security hackles should be raised in this scenario too. ;)
(no subject)
From:Improved Throughput for whom?
From:(no subject)
From:(no subject)
From:(no subject)
From:(no subject)
From:(no subject)
Date: 2017-06-08 03:40 pm (UTC)Here's a fun thing not to try at home: a beacon on your own car that claims to be a portable traffic signal. "Lane closing; keep right."
"I'm a police car, let me pass."
Wayze likes to route vehicles through quiet neighborhoods as shortcuts. Drop a spoofer on the corner and the road is "closed" or "5 MPH" and stops being used as a shortcut.
"Invisible construction area, wait for [nonexistent] flagman."
(no subject)
From:(no subject)
From:(no subject)
Date: 2017-06-08 03:42 pm (UTC)(no subject)
Date: 2017-06-09 12:27 am (UTC)(no subject)
Date: 2017-06-09 07:02 pm (UTC)Right now there has been a proof of concept of an algorithm to generate an overlay for an image that is invisible to the human eye but that prevents facial recognition systems from working. Now let's put that technology into the hands of someone malicious. Let's say I can put an overlay on your protest sign that you can't see with your human eyes but when you wave that sign the vision algorithm sees a child running into the street.
If you think that's impossible then imagine the following scenario: for every deep learning algorithm that gets rewarded for doing what you want, imagine creating an opposition algorithm that gets rewarded for generating situations or images or whatever that cause the first algorithm to mess up.
(no subject)
Date: 2017-06-11 10:20 pm (UTC)More broadly: not every thing on the streets will be part of the IoT, but they're still actors and obstacles.