Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Can someone help me here? I'm scanning through this and... not really finding much meat. They used a deliberate radio attack to spoof GPS signals and change the car's idea about where it was in a fundamentally unrecoverable way, and it seems the only thing they got the car to do incorrectly was... take a wrong turn?

I mean, yeah. If you take away someone's maps and compass they might get lost. What am I missing?



Not just lost; you can make them drive exactly where you want them to go. As long as a human is behind the wheel this won't be terribly effective except as a disruptive tactic, but once there's nobody behind the wheel you could use this to hijack any autonomous car and make it drive where you want it to drive. This might be especially effective when using autonomous vehicles for long-range shipping, where you could force the truck to drive to a warehouse under your control where you can then steal its cargo.


To be fair, that's not at all what was demonstrated.

But... OK. If you can put a transmitter on top of a vehicle in motion you can take control and make it drive to your destination? How is that significantly more damaging or dangerous or "bad" than just grabbing it with a tow truck? Or hijacking it? Or just stealing the vehicle itself?

I mean... this just doesn't really seem like an indictment of autonomous driving to me. People were successfully stealing stuff out of horse drawn carriages (or hell, just stealing the horses) and we all seemed to survive just fine.

Seriously, this just doesn't seem like a doomsday kind of thing. Needs more spin.


Because this doesn't require putting a transmitter on top of the vehicle. As the article itself mentioned, this sort of GPS spoofing has already been demonstrated to be effective at range. For the purposes of Regulus's tests they didn't need to do it at range, mounting a transmitter on the top of the car was sufficient to demonstrate that an external spoof is possible

> The spoofer can easily use an off the shelf high-gain directional antenna to get a range of up to a mile. If they add an amplifier, a range of a few miles is very much possible. It has already been proven that spoofing can even occur across dozens of miles, for example in the Black Sea spoofing attack in June 2017.

And literally nobody is saying this is a doomsday. What they are saying is that autonomous cars need to recognize this attack vector and take steps to combat it. The fact that stealing or hijacking vehicles has always been possible doesn't mean we need to deliberately turn a blind eye toward a new and potentially very effective attack against autonomous vehicles.


You can't drive ONE vehicle to steal it with a blanket attack, that's ridiculous. You'd have to know exactly where that vehicle is, with orientation and velocity, down to cm-scale precision. And while you could then steal that vehicle you'd disrupt all the activity around it, so I don't see it.

Like I said, needs more spin. This scenario doesn't really fly.


It sounds like you're saying the technology as it currently exists is as far as it will ever go and nobody will figure out how to refine this attack?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: