Tesla Self-Driven Cars: Fears Mounts Over Possibility of Car Hacking

By Abdul Muqeet / 1472676474
(Photo : Bloomberg via GettyImages) Researchers have shown that the sensors of Tesla's self-driving car can be hacked to confuse the vehicle in real-life situations.

With manufacturers making heavy strides in the field of self-driving cars and taxis, car hacking is going from a vague idea into a real concern.

Speculation started around May when a Tesla S was on autopilot, and the driver had fallen asleep. The car could not judge the oncoming obstacle and rammed itself into a tractor at a speed of 74 miles/hour.

The result of the accident was the unfortunate death of the driver, which raised questions about the reliability of autonomous vehicles. The incident also begged the question what if someone were to fiddle with the car's sensors and made the autopilot sensors fail?

"Car companies are finally realizing that what they sell is just a big computer you sit in," Kevin Tighe, a senior systems engineer at the security testing firm Bugcrowd, told The Guardian.

This statement came as a reassurance that car companies were at the very least acknowledging the issue and would be doing something to rectify it.

To carry out the job of hacking into a car, a group of researchers at the University of South Carolina, China's Zhejiang University and the Chinese security firm Qihoo were put to the task. In a sequence of tests, they found that they could use off-the-shelf radio sound and light-emitting tools to deceive Tesla's autopilot sensors. In some cases, this caused the car's computers to perceive an object where none existed, and in other cases to missed a real object on it's path.

Although the study showed that messing with the navigation and sensors of the car's was possible, it also showed that only highly motivated people with a special set of skills can carry out such attacks.