How the Tesla autopilot cars were tested and what results they had

New restrictions for Denmark: mandatory use of face mask indoors, no gatherings of more than 10 people

The Prime Minister of Denmark Mette Frederiksen announced on Friday night new restrictions aimed at curbing the country’s deteriorating corona situation.Face masks must be...

Armenians cut AP-7 in La Jonquera protest over war with Azerbaijan

Some 200 Armenian citizens they have cut this Saturday the AP-7 motorway in both directions of travel at Agullana (Girona), about 8 kilometers...

The time change brings winter time to Spain

On the night from Saturday to Sunday the second time change of the year will take place This morning from Saturday October 24...

In Naples, fierce riots as youth opposed tightening corona restrictions

In Naples, hundreds of people clashed violently with police on the night between Friday and Saturday. The rebels were mostly young.Protesters opposed tightened...

Crisis in Venezuela: slowly dying in the basement of a public building

Homeless, disabled, marginalized people, once promised a home, spend the rest of their days in a state shelter, in a state of putrefaction.Sunlight does...

Israeli security researchers have figured out how to fool self-driving cars by placing a stop sign on a digitally controlled billboard – a hack, they said. Wired, which could cause traffic jams or even accidents. They performed this test on a Tesla car.

“It only takes a picture of something on the road or a few frames in a digital panel, and the car will apply the brakes or eventually turn around, and this is dangerous,” said Yisroel Mirsky, a University researcher. Ben Gurion. “The driver will not even notice. So, someone’s car will react and will not understand why “.

The team’s research initially focused on some bright images on the road, to trigger autonomous vehicle systems to do something they shouldn’t.

But then they realized that it would be more convenient to access the connected digital panels and found, in the course of their research, that they only had to reproduce the image of a stop sign for a split second.

The team was able to test this on a Tesla car running the company’s latest autopilot version, according to Wired. The most worrying thing is probably that the hack would require little hardware and would leave no evidence behind.

“Previous methods leave forensic evidence and require complicated training,” said researcher Ben Gurion Ben Nassi. “Ghost attacks, on the other hand, can simply be done remotely and do not require any special expertise. That’s why it could be so dangerous, “he added.

How did the autopilot on Tesla cars become a danger on the roads

Critics of Tesla’s “autopilot” mode have long argued that the name is misleading and suggest to drivers that their car can handle driving more than it actually can.

But CEO Elon Musk – who has repeatedly claimed that Tesla is at the forefront of fully autonomous vehicles – has no problem with that. In fact, he thinks criticizing the name “autopilot” is “stupid,” he told Automotive News. The name of an Tesla autopilot steering system suggests to drivers that the cars are, in fact, autonomous. Moreover, when asked if Tesla would consider changing the name, Musk replied, “Absolutely not. It’s ridiculous. “

But there are several issues at stake in the “autopilot” debate, in addition to Musk’s pride. Autopilot mode has been linked to a series of fatal car accidents and legal issues.


Related Articles