THE Tesla is making its Full Self-Driving (FSD) technology available in beta for all of its vehicles in the United States and Canada. the release of resource will not depend on meeting the minimum security requirements — just buy it.
Tesla’s Full Self-Driving system, which is actually Level 2 automation, is investigated by the US highway agency for hundreds of accidents. There are cases of cars hitting parked emergency vehicles and a man has been charged with manslaughter, as reported by the Associated Press🇧🇷
The company’s CEO revealed news on Twitter
Elon Musk, CEO of Tesla, released the update on the beta of autopilot on Twitter, his other company. In the publication, Musk explains that access to the Full Self-Driving beta will be available to everyone who makes the purchase – made directly through the vehicle’s dashboard.
The FSD costs US$ 15,000 and is an “expansion” of Autopilot, an assisted driving system with features such as cruise speed, lane switching and lane control, automatic parking and vehicle detection. Autopilot is recommended for use on highways, not in cities. By purchasing Full Self-Driving, the customer gains access to the traffic light control feature and stop signs.
Before the release of the beta for all customers who purchase the FSD, it was necessary for the driver to meet some requirements: 80 security points in Tesla’s system and 160 km using Autopilot. Now those requirements have been eliminated. Customers of the brand are reporting that they got access to the beta without meeting the colon.
Autopilot: not so “auto”, not so safe
SAE (Society of Automotive Engineers) divides self-driving technologies into five tiers — Tier 5 being full vehicle automation and Musk’s dream.
Tesla’s Autopilot and FSD are closer to SAE Level 2, where it is still necessary for the driver to monitor the vehicle🇧🇷 However, there are numerous reports of drivers falling asleep at the wheel while on Autopilot (via NBC) and, of course, vehicle accidents.
NHTSA disclosed that of the 392 accidents between June 2021 and June 2022 involving steering assistance systems, 273 were with Tesla vehicles. At the time of publication, Tesla has beefed up the safety of its system and defended itself by comparing the number of fatal accidents with non-autonomous cars.
And as much as the fatal numbers are small (there are six deaths recorded in the report), safety is not just about staying alive. NHTSA has also opened a investigation into the problems of “ghost braking”, in which Autopilot braked “out of the blue” and without any imminent risk. These incidents indicate a software failure and can be dangerous in situations where there is a vehicle “glued” to the back of the Tesla — especially a truck. However, a conscious Autopilot user will have time to react and save himself.
Because of the NHTSA, the Tesla also disabled the “rolling stop” function in their cars. Without a direct translation into Portuguese, the “rolling stop” is the attitude in which the driver enters an intersection without completely stopping the vehicle, keeping the car still in a slow movement, preparing to accelerate if it is possible to follow the path. This functionality was available in Full Self-Driving’s “Assertive” mode, in which the AI performs riskier maneuvers — an option for those in a hurry.
Driving assistance tools are critical for road safety. However, reckless drivers should not participate in the beta. By allowing access to people with low Tesla safety scores and less than 100 miles of Autopilot use, the tests could end up bringing more problems to Tesla’s goal of autonomous driving — the real thing, Level 5 — at Tesla.
With information: The Verge and TechCrunch