Tesla is facing pressure from all sides with regard to its features Autopilot and Full Self-Driving. With those pressures in mind, it seems to be putting itself into a position to legally leverage in-car footage of drivers using the latter.
Quite simply, Tesla has updated the software package (Version 10.5 now) for Full Self Driving and included some new language in the user agreement. That language is as follows...
"By enabling FSD Beta, I consent to Tesla’s collection of VIN-associated image data from the vehicle’s external cameras and Cabin Camera in the occurrence of a serious safety risk or a safety event like a collision.”
This is a big change from the way Tesla has previously used footage from customer-owned vehicles. In the past, Tesla has been open about using in-car and external footage. They've said that the captured video is helpful in their efforts to train their machine learning system.
The difference is that, according to Tesla, all footage was anonymous and never associated with a specific vehicle or its owner. Now, that's changing, i.e. "VIN-associated" and Tesla is requiring any user that wants to continue using Full Self Driving to accept this new condition.
Of course, it can't be overlooked that just a few weeks ago, the NHTSA received its first official complaint of an accident allegedly caused by the Full Self Driving feature. Since this new policy has just been spelled out, there's no way that Tesla could use footage from that crash to help determine fault.
For owners that accept the new TOS, Tesla could do just that. We wouldn't be shocked to see Tesla ultimately try to roll this out to any and all cars in the future.
As the number of accidents involving these technologies rise, Tesla sounds confident that video evidence will clear them of fault. Privacy concerns aside, perhaps that attitude should inform unreasonable critics of the technology.
"By enabling FSD Beta, I consent to Tesla’s collection of VIN-associated image data from the vehicle’s external cameras and Cabin Camera in the occurrence of a serious safety risk or a safety event like a collision.”
This is a big change from the way Tesla has previously used footage from customer-owned vehicles. In the past, Tesla has been open about using in-car and external footage. They've said that the captured video is helpful in their efforts to train their machine learning system.
The difference is that, according to Tesla, all footage was anonymous and never associated with a specific vehicle or its owner. Now, that's changing, i.e. "VIN-associated" and Tesla is requiring any user that wants to continue using Full Self Driving to accept this new condition.
Of course, it can't be overlooked that just a few weeks ago, the NHTSA received its first official complaint of an accident allegedly caused by the Full Self Driving feature. Since this new policy has just been spelled out, there's no way that Tesla could use footage from that crash to help determine fault.
For owners that accept the new TOS, Tesla could do just that. We wouldn't be shocked to see Tesla ultimately try to roll this out to any and all cars in the future.
As the number of accidents involving these technologies rise, Tesla sounds confident that video evidence will clear them of fault. Privacy concerns aside, perhaps that attitude should inform unreasonable critics of the technology.