Your Tesla is watching you—whether or not you’re watching the road

In the wake of Tesla’s first recorded autopilot crash, automakers are reassessing the risk involved with rushing semi-autonomous driving technology into the hands of distractible drivers. But another aspect of autopilot—its ability to hoover up huge amounts of mapping and “fleet learning” data—is also accelerating the auto industry’s rush to add new sensors to showroom-bound vehicles. This may surprise some users: Tesla’s Terms of Use (TOU) does not explicitly state that the company will release information captured by its vehicles when there is no clear legal need for it. That doesn’t mean they can’t, however, it all comes down to how you interpret the wording. Whether or not Tesla’s public disclosures of vehicle data fall under a reasonable interpretation of its TOU is a matter of legal interpretation. As a practical matter, however, the company’s failure to clearly disclose that it could publicly release characterizations of owner driving data could potentially lead to a backlash from privacy-minded consumers, especially as Tesla attempts to bring its technology to a mass market.

External Link to Story

What is Privacy International calling for?

People must know

People must be able to know what data is being generated by devices, the networks and platforms we use, and the infrastructure within which devices become embedded.  People should be able to know and ultimately determine the manner of processing.

Data should be protected

Data should be protected from access by persons who are not the user.

We should know all our data and profiles

Individuals need to have full insight into their profiles. This includes full access to derived, inferred and predicted data about them.