When I got my last car, it had a sticker instructing me to press the SOS button to opt out of data collection. The gentleman that answered acted confused when I made the request, taking the stance that he wasn't aware of data collection from the car and maybe I should contact the manufacturer. It was only after I read him the sticker text verbatim that he went into a scripted response and confirmed the opt out. Shady.
I had a friend who worked with call centers, taking orders over the phone. they had a script for everything, and after they quickly took your order, they started with a scripted upsell. If the person balked, they had responses for everything so they could continue. The only way out of the script was if the customer said "I will cancel my order". The call center person would be fired if they did not follow the script.
Xfinity similarly does this if they overcharge you on a contract. They try to upsell on a new (less value) contract for an hour before finally just changing tone and saying "you're right we messed up" and refunding you.
I never saw one of those when buying my most recent car (Toyota). But, after creating an account on their website I was able to find this section:
Consent Centre
E-privacy: Your car data
Help us improve our products by sharing car data and diagnostic info.
We don’t currently have your consent to use this data. This may be for the following reasons:
Your car doesn’t have Connected Services. Contact your dealer to check if they can be added.
No (connected) car has been added to your account. Go to My Vehicle, add a car if needed, then connect your car via the Connected Services card.
You have a connected car but have yet to consent. Go to My Vehicle, and open the chosen car. You will be requested to give your consent.
Toyota thanks you for helping us to improve our products.
FWIW, their smartphone app contains a "consent" section with various toggles for the usual reasons (e.g. "sharing data to improve products") and these are all turned off.
I love the attention to detail here, like any customer ever has navigated to this page trying to find out how to give consent to having their data collected.
One of the automakers in the article claims that voids your warranty. It may or may not but enjoy the legal battle should you ever need to make a claim.
I still had the sticker- I peeled it off and put it in the manual. The exact text is:
VEHICLE DATA TRANSMISSION IS ON! Your vehicle wirelessly transmits location, driving and vehicle health data to deliver your services and for internal research and data analysis. See www.toyota[.]com/privacyvts. To disable, press vehicle's SOS button.
This kinda confirms, unfortunately and sadly, that ChatGPT answers are probably just as good as human answers. And the data collection for your phone call went to training, not into a database where you officially opted out.
> This kinda confirms, unfortunately and sadly, that ChatGPT answers are probably just as good as human answers.
These people are paid to follow scripts and strict protocols. At best, this may suggest that ChatGPT answers are as good as a call center representative's answers.
I think there are a few reasons to be mad about this, but some are better than others:
1) People don't understand they're being monitored. I think this is a good reason to be mad. People should have some understanding of the agreements they make. It's part of being a functional adult in the world. It's also annoying that the companies keep spinning this as a tool to improve your driving, when it's clearly an attempt to price insurance against a person's actual driving habits.
2) The system's assessments are opaque. I don't have a good sense of how accurate any of these measurements are, nor what system is in place to ensure that. If the information collected is consequential enough to double a person's insurance costs, there should be some effort expended to be confident that the collected metrics actually reflect reality. I didn't see anything like that in the article, maybe I missed it, but it shouldn't just be some random team in a private company doing their best.
3) People's driving habits shouldn't be shared with insurance companies. This one ... this one I think is not great. It looks like the shared data at least tries to be anonymous -- they share driving behavior and times, but not actual location data. Heck, I'd be fine with scrubbing the times and just sharing the hard start and stop and speeding numbers (assuming point 2 above is addressed). I get that a knee-jerk defensiveness about privacy would make Thomas Jefferson proud or whatever, but we strike balances on public welfare and private freedom all the time. If you're itching to manufacture 1 gram of ricin to put in a sealed glass vial above your mantle, too bad, you can't. Cars aren't ricin, but they are the shortest path between most humans and homicide. If this kind of intervention induces people to be more careful to keep their current insurance rates, I think that's reasonable. Driving like a maniac is not a human right or a protected characteristic.
I've avoided accidents by hard breaking twice in the last two years, once from deer bounding into the road, and once from a deaf old cat walking into the street.
I haven't been cited for anything in decades, and have never been in an at-fault accident. I drive the speed limit and have a dashcam. With the deer, I was actually 10 MPH under the limit.
So should my rates go up for these incidents where I successfully avoided hitting something? Insurers are unscrupulous and would use any excuse.
The insurance company would argue that you drive in an area with wildlife crossings. That makes you a higher risk even if you managed to avoid this deer. You are more likely than average to encounter another one in the future and may not be as fortunate.
I did pest control for a while, and my truck was equipped with a monitoring device that would beep if it detected unsafe driving. The thing was inconsistent enough to be nearly indistinguishible from random. It sometimes nagged me while driving straight at normal speeds, or going over a pot hole, or just stopping like normal at a stop light. At other times, it wouldn't go off for what should have been obvious "offenses"—hard stops, last-second swerves to avoid road debris, etc.
All in all, I think it was useless for actually policing driving behavior, but I did get identified (read: randomly selected) as the safest driver in the branch one month and got a bonus, so I guess that was nice?
Location can be relevant. There is both a quarter mile drag strip by me and a circuit lap by me that both allow you to drive your own car on them.
Both styles of driving would be... Alarming from a telemetry perspective.
Afaik neither is covered by regular auto insurance anyways so it really shouldn't factor into rates. There's specific racing insurance, but it's quite pricey.
Not that I want them sharing location data, but pure acceleration/velocity data won't show areas like that.
I'm also not sure how well regionalized the data is. Though neither is good, there's a very big difference between going 15 over on the highway and going 15 over on back country roads with blind turns. Or between going 15 over on the highway vs in a shopping center parking lot.
It’s also difficult to determine if someone is speeding from data.
For example the road I live off of according to the speed limit the car thinks goes from 40 to 65 to 25 to 65 to 40 in about a 4 mile span. Spoiler it does not. It is 40 the whole way. But according to the car I am either going 25 under, 15 over, or exactly the right speed.
(And the 65 section in the middle? Blind corner. Idk where it’s getting its data but it is very very wrong)
Just like any other data collection and “tailored” to it service the only purpose is to justify a charge not to actually work. Targeted ads work better than traditional ones? Who knows but how can you say that your service is better if it has nothing innovative. Just like “feature-rich” devices is just a sales pitch.
>People should have some understanding of the agreements they make.
People do not engage in meetings of the minds on these types of things. Manufacturers/insurance companies enter into agreements (and leave stickers that are unlikely to be read) which is a clear violation (imo) of contract law.
It's one thing to be aware of agreements you make, it is another to navigate a corporate surveillance hellscape of on by default consentless surveillance a bunch of psychopayhic corporate types greenlit.
> It looks like the shared data at least tries to be anonymous
One of the main points of the article is that insurance companies are using the data to raise drivers' rates. How can they do that if the data is anonymous?
The car company can share the details based on the chassis VIN number rather than driver details.
Then the insurance company grabs the vehicle registration number when you ask for a quote and looks up the VIN on their side based on a security database to prevent resale of stolen cars or similar.
Ah, anonymous was the wrong word. I meant instead that the shared data tries to restrict itself to information that doesn't obviously fall under a right to privacy. For example, trip times are shared, but locations are not.
The flow of traffic on the highways where I live is consistently 15-20 mph above the posted limit. I wish everyone would slow down, but that doesn't change the fact that the safest way to merge is to accelerate hard and match their speed. The last thing I need is a financial incentive to be oblivious to my surroundings.
The only speeding ticket I’ve gotten in the past 20 years was for speeding on the on-ramp to get up to the speed of the highway traffic. Holiday weekend, so it was stop, ticket, and release. Repeat. No warnings given.
When I worked in car insurance, besides our own telemetry, we got at least
- Willis Towers Watson (WTW) aggregated driving data
- Verisk (afaik this was mostly around vehicles, not people)
- Various reports directly from state governments
- LexisNexus (multiple different report types)
Really any mobile app that has accelerometer or gyroscope access (even without GPS) can estimate driving safety. Using phone movement and angle, you can estimate driver vs passenger.
Cambridge Mobile sells equipment a lot of insurers use and afaik also data
Your responsibilities include: (1) informing passengers and drivers of your vehicle that data is collected and used by us, and (2) notifying us of a sale or transfer of your vehicle. If you do not notify us of a sale or transfer, we may continue to send data about the vehicle to the subscriber's Account Information currently on file, and we are not responsible for any privacy related damages you suffer.
This is why I like paper contracts. I would simply cross out a stupid clause like this before signing.
99% of the time the rep doesn't care, and if the company can't be bothered to put someone on the other side of the table who is actually paying attention or has bargaining power then they deserve it.
The US doesn't just need laws about disclosure of these practices. It needs to mandate that this kind of corporate surveillance must be a clearly labeled opt-in and cannot be mandated by any contract.
Yes, but to me there's another important issue, with all the tracking tech in our vehicles, who actually owns them? We need to treat this similar to the "right to repair" issue! I paid my money, so I own the product. [IF] I own the vehicle it should be my right to say what software runs in the background.
Of course the company can say, "If you don't like our product, don't buy it." If I want to keep up with the latest safety upgrades to my vehicle to protect myself and all car companies have the same tracking software, my only option is to look for a "dumb" vehicle. This is blatantly unsafe and irresponsible. So, they're saying that my safety comes with a price other than the $40K I shelled out?
> Yes, but to me there's another important issue, with all the tracking tech in our vehicles, who actually owns them?
It's even worse. Your car acts as a blackbox against you. A 1990's car? I can do whatever the fuck I want to, I can drive it offroad, I can speed, I can even be near a bank robbery or whatever.
A modern car? Someone robs a bank a few hundred meters from where I am, and now the police will come knock on my door because the IMEI of my car was near the bank when the robbery happened. I speed a little bit to overtake some dumbass driving 20 km/h below the limit, the police makes a dragnet subpoena against my insurance / the data processor from the manufacturer, and issues me a ticket.
> Of course the company can say, "If you don't like our product, don't buy it."
Honestly I'm pretty tired of that "our way or the high way" nonsense. Society needs to make it so they actually can't say that. Make respecting us a precondition for their continued existence. As in they literally get liquidated if they say that even once.
That's how we deal with sociopaths leveraging these non-negotiable "terms" against us. They have zero empathy, they view us like cattle to be marked and monitored and turned into cash flow. So there is no reason to empathize with their nonsense viewpoints either. Just make whatever they're doing illegal. Doesn't matter how much money they lose.
> who actually owns them? We need to treat this similar to the "right to repair" issue
Ownership is a bad framework for this issue—it’s too ambiguous. You can “own” a vehicle all you want, that doesn’t give you the right to fuck with its odometer or catalytic converter.
> In recent years, automakers [...] have started offering optional features in their connected-car apps that rate people’s driving.
At least the programs are (currently) opt-in.
This amusing anecdote is buried:
> One driver lamented having data collected during a “track day,” while testing out the Corvette’s limits on a professional racetrack.
> [...] he was denied auto insurance by seven companies [...]
There is another commenter further up that says they had to opt out on a Toyota and the rep acted like he didn't know until the opt out text was read verbatim.
I just purchased a Camry Hybrid from a Toyota dealership. The operator tried to tell me that "because I financed it they cannot turn off analytics." I had not financing, paid cash.
Pressing the SOS button to cancel [as sticker suggested] was met with so much difficulty that (while the operator was on the line still) I found the fuse panel and pulled out `DCS` to disconnect the call/tracking. This ended our trasmission.
But are they really optional? I can’t imagine that the telematics link is going unused for the value it provides (i.e. crowd-sourcing for speed and road map data).
The worst part is that assumptions about who’s driving the vehicle.
I would be willing to bet even if you told the insurance companies it was totally legal on a professional race track -- they'd say "Nope, we still don't want to insure someone that takes his car on professional race tracks like that."
The article makes clear that most people don't know what's happening with their data. They opt into something else and this data collection is included - that doesn't sound like much of an 'option'.
My biggest concern is that rather than comparing difficult to identify behavior against claim rates, they will penalize behavior that is easy to identify. For example, yesterday I was traveling 15 MPH over the speed limit on a multi-line highway where traffic is often traveling 10+ MPH over the limit (the limit is objectively wrong for a divided, grade separated, access controlled highway). I typically drive as far right as I can to make room for faster vehicles, but eventually got stuck behind someone camping in the left lane. When opportunity presented itself I went around them in the center lane. They expressed anger at this by encroaching into my lane to squeeze me against traffic in the right lane. There were four inches between their vehicle and my side mirror. Who is driving dangerously, and more likely to cause an accident? I would argue it's the driver who is obstructing traffic and behaving aggressively toward others on the road. But if GPS isn't accurate enough to show their lane deviation, it's a lot easier to ding me for my speed.
Making it easier to determine who's at fault in cases like you mentioned would involve more sensors, radars, cameras etc. So we either 1984-ify everyones car or we just don't do any monitoring at all (since half-assing it can lead to false positives). I have a feeling insurance companies (and therefore governments) will slide more towards the 1984 side to save a couple dollars.
Insurance companies don’t have to, people are voluntarily installing dash cams to show who was at fault (or at least show they were not). Chances are, someone is recording your collision, and you might as well have your evidence to fight against someone else’s.
I‘m a huge fan of s telemetry insurance. I have I personally and it saves me around 300€/year on my cars insurance because I am a very defensive driver.
However, this being integrated into the vehicle in an absolutely intransparent way is a huge step up and a really unsettling privacy violation.
For this to be ethically viable imho, there need to be a few prerequisites
- it’s transparent what has been transmitted
- you can always easily opt out, but you may loose the discount you earned
- your driving can’t make your premium go up beyond the base premium without the discount (sensors will never paint an entirely accurate picture)
>>I‘m a huge fan of s telemetry insurance. I have I personally and it saves me around 300€/year on my cars insurance because I am a very defensive driver
My sister had it, and it was biggest piece of crap imaginable. The system would send her emails warning her about "lack of smoothness" in her driving, because....the system would rate her down every time she went over a speed bump.
The biggest problem was that she would get emails saying "we've detected you were going 70mph in a 20mph zone, if this continues we will cancel your insurance", so we would call them and ask them to provide GPS logs, which they always would - and the logs would always show that she was going legal 70mph on the motorway, which at one point goes above a smaller 20mph road - and of course the system was stupid enough to just query the speed limit for every point, not realizing that this wasn't the road she was actually on. We would email them back explaining, and the warnings would go away until she went on that road again.
Absolute waste of time and money, I think the insurance company would need to pay me to have this fitted, the nerves it cost my sister to have that piece of crap in her car weren't worth whatever discount she got for it.
From the other side, it’s essentially a fine for people who respect their privacy. Insurance prices will adjust to the adoption of this discount, will rise to the current normal and only people who don’t opt in will be hit with the extortion fee forcing them to opt in.
That last point is merely a way for you to get used to this system. Once enough people allow the spying they'll increase the price of you don't allow the spying.
And after that they'll mandate it for everybody.
I already pay a premium for having more horses under the hood. I don't want to get dinged when I use my car's power.
I'd be a fan, too, if they couldn't use the information to raise rates. But even the best drivers brake hard to avoid accidents from time to time, and in the US, insurers are dirty.
I don't know about OP but in Poland https://yanosik.pl/ offered such deals ( https://payhowyudrive.pl/ ). It is probably a bit self defeating - the app's main function is warning about speed traps, that means unsafe drivers as significant part of its users.
The easiest way to disable this in a Chevrolet with OnStar is to pull the fuse (Fuse 38 under the dash for the Chevrolet Malibu 2024). Other options disconnecting the antenna (can still connect if strong signal), or pulling out the box/microphone (disassembly required). At least for the 2024 model CarPlay features seems to keep working, but I haven't tested Bluetooth yet.
Many GM models used to have a bridge/jumper between the network daughter board and the rest of the car. Pretty easy and didn't affect anything else (sometimes the fuse for OnStar also covered your Bluetooth or voice commands).
The problem with such issues of data misuse is that people only provide 2 solutions.
a) Go off grid. Don't use The tech that these cars make.
The problem with this is that it is impractical for people that use see alot of value in using this tech.
b) Pass more regulation.
I am a Hayekian and I believe that regulation will not help with people that know the ins & outs of the regulation, also it's doesn't stop them. It just means corporations are willing to misbehave as long as they can play the legal gymnastics and pay rudimentary fines.
Now,
The third option which I see would be the best but isn't talked much about is the promotion, and installation of homomorphic computing or homomorphic encryption.
I am not a cryptographer so I really don't fully understand it's limitations. But adopting this would simply make all these data abuse issues vanish.
Cryptographers, why hasn't homomophic Computing or homomophic encryption been massively adopted?
>Sure, the car company will homomorphically encrypt your driving data when it sends it to its own servers.
You can encrypt the data such that the insurance companies cannot target any particular individual (which is my problem her) but they can use the data to improve their insurance pricing models.
I have no problem with a health insurance company using population data to find out how many are susceptible to say cancer.
But I have a problem when they use this data to over price a particular individuals insurance
because their gene say that they are susceptible to cancer.
a) Impractical because cars are needed for daily life and there’s no incentive for automakers to not sell your data.. so all cars will unless this becomes a compelling enough product difference to move the needle on profits,
b) Legislation/regulation that creates the right incentives isn’t easy, but certainly doable.
c) Impractical because homomorphic encryption is absurdly computationally expensive, is still not a fully unsolved problem, and.. in what universe do automotive companies implement this far fetched and expensive means of privacy without sone.. err.. regulation?
Which specific regulation do you think has a history of not being impactful? I find that the devil is in the detail in this argument because most regulation us massively impactful and helpful and I find that the talking point that we need to get rid of it is generally loudest from those who would profit the most from not following those rules anymore.
GDPR for example has done nothing to protect people from this particular case of data misuse.
The problem with English law, is that you have to explicitly declare what is wrong a head of time. So we just end up with endless needs for regulation ls.
If we had legal systems like Hammurabi Codes, they work work way better.
Since nobody answered the question, the reason is its terribly absolutely insanely slow. It's possible, just requiring hundreds of thousands or millions of times as much work as say, a normal lookup in a database.
> I am a Hayekian and I believe that regulation will not help with people that know the ins & outs of the regulation, also it's doesn't stop them.
that is such a funny thing to say. Car industry is heavily regulated and car companies do work with the regulation. They are already regulated on safety, fuel standards, dimensions... Adding data protection into the mix makes sense.
The auto industry has fought tooth and nail against safety requirements[1] and still fights today against more stringent fuel standards[2][3].
Not only would they fight regulations like data safety that would open them to potential litigation when lose the data or sell it to the wrong player, but they would win. Privacy isn't the political football that the environment is, and you can't point to death statistics like you can with safety issues.
If I am a corporation and I am willing to break regulations, how will you force me to use homomorphic encryption? Why should I pass on gathering data that I can resell?
The average buyer won't understand or care about it so there is no direct pressure from consumers. I think regulations is not optional (and homomorphic encryption may be mandated if viable?). Breaching regulations is often a "cost of doing business", but some recent regulations (such as GDPR) can actually create very large fines in many countries. So it seems that what may be needed is good enforcement and measured penalties. Another deterrent would be having penalties that are not money.
> Breaching regulations is often a "cost of doing business", but some recent regulations (such as GDPR) can actually create very large fines in many countries.
This is the issue with so many laws. Stricter fines basically never deter would be offenders from committing the crime. What deters people is a high chance of getting caught.
Do companies ignore regulations? Sure, some do. But saying 'they will just pay the fines' ignores the fact that we could make the fines existential, or punish board members by kicking them out of the industry. The answer to 'the regulation we haven't even tried won't work if we do it improperly' is 'let's do it, and do it properly'. I have no idea what homomorphic encryption is, but rarely do 'let's add more tech to magic bullet a human problem of incentives' solutions work.
I think a problem in this area is that if one avenue of data collection is denied, another one will be implemented and it becomes a game of whack-a-mole.
For example the USG is forbidden from collecting communications from US citizens, but that does not keep it from buying this information from private domestic sources or from other governments.
> I am a Hayekian and I believe that regulation will not help with people that know the ins & outs of the regulation, also it's doesn't stop them.
I work in the automotive industry. It is very heavily regulated. The majority of people have never heard of ISO 26262 but it's keeping billions of people safe every day. Data privacy can work in the same way.
> I am a Hayekian and I believe that regulation will not help with people that know the ins & outs of the regulation, also it's doesn't stop them. It just means corporations are willing to misbehave as long as they can play the legal gymnastics and pay rudimentary fines.
So you try nothing and are out of ideas. Amazing.
> homomorphic encryption
Let me get this straight, you think regulation is too hard because corporations don't want it, but you don't see any problem with homomorphic encryption, which is difficult to implement, poorly understood by consumers, AND provides privacy guarantees that corporations don't want?
It's pretty clear we've reached the point where technology has shifted to working against us, and not for us anymore.
I work in tech but as far as I am concerned, you can keep all your smart homes, cars and other gadgets and soul sucking (anti) "social" apps.
Somewhere along the way technology was hijacked to control us rather than empower us. And if you don't like it: shut up because "progress" is inevitable
> we've reached the point where technology has shifted to working against us
Everyone has always said this since the dawn of farming. It’s not a particularly useful insight: the question is in how and how it is to be banned or balanced.
Technology is amoral. The power shift happened because we are no longer in control. It's these corporations who are the masters of the computers now. They're just allowing us to use their computers. Of course those computers work against us, they are treacherous by definition.
All new manufacture cars sold in the US already have "black box" data recorders that can be dumped in the event of an accident. In many cases this can even be done without a warrant as of a decade ago [1] - not sure whether that's changed. In any event it seems as though this is a natural evolution in concert with those voluntary ODB-II devices that insurers started using to record driving habits.
> "More specifically, automakers are selling access to the data to Lexis Nexis, which is then crafting “risk scores” insurance companies then use to adjust rates. Usually upward"
In an ideal world, such data-harvesting might lead to cheaper prices / a more efficient insurance market - which would make the privacy loss worth considering from a trade-off standpoint, at least in theory.
Unfortunately it's instead likely to just lead to higher margins for insurance companies. And the only way to compete would be to harvest more data for better predictions.
> In an ideal world, such data-harvesting might lead to cheaper prices / a more efficient insurance market - which would make the privacy loss worth considering from a trade-off standpoint, at least in theory.
In an ideal world (read: perfect information knowledge), this would lead to insurance being a bad deal for every consumer of it. In the theoretical position where insurance companies can accurately price each individual customer based on their habits, they will charge them exactly what they cost _plus_ a margin.
This is only useful for a consumer if they cannot access cash or a credit line to pay for a sudden large expense. Instead, insurance effectively becomes paying the credit line ahead of time.
> This is only useful for a consumer if they cannot access cash or a credit line to pay for a sudden large expense.
Isn't that the main point of insurance?
Insurance can also socially redistribute bad things. Which fair enough it is in practice a result of insurance but I don't think that's what it was invented for. And indeed the better the insurer's crystal ball the smaller this effect is.
Although in practice I don't think there ever will be a crystal ball good enough to make insurance a bad deal for everyone like that. You always have to insure against another driver being bad or just plain bad luck.
No, the point of buying insurance is to reduce your individual variance even though your average cost goes up. It's not an individual savings plan, but rather shared pooling of risk.
In an ideal world, such data harvesting would be illegal, with liability adhering to the executives pushing for and approving the initiative as well as any legal counsel involved. Acquiring the data should require explicit, truly informed, and revocable consent not buried in a bunch of BS and not required for the purchase of a vehicle or insurance.
I wholeheartedly agree that the dark patterns around consent are atrocious. But I also think hn is probably biased in its valuation of an individual's data.
If companies offered say a $50/month discount on car insurance premiums in exchange for gathering data, I imagine a large proportion of people would indeed opt in to that (setting aside issues of selection bias or trust in this ideal world)
After seeing this article, I did a bit of searching and you can also get your LexisNexus report and also opt-out of data sharing along with deleting associated data.
I did it and recommend everyone else does as well.
Is there any hope for something like a Privacy Bill of Rights to ever be passed? I feel like privacy is an inalienable right for all humans and the passage of something like this would be a light speed jump ahead for personal freedom in the new era we find ourselves in. Just because tech enables it doesn’t make this any creepier than someone following behind you in the woods stalking you on your horse 200 years ago.
Like it or not most of the US is oriented around driving and it's basically unavoidable for most adults. Using that as justification to erode everyone's rights feels deeply wrong to me.
Nobody? There are countries other than the USA. I've never heard of signing away rights in respect of blood as a condition of getting a licence. Is this a real thing in the USA?
I have been convinced for several years now that insurance companies are likely buying up personal data from many different sources. They seem to be ideal consumers because it'll lead to better outcomes when they can increase rates on those that identify as risky.
I knew a guy who worked in Finance. Whenever he would buy alcohol, or cannabis (legal where I lived) he would only pay cash. His concern was that, if his credit card usage data were sold, it could increase his premiums.
It’s still unknown if someone engaging in risk will end up in costly collisions, or other events. Just because you engage in risk doesn’t mean it will bite you, only that it is more likely to bite you.
Besides why should less risky drivers subsidize riskier drivers?
This has been true for several years. An insurance agent once told me that there are life insurance companies dropping the requirement for blood draws / medical exams and are just buying prescription records to correlate with financial, educational, and other behavioral data.
As a safe driver, I like the idea of dangerous drivers paying more. There's no good reason the participants should not be aware they are under surveillance though.
Sidenote: I wonder if they've considered close follow distance or frequent lane changes as a risk factor.
Spot enforcement with appropriate training and vehicle improvements is more than appropriate for numerous reasons:
1) Regression to the mean will happen with 100% enforcement/over-enforcement. The new standard for 'safe' will collapse to an unobtainable level which will not benefit society in the long run.
2) Safety is not my #1 concern. The number one cause of death on roads is being born. I value getting to my destination without being tracked more than the potential safety gains of strict monitoring. I believe in rational safety measures but "It's safer so we must do it" is an argument I no longer accept. I want to live a good life, not just a safe one.
3) We have seen time and time again that personal information collected by companies rarely benefits consumers and instead is always used to benefit companies. This is no different. I have negative trust in industry handling my data for my benefit.
The idea of dangerous drivers paying more for insurance is fine. It's probably better than the idea of drivers with bad credit paying more for insurance.
The problem is in how is dangerous driving assessed. Simple to apply rules lack the understanding of conditions. Telematics are going to be low bandwidth data, almost certainly without enough data to form an understanding of conditions.
The thing that’s somewhat ironic here is that the car companies could make cars safe by default. For example, they could make it not possible to accelerate faster than one needs to. They could put in speed limiters that are triggered by the speed limit on the road. They could stop marketing and selling over powered cars.
Instead they market cars as exciting race track like vehicles, things that let you do what you want, when you want. And now they will collect data on the people who actually do that.
Personally I would prefer a car that helps me be a safer driver by following the law. Ensuring there are no pedestrians or cyclists in front of me, etc. But at the end of the day, automated enforcement is a good thing, so maybe this will help some people become safer drivers, though the reality that’s probably more likely is that fewer and fewer people will be able to afford/get insurance, and because our country is so car dependent, they will just drive without.
> For example, they could make it not possible to accelerate faster than one needs to.
I was in a rental car that had this once. Was on the highway, needed to get around another driver who was being unsafe. Was unable to do so because of the limiter. It was easily the most unsafe vehicle I've ever driven as a result. These mechanisms lack situational awareness and nuance, and thus are a direct threat to my personal safety. They very much need to be banned as a matter of course until such a time as humans aren't allowed to drive at all.
The problem though is that inevitably they will eventually automatically label anyone who does not "consent" to total surveillance as risky or dangerous.
The easiest way to disable this is by physically removing the cell modem from your vehicle, which is very straightforward. Without egress, the only way for data harvesting to occur is by physical access, typically at a dealership. However, virtually all automotive cell modems are either packaged on the same chip as the GNSS receiver, or colocated on the same daughter board. As such, choosing to retain control over your data typically comes at the cost of foregoing the built in navigation system and other features such as emergency calling.
There are insurance companies that allow you to voluntarily submit to tracking in exchange for reduced premiums. What is happening here is that those savings are being passed on to auto makers as an extra revenue stream.
US lawmakers can put a stop to this and every other privacy scandal over the years at any time you know by passing a strong privacy law but nahhhhhh we can't do that!
It's yet another reason why people should buy older cars (preferably 2012 or older) since the automotive, insurance, and data broker industries don't give a total jack about your privacy and sadly the US aren't going to do jack about this either until we can elect more people in office that does care and pass a strong privacy law in the process.
The worst thing about this is that all of their conclusions about what data constitutes "bad driving" or "risky driving" is dead wrong.
The signs they consider to be "bad driving" are high-g braking and turning.
Yet these are EXACTLY the same signs created by highly-skilled driver or racer operating at the limit, as they would to avoid an accident (thus costing the insurer $0), where the same situation would catch 90% of the low-g drivers into a wreck that totals the vehicle and causes injuries. A core element of high-performance driving for accident avoidance and racing is to understand the limits of tyre traction, and how to operate the car up to those limits — but not over them — i.e., just under the limit of sliding (sliding friction is always less than static or rolling friction), and to choose lines that maximize available traction.
Distinguishing the signs to tell a high-skilled driver from a bad driver requires more than just "is that number high?". You must look at the circumstances, the frequency, the conditions, the rate of increase and decrease of pressure, the slip angle, the grip state of all 4 tires, and more. But of course, no one bothers to do this.
It is the same kind of institutional stupidity that causes a world-class weightlifter with 4% body fat to be classed as "obese" because s/he scores high on the stupidly simplistic BMI scale(a ratio of weight to height).
Except with BMI insurance companies are not allowed to re-rate people and doctors can instantly adjust treatment when they see the person is obviously not obese but highly trained.
With auto insurance, they can secretly re-rate us on bogus numbers that actually down-rate the highly skilled.
Seems more attractive with every passing year to rebuild older nice cars than get into the new rolling spyware contraptions.
Well, if one is stupid enough to get a race car with telemetry then the spying is deserved. The skill level is irrelevant insurance-wise, as it doesn't last, varies within the day, and is of no use on open, shared streets.
Now the dream car will soon be an electrified lada niva, no electronics, speeding impossible.
You do realize that wheel speed sensors and g-force sensors are already standard equipment in most cars, and that this is part of the data they are selling, right?
Electrified Lada Niva, eh? Depending on how it's electrified, it might go waaayy faster than would be sane... ;-)
My example is NOT about "self identified" "experts", but REAL experts who ACTUALLY have the skills. They also are typically very safe on the roads and know that race-like on-the-limit driving on the streets is idiocy.
The point is that people who ACTUALLY have these skills have a far wider margin of safety than the ordinary driver, and far better capability to avoid accidents. But, they will also — with that far wider margin of safety — often turn or brake with higher than ordinary G-forces.
For example, ordinary street tires and suspensions on modern cars can handle 0.9G lateral or braking acceleration. Ordinary people get uncomfortable at 0.2G lateral acceleration.
An unskilled driver approaching 0.25G lateral acceleration does risk exceeding adhesion limits and losing control because they are insensitive to inputs and feedback. In contrast, a skilled driver can turn at 0.25G all day with virtually no risk, as they are accustomed to driving at 3-4 times those Gs, and are situationally aware, sensitive to inputs and feedback, and choose lines and inputs that avoid the limit.
They are far less of a risk than an unskilled driver at 0.1G. Yet, the skilled driver will get flagged as "bad".
With deeper understanding and analysis, they could make the distinction between actual expert drivers vs overconfident idiots. But I see no indication that this will happen.
I think it varies manufacturer to manufacturer, but even those that make it possible seemingly make you jump through hoops. I was researching potential replacements for my 16 year old car and found a lot of discussion about this re Mazda models:
> Mazda CEC makes it quite difficult to actually request/disable your TCU. It can take many phone calls and escalations to get someout to understand the request and actually "push the button" to send the disable event to your car.
Honestly I’m just totally disinterested in just about every current new car model.
Yes, many models have guides out there for disabling wireless connections. On a previous vehicle of mine, it was as simple as disconnecting the bridge/jumper between the main board and the wireless board.
Being mad about this is like being mad the thief who stole your belongings then pawned them. The crime was spying on you in the first place. Automakers should not have any data, to share or sell or give to law enforcement with a subpoena.
> An employee familiar with G.M.’s Smart Driver said the company’s annual revenue from the program is in the low millions of dollars.
Is that a lot of money for GM? I would have guessed no, but it doesn't seem like very much for selling out their customers like this. Either it's more to GM's profits than I'd expect, or they really don't expect much PR blowback risk at all?
I don't know if they are right or wrong, but...
> Drivers who have realized what is happening are not happy. The Palm Beach Cadillac owner said he would never buy another car from G.M. He is planning to sell his Cadillac.
Is there a good source for which makes, models, and model years “phone home”? I would absolutely take it into account when shopping for a new or used car, but I’ve had no luck with Googling.
I called to turn off the data in a Toyota, and the guy wanted my name, phone number, email address, physical address and even more I can't remember right now. I was like "why do you need this info?" He said, "We need a record of who made this request for our records." I told him "do you understand that I am calling your company specifically because I don't want you to have records?" This went round and round about three times before I just gave him fake info.
Were you able to obtain any records of your own where they agree to cease collection that you can hold against them if they continue? Do you have any means of verifying that the collection has ceased? I don't believe that their word means much without these.
Ladies and gentleman if we want a fair society we MUST:
- mandate FLOSS by law, starting from the first SLoC, meaning no company can sudden publish software to sell something with it, the software must be published since the day zero of it's development or the hw/sw/service can't be on sale;
- mandate local first for anything, so connected cars are ok, but they just offer a simple DynDNS mechanism the owner can add to it's own domain name as a subdomain like car.mydomain.tld and reach a relevant set of APIs the car offer. All data collected by the OEM must pass though the car owners systems, in an open and readable and documented form.
If this is not mandate, by popular acclaim, surveillance capitalism will stay, since it's the new tool to know and conform the masses. Surveilled people are known, and knowing they are surveilled try to behave in a "social norm" way, fearing the judgment/social score, as a result people evolve toward slaves who obey those who establish and update current social norms. We all know cooperation is needed to do anything, those who compete then need many who cooperate, obeying their orders, to craft anything. In the past was religion, then money, now social scoring the way to stiffen the masses. Such powerful tool is not something anyone accept to loose without a desperate and limitless fight. Only a large public reaction can force a change.
There's a lot of jerk drivers who go way too fast and drive very dangerously. They should have to pay significantly more for it. For people that drive correctly, they should be charged less as well. I don't see why this is an issue.
Sure they are. Speed is easy to detect for instance. Someone driving 50mpg in a 25mph school zone should have massive increases to their insurance as they present huge risk.
This happens with healthcare data too. Every prescription you fill is tracked and used as input data for many insurance models that make health insurance pricing decisions.
They share your data in order to help lower your insurance rates.
Imagine what your premium might be without this service.
For example, I drive less than 900 miles a year, have had no accidents, citations or thefts and keep my 10 year old car in a garage. Yet my payments are $1500 per year. And after getting estimates from several companies, this was the lowest we could find.
Even with this service, the inflation rate for auto insurance is higher than anything else in our family budget.
Could've just been "inflation" (read: Opportunity to jack up prices) too. Although if one car company is going to be on the bleeding edge of data collection & sharing it'd probably be Tesla. They're the most Silicon Valley of all.
Can someone educate me why insurer should not know one's driving habits? I'd imagine that the risks calculated from one's driving habit will be more accurate than that derived from only past accidents, car color, user profile and etc.
For me it’s because of this dirty concept called "privacy" and it’s the reason why insurers don’t have access to the list of items that I buy at the grocery store (also health records, name of sex partners, what I do all day long, whether I walk enough every day, etc.)
What sort of "reputational risk" do you think they are taking on?
Data sharing with third parties is ubiquitous in almost all industries. Every single company that deals with financial products reports account information to third parties (Experian, Equifax, TransUnion, Early Warning Services, ChexSystems). If you return an item at a retail store it gets reported to fraud alert databases. Most medium to large employers report the contents of the paychecks of their employees to The Work Number. Insurance claims are reported to LexisNexis. Oil change companies report milage to CarFax, which insurance companies use to look up if you're reporting accurate mileage.
Data reporting and sharing is ubiquitous; it's standard operating procedure. Having a few "privacy nerds" complain about it on the Internet is not risking their reputation.
> What sort of "reputational risk" do you think they are taking on?
> a few "privacy nerds" complain about it on the Internet is not risking their reputation.
The news about GM's OnStar tattling (their words) on drivers is front page on several big news sites like CNN. This is not just some privacy nerds, this is a whole bunch of mainstream media outlets calling out GM by name.
I'm confident the PR team at GM is working overtime right now to try and find a mitigating spin.
> It doesn't seem in the car company's interests to take on the reputational risk for this kind of financial reward.
Tell that to Boeing, they're on course to tank the entire company out of the financial shenanigans they pulled after 1997.
As soon as a company goes publicly traded, the incentives change - there is no more priority on long term, the only thing that matters is INVESTORS INVESTORS INVESTORS (read that one in your finest Steve Ballmer voice).
Short term profit long term losses things are done a lot.
Also, companies seem to work against their own interests quite often. The spyware is probably on some separate budget with separate bonuses attached. So "locally" in the department it might make financial sense to spy on the users.
Hear hear! And why not? Fuck the consumer I say! The one thing we can all agree on is that human dignity must be paid for in cash. If normal people wanted to be treated with respect then they would be high earners like us.
Gives the creepy vibes, but if you stop to think about it - this can stop good drivers from subsidizing the bad drivers. Not like the insurance companies are doing to lower the premium on good drivers, if you have a problem with that talk to capitalism. But bad drivers getting higher premiums is good for everyone.
> But bad drivers getting higher premiums is good for everyone.
Not necessarily. In many parts of the United States, a car is the only viable mode of transport. If you price the bad drivers out of the insurance market, they will forgo insurance all together. Then, if they cause a loss, they will be uninsured and the other driver's insurance will have to pay for the loss (or spend resources in costly suits) anyhow. So, then good drivers premiums will need to go up to compensate for the extra "bad drivers can't afford insurance" risk that good driver's carry. We end up in a similar situation in a roundabout manner but with the added element that now all our data is stored on everyone's servers.
I mean, Tesla has their own insurance product that they claim is better and cheaper than alternatives because of the data they track. People cheered for this.
Personally, I"m not opposed to dangerous drivers paying higher rates, but the devil is in the details.
Imagine a scenario where bunch of those animals brake check you, and then your insurance company calls you up and says "Hey jgalt212, we're seeing that you your forward collion avoidance system got activated too many times this year. We're going to flag you as a tailgater and up your premium by %80. Have a lovely day."
Playing devil's advocate, the answer is that brake checking only works if you are tailgating. If you increase following distance such that you cannot be brake checked, the insurance company has succeeded in making your driving habits safer.
As opposed to the jackass whose pulling high lateral G's and going 90 on the freeway. I think I can explain my way out of the insurance hike easier than the aforementioned jackass--who actually should be deemed uninsurable.
https://web.archive.org/web/20240313200717/https://www.nytim...
When I got my last car, it had a sticker instructing me to press the SOS button to opt out of data collection. The gentleman that answered acted confused when I made the request, taking the stance that he wasn't aware of data collection from the car and maybe I should contact the manufacturer. It was only after I read him the sticker text verbatim that he went into a scripted response and confirmed the opt out. Shady.
The confusion was likely a scripted response too.
I had a friend who worked with call centers, taking orders over the phone. they had a script for everything, and after they quickly took your order, they started with a scripted upsell. If the person balked, they had responses for everything so they could continue. The only way out of the script was if the customer said "I will cancel my order". The call center person would be fired if they did not follow the script.
Xfinity similarly does this if they overcharge you on a contract. They try to upsell on a new (less value) contract for an hour before finally just changing tone and saying "you're right we messed up" and refunding you.
9 replies →
I never saw one of those when buying my most recent car (Toyota). But, after creating an account on their website I was able to find this section:
FWIW, their smartphone app contains a "consent" section with various toggles for the usual reasons (e.g. "sharing data to improve products") and these are all turned off.
I love the attention to detail here, like any customer ever has navigated to this page trying to find out how to give consent to having their data collected.
I was looking at the Prius Prime for a next car--thanks for the heads-up!
Prius Prime has a single fuse [labeled `DCS`] which, once removed, actually disables the computer module which connects to the two cell antenna.
2 replies →
If I get a new car I'm just taking the modem out.
If an insurance company can’t find data on you when they expect it, won’t they just charge you the high risk premium?
1 reply →
One of the automakers in the article claims that voids your warranty. It may or may not but enjoy the legal battle should you ever need to make a claim.
1 reply →
Brand?
Toyota.
I still had the sticker- I peeled it off and put it in the manual. The exact text is:
VEHICLE DATA TRANSMISSION IS ON! Your vehicle wirelessly transmits location, driving and vehicle health data to deliver your services and for internal research and data analysis. See www.toyota[.]com/privacyvts. To disable, press vehicle's SOS button.
8 replies →
Was it a Toyota?
This kinda confirms, unfortunately and sadly, that ChatGPT answers are probably just as good as human answers. And the data collection for your phone call went to training, not into a database where you officially opted out.
> This kinda confirms, unfortunately and sadly, that ChatGPT answers are probably just as good as human answers.
These people are paid to follow scripts and strict protocols. At best, this may suggest that ChatGPT answers are as good as a call center representative's answers.
I think there are a few reasons to be mad about this, but some are better than others:
1) People don't understand they're being monitored. I think this is a good reason to be mad. People should have some understanding of the agreements they make. It's part of being a functional adult in the world. It's also annoying that the companies keep spinning this as a tool to improve your driving, when it's clearly an attempt to price insurance against a person's actual driving habits.
2) The system's assessments are opaque. I don't have a good sense of how accurate any of these measurements are, nor what system is in place to ensure that. If the information collected is consequential enough to double a person's insurance costs, there should be some effort expended to be confident that the collected metrics actually reflect reality. I didn't see anything like that in the article, maybe I missed it, but it shouldn't just be some random team in a private company doing their best.
3) People's driving habits shouldn't be shared with insurance companies. This one ... this one I think is not great. It looks like the shared data at least tries to be anonymous -- they share driving behavior and times, but not actual location data. Heck, I'd be fine with scrubbing the times and just sharing the hard start and stop and speeding numbers (assuming point 2 above is addressed). I get that a knee-jerk defensiveness about privacy would make Thomas Jefferson proud or whatever, but we strike balances on public welfare and private freedom all the time. If you're itching to manufacture 1 gram of ricin to put in a sealed glass vial above your mantle, too bad, you can't. Cars aren't ricin, but they are the shortest path between most humans and homicide. If this kind of intervention induces people to be more careful to keep their current insurance rates, I think that's reasonable. Driving like a maniac is not a human right or a protected characteristic.
I've avoided accidents by hard breaking twice in the last two years, once from deer bounding into the road, and once from a deaf old cat walking into the street.
I haven't been cited for anything in decades, and have never been in an at-fault accident. I drive the speed limit and have a dashcam. With the deer, I was actually 10 MPH under the limit.
So should my rates go up for these incidents where I successfully avoided hitting something? Insurers are unscrupulous and would use any excuse.
No, thanks. I'll share nothing.
Amen. It's their job to calculate risk. Not my job to be "transparent". The ratchet only goes in one direction.
I won't be an Amazon driver in my own car.
1 reply →
The insurance company would argue that you drive in an area with wildlife crossings. That makes you a higher risk even if you managed to avoid this deer. You are more likely than average to encounter another one in the future and may not be as fortunate.
I did pest control for a while, and my truck was equipped with a monitoring device that would beep if it detected unsafe driving. The thing was inconsistent enough to be nearly indistinguishible from random. It sometimes nagged me while driving straight at normal speeds, or going over a pot hole, or just stopping like normal at a stop light. At other times, it wouldn't go off for what should have been obvious "offenses"—hard stops, last-second swerves to avoid road debris, etc.
All in all, I think it was useless for actually policing driving behavior, but I did get identified (read: randomly selected) as the safest driver in the branch one month and got a bonus, so I guess that was nice?
2 replies →
Did your rates go up? Or is this a straw man argument?
1 reply →
Location can be relevant. There is both a quarter mile drag strip by me and a circuit lap by me that both allow you to drive your own car on them.
Both styles of driving would be... Alarming from a telemetry perspective.
Afaik neither is covered by regular auto insurance anyways so it really shouldn't factor into rates. There's specific racing insurance, but it's quite pricey.
Not that I want them sharing location data, but pure acceleration/velocity data won't show areas like that.
I'm also not sure how well regionalized the data is. Though neither is good, there's a very big difference between going 15 over on the highway and going 15 over on back country roads with blind turns. Or between going 15 over on the highway vs in a shopping center parking lot.
Speeding is contextual.
It’s also difficult to determine if someone is speeding from data.
For example the road I live off of according to the speed limit the car thinks goes from 40 to 65 to 25 to 65 to 40 in about a 4 mile span. Spoiler it does not. It is 40 the whole way. But according to the car I am either going 25 under, 15 over, or exactly the right speed.
(And the 65 section in the middle? Blind corner. Idk where it’s getting its data but it is very very wrong)
4 replies →
Just like any other data collection and “tailored” to it service the only purpose is to justify a charge not to actually work. Targeted ads work better than traditional ones? Who knows but how can you say that your service is better if it has nothing innovative. Just like “feature-rich” devices is just a sales pitch.
>People should have some understanding of the agreements they make.
People do not engage in meetings of the minds on these types of things. Manufacturers/insurance companies enter into agreements (and leave stickers that are unlikely to be read) which is a clear violation (imo) of contract law.
It's one thing to be aware of agreements you make, it is another to navigate a corporate surveillance hellscape of on by default consentless surveillance a bunch of psychopayhic corporate types greenlit.
> it's clearly an attempt to price insurance against a person's actual driving habits.
I think it's a combination of two strategies.
1) searching for a reason to not pay a claim.
2) searching for a reason to increase your pricing, while hiding average driver behavior from you to increase their bargaining power
> It looks like the shared data at least tries to be anonymous
One of the main points of the article is that insurance companies are using the data to raise drivers' rates. How can they do that if the data is anonymous?
The car company can share the details based on the chassis VIN number rather than driver details.
Then the insurance company grabs the vehicle registration number when you ask for a quote and looks up the VIN on their side based on a security database to prevent resale of stolen cars or similar.
Anonymous data becomes identifiable data...
1 reply →
Ah, anonymous was the wrong word. I meant instead that the shared data tries to restrict itself to information that doesn't obviously fall under a right to privacy. For example, trip times are shared, but locations are not.
The flow of traffic on the highways where I live is consistently 15-20 mph above the posted limit. I wish everyone would slow down, but that doesn't change the fact that the safest way to merge is to accelerate hard and match their speed. The last thing I need is a financial incentive to be oblivious to my surroundings.
The only speeding ticket I’ve gotten in the past 20 years was for speeding on the on-ramp to get up to the speed of the highway traffic. Holiday weekend, so it was stop, ticket, and release. Repeat. No warnings given.
When I worked in car insurance, besides our own telemetry, we got at least
- Willis Towers Watson (WTW) aggregated driving data
- Verisk (afaik this was mostly around vehicles, not people)
- Various reports directly from state governments
- LexisNexus (multiple different report types)
Really any mobile app that has accelerometer or gyroscope access (even without GPS) can estimate driving safety. Using phone movement and angle, you can estimate driver vs passenger.
Cambridge Mobile sells equipment a lot of insurers use and afaik also data
The magic keyword to look for is "telematics"
(From the Toyota Connected Services Disclaimer)
Your Responsibilities
This needs to be illegal.
Senator Wyden has been a champion of data privacy. If you are an Oregon resident, please reach out to his office.
https://www.wyden.senate.gov/
This is why I like paper contracts. I would simply cross out a stupid clause like this before signing.
99% of the time the rep doesn't care, and if the company can't be bothered to put someone on the other side of the table who is actually paying attention or has bargaining power then they deserve it.
I remember driving a nissan leaf. You had an opt-out prompt every time you drove the car.
For data or was it the standard “use responsibly” message?
1 reply →
The US doesn't just need laws about disclosure of these practices. It needs to mandate that this kind of corporate surveillance must be a clearly labeled opt-in and cannot be mandated by any contract.
Yes, but to me there's another important issue, with all the tracking tech in our vehicles, who actually owns them? We need to treat this similar to the "right to repair" issue! I paid my money, so I own the product. [IF] I own the vehicle it should be my right to say what software runs in the background.
Of course the company can say, "If you don't like our product, don't buy it." If I want to keep up with the latest safety upgrades to my vehicle to protect myself and all car companies have the same tracking software, my only option is to look for a "dumb" vehicle. This is blatantly unsafe and irresponsible. So, they're saying that my safety comes with a price other than the $40K I shelled out?
> Yes, but to me there's another important issue, with all the tracking tech in our vehicles, who actually owns them?
It's even worse. Your car acts as a blackbox against you. A 1990's car? I can do whatever the fuck I want to, I can drive it offroad, I can speed, I can even be near a bank robbery or whatever.
A modern car? Someone robs a bank a few hundred meters from where I am, and now the police will come knock on my door because the IMEI of my car was near the bank when the robbery happened. I speed a little bit to overtake some dumbass driving 20 km/h below the limit, the police makes a dragnet subpoena against my insurance / the data processor from the manufacturer, and issues me a ticket.
> Of course the company can say, "If you don't like our product, don't buy it."
Honestly I'm pretty tired of that "our way or the high way" nonsense. Society needs to make it so they actually can't say that. Make respecting us a precondition for their continued existence. As in they literally get liquidated if they say that even once.
That's how we deal with sociopaths leveraging these non-negotiable "terms" against us. They have zero empathy, they view us like cattle to be marked and monitored and turned into cash flow. So there is no reason to empathize with their nonsense viewpoints either. Just make whatever they're doing illegal. Doesn't matter how much money they lose.
> who actually owns them? We need to treat this similar to the "right to repair" issue
Ownership is a bad framework for this issue—it’s too ambiguous. You can “own” a vehicle all you want, that doesn’t give you the right to fuck with its odometer or catalytic converter.
24 replies →
[dead]
> In recent years, automakers [...] have started offering optional features in their connected-car apps that rate people’s driving.
At least the programs are (currently) opt-in.
This amusing anecdote is buried:
> One driver lamented having data collected during a “track day,” while testing out the Corvette’s limits on a professional racetrack. > [...] he was denied auto insurance by seven companies [...]
There is another commenter further up that says they had to opt out on a Toyota and the rep acted like he didn't know until the opt out text was read verbatim.
https://news.ycombinator.com/item?id=39667268
I just purchased a Camry Hybrid from a Toyota dealership. The operator tried to tell me that "because I financed it they cannot turn off analytics." I had not financing, paid cash.
Pressing the SOS button to cancel [as sticker suggested] was met with so much difficulty that (while the operator was on the line still) I found the fuse panel and pulled out `DCS` to disconnect the call/tracking. This ended our trasmission.
But are they really optional? I can’t imagine that the telematics link is going unused for the value it provides (i.e. crowd-sourcing for speed and road map data).
The worst part is that assumptions about who’s driving the vehicle.
I would be willing to bet even if you told the insurance companies it was totally legal on a professional race track -- they'd say "Nope, we still don't want to insure someone that takes his car on professional race tracks like that."
> At least the programs are (currently) opt-in.
The article makes clear that most people don't know what's happening with their data. They opt into something else and this data collection is included - that doesn't sound like much of an 'option'.
Your quote is misleading. The "he" is in the next paragraph and refers to someone else who owns a Cadillac, not a Corvette.
The track day thing probably was the funniest thing in the article, though.
My biggest concern is that rather than comparing difficult to identify behavior against claim rates, they will penalize behavior that is easy to identify. For example, yesterday I was traveling 15 MPH over the speed limit on a multi-line highway where traffic is often traveling 10+ MPH over the limit (the limit is objectively wrong for a divided, grade separated, access controlled highway). I typically drive as far right as I can to make room for faster vehicles, but eventually got stuck behind someone camping in the left lane. When opportunity presented itself I went around them in the center lane. They expressed anger at this by encroaching into my lane to squeeze me against traffic in the right lane. There were four inches between their vehicle and my side mirror. Who is driving dangerously, and more likely to cause an accident? I would argue it's the driver who is obstructing traffic and behaving aggressively toward others on the road. But if GPS isn't accurate enough to show their lane deviation, it's a lot easier to ding me for my speed.
Making it easier to determine who's at fault in cases like you mentioned would involve more sensors, radars, cameras etc. So we either 1984-ify everyones car or we just don't do any monitoring at all (since half-assing it can lead to false positives). I have a feeling insurance companies (and therefore governments) will slide more towards the 1984 side to save a couple dollars.
Insurance companies don’t have to, people are voluntarily installing dash cams to show who was at fault (or at least show they were not). Chances are, someone is recording your collision, and you might as well have your evidence to fight against someone else’s.
I‘m a huge fan of s telemetry insurance. I have I personally and it saves me around 300€/year on my cars insurance because I am a very defensive driver.
However, this being integrated into the vehicle in an absolutely intransparent way is a huge step up and a really unsettling privacy violation.
For this to be ethically viable imho, there need to be a few prerequisites
- it’s transparent what has been transmitted
- you can always easily opt out, but you may loose the discount you earned
- your driving can’t make your premium go up beyond the base premium without the discount (sensors will never paint an entirely accurate picture)
>>I‘m a huge fan of s telemetry insurance. I have I personally and it saves me around 300€/year on my cars insurance because I am a very defensive driver
My sister had it, and it was biggest piece of crap imaginable. The system would send her emails warning her about "lack of smoothness" in her driving, because....the system would rate her down every time she went over a speed bump.
The biggest problem was that she would get emails saying "we've detected you were going 70mph in a 20mph zone, if this continues we will cancel your insurance", so we would call them and ask them to provide GPS logs, which they always would - and the logs would always show that she was going legal 70mph on the motorway, which at one point goes above a smaller 20mph road - and of course the system was stupid enough to just query the speed limit for every point, not realizing that this wasn't the road she was actually on. We would email them back explaining, and the warnings would go away until she went on that road again.
Absolute waste of time and money, I think the insurance company would need to pay me to have this fitted, the nerves it cost my sister to have that piece of crap in her car weren't worth whatever discount she got for it.
From the other side, it’s essentially a fine for people who respect their privacy. Insurance prices will adjust to the adoption of this discount, will rise to the current normal and only people who don’t opt in will be hit with the extortion fee forcing them to opt in.
That last point is merely a way for you to get used to this system. Once enough people allow the spying they'll increase the price of you don't allow the spying.
And after that they'll mandate it for everybody.
I already pay a premium for having more horses under the hood. I don't want to get dinged when I use my car's power.
>you can always easily opt out
No, that should definitely be opt-in, with explicit consent to data collection and process purposes.
I'd be a fan, too, if they couldn't use the information to raise rates. But even the best drivers brake hard to avoid accidents from time to time, and in the US, insurers are dirty.
In which country is that?
I don't know about OP but in Poland https://yanosik.pl/ offered such deals ( https://payhowyudrive.pl/ ). It is probably a bit self defeating - the app's main function is warning about speed traps, that means unsafe drivers as significant part of its users.
Sorry for the late answer. But this is for my insurance in Germany, which is extremely expensive because I’m a young driver
They're describing what should be, not what is.
2 replies →
The easiest way to disable this in a Chevrolet with OnStar is to pull the fuse (Fuse 38 under the dash for the Chevrolet Malibu 2024). Other options disconnecting the antenna (can still connect if strong signal), or pulling out the box/microphone (disassembly required). At least for the 2024 model CarPlay features seems to keep working, but I haven't tested Bluetooth yet.
There's somebody on YouTube describing the parts of the OnStar feature: https://youtu.be/TZILodhvjdw?feature=shared
> (can still connect if strong signal)
Wonder if that would still work if you additionally shunted the antenna with some kind of impedance matched load.
All the Hams scramble to grab a spare dummy load.
Many GM models used to have a bridge/jumper between the network daughter board and the rest of the car. Pretty easy and didn't affect anything else (sometimes the fuse for OnStar also covered your Bluetooth or voice commands).
I read that microphone in Bluetooth stops working (on Chevy bolt)
The problem with such issues of data misuse is that people only provide 2 solutions.
a) Go off grid. Don't use The tech that these cars make.
The problem with this is that it is impractical for people that use see alot of value in using this tech.
b) Pass more regulation.
I am a Hayekian and I believe that regulation will not help with people that know the ins & outs of the regulation, also it's doesn't stop them. It just means corporations are willing to misbehave as long as they can play the legal gymnastics and pay rudimentary fines.
Now, The third option which I see would be the best but isn't talked much about is the promotion, and installation of homomorphic computing or homomorphic encryption.
I am not a cryptographer so I really don't fully understand it's limitations. But adopting this would simply make all these data abuse issues vanish.
Cryptographers, why hasn't homomophic Computing or homomophic encryption been massively adopted?
> isn't talked much about is the promotion, and installation of homomorphic computing or homomorphic encryption
Sure, the car company will homomorphically encrypt your driving data when it sends it to its own servers.
You’re trying to solve a social problem with technology. That doesn’t work.
>Sure, the car company will homomorphically encrypt your driving data when it sends it to its own servers.
You can encrypt the data such that the insurance companies cannot target any particular individual (which is my problem her) but they can use the data to improve their insurance pricing models.
I have no problem with a health insurance company using population data to find out how many are susceptible to say cancer.
But I have a problem when they use this data to over price a particular individuals insurance because their gene say that they are susceptible to cancer.
1 reply →
Of the solutions:
a) Impractical because cars are needed for daily life and there’s no incentive for automakers to not sell your data.. so all cars will unless this becomes a compelling enough product difference to move the needle on profits,
b) Legislation/regulation that creates the right incentives isn’t easy, but certainly doable.
c) Impractical because homomorphic encryption is absurdly computationally expensive, is still not a fully unsolved problem, and.. in what universe do automotive companies implement this far fetched and expensive means of privacy without sone.. err.. regulation?
It doesn’t seem to be superior to option b)
Which specific regulation do you think has a history of not being impactful? I find that the devil is in the detail in this argument because most regulation us massively impactful and helpful and I find that the talking point that we need to get rid of it is generally loudest from those who would profit the most from not following those rules anymore.
GDPR for example has done nothing to protect people from this particular case of data misuse.
The problem with English law, is that you have to explicitly declare what is wrong a head of time. So we just end up with endless needs for regulation ls.
If we had legal systems like Hammurabi Codes, they work work way better.
12 replies →
Since nobody answered the question, the reason is its terribly absolutely insanely slow. It's possible, just requiring hundreds of thousands or millions of times as much work as say, a normal lookup in a database.
> I am a Hayekian and I believe that regulation will not help with people that know the ins & outs of the regulation, also it's doesn't stop them.
that is such a funny thing to say. Car industry is heavily regulated and car companies do work with the regulation. They are already regulated on safety, fuel standards, dimensions... Adding data protection into the mix makes sense.
The auto industry has fought tooth and nail against safety requirements[1] and still fights today against more stringent fuel standards[2][3].
Not only would they fight regulations like data safety that would open them to potential litigation when lose the data or sell it to the wrong player, but they would win. Privacy isn't the political football that the environment is, and you can't point to death statistics like you can with safety issues.
[1] https://www.the-rheumatologist.org/article/revisionist-histo... [2] https://texasclimatenews.org/2022/03/19/decades-of-lobbying-... [3] https://www.cbtnews.com/auto-lobby-group-warns-fuel-efficien...
3 replies →
If I am a corporation and I am willing to break regulations, how will you force me to use homomorphic encryption? Why should I pass on gathering data that I can resell?
The average buyer won't understand or care about it so there is no direct pressure from consumers. I think regulations is not optional (and homomorphic encryption may be mandated if viable?). Breaching regulations is often a "cost of doing business", but some recent regulations (such as GDPR) can actually create very large fines in many countries. So it seems that what may be needed is good enforcement and measured penalties. Another deterrent would be having penalties that are not money.
> Breaching regulations is often a "cost of doing business", but some recent regulations (such as GDPR) can actually create very large fines in many countries.
This is the issue with so many laws. Stricter fines basically never deter would be offenders from committing the crime. What deters people is a high chance of getting caught.
Do companies ignore regulations? Sure, some do. But saying 'they will just pay the fines' ignores the fact that we could make the fines existential, or punish board members by kicking them out of the industry. The answer to 'the regulation we haven't even tried won't work if we do it improperly' is 'let's do it, and do it properly'. I have no idea what homomorphic encryption is, but rarely do 'let's add more tech to magic bullet a human problem of incentives' solutions work.
Homomophic encryption simply means that the data is encrypted in a way that the person working with it cannot use it arbitrarily.
Here is an example, I would for instance use Google Maps for Navigation but Google or any other third party would have no idea where I am going.
I used it in the first company I worked for and it works beautifully.
A) and B) work but they are not as effective as homomophic encryption.
4 replies →
I think a problem in this area is that if one avenue of data collection is denied, another one will be implemented and it becomes a game of whack-a-mole.
For example the USG is forbidden from collecting communications from US citizens, but that does not keep it from buying this information from private domestic sources or from other governments.
2 replies →
Strangely enough, I know the answer to that, if memory is serving.
Homomorphic encryption is where you can compute on the encrypted data without ever decrypting it.
Logically, it sounds like a pipe dream to me, but apparently it's a thing.
3 replies →
> I am a Hayekian and I believe that regulation will not help with people that know the ins & outs of the regulation, also it's doesn't stop them.
I work in the automotive industry. It is very heavily regulated. The majority of people have never heard of ISO 26262 but it's keeping billions of people safe every day. Data privacy can work in the same way.
> The problem with this is that it is impractical for people that use see alot of value in using this tech.
I would be happy to turn down the tech, but I wonder how long until I can't feasibly buy a car (or a car I want) without it...
> I am a Hayekian and I believe that regulation will not help with people that know the ins & outs of the regulation, also it's doesn't stop them. It just means corporations are willing to misbehave as long as they can play the legal gymnastics and pay rudimentary fines.
So you try nothing and are out of ideas. Amazing.
> homomorphic encryption
Let me get this straight, you think regulation is too hard because corporations don't want it, but you don't see any problem with homomorphic encryption, which is difficult to implement, poorly understood by consumers, AND provides privacy guarantees that corporations don't want?
Really?
It's pretty clear we've reached the point where technology has shifted to working against us, and not for us anymore.
I work in tech but as far as I am concerned, you can keep all your smart homes, cars and other gadgets and soul sucking (anti) "social" apps.
Somewhere along the way technology was hijacked to control us rather than empower us. And if you don't like it: shut up because "progress" is inevitable
> we've reached the point where technology has shifted to working against us
Everyone has always said this since the dawn of farming. It’s not a particularly useful insight: the question is in how and how it is to be banned or balanced.
So you think today's technology is comparable to farming?
1 reply →
Technology is amoral. The power shift happened because we are no longer in control. It's these corporations who are the masters of the computers now. They're just allowing us to use their computers. Of course those computers work against us, they are treacherous by definition.
All new manufacture cars sold in the US already have "black box" data recorders that can be dumped in the event of an accident. In many cases this can even be done without a warrant as of a decade ago [1] - not sure whether that's changed. In any event it seems as though this is a natural evolution in concert with those voluntary ODB-II devices that insurers started using to record driving habits.
[1] https://www.edmunds.com/car-technology/car-black-box-recorde...
In an ideal world, such data-harvesting might lead to cheaper prices / a more efficient insurance market - which would make the privacy loss worth considering from a trade-off standpoint, at least in theory.
Unfortunately it's instead likely to just lead to higher margins for insurance companies. And the only way to compete would be to harvest more data for better predictions.
> In an ideal world, such data-harvesting might lead to cheaper prices / a more efficient insurance market - which would make the privacy loss worth considering from a trade-off standpoint, at least in theory.
In an ideal world (read: perfect information knowledge), this would lead to insurance being a bad deal for every consumer of it. In the theoretical position where insurance companies can accurately price each individual customer based on their habits, they will charge them exactly what they cost _plus_ a margin.
This is only useful for a consumer if they cannot access cash or a credit line to pay for a sudden large expense. Instead, insurance effectively becomes paying the credit line ahead of time.
> This is only useful for a consumer if they cannot access cash or a credit line to pay for a sudden large expense.
Isn't that the main point of insurance?
Insurance can also socially redistribute bad things. Which fair enough it is in practice a result of insurance but I don't think that's what it was invented for. And indeed the better the insurer's crystal ball the smaller this effect is.
Although in practice I don't think there ever will be a crystal ball good enough to make insurance a bad deal for everyone like that. You always have to insure against another driver being bad or just plain bad luck.
2 replies →
No, the point of buying insurance is to reduce your individual variance even though your average cost goes up. It's not an individual savings plan, but rather shared pooling of risk.
2 replies →
Insurance companies don't have to make money from underwriting or insurance.
In an ideal world, such data harvesting would be illegal, with liability adhering to the executives pushing for and approving the initiative as well as any legal counsel involved. Acquiring the data should require explicit, truly informed, and revocable consent not buried in a bunch of BS and not required for the purchase of a vehicle or insurance.
I wholeheartedly agree that the dark patterns around consent are atrocious. But I also think hn is probably biased in its valuation of an individual's data.
If companies offered say a $50/month discount on car insurance premiums in exchange for gathering data, I imagine a large proportion of people would indeed opt in to that (setting aside issues of selection bias or trust in this ideal world)
1 reply →
Basically they keep the profits and socialize the risks.
> Unfortunately it's instead likely to just lead to higher margins for insurance companies.
Why? Insurance pricing is heavily regulated, and profit margins for insurers have always been very low.
After seeing this article, I did a bit of searching and you can also get your LexisNexus report and also opt-out of data sharing along with deleting associated data.
I did it and recommend everyone else does as well.
https://consumer.risk.lexisnexis.com/consumer
That link only allows you to request the LexisNexis Risk report on you.
Do you have a link where one can opt-out of data sharing or deleting associated data? I cannot find anything on the LexisNexis site allowing for that.
It turns out only certain states allows for deleting like California
For anyone looking
https://www.lexisnexis.com/global/privacy/en/privacy-center-...
This looks like the Canadian page, made a request and let’s see where this leads to.
Hello there - This form is from a different business of LexisNexis. Not pertaining to their Risk business. This form will not get what you want.
Is there any hope for something like a Privacy Bill of Rights to ever be passed? I feel like privacy is an inalienable right for all humans and the passage of something like this would be a light speed jump ahead for personal freedom in the new era we find ourselves in. Just because tech enables it doesn’t make this any creepier than someone following behind you in the woods stalking you on your horse 200 years ago.
[flagged]
Many things people do are extremely dangerous and detrimental to society. Not sure that's a great rationale for stripping someone of their privacy.
> You even sign away your rights to the privacy of your own blood when you get a license to drive.
I'm not sure what this is referring to. Is any random government agent allowed to take a DNA sample if you're behind the wheel of a car?
2 replies →
Like it or not most of the US is oriented around driving and it's basically unavoidable for most adults. Using that as justification to erode everyone's rights feels deeply wrong to me.
If you actually believe this, then please reply here with the start and end GPS coordinates of your last driven commute to/from where you live.
1 reply →
Nobody? There are countries other than the USA. I've never heard of signing away rights in respect of blood as a condition of getting a licence. Is this a real thing in the USA?
5 replies →
I have been convinced for several years now that insurance companies are likely buying up personal data from many different sources. They seem to be ideal consumers because it'll lead to better outcomes when they can increase rates on those that identify as risky.
This isn't a secret. Go read one of the world's largest data broker's annual report to investors, ctrl-f for "insurance": https://www.experianplc.com/content/dam/marketing/global/plc...
Absolutely. Annual financial reports by public companies are a gold mine for this stuff, as they are literally required to talk about it.
You can also get a sense of the scale of the problem by the reported revenue and growth rates (which they're always eager to highlight).
I knew a guy who worked in Finance. Whenever he would buy alcohol, or cannabis (legal where I lived) he would only pay cash. His concern was that, if his credit card usage data were sold, it could increase his premiums.
That's why I buy my liquor at the gas station, on the same tx as the gas.
4 replies →
The whole point of an insurance business is to insure against unknown and unlikely risks.
If it is insuring known or likely risks, then it becomes a subsidy or wealth transfer (which should be the domain of governments).
> The whole point of an insurance business is to insure against unknown and unlikely risks.
Unknown to whom? To you, the insured? Or to them? Business thrives on customers with incomplete information.
It’s still unknown if someone engaging in risk will end up in costly collisions, or other events. Just because you engage in risk doesn’t mean it will bite you, only that it is more likely to bite you.
Besides why should less risky drivers subsidize riskier drivers?
3 replies →
This has been true for several years. An insurance agent once told me that there are life insurance companies dropping the requirement for blood draws / medical exams and are just buying prescription records to correlate with financial, educational, and other behavioral data.
Edit: changed prescription “data” to “records”
Wouldn’t this violate HIPAA?
6 replies →
Not only that, don't insurers offer 'discounts' for installing tracking apps on your phones and devices?
As a safe driver, I like the idea of dangerous drivers paying more. There's no good reason the participants should not be aware they are under surveillance though.
Sidenote: I wonder if they've considered close follow distance or frequent lane changes as a risk factor.
Spot enforcement with appropriate training and vehicle improvements is more than appropriate for numerous reasons: 1) Regression to the mean will happen with 100% enforcement/over-enforcement. The new standard for 'safe' will collapse to an unobtainable level which will not benefit society in the long run. 2) Safety is not my #1 concern. The number one cause of death on roads is being born. I value getting to my destination without being tracked more than the potential safety gains of strict monitoring. I believe in rational safety measures but "It's safer so we must do it" is an argument I no longer accept. I want to live a good life, not just a safe one. 3) We have seen time and time again that personal information collected by companies rarely benefits consumers and instead is always used to benefit companies. This is no different. I have negative trust in industry handling my data for my benefit.
The idea of dangerous drivers paying more for insurance is fine. It's probably better than the idea of drivers with bad credit paying more for insurance.
The problem is in how is dangerous driving assessed. Simple to apply rules lack the understanding of conditions. Telematics are going to be low bandwidth data, almost certainly without enough data to form an understanding of conditions.
> It's probably better than the idea of drivers with bad credit paying more for insurance.
There must be some correlation between bad credit and likelihood to be in a collision.
2 replies →
The thing that’s somewhat ironic here is that the car companies could make cars safe by default. For example, they could make it not possible to accelerate faster than one needs to. They could put in speed limiters that are triggered by the speed limit on the road. They could stop marketing and selling over powered cars.
Instead they market cars as exciting race track like vehicles, things that let you do what you want, when you want. And now they will collect data on the people who actually do that.
Personally I would prefer a car that helps me be a safer driver by following the law. Ensuring there are no pedestrians or cyclists in front of me, etc. But at the end of the day, automated enforcement is a good thing, so maybe this will help some people become safer drivers, though the reality that’s probably more likely is that fewer and fewer people will be able to afford/get insurance, and because our country is so car dependent, they will just drive without.
> For example, they could make it not possible to accelerate faster than one needs to.
I was in a rental car that had this once. Was on the highway, needed to get around another driver who was being unsafe. Was unable to do so because of the limiter. It was easily the most unsafe vehicle I've ever driven as a result. These mechanisms lack situational awareness and nuance, and thus are a direct threat to my personal safety. They very much need to be banned as a matter of course until such a time as humans aren't allowed to drive at all.
The problem though is that inevitably they will eventually automatically label anyone who does not "consent" to total surveillance as risky or dangerous.
Its telling how you phrase this.
If it's so telling then tell us, enough with the dark innuendos!
2 replies →
The easiest way to disable this is by physically removing the cell modem from your vehicle, which is very straightforward. Without egress, the only way for data harvesting to occur is by physical access, typically at a dealership. However, virtually all automotive cell modems are either packaged on the same chip as the GNSS receiver, or colocated on the same daughter board. As such, choosing to retain control over your data typically comes at the cost of foregoing the built in navigation system and other features such as emergency calling.
Unless most people do it, their answer will be to put you on the high risk bucket by default.
There are insurance companies that allow you to voluntarily submit to tracking in exchange for reduced premiums. What is happening here is that those savings are being passed on to auto makers as an extra revenue stream.
There needs to be a Pi-hole for cars.
GrapheneOS for Automotive
All the companies were have bee hoovering up our data, where did you think this was going to end?
Add in the fact that if you are not getting "growth" on the stock market, then you must be doing something wrong.
The Stasi didn't add anything to the GDP of the GDR, either. That wasn't the point then, and it isn't now.
My point is that companies start to collect data; either for debug purposes or to try and better understand the customer.
Then there is a lot of pressure to monetize the data. So from a consumer POV, it is better to not have anything collected.
1 reply →
Gift Link: https://www.nytimes.com/2024/03/11/technology/carmakers-driv...
US lawmakers can put a stop to this and every other privacy scandal over the years at any time you know by passing a strong privacy law but nahhhhhh we can't do that!
It's yet another reason why people should buy older cars (preferably 2012 or older) since the automotive, insurance, and data broker industries don't give a total jack about your privacy and sadly the US aren't going to do jack about this either until we can elect more people in office that does care and pass a strong privacy law in the process.
More concerning is how we are not able to view and challenge this data. It's a one-way street.
The worst thing about this is that all of their conclusions about what data constitutes "bad driving" or "risky driving" is dead wrong.
The signs they consider to be "bad driving" are high-g braking and turning.
Yet these are EXACTLY the same signs created by highly-skilled driver or racer operating at the limit, as they would to avoid an accident (thus costing the insurer $0), where the same situation would catch 90% of the low-g drivers into a wreck that totals the vehicle and causes injuries. A core element of high-performance driving for accident avoidance and racing is to understand the limits of tyre traction, and how to operate the car up to those limits — but not over them — i.e., just under the limit of sliding (sliding friction is always less than static or rolling friction), and to choose lines that maximize available traction.
Distinguishing the signs to tell a high-skilled driver from a bad driver requires more than just "is that number high?". You must look at the circumstances, the frequency, the conditions, the rate of increase and decrease of pressure, the slip angle, the grip state of all 4 tires, and more. But of course, no one bothers to do this.
It is the same kind of institutional stupidity that causes a world-class weightlifter with 4% body fat to be classed as "obese" because s/he scores high on the stupidly simplistic BMI scale(a ratio of weight to height).
Except with BMI insurance companies are not allowed to re-rate people and doctors can instantly adjust treatment when they see the person is obviously not obese but highly trained.
With auto insurance, they can secretly re-rate us on bogus numbers that actually down-rate the highly skilled.
Seems more attractive with every passing year to rebuild older nice cars than get into the new rolling spyware contraptions.
Well, if one is stupid enough to get a race car with telemetry then the spying is deserved. The skill level is irrelevant insurance-wise, as it doesn't last, varies within the day, and is of no use on open, shared streets.
Now the dream car will soon be an electrified lada niva, no electronics, speeding impossible.
Who said anything about racecar telemetry?
You do realize that wheel speed sensors and g-force sensors are already standard equipment in most cars, and that this is part of the data they are selling, right?
Electrified Lada Niva, eh? Depending on how it's electrified, it might go waaayy faster than would be sane... ;-)
I'm skeptical of this argument. I don't want to be on the same road with people who self identity as expert drivers going at the limit.
I completely agree.
My example is NOT about "self identified" "experts", but REAL experts who ACTUALLY have the skills. They also are typically very safe on the roads and know that race-like on-the-limit driving on the streets is idiocy.
The point is that people who ACTUALLY have these skills have a far wider margin of safety than the ordinary driver, and far better capability to avoid accidents. But, they will also — with that far wider margin of safety — often turn or brake with higher than ordinary G-forces.
For example, ordinary street tires and suspensions on modern cars can handle 0.9G lateral or braking acceleration. Ordinary people get uncomfortable at 0.2G lateral acceleration.
An unskilled driver approaching 0.25G lateral acceleration does risk exceeding adhesion limits and losing control because they are insensitive to inputs and feedback. In contrast, a skilled driver can turn at 0.25G all day with virtually no risk, as they are accustomed to driving at 3-4 times those Gs, and are situationally aware, sensitive to inputs and feedback, and choose lines and inputs that avoid the limit.
They are far less of a risk than an unskilled driver at 0.1G. Yet, the skilled driver will get flagged as "bad".
With deeper understanding and analysis, they could make the distinction between actual expert drivers vs overconfident idiots. But I see no indication that this will happen.
Can this be opted out at the dealer? Black box collection OR wireless connectivity?
If not, are there guides on disabling the modem without damaging diagnostics or infotainment?
I want a car that does not transmit data. Which means I may need to get my 2010s crossover rebuilt and reupholstered instead of getting a new car.
I think it varies manufacturer to manufacturer, but even those that make it possible seemingly make you jump through hoops. I was researching potential replacements for my 16 year old car and found a lot of discussion about this re Mazda models:
https://www.cx30talk.com/threads/thoughts-on-tcu-disable.374...
> Mazda CEC makes it quite difficult to actually request/disable your TCU. It can take many phone calls and escalations to get someout to understand the request and actually "push the button" to send the disable event to your car.
Honestly I’m just totally disinterested in just about every current new car model.
Yes, many models have guides out there for disabling wireless connections. On a previous vehicle of mine, it was as simple as disconnecting the bridge/jumper between the main board and the wireless board.
Being mad about this is like being mad the thief who stole your belongings then pawned them. The crime was spying on you in the first place. Automakers should not have any data, to share or sell or give to law enforcement with a subpoena.
Subaru opt-out
https://subarucustomersupport.powerappsportals.com/Consumer-...
> An employee familiar with G.M.’s Smart Driver said the company’s annual revenue from the program is in the low millions of dollars.
Is that a lot of money for GM? I would have guessed no, but it doesn't seem like very much for selling out their customers like this. Either it's more to GM's profits than I'd expect, or they really don't expect much PR blowback risk at all?
I don't know if they are right or wrong, but...
> Drivers who have realized what is happening are not happy. The Palm Beach Cadillac owner said he would never buy another car from G.M. He is planning to sell his Cadillac.
Is there a good source for which makes, models, and model years “phone home”? I would absolutely take it into account when shopping for a new or used car, but I’ve had no luck with Googling.
I’d imagine any car since 2019 can likely share such data.
[dupe]
Some more discussion: https://news.ycombinator.com/item?id=39666976
I called to turn off the data in a Toyota, and the guy wanted my name, phone number, email address, physical address and even more I can't remember right now. I was like "why do you need this info?" He said, "We need a record of who made this request for our records." I told him "do you understand that I am calling your company specifically because I don't want you to have records?" This went round and round about three times before I just gave him fake info.
Were you able to obtain any records of your own where they agree to cease collection that you can hold against them if they continue? Do you have any means of verifying that the collection has ceased? I don't believe that their word means much without these.
I just wrapped my truck with several layers of copper mesh, so it should be fine.
In all seriousness though, no. I have no way of confirming the data transmission has stopped.
Surely these cars have an "offline-mode". Anyone know how to force it? (I almost said "airplane-mode".)
> almost said "airplane-mode"
My old Jetta’s door once fell off.
Disconnect the antenna or the whole modem itself.
Ladies and gentleman if we want a fair society we MUST:
- mandate FLOSS by law, starting from the first SLoC, meaning no company can sudden publish software to sell something with it, the software must be published since the day zero of it's development or the hw/sw/service can't be on sale;
- mandate local first for anything, so connected cars are ok, but they just offer a simple DynDNS mechanism the owner can add to it's own domain name as a subdomain like car.mydomain.tld and reach a relevant set of APIs the car offer. All data collected by the OEM must pass though the car owners systems, in an open and readable and documented form.
If this is not mandate, by popular acclaim, surveillance capitalism will stay, since it's the new tool to know and conform the masses. Surveilled people are known, and knowing they are surveilled try to behave in a "social norm" way, fearing the judgment/social score, as a result people evolve toward slaves who obey those who establish and update current social norms. We all know cooperation is needed to do anything, those who compete then need many who cooperate, obeying their orders, to craft anything. In the past was religion, then money, now social scoring the way to stiffen the masses. Such powerful tool is not something anyone accept to loose without a desperate and limitless fight. Only a large public reaction can force a change.
https://archive.is/lmMp9
You can get your data from LexusNexus or opt out and delete data if you're in a state that mandates the option (such as CA) here:
https://consumer.risk.lexisnexis.com/consumer
I hate how many companies don't give a flying shit about privacy and completely ignore user selections under the guise of incompetence.
I would like to see a judge with sympathy for this sentiment fine them out of existence.
There's a lot of jerk drivers who go way too fast and drive very dangerously. They should have to pay significantly more for it. For people that drive correctly, they should be charged less as well. I don't see why this is an issue.
Fun new line of business idea: Manufacturers could claim the texas abortion bounties by reporting any motorist who travels to an out of state clinic.
The problem with allowing this kind of data usage is you will also have other moral authoritarians that wish to use the data as well.
There are plenty of data brokers who will sell your personal location data, independent of your vehicle, obtained from the apps on your phone.
The cars are not capable of measuring how dangerous the driving is.
Sure they are. Speed is easy to detect for instance. Someone driving 50mpg in a 25mph school zone should have massive increases to their insurance as they present huge risk.
8 replies →
> An employee familiar with G.M.’s Smart Driver said the company’s annual revenue from the program is in the low millions of dollars.
i would have guessed that this is making more money. Does anyone know why this is the case?
Topic is previously discussed (163 comments, 2 days ago) off NYT article at: https://news.ycombinator.com/item?id=39666976
This happens with healthcare data too. Every prescription you fill is tracked and used as input data for many insurance models that make health insurance pricing decisions.
They share your data in order to help lower your insurance rates.
Imagine what your premium might be without this service.
For example, I drive less than 900 miles a year, have had no accidents, citations or thefts and keep my 10 year old car in a garage. Yet my payments are $1500 per year. And after getting estimates from several companies, this was the lowest we could find.
Even with this service, the inflation rate for auto insurance is higher than anything else in our family budget.
Thank the lord for data sharing.
Some insurance companies make you use an app if you want lower payments so this battle is mostly lost.
My insurance went up 10 percent out of the blue. Wonder if Tesla shares?
10% is below average, you should feel lucky. Car insurance premiums have been rising dramatically, especially for EVs.
Could've just been "inflation" (read: Opportunity to jack up prices) too. Although if one car company is going to be on the bleeding edge of data collection & sharing it'd probably be Tesla. They're the most Silicon Valley of all.
How do you know labor prices to fix cars and more complicated/costly car parts did not cause the increase?
Can someone educate me why insurer should not know one's driving habits? I'd imagine that the risks calculated from one's driving habit will be more accurate than that derived from only past accidents, car color, user profile and etc.
For me it’s because of this dirty concept called "privacy" and it’s the reason why insurers don’t have access to the list of items that I buy at the grocery store (also health records, name of sex partners, what I do all day long, whether I walk enough every day, etc.)
Does anyone have a list of companies that do not do this?
> An employee familiar with G.M.’s Smart Driver said the company’s annual revenue from the program is in the low millions of dollars.
It doesn't seem in the car company's interests to take on the reputational risk for this kind of financial reward.
What sort of "reputational risk" do you think they are taking on?
Data sharing with third parties is ubiquitous in almost all industries. Every single company that deals with financial products reports account information to third parties (Experian, Equifax, TransUnion, Early Warning Services, ChexSystems). If you return an item at a retail store it gets reported to fraud alert databases. Most medium to large employers report the contents of the paychecks of their employees to The Work Number. Insurance claims are reported to LexisNexis. Oil change companies report milage to CarFax, which insurance companies use to look up if you're reporting accurate mileage.
Data reporting and sharing is ubiquitous; it's standard operating procedure. Having a few "privacy nerds" complain about it on the Internet is not risking their reputation.
> What sort of "reputational risk" do you think they are taking on?
> a few "privacy nerds" complain about it on the Internet is not risking their reputation.
The news about GM's OnStar tattling (their words) on drivers is front page on several big news sites like CNN. This is not just some privacy nerds, this is a whole bunch of mainstream media outlets calling out GM by name.
I'm confident the PR team at GM is working overtime right now to try and find a mitigating spin.
> It doesn't seem in the car company's interests to take on the reputational risk for this kind of financial reward.
Tell that to Boeing, they're on course to tank the entire company out of the financial shenanigans they pulled after 1997.
As soon as a company goes publicly traded, the incentives change - there is no more priority on long term, the only thing that matters is INVESTORS INVESTORS INVESTORS (read that one in your finest Steve Ballmer voice).
Short term profit long term losses things are done a lot.
Also, companies seem to work against their own interests quite often. The spyware is probably on some separate budget with separate bonuses attached. So "locally" in the department it might make financial sense to spy on the users.
Hear hear! And why not? Fuck the consumer I say! The one thing we can all agree on is that human dignity must be paid for in cash. If normal people wanted to be treated with respect then they would be high earners like us.
Good luck collecting that from a 1992 Mitsubishi L200
Got any apps?
Well, yeah, other than that :-)
Gives the creepy vibes, but if you stop to think about it - this can stop good drivers from subsidizing the bad drivers. Not like the insurance companies are doing to lower the premium on good drivers, if you have a problem with that talk to capitalism. But bad drivers getting higher premiums is good for everyone.
> But bad drivers getting higher premiums is good for everyone.
Not necessarily. In many parts of the United States, a car is the only viable mode of transport. If you price the bad drivers out of the insurance market, they will forgo insurance all together. Then, if they cause a loss, they will be uninsured and the other driver's insurance will have to pay for the loss (or spend resources in costly suits) anyhow. So, then good drivers premiums will need to go up to compensate for the extra "bad drivers can't afford insurance" risk that good driver's carry. We end up in a similar situation in a roundabout manner but with the added element that now all our data is stored on everyone's servers.
I mean, Tesla has their own insurance product that they claim is better and cheaper than alternatives because of the data they track. People cheered for this.
Personally, I"m not opposed to dangerous drivers paying higher rates, but the devil is in the details.
I mean, I knew…but… I didn’t know...
oh f**, we invented this shit in software, now it's coming back to bite us
Anyone know where I can find /etc/hosts on my Ford? /s
Behind the firewall.
I hope so. Get the animals off the road.
Imagine a scenario where bunch of those animals brake check you, and then your insurance company calls you up and says "Hey jgalt212, we're seeing that you your forward collion avoidance system got activated too many times this year. We're going to flag you as a tailgater and up your premium by %80. Have a lovely day."
Playing devil's advocate, the answer is that brake checking only works if you are tailgating. If you increase following distance such that you cannot be brake checked, the insurance company has succeeded in making your driving habits safer.
As opposed to the jackass whose pulling high lateral G's and going 90 on the freeway. I think I can explain my way out of the insurance hike easier than the aforementioned jackass--who actually should be deemed uninsurable.