Our privacy matters — it’s time to start acting like it
No dystopian novel has predicted our current situation better than M. T. Anderson’s 2002 dystopian novel, Feed.
In Anderson’s not-so-distant future most Americans live with a device called the Feed implanted in their brains. The Feed, which is effectively a smartphone glued to your skull, allows for telepathic communication, access to a vast database of information, and the ability to seamlessly buy products from brands that plug into the feed.
The most striking part of Feed’s world of manufactured clouds and hourly changes in fashion trends is the pervasiveness of ads throughout the story. The narrator’s thoughts are constantly interrupted by ads urging the him to buy the latest and greatest products. As the story progresses these ads become more aggressive, more persistent, and more personalized as the Feed learns what has and hasn’t worked for the narrator. This is a reality the narrator and his friends have come to accept and often embrace.
We don’t have computer chips implanted in our brains at birth, but since Feed’s publication in 2002 Anderson’s vision of the relationship between consumers and brands has moved closer to what he laid out in his dystopia. Our data is becoming increasingly valuable and companies are doing everything they can to capitalize and act on it.
Uber is the latest in a long list of companies who have turned to questionable methods of data collection for personal gain. Mike Isaac, writing for The New York Times, recently reported that Uber used an exploit in iOS to tag individual iPhones and identify them even after users deleted Uber off their phone or wiped it entirely — a direct violation of Apple’s privacy guidelines.
Isaac’s reporting also revealed that Unroll.Me, a service which scrubs user email accounts to easily unsubscribe from unwanted newsletters and promos, was selling anonymized information from users’ Lyft receipts to Uber. The company then used this data to monitor Lyft’s success and develop strategies to stay ahead of its competition.
The problem with terms of service
After The Times’ story broke, Unroll.Me CEO Jojo Hedaya said in a statement, “it was heartbreaking to see that some of our users were upset to learn about how we monetize our free service.” There’s some truth to what Hedaya said. Free services like Unroll.Me, or more prominent ones like Gmail and Facebook, have to make money somehow. If it’s not through a one-time fee or a subscription, ads are the most viable option.
While the old adage “if you’re not the customer, you’re the product” should be assumed at this point, Hedaya’s remark is hilariously tone-deaf. According to him the issue isn’t how his company treats its users’ data, it’s that users didn’t like it once they found out.
One could argue that it’s the responsibility of users to look into a company’s privacy policy and terms of service before agreeing to them. In theory, this would prevent shocking revelations of dodgy practices from companies we should trust. Any time we’re informed that a product will read over our messages to provide a better ad experience or share what they’ve collected with undisclosed third-parties, and we don’t like it, we can opt out of that service and go with a competitor whose privacy policy better matches what’s acceptable to us.
The problem with this suggestion is that it requires companies to be entirely transparent about the data they collect, what’s done with it, whether it’s anonymized, and how it’s anonymized.
Looking at Unroll.Me’s Privacy Policy, it’s no surprise users had no idea companies like Uber could buy the information off their receipts. Other than the assertion that linking to any page from Unroll.Me’s website other than the homepage violates its terms of service, it’s pretty similar to most terms of service pages you’ll find. This means it’s a great example of just how confusing these terms are, and explains why Hedaya’s false sense of surprise is entirely unjustified.
This seems like a good enough start. It’s written with clear language and immediately lays out what data they’ll be collecting. If you willingly provide information to the company, like your contact information or demographic details in a survey, they have permission to hold onto it and use it in the ways listed above.
The policy takes a turn when the focus shifts to “non-personal information.” Where the personal information section is concise and reasonable, Unroll.Me uses vague and complex language in the section on non-personal information, making it tougher to discern. If Hedaya wants to avoid surprise user outrage in the future, rewriting this section would be the first step.
I doubt most Americans are familiar with the CAN-SPAM Act (15 U.S.C. 7702 et. seq.), and it shouldn’t take a law degree or extensive research to fully understand what the implications of a privacy policy for a basic service are.
They do assure users that any personal information is removed before distributing collected data. That’s assuring, but it leaves me with some questions: Are there any types of disclosures that are excluded from this clause? What happens to this data if we leave the service entirely? Who at the company has access to this information?
The answers to these questions are probably simple and reasonable enough to put users at ease. The trouble comes from the ambiguity of the term “trusted business partners.” It’s never explained who identifies these partners or what qualifications they have to meet to be trusted. Users can’t see who these partners are, what they do with the data once they have it, or opt-out of supporting companies they have no intentions of supporting. Regardless of your stance on user privacy, this should be addressed if we’re to expect consumers to stay informed on the products they choose to use and support.
The shift in language between Unroll.Me’s definition of personal and non-personal information isn’t coincidental. If you’re worried that your practices may scare away customers, it’s in your best interest to mask those practices as much as possible. Given users’ reaction to The Times’ revelation, it’s clear why the company presents the information so ambiguously. If they directly stated that they stored every email you send and receive while a part of their service, an unverified claim from a Hacker News user, they’d be out of business.
As John Gruber put it: “they’re selling your personal information to companies like Uber. Supposedly that information is anonymized, but wiped iPhones are supposed to be anonymized too, and Uber found at least one route around that.”
Finding someone to blame
Entering an agreement with a company implies you’re okay with their practices and you trust them with whatever data they’re asking for. Privacy policies like Unroll.Me’s imply that you’re extending that trust to any company that under Unroll.Me’s umbrella of “trusted business partners” without knowing who that trust is being given to. When that translates to giving companies like Uber, who are known for their troubling practices, your private data, we should expect more than a vague term which ultimately means nothing without additional context.
Some users might find nothing wrong with Unroll.Me trusting a company like Uber, who has made it a mission to circumvent policies that protect user’s privacy to further their own goals. Even so, the company’s customers should be made aware of it before deciding to trust a company with what’s often one of our most personal forms of communication.
This isn’t unique to Unroll.Me. Evernote faced backlash last year after releasing a grotesque privacy policy that would have allowed engineers and other Evernote employees to read their users’ notes. Some damage control led this to being an opt-in policy, but that Evernote wanted to be able to read all of the notes its users create is telling about its stance on user privacy.
As long as these companies continue to write hazy privacy policies and be unclear about what’s done with their customers’ data, it’s impossible for users to make informed and responsible choices about the companies they hand their data to.
We can’t expect companies to act in ways that hurt their bottom line, but we can make having poor privacy policies affect that bottom line by making our privacy a priority.
There’s no shortage of ways to go about this, but what you choose to do will vary based on your comfort level:
- When a company’s privacy policy is vague, contact the company for clarification before joining their service
- If a privacy policy seems to overreach or grants the company access to information you don’t want to share, tell them that their policy is why you cancelled your account or chose to not sign up
- Let the companies who do value your privacy know that it makes a difference to you as a customer
- When possible, encrypt your data.
- Use a password manager and two-factor authentication for all of your accounts. Even the companies that do value your privacy can’t have perfect security measures, so do everything you can to keep your data safe
- Review the permissions you’ve given third-parties over the years
- Support the Electronic Frontier Foundation
This won’t solve the problem overnight, and companies will always try to find ways around transparency, but doing nothing tells them it’s okay to keep users in the dark about what data they’re collecting and what happens to that data.
If we keep avoiding telling companies that our privacy matters, we’ll continue to see our trust be taken advantage of and have our data recklessly given to the highest bidder. It’s time to stop putting a price tag on our personal information.