I was out for dinner with my friends and I was humming “Take Five” with my jazz-fanatic friend. Another friend asked us the name of the song and I said it’s Take Five.
The next day, the same friend received an ad from Take Five, the oil-changing company.
This was terrifying. My fears about devices listening to me 24×7 resurfaced. I tried to take control of it by not installing apps till it was very necessary and of course, by switching to iPhone. But I wasn’t surveillance-proof when I was around people. When the apps are listening and they make use of that data to feed AI models and then provide targeted recommendations, the conversation extends beyond storing data, it’s also about making use of that data to treat users as a commodity.
In this article, I am going to talk about how AI is compromising consumer privacy, what can companies do, and what we can do as users.
What is Consumer Privacy?
Consumer privacy is the right of users to shield themselves from privacy loss by corporations, as consumers of their products and services. The privacy loss may either happen due to the nonchalance of corporations or they might find grey areas that enable them to compromise consumer privacy for commercial gains.
Consumer Privacy Laws date back to the era of telephones but corporations to date find ways to escape legal liability. The most frequent one is the claim that the data they store and use is completely anonymized and doesn’t hold “enough sensitive data” that it would come under consumer privacy breach when leaked.
There are certain problems with this. First, data is not completely anonymized. Even if companies may find ways to remove identifiable markers like name, address, etc, they still end up saving other information which in many cases would end up being a single person! For example, let’s say 100s women gave birth in a hospital. How many of them would have a Ph.D., have a B+ blood group, be 28 years old, 65 kgs in weight? Very few, right? Probably one in that area. Even if I remove the names, the data is still not anonymized.
Second, the proposition is that what companies are storing doesn’t contain enough ‘sensitive data’. There was a Bloomberg report of Amazon employees listening to conversations on Alexa, which is a process to annotate and classify data manually for making the product better. But, where it crosses the line is the incident of employees finding their own private conversations, and some acts of violence. You guessed it, no action was taken.
This all makes a case for diving deeper into the intersection of AI and Consumer Privacy.
AI + Consumer Privacy
As AI becomes more integrated into our personal and professional lives, it’s important to reflect on the impact it has on our privacy as consumers. There are three major problems when it comes to Consumer Privacy and AI [Source]:
- Data repurposing: The data which was originally collected on a person for a single purpose can be stored and then be used in the future for another purpose, without the person’s knowledge. For example, the conversation between me and my bank can be recorded by the smart speaker and used to train a financial chatbot without my knowledge.
- Data persistence: The data collected on a person may be stored much longer than that person’s lifespan since data storage costs are quite low. There are HIPPA laws to protect the data of people and their records being used for academic purposes. However, the guidelines for how data is being used by apps, smart home devices, and social media platforms are very unclear.
- Data spillover: The data was originally collected on a single human who gave consent, it was also collected on those in the nearby vicinity, or who come in contact with that person regularly, without their consent. This is exactly what happened to me during the Take Five event. My voice was recorded by an app on my friend’s phone.
As a user, why are people okay with giving away their information?
Things that I have been talking about above have been realized by a lot of other individuals and the distrust can be seen in social media posts, personal conversations, and also in the hallways of academia. When people are concerned about their privacy, why do they give away their data to corporations?
In a survey related to data breaches, conducted by Ablon et al, only 11 percent of the participants stopped dealing with the affected company and 77 percent were highly satisfied with the company’s post-breach response. Their impression could also have been impacted by corporations’ PR teams but similar studies indicate a good amount of ignorance towards data breaches.
Thinking as a user, when I am asked to provide my personal information, I care more about accessing the product or service at the moment than my data privacy in the long term. Alongside, there are already a lot of things in our day demanding our attention and permission to apps is the last thing on our minds – there is a lot of decision fatigue. Lastly, users have somewhat internalized that their personal information is already out in the open, and sharing with one more corporation will not add much toward protecting their privacy. A study conducted by Pew Research in 2016 found that 91% of adults agreed that they have lost control over the collection and usage of their personal data.
This makes a case of the Privacy Paradox which highlights that people care about their security but their behavior doesn’t necessarily reflect the same.
What can we do as a User?
There exist a number of guidelines around keeping yourself safe as a user that includes reading their terms before agreeing, being cautious with permissions, and being mindful before sharing data.
I completely recognize the importance of such guidelines but in my viewpoint, the problem extends beyond users, leaving more onus on corporations to protect consumer privacy. The issues of data repurposing, data persistence, and data spillover cannot be resolved at a user level, we need corporations to step up and have clear rules and guidelines for data usage and storage.
As users, while taking care of ourselves, we need to ask corporations to do better in keeping our data safe, and not compromise our privacy for commercial gains.