What you should know about data privacy | Davenport Website Designs

What you should know about data privacy

Gear shaped metal art.

Your Privacy. Your Data. You Are Being Tracked! Next Post Coming Soon

Think you have nothing to worry about?

Saturday, October 15, 2023

Being in the marketing, internet, and tech worlds people talk about how they got an ad for something on facebook. When I asked if they mentioned the product or was looking for it on another website they say "No." But then say their friend talked to them about the product and that specific model.

I then go into an informative talk on internet privacy and security and before I finish the first sentence they interrupt with, "That is not true. But it is not important to me. I do not have anything to hide. Why should I care about data privacy?"

A massive amount of personal and granular data is collected about you every day through your phones, computers, cars, homes, televisions, smart speakers — anything that is connected to the internet and is a newer electronic device, yes even your vacuum cleaner. As well as things like credit card purchases and even the information on your driver’s license. You have little control over the collection of this data. Very few people do not understand when or how the data is used, which includes how it may be used to influence you. An ad to buy a product you discussed with someone. A recommendation to watch a YouTube video.

Internet platforms like YouTube use AI to deliver personalized recommendations based on thousands of data points they collect about you. Among those data points is your behavior when navigating YouTube using its parent company's Chrome browser — and other that have any social media, shopping cart, or convenient connectivity to social services. When navigating websites, such as YouTube, they track where you scrolled to on a page, which videos you clicked on, what the subject of those videos, how much time you spend watching, and so much more. It is all logged and used to inform increasingly personalized recommendations (that is how they sell it to you so it has a soothing ring: Convenient and Personalized. To save you time searching and see what really matters to you). Which are served up through autoplay (activated by default) before you can click away.

The truth and reality is, the AI is optimized to keep you on the platform so that you keep watching ads and YouTube keeps making money. It is not designed to optimize for your well-being or ‘satisfaction,’ despite what YouTube claims. As a result, research has demonstrated how this system can give people their own private, addictive experience that can easily become filled with conspiracy theories, health misinformation, and political disinformation.

The real-world harm this can cause became pretty clear on January 6, when hundreds of people stormed the Capitol building to try to overturn the certification of an election they were convinced, baselessly, that Trump won. This mass delusion was fed by websites that, research has shown, promoted and amplified conspiracy theories and election misinformation.

The algorithmic amplification and recommendation systems that platforms employ spread content that is evocative over what’ is true. The horrific damage to our democracy wrought on January 6th demonstrated how these social media platforms played a role in radicalizing and emboldening terrorists to attack the Capitol. These American companies must fundamentally rethink algorithmic systems that are at odds with democracy.

For years, Facebook, X, YouTube, and other platforms have pushed content on their users that their algorithms tell them those users will want to see based on the data they have about their users. The videos you watch, the Facebook posts and people you interact with, the tweets you respond to, your location — these help build a profile of you, which these platforms’ algorithms then use to serve up even more videos, posts, and tweets to interact with, channels to subscribe to, groups to join, and topics to follow. You are not looking for that content; it is looking for you. And it becomes an addiction people cannot look away from because they feel euphoric in a self-serving feeder.

This is good for users when it helps them find harmless content they are already interested in, and for platforms because those users then spend more time on them. It is not good for users who get radicalized by harmful content, but that is still good for platforms because those users spend more time on them. It is their business model, it’s been a very profitable one, and they have no desire to change it — nor are they required to.

Studies have shown how social media algorithms push users toward polarized content, allowing companies to capitalize on divisiveness. If personal data is being used to promote division, consumers have a right to know. But that right is not a legal one. There is no federal data privacy law, and platforms are notoriously opaque about how their recommendation algorithms work, even as they have become increasingly transparent about what user data they collect and have given users some control over it. But these companies have also fought attempts to stop tracking when it is not on their own terms, or have not acted on their own policies forbidding it.

It is impossible to stop companies from collecting data about you — even if you do not use their services, your friends and family do. You are connected to them which then connects your data to the company.

You are the only one that can prevent collection of your private data. Just because a conspiracy theory or misinformation makes its way into your timeline or suggested videos does not mean you have to read or watch it. The conspiracies might be much easier to find (even when you were not looking for them); you still choose whether or not to go down the path they show you.

You might think QAnon is stupid, but you will share #SaveTheChildren content. You might not believe in QAnon, but might vote for a Congress member who does. You should be smart enough not fall for anything, but your friends and family might not be.

Blog Posts