Flying out to California and taking a break from in office work has given me a lot of time to reflect on how technology can track and predict your behavior. I am pretty familiar with machine learning and predictive modeling so I hope I do my best not to mansplaining anything in this blog post, but protecting ourselves from the internet is going to be the next key trend in the coming year.
As many people are aware by this point, Facebook and other social media platforms have been tracking our information. That is pretty common because a social media platform has to predict how to keep us coming back. They have to know who we want to talk with, what we want to view, and in what order we want to see items of interest. If they are unsuccessful in doing that, the platform fails and dies out. However, if they can keep producing information that releases a certain stimulus, we get addicted and need to check it every time we pull our phone out. This results in more and more time needed to feel up to date on other people’s lives, resulting in an average user of social media shelling out over 2 hours on sites A DAY, according to Statista. We are addicting, and the more we use it, the more a company will learn about us.
The way a machine learning model works is exactly how the word sounds. An algorithm will start off making a bad prediction about something because it has little to no data on a subject. However, over time the machine collects more information about a subject and can make a better prediction. In social media, we are the subject and the prediction is on if something will illicit a response from us. That response can be a like, a comment or even more time on a page rather than just scrolling past it. The more we give it, the better it can do. This is all fine if restricted to a social media platform, but when it leaks out past that, then we have an issue.
You might have heard about Cambridge Analytica as of late in the news. If not, here is a good catch up link by Wired if you want to go more in-depth. Essentially, the tl;dr version is that an article came out accusing that the Trump campaign hired Cambridge Analytica to analyze over 50 million people to do psychological targeting based on the information they collected from Facebook to influence the 2018 presidential campaign. Okay cool. They have a bunch of likes and picture of us. A little creepy, but nothing too bad, right? Well, it has not been super clear what Facebook or Google as a larger entity has been storing on its users. This definitely brought it to light, and it is very gloomy from a personal privacy standpoint.
You should really sit down and read either Dylan Curran’s link on the Guardian or his thread on Twitter because they are both striking. The line was definitely moves away from social responsibility of the company to blaming the customer, making companies essentially state, “Well you shouldn’t have agreed to it. You knew what you were doing.” This brings me to one of my favorite Parks and Recreation motivation moments:
We as consumers have to do a better job of monitoring our data footprint. However, that doesn’t mean a company can just play ignorant of the thing, saying it is our fault. That is not how you treat a consumer base, especially for a free product.
Think carefully before the next time you spend your time scrolling. In my opinion, deleting Facebook isn’t the answer since it gives so many benefits of connection. How am I supposed to keep track of everyone’s birthday? However, a push for the government to step in and help solve this privacy issue would be a great asset to the overall conversation than deleting an app (even if you delete it, they still have the data so be wary about that). Government regulation on what Facebook can capture, if done correctly, can set a precedent for other social media companies doing the same type of harnessing and allowing companies access to psychological targeting.
If you want to see how you can download your social media information, check out the Guardian link from Dylan Curran above, or check this link out for another way.