With the use of algorithms, sites like Google, Twitter, and Facebook can use your search history to personalize the information and topics of interest that you read and see daily. Sounds cool, right? This means that whatever you may search on Google, the results and information you see is completely different to any other user, because it is unique to your interests. You may be wondering; how does Google know all this information about me? Well, today algorithms allow sites like Google, Facebook, and Twitter to track your behaviour online to generate data such as geographic location, search history and your interests. This allows the sites to come up with sophisticated guesstimates about who you are and what information is relevant to you as an individual. To summarise, algorithms allow social networks and search engines to gather information on you, so they pretty much know more or less about who you are as a person, what you like to do, who your friends are, and your exact location. Sounding scary yet?
Even though it can be a benefit to automatically read about topics that interest you, personalization from algorithms can create a filter bubble. This term was used in the reading that I read this week, The Internet 2.0 – Personalization and Intro to Algorithms. Because algorithms are constantly present in our day to day lives, the information that we see can become the same because it is created by our established interests. Thus, we may become confined into a bubble that traps us from being able to see other views and how the world looks outside of our own beliefs and personal interests. This can increase the likelihood of being misinformed by materials we read, trapped in conspiracy theories, and believing “fake news.” The filter bubble controls what we see and what we do not see.
What shocked me most in the Algorithms of Oppression reading was reading about the connection between Google’s algorithmic conceptualizations and the misinterpretation and stereotypes of women of color. An example given in the reading talked about how searching black girls brought results for “porn sites, dehumanizing them as commodities, as products and as objects of sexual gratification” (page 392).
The way that we use technology in today’s society and use the search engine to find out information has become part of our normal lives. We may assume that what we are reading online gives us credible and accurate information. The Google search of black girls shows us the algorithmic conceptualizations of women of color and how it is embedded in the search engine. This highlights how the Internet encourages the portrayal of stereotypes and misrepresentation, which makes people believe what they are seeing, and reading is normal and correct.
I believe one of the most important things to take away from this information, is how what we read online, and on social media sites needs to be analysed carefully before making an assumption. It is important to be aware of how easily information that we read on the internet and on social media can be heavily influenced and manipulated.
Photo by Markus Spiske on Unsplash
Noble, S. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. New York University Press.
Stjernfelt, F., & Lauritzen, A. (2019). “The Internet 3.0”. Your post has been removed. Springer.
4 thoughts on “The Hidden Agenda of Algorithms”
Hey Alice! I love how you stressed the importance of taking “information” from the internet with a careful approach. Just because something appears on the search results to a given topic or question, it does not mean it is 100% factual. That makes me especially nervous for the new generations that are being raised with constant access to the internet. I also like how you talked about the personalized search engine/results. It can be cool and convenient to have an advertisement appear for clothes you like or even a trending restaurant that is nearby….but it is indeed creepy, as you hinted at. In order for these personalized results to come up, there is so much information that is stored away about all of us! Do you think that it is worth giving up personal information, in order to combat the stereotypical and sometimes prejudice search results that would otherwise appear? (151 words)
Hi Haley! I definitely think it is important to combat discrimination and prejudice views caused by algorithms. It is so wrong that certain words are linked to a certain group of people which creates harmful stereotypes. If this means giving up more of my personal information to resolve this issue and reduce biases then I would consider it.
Hi Alice, one of the shocking things you brought up was the perpetual stereotypes propagated by such algorithms. Personally, reading about algorithms this week has become a bit of a “red-pill” moment for me. It’s truly scary and disturbing how algorithms inform our worldview, our perspectives, our beliefs, regarding the world and other people, as mentioned in the black girls exampled. I have a question for you, prior to learning about algorithms this week and how they seem to shape the world, were you aware of the extent to which algorithms perpetuate ideas? Now that you know, how do you feel about them? #104
Hi Ian! Before this week I wasn’t aware of the extent that algorithms influence so much of our lives and daily activities. It is truly scary that it is so easy for people to learn so much about ourselves through what we search online and how the data is then stored. I am definitely going to be more aware of the things that I search online and be aware of the biases that are portrayed online. I think it is important to keep an open mind with what you read and see on social media as it can easily be influenced and manipulated.