With the use of algorithms, sites like Google, Twitter, and Facebook can use your search history to personalize the information and topics of interest that you read and see daily. Sounds cool, right? This means that whatever you may search on Google, the results and information you see is completely different to any other user, because it is unique to your interests. You may be wondering; how does Google know all this information about me? Well, today algorithms allow sites like Google, Facebook, and Twitter to track your behaviour online to generate data such as geographic location, search history and your interests. This allows the sites to come up with sophisticated guesstimates about who you are and what information is relevant to you as an individual. To summarise, algorithms allow social networks and search engines to gather information on you, so they pretty much know more or less about who you are as a person, what you like to do, who your friends are, and your exact location. Sounding scary yet?
Even though it can be a benefit to automatically read about topics that interest you, personalization from algorithms can create a filter bubble. This term was used in the reading that I read this week, The Internet 2.0 – Personalization and Intro to Algorithms. Because algorithms are constantly present in our day to day lives, the information that we see can become the same because it is created by our established interests. Thus, we may become confined into a bubble that traps us from being able to see other views and how the world looks outside of our own beliefs and personal interests. This can increase the likelihood of being misinformed by materials we read, trapped in conspiracy theories, and believing “fake news.” The filter bubble controls what we see and what we do not see.
What shocked me most in the Algorithms of Oppression reading was reading about the connection between Google’s algorithmic conceptualizations and the misinterpretation and stereotypes of women of color. An example given in the reading talked about how searching black girls brought results for “porn sites, dehumanizing them as commodities, as products and as objects of sexual gratification” (page 392).
The way that we use technology in today’s society and use the search engine to find out information has become part of our normal lives. We may assume that what we are reading online gives us credible and accurate information. The Google search of black girls shows us the algorithmic conceptualizations of women of color and how it is embedded in the search engine. This highlights how the Internet encourages the portrayal of stereotypes and misrepresentation, which makes people believe what they are seeing, and reading is normal and correct.
I believe one of the most important things to take away from this information, is how what we read online, and on social media sites needs to be analysed carefully before making an assumption. It is important to be aware of how easily information that we read on the internet and on social media can be heavily influenced and manipulated.
Noble, S. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. New York University Press.
Stjernfelt, F., & Lauritzen, A. (2019). “The Internet 3.0”. Your post has been removed. Springer.