Algorithms, especially computer algorithms, are playing a larger role in everyday life. Algorithms work well when they serve as filters that limit data overload and increase relevant search results. Facebook’s algorithms, for example, use a ranking system that examines the inventory of all of the possible stories (posts by the user’s friends, posts by companies the user follows), examines signals given by the user (types of stories that the user likes/shares/blocks), predicts which stories the user is likely to enjoy (share/like), and develops a relevancy score (Mosseri 2018). Stories are then posted on the user’s newsfeed based on that score. This process can be useful since it prevents the user from having to sift through many unrelated and unwanted posts.
I contend, however, that there is a fundamental problem with some computer algorithms: the effect they can have on the development of the individual’s I-for-myself. The I-for-myself is an internal self-definition. It is how our lives feel to us, day to day, on the inside. Our conception of the I-for-myself, according to Mikhail Bakhtin, comes from the initial words of the parents that are internalized by the child as self-definition (1984, 1986, 1990, 1993, 2017). As individuals encounter new people and ideas, this inner definition is used to judge new definitions of the individual given by external others. The inner definition is modified as the individual openly interacts with a world of other selves. When those others lose their personhood, when the individual no longer sees others as individuals but as stereotypes, categories, or images, the individual becomes less trusting of the other and less open to change. Algorithms quicken this shift by flipping the normal equation: the individual interacts with programs that create a snapshot of the individual at a given moment. The (artificial) algorithmic image stands in for the authentic other, “helping” the individual develop her “self” by presenting the individual with articles, stories, or search results that “match” her. Because the development of self requires voices that bring novelty, the algorithmic voice, which brings the individual conformity (material she already has affirmed as part of her “self”), shuts the person off from herself and her development.
Algorithmic closure is most clearly demonstrated by Facebook’s algorithms. The more the user interacts, posts, likes, or shares items on Facebook, the more the algorithm “learns” about the user. The algorithms try to predict what the user will want to interact with based upon who the user was in the past, a process that finalizes and objectifies the user. There can be several negative effects of finalization. First, the user can become trapped within a filter bubble with only the news, viewpoints, and opinions with which she agrees being let in. Eli Pariser (2011) views the bubble as problematic, for the user does not know she is in a bubble, does not know how she got into the bubble, and has no clear path out of the bubble (p. 10). The Pew Internet and American Life Project (2019) illustrates the scope of this problem, noting that 74% of people polled were unaware that Facebook maintains a list of their interests and traits and that 27% of the people polled, after viewing their lists, felt that the list developed by the algorithm did not accurately represent them (p. 2). Second, algorithms can be tailored to manipulate users’ emotions. A study by Kramer, Guillory, and Hancock (2014) demonstrated that controlling positive and negative words in a user’s newsfeed causes a very small, but statistically significant, change in the number of positive or negative words she uses in her subsequent posts. If we combine filter bubbles and manipulation, it is easy to see that the user’s sense of self is not given the room needed to develop properly, potentially closing the individual off from change and growth. Shires and Orgel (2017) believe that a user who is trapped in a newsfeed that portrays the world in a constant manner and fails to show that the world can be any other way, could develop a sense of resentment towards others that could bloom into ressentiment. Ressentiment develops when the individual is denied both choice and agency.
The development of an authentic I that moves towards self-actualization should be valued; however, algorithms like the ones on Facebook impede that movement greatly. It remains an open question whether an algorithm could be written that would be an authentic other. Once the I objectifies itself, can it ever be reawakened? No I is ever fully finalized, even if the individual fully buys into the algorithm’s image. Bakhtin clearly believes that such a reawakening is possible if the I rejects the false other of the image (the algorithm). When the individual returns to the world of other consciousnesses, the authentic internalization/externalization dynamic restarts. There is always another other that can bring the subiectum back into the dialogue and reawaken the I to his or her unfinalizability.
Associate Professor of Communication
Director of Graduate Studies in the Department of Communication & Creative Arts
Purdue University Northwest
References:
Bakhtin, Mikhail. 1984. Problems of Dostoevsky’s Poetics. Edited by Caryl Emerson. Translated by Caryl Emerson. Minneapolis: University of Minnesota Press.
—. 1986. Speech Genres and Other Late Essays. Edited by Michael Holquist and Caryl Emerson. Translated by Vern W. McGee. Austin: University of Texas Press.
—. 1990. Art and Answerability. Edited by Michael Holquist and Vadim Liapunov. Translated by Vadim Liapunov. Austin: University of Texas Press.
—. 1993. Toward a Philosophy of the Act. Edited by Michael Holquist and Vadim Liapunov. Translated by Vadim Liapunov and Kenneth Bostrom. Austin, Texas: University of Texas Press.
—. 2017. “Selections from the Wartime Notebooks.” Edited by Irina Denischenko and Alexander Spektor. Slavic and East European Journal 61 (2): 201-232.
Hitlin, Paul, and Lee Rainie. 2019. Facebook Algorithms and Personal Data. Washington: Pew Research Center. Accessed January 20, 2019.
Kramer, Adam D. I., Jaime E. Guillory, and Jeffrey T. Hancock. 2014. “Experimental Evidence of Massive-scale Emotional Contagion through Social Networks.” Proceedings of the National Academy of Sciences 111 (29). Accessed May 15, 2017.
Mosseri, Adam. 2018. “News Feed Rankings in Three Minutes Flat.” Facebook Newsroom. May 22. Accessed June 24, 2018.
Pariser, Eli. 2011. The Filter Bubble. New York: Penguin Books.
Shires, Jeff, and Nel Orgel. 2017. “The Bully Chamber: Creation of Funhouse Selves from Distorted Media.” International Journal of Digital Television 8 (3): 309-320.