When taking research literacy lessons, I have the students conduct a basic Google search using their inquiry question. Then, in a new tab, they use only the key words and watch what happens to the number of results. Next, they use a range of Boolean operators with their key words. We compare and discuss the change in the number of results (usually significantly less) and the power of effective search skills. I always point out that despite everyone in the room using the exact same words and operators, we get a different number of results. We speak about Google’s algorithms, it’s filtering of their results, and the power of going beyond page 1.
This has always been a valuable discussion point, as some believe the filter bubble can dramatically increase confirmation bias. In a climate of divisive viewpoints, this is important to note. Not only in the personal and social world but also the world of academia. Students must have the opportunity to challenge their thinking to develop deeper understandings and develop their capacity for critical thinking.
In his 2011 TED Talk, Pariser highlighted the need for algorithms to be transparent and customisable to enhance companies’ ethics and “civic responsibility” in terms of how people connect and with what they are exposed to (TED2011, 2011). It seems Google responded. When recently searching in Google, I wanted to see if I could turn off certain algorithms or data collection – could I go back to square one to have a truly pure and uncorrupted search experience. It turns out, in 2018, Google released Your Data in Search which makes deleting your search history and controlling the ads you see much easier. You can also turn off Google’s personalisation. While some studies suggest Google’s attempt at reducing the filter bubble (searching in private mode and when signed out) does not greatly affect the disparity between users’ search results, it is perhaps a step in the right direction. It is worthwhile noting that Google disputes the claim that personalisation greatly effects search results.
The jury may still be out as key players are unsurprisingly at odds, however I have seen the difference in results first hand when working with classes of students. In the realm of their academic research, it may not be as big of an issue as say perpetuating political beliefs or other ideologies, however these algorithms are deciding what it deems most useful or important for these students. This can limit students’ search rather than assist, and popularised click bait can hinder their academic as well as social searching. An alternative search engine, which does not track or store your personal information, is DuckDuckGo. A search engine many librarians and educators have been promoting for some time. The next time I take a research literacy lesson, I will put it to the test and see how it stacks up against Google.
DuckDuckGos Business Model. Cuofano, G. (n.d.). DuckDuckGos Business Model [Image]. https://fourweekmba.com/duckduckgo-business-model/
In terms of curation, Valenza (2012) suggests we be mindful of the filter bubble when evaluating the curations of others. Are viewpoints missing? Whose perspective is the curation from? On the other hand, effective human curation can alleviate the filter bubble. Human curators, particularly those participating in collective curation, have the ability to provide multiple perspectives within a curation. This gives users a more comprehensive pool of sources to select from and can expose users to a breadth of viewpoints. Even Apple is using human curation to counter the limitations of algorithmic curation. Apple intends to present a curation of quality-controlled news by leveraging the collective skills and expertise of a curation team. This is also a powerful exercise for students.
Collective curating of resources for research tasks can reduce students' work load, provide multiple and alternate perspectives and encourage collaborative processes and communication. Shirky (O’Reilly, 2008) highlights an instance in 2008 whereby a Toronto college student created a Facebook study group to mimic an IRL study group. In this group, membership was open and vast. He was quickly charged with cheating by Ryerson College. I personally don’t believe the creation of this group to be in violation of academic integrity. Even though students may be collaboratively curating (something I think should be encouraged), they, themselves as individuals, must still be discerning in their selection of sources and evidence, and must still demonstrate their ability to evaluate, analyse and sythensise. Collective curation provides opportunities for students to debate, widen the available perspectives, and support one another in their academic endeavours.
So, the next topic to explore is appropriate collective curation tools that support students inside and outside the school environment.
References
TED2011. (2011, March). Eli Pariser: Beware online “filter bubbles” [Video file]. https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles?language=en#t-476026
O’Reilly. (2008, September 19). Web 2.0 Expo NY: Clay Shirky (shirky.com) It’s Not Information Overload. It’s Filter Failure [Video file]. https://www.youtube.com/watch?v=LabqeJEOQyI
Comments