What a crazy week this has been. While contemplating the questions for this week’s blog, thinking of both digital literacies and exploring the notion of filter bubbles, I have also been trying to desensitize myself to the pandemonium which has ensued in my small community. Who knew Spring Break would be spent in our own little social isolation bubbles? As an educator, I have been contemplating this last week what the future looks like for online learners?

Our district has a meeting scheduled this Monday to discuss what a fully online system could like in our small community and the surrounding villages. While I want to remain optimistic that we will be able to continue to provide meaningful learning opportunities for our students, I do question what this is going to look like. There are a few obvious considerations and will be interested to hear what the plan will be moving forward.

We do not have any schools currently with a 1:1 ratio of students and technology, so I am not sure how accessible online courses will really be.

I also question how families will manage with multiple children needing to use technology to continue with as normal a schooling schedule as possible, especially if families are working from home. We are assuming that most families have access to 1:1 technology and I am not sure this is the case.

Furthermore, I am curious to see how we will move forward with many schools staffing not familiar with using these platforms that are being suggested. With educator’s lack of experience, exposure and confidence with digital literacy skills, what will this look like?

Only the future will tell…

This week I listened to Rheingold’s podcast of Chapter 2 Crap Detection 101: How to Find What you Need to Know, and Decide if It’s True as well as the YouTube Video, Beware of Filter Bubbles, by Eli Pariser.

Rheingold’s podcast talked about creating virtual communities to combat the spread of misinformation and to think about how we can learn to detect the “crap information” and avoid it.

He focused his initial conversation on the issue of attention, Research has indicated that we are not helpless in the face of all the attractions, we have to assert our attention and be more mindful of the attractions that lure us to buy into garbage news and misinformation. This is fundamental to our thinking and communication.

He further went on to explain that “crap detection” can help us further evaluate resources for authenticity but separating truth from fiction. While there are safeguards or “gateways” in place that ensure that books found in the library are authentic and believable resources, how do we do this online?

While there are many sites as we have explored through this week’s module activity that can support our quest for sifting out fake news, it is important to consider that the future of the commons depends on our online participation being meaningful.

One thing that struck me as interesting was Rheingold’s take on gamers and You Tubers having strong crap-detectors. Something I had not really given much thought, not being a gamer or YouTube fan. He pointed out that gamers and You Tubers have:

  • Developed a community in which they teach each other
  • They motivate themselves and those in their community to do better
  • They develop social and computational filters
  • They create communities and social networks in which they know who they can turn to and discuss issues, current and credible sources

The question now lies, regardless of what your interest and involvement entail with the internet, how can you create these communities and social networks that are credible and trustworthy?

The other video this week, Beware of Online “filter bubbles” by Eli Pariser, was an interesting watch when considering news and trying to stay informed. As Eli points out, it is important that we are able to identify our own biases and beliefs and that we recognize the power that we have inadvertently given our search engines.

This “Trust in Google” notion is quite daunting, as we look for valid, relevant information. We must remind ourselves that we have to seek out alternative views and look for varying information so as to not get caught in a filter bubble of one-sided information. We see this often with “liking” and “joining” groups on social media, more specifically on Facebook and Twitter. While it is important to follow both people and causes that are relevant to us, it is critical that we recognize that these platforms create their own bubbles, where we see similar news, views and opinions.

While Rheingold points out there isn’t really anywhere that provides you with an unbiased perspective. It is important to find a mixture of biases for your “information diet”.

I think what stood out the most for me in Pariser’s video was the idea that behind every algorithm is a person. This idea that social media, in particular Facebook is able to see what links you click on and then use that information to edit the results of what you are seeing, without your consent.

It is important when considering that we use the internet as a tool to learn, communicate and inform ourselves, that we should be shown information that is relevant, important and challenges our views thoughts and opinions, not just those that the internet believes we hold near and dear.

Furthermore, where is the civic responsibility for users to take back control and access information that they deem important, credible and valid?

Why does the internet and its algorithm dictate/ predict the future of the information that we are shown?

This is a huge challenge to overcome and one that is currently leaving us isolated in our own filter bubble.

Rheingold, H. (2012). Chapter 2 Crap Detection 101: How to Find What you Need to Know and Decide if It’s True. In Net Smart: How to Thrive Online. (pp. 77-111). Cambridge, Mass. MIT Press.

Beware Online Filter Bubbles- TEDx