Filter Bubble & Echo Chamber

You've probably heard the term Filter Bubble and/or Echo Chamber at least once or twice in the past few months. It's a term that has been circling the media for some time about Facebook and the 2016 U.S. Presidential Election. But what do they mean exactly? How does it relate to the internet or more specifically Facebook and Google? How does it affect you? Watch the video to find out!

Script & Storyboard
“Filter Bubble” is a theory that the algorithms from companies[1] like Facebook [1.3] and Google [1.4] bases the information given to you on data acquired from things [2.2] like, your search history,[2.3] your past click behavior,[2.4] the type of your computer[2.5] and your location.[2.6] Therefore, limiting the topics that reach you [3] to a bubble of only [4.1] your own formulated interests and personalized search subjects. [4.2]
The term was coined by Eli Pariser who wrote a book on this subject [1] explaining that these algorithms are “closing us off to new ideas [2.3], new subjects[3.4] and important information [3.5]”. What he means is that you are 3.1] not given information outside [3.2] your own political views,[3.3] religious views [3.4]or even other data like for example updates [3.6] on women's rights [3.7]and animal rights. [3.8]
Another way of saying this is “Echo Chamber”. [1.2] Echo chamber is when information,[1.3] ideas, [1.4] or beliefs [1.5] are repeatedly pushed in an enclosed system [1.6] like your mind, [2.1] your newsfeed,[2.2]or your social circle, [2.3] while other views are prohibited. [3]
It’s not so different as it was back in the old days where our great-grandfathers chose only one type of newspaper.[1] However, as time went on, the curators [2] and editors [3] of old media realized the important effects they had on the world and spent decades to develop their ethical foundation. [4] Are algorithms created by corporations equipped with the same ethical foundation?[5.1] Are you limiting your own views? [5.2] Are you learning anything new? [5.3]
Filter bubbles may have had some responsibility in the 2016 election. [1] During the Obama administration, the concerns of the [2.1] American working class in the midwest [2.2] have been ignored and rejected. [2.3] This led to a hardening of their political stance. [3.3] Another example is the opposing side, the liberals had believed they held an extraordinary lead [4.2] until the election ballots were counted. [4.4] And then of course as always in an aggressive campaign, many people felt that [5.2] attacks on their candidate [5.3] were like attacks on themselves [5.4] which lead them to retreat into their bubble where only agreeable information can reach them. [5.5]
It’s not necessarily that personalized information engines are a bad thing. [1] They do help edit out the massive amount of information provided online to cater to what’s important to us.[2] After all, we can’t process all the news that’s affecting people in Syria [3.1], China [3.2], North Korea [3.3], Africa [3.8], Germany [3.9], Kim Kardashian [3.10] and then finally to news of our own country. [4]
It really comes down to actively being aware of the content you see. [1] Actively expanding your news source [2] and balancing your knowledge on interested topics. [3] Being that we are a generation that is given the opportunity to easily get a chance to open our minds to other viewpoints. [4] It seems like it would be a shame just to close yourself off in a little filtered bubble. [5]