Plenty of users take what they read online at face value, which some social experiments have proven. The average user often doesn’t check facts or consider whether the source is credible. “You look at a Wikipedia article and assume that it all must be true,” said Christo Wilson, a computer science professor at Northeastern University who researched algorithms and personalization extensively. “Or you search for something on Google and think the results are subjective and correct off the bat.” And then there are algorithms on top of every social network and search engine, providing users with personalized, and ultimately skewed, results. Algorithms are a mystery to researchers.
Considered trade secrets, algorithms are best kept confidential. Researchers have a general understanding of how they work, but algorithms are constantly changing so that companies maintain a competitive advantage. Researchers have no way of knowing for sure exactly how Google, Facebook, Twitter, Instagram or any other online platform’s algorithms work.
“News-filtering algorithms narrow what we know, surrounding us in information that tends to support what we already believe,” Eli Pariser, CEO of Upworthy, wrote on Medium, referencing his experience clicking links and then subsequently seeing related content in his feed.
Now that the 2016 presidential election is a regular part of the news cycle and users are sharing their opinions daily, let’s dive into all the ways, subtle and overt, that the most-used social network, Facebook, and most-used search engine, Google, affect our political beliefs.
Full Article: How Facebook and Google’s Algorithms Are Affecting Our Political Viewpoints | Megan Anderle.