From Above the Noise.
Why do you see what you see online and in social media and how can that influence your identity and behavior? Co-produced with @Common Sense Education
*Why do you see what you see online and on social media?*
What content you see on a given social app or search engine isn’t random– it’s super curated content. And it’s curated by recommendation algorithms. Recommendation algorithms are essentially the computer instructions for how a given social app or search engine decides what to show you.
*How do recommendation algorithms work?*
Basically, these algorithms are designed to keep you on a given social app for as long as possible. They do that by learning what you like and showing you more of that content. They consider a bunch of stuff when deciding what to show you. For instance, they collect data on what we’re watching, clicking, liking, commenting, sharing, buying, where we live, etc. They also consider what everyone else is liking and watching too. But exactly what and how all these things are ranked to give you the content that shows up in your feed is top secret. Plus, companies are constantly tweaking and changing their algorithms.
*Why do social apps use recommendation algorithms?*
There’s tons of content out there so recommendation algorithms sift through that content and show us what they think is the most relevant stuff to us. In the end, social apps and YouTube want to keep you on the platform for as long as possible so they can show you more ads and make more money– and they do that by showing you stuff they think is gonna keep you on the app the longest.
*What’s dangerous about recommendation algorithms?*
Recommendation algorithms can trap users in echo chambers or filter bubbles– where you are served content that just reinforces what you already believe. This is particularly true when it comes to news and politics– and has been cited as a reason for increased political polarization in America. These recommendation algorithms can also spread misinformation, disinformation and propaganda. Content that is emotionally charged tends to go viral because a lot of users engage with that content– and sometimes that means these algorithms are spreading misinformation, disinformation, and propaganda. There are also reports that users can get sucked into radicalization rabbit holes as algorithms serve up more and more extreme content.
Facebook’s Algorithms Fueled…. (The Conversation)
The Social Media Echo Chamber Is Real (Ars Technica)
Algorithms in Social Media Platforms (Internet Justice Society)
How TikTok Reads Your Mind (NY Times)
For You Page: TikTok and Identity (Debating Networks and Communities Conference IX)
Get your students in the discussion on KQED Learn, a safe place for middle and high school students to investigate controversial topics and share their voices: https://learn.kqed.org/
Check out Common Sense Education’s Digital Citizenship Curriculum: https://www.commonsense.org/education/
KQED serves the people of Northern California with a public-supported alternative to commercial media. An NPR and PBS member station based in San Francisco, KQED is home to one of the most listened-to public radio stations in the nation, one of the highest-rated public television services, and an award-winning education program helping students and educators thrive in 21st-century classrooms. A trusted news source, leader, and innovator in interactive technology, KQED takes people of all ages on journeys of exploration — exposing them to new people, places, and ideas.
Funding for KQED Education is provided by the Corporation for Public Broadcasting, the Koret Foundation, the William and Flora Hewlett Foundation, the AT&T Foundation, the Crescent Porter Hale Foundation, the Silver Giving Foundation, Campaign 21 donors, and members of KQED.
00:55 What are recommendation algorithms?
2:00 How social media algorithms work
4:21 Pros of recommendation algorithms
4:51 Dangers of recommendation algorithms
7:10 Tips to make recommendation algorithms work for you