Three years ago, Meta, the parent company of Facebook and Instagram, collaborated with researchers to study their platforms’ impact on the 2020 election. The first batch of results, published in Science and Nature, shed light on algorithmic influence on users’ election feeds.
These findings are crucial for Meta, given previous criticism about transparency with independent researchers. Nick Clegg, Meta’s policy chief, suggested their platforms might not have as much impact on political beliefs as commonly believed.
However, initial findings are nuanced. One study explored “echo chambers,” where like-minded content reinforces beliefs. Decreasing like-minded content reduced engagement but didn’t change beliefs, suggesting complex effects.
Another study compared chronological and algorithmic feeds. Algorithmic feeds influenced user experiences, affecting engagement and content exposure, including politically moderate and untrustworthy sources.
These findings indicate that the algorithms used by Facebook and Instagram play a significant role in shaping users’ interactions with content. While Meta’s policy chief argues that the platforms may not be solely responsible for polarization or changes in political attitudes, the research hints at the intricate ways in which these algorithms impact users’ experiences and exposure to diverse content.
In conclusion, the partnership between Meta and independent researchers has yielded thought-provoking insights into the role of Facebook and Instagram in the context of the 2020 election. The research highlights the complexities of algorithms, echo chambers, and content exposure, underscoring the need for ongoing studies to better understand the impact of social media platforms on society and political discourse. As discussions on digital platforms’ influence continue, these research findings serve as an essential step toward fostering transparency and accountability in the tech industry.