After the trailer of a new series with Bill Gates, I had to review the predictions from 1995 under a new scope. We will review how social media platforms and their algorithms can sometimes reinforce our existing beliefs and ideas, create filter bubbles and influence our decision-making processes. These insights takes us on a journey from mass media advertising to hyper-personalised content, and along the way, we'll meet some fascinating people and explore how group dynamics and influencers can shape our opinions and behaviours. By taking a closer look at how algorithms work, the essay shows how these systems tend to reinforce certain ideas while making it harder for diverse viewpoints to be heard.
In the current digital age, the rapid spread of misinformation is a cause for concern and offers an intriguing insight into the way we consume and share information. What started as a personal observation soon led me to embark on a journey of discovery, with the aim of gaining a deeper understanding of the underlying forces driving this phenomenon. In particular, I was interested in investigating whether users on these platforms may inadvertently link certain posts or sources with credibility simply because they align with their existing belief systems. As I began to explore the existing literature on this topic, I came across the concept of 'illusory correlation', which seemed to offer a potential explanation. Through a careful review of scientific studies and articles, I found evidence that seemed to support the claim that cognitive biases, amplified by algorithms, may play a role in the spread of misinformation.
Today, we look at how social media sites use complex decision-making models and psychological biases to influence what users do. The article looks at how decision theory has evolved, from expected utility theory to prospect theory. It also looks at how manipulation techniques have changed, from TV advertising to the sophisticated algorithms used by social networks. We look at concepts like 'illusory correlation' and 'delay of gratification' to see how users are nudged towards specific ideologies without realising it. Economic models like the Investment-Payoff Model show why companies focus on engagement over diverse viewpoints, which ultimately creates echo chambers. The discussion looks at the ethical issues around these manipulations and questions the responsibility of social media corporations in shaping public discourse. We need to think about how users can counteract algorithmic influence and reclaim critical thinking in the digital age.