Why social media platforms thrive (despite making users worse off)
New research shows social media platforms do not need to addict users to keep them – they just need to make leaving worse than staying
In September 2021, the Wall Street Journal published a series of stories based on internal Meta documents that came to be known as the “Facebook Files.” Among the findings was research Meta had conducted on its own users. The company’s own analyses acknowledged that Instagram worsened body image issues for one in three teenage girls and that users themselves blamed the platform for increases in anxiety and depression. The revelations prompted congressional hearings, regulatory scrutiny and public outrage. But they also raise a question that cuts deeper than corporate accountability: if the platform’s own data shows it causes harm, why do hundreds of millions of people keep using it?
New research from UNSW Business School offers an answer, and it has less to do with addiction or willpower than most people assume. The problem, the researchers found, is that users are trapped. They dislike the platform, but the cost of leaving is even worse, because everyone else is still on it. Since 2010, for example, rates of major depression among US teenagers have risen by more than 150%, and the share of 8th, 10th and 12th graders who reported being satisfied with themselves dropped by roughly 10 percentage points. The harm is well documented. The puzzle is why participation persists.
The party nobody wants to attend, but everyone shows up to
The researchers call platforms that harm their users “bad networks.” These networks do not need to trick or addict people into joining. Users feel compelled to participate because the cost of being left out is even worse than the cost of being on the platform. Bad networks are “like parties that people do not wish to attend but feel obligated to go to when others are going”.
The study, Bad Networks, was published in the Journal of Public Economics and co-authored by UNSW Business School’s Professor Robert Akerlof, Scientia Professor Richard Holden and Dr DJ Thornton, a Postdoctoral Fellow at the Manos Institute for Cognitive Economics at UNSW Sydney. Using game-theoretic modelling, the researchers solved for the ‘Nash equilibrium’ – the point at which no individual has an incentive to change their decision, given what everyone else is doing – to show how networks that harm users form, grow and persist.

“The upsides of network effects have been widely studied and are well understood”, said Prof. Holden. “We wanted to better understand the dark side of networks.”
The model identifies “instigators” – people who join early because they gain from visibility and status. Once enough instigators are on a platform, they create pressure for others to follow, “creating a snowball effect.” Even “resistors” who dislike the platform eventually sign up because the social cost of staying away becomes too high.
“High-profile influencers on social media (people with lots of Instagram followers, for instance) are potential instigators – but so are influencers in more localised networks, such as on university campuses,” Prof. Holden explained.
The question of what platforms knew about the harm they caused (and when) has since moved into the courts. A New Mexico jury ordered Meta to pay US$375 million after finding the company violated state consumer protection law by misleading users about the safety of its platforms and failing to protect children from exploitation.
Subscribe to BusinessThink for the latest research, analysis and insights from UNSW Business School
Shortly after, a Los Angeles jury found Meta and Google liable in a separate case, ruling that Instagram and YouTube were designed to addict young users, and awarding US$6 million in compensatory and punitive damages to a woman who said she became hooked on the platforms as a child. Both companies said they disagreed with the verdicts and would appeal.
How the social media rat race drives platform growth
The paper identifies what makes social media prone to becoming bad networks: rat races. On platforms such as Instagram and TikTok, users compete for likes, followers and public signals of status, escalating self-promotion in a contest that delivers no collective benefit. Networks tend to be both harmful and easy to establish when they generate these competitive dynamics.
The model captures this through the “salience of esteem”, or how much a platform makes users care about how they compare to others. When salience is low, as on coordination platforms such as Dropbox or Slack, the network benefits users. When salience is high, as on platforms that display likes, views, and follower counts, competition tips the network into harmful territory.

“When people are judged by their number of followers or likes, there is extreme pressure to post and to participate in the network to keep up with their peers,” said Prof. Holden. “This is the very essence of a ‘rat race’.”
Why platforms profit from making the problem worse
The researchers showed that intensifying the rat race tends to grow the network. As the paper states: “Amplifying the rat race boosts network size, which, while harmful to consumers, may benefit the platform.” Between 2019 and 2021, Instagram ran an experiment hiding public “like” counts, with the stated aim “to make it less of a competition.” Other research has found that removing visible likes reduced negative affect and loneliness. Yet Instagram made the change optional rather than the default.
Another study found that reducing toxic content on social media led to a drop in time spent on platforms and in advertising impressions. While this research was not about social comparison directly, toxic environments may amplify it through shaming, hostile commentary and norm-enforcing harassment. Taken together, the evidence suggests that the features that cause harm also drive the engagement that makes platforms profitable.
Can regulation fix harmful social media networks?
The paper explores whether Pigouvian taxes (charges designed to make users pay for the harm they impose on others) could work. Such taxes can help in some situations, but once a bad network is established, a tax calibrated to the harm each user causes may not dislodge it. Each user has almost no effect on the overall network size, so a small charge does not change their calculation about whether to stay or leave.

As the paper notes: “Achieving the socially preferred outcome may require a more extreme policy – a tax high enough to destroy the bad equilibrium altogether, or an outright ban on the network – rather than merely correcting the marginal externality at the existing network size.” Australia’s legislation banning social media for users under 16 is an example of a targeted policy that removes the instigators needed to tip a peer network into harmful territory.
“Australia has been a world leader in age verification for social networks – though how effective that policy is, remains to be seen,’ Prof. Holden observed. “The broader challenge is limiting bad networks through taxes on usage or directly on algorithms that cause harm to users.”
What business leaders and policymakers should take from this
The research reframes the debate around social media harm. The problem is not that people lack willpower. The core issue is a coordination failure: users are trapped on platforms they would collectively prefer to leave, because no single person can change the outcome on their own.
Learn more: Behind the content moderation strategies of social media giants
Platform design choices matter. Features that amplify social comparison, such as public like counts, follower metrics, and algorithmic promotion of content based on engagement, intensify the rat race that makes networks harmful and harder to leave. Organisations developing community platforms should consider whether their design choices create competitive dynamics that work against user interests.
For governments, incremental interventions may prove insufficient once a harmful network has reached a certain size. Measures such as age-based restrictions, mandated design changes or revenue-based taxes may be needed.