Short-Form Video Most Detrimental To Young Women’s Mental Health

Short-Form Video Most Detrimental To Young Women’s Mental Health

Short-form videos on tech platforms that reflect unrealistic depictions of physical appearance are harmful to female mental health, and the impacts are worse if the content is perceived as unedited and natural according to new research from scientific journal Body Image.

The research suggests that just a little exposure to short-form social media videos reflecting unattainable appearance standards – known as appearance-ideal content – is enough to have a negative impact on body image. Appearance ideals are what our culture tells us about how we should aspire to look.

The study investigated the impact of viewing appearance-ideal short-form social media video content on young women’s appearance satisfaction, negative mood and self-objectification.

While these standards are not necessarily new, social media is an environment where users tend to view and show the most attractive versions of themselves and others. TikTok, which exclusively utilises short-form video content, can be an especially appearance-focused environment, with many viral trends like dance challenges reflecting widely held societal conceptions of beauty.

“Appearance-ideal content can pressure women to look a certain way that is unrealistic or completely unattainable,” said Dr Jasmine Fardouly, senior author of the study from the School of Psychology at UNSW Science.

“We know this starts early, with girls as young as six reporting unhappiness with their bodies, desires to look thinner, and even dieting to lose weight.”

Appearance dissatisfaction, frequently reported among young women, is associated with adverse mental health outcomes, including depression, and is a risk factor for some clinical eating disorders.

“Social media isn’t the only place where these appearance ideals are promoted, but there is a lot more opportunity to internalise them through the platforms,” Dr Fardouly said.

Negative impacts on body image

For the study, the researchers showed 211 women aged 17 to 28 ten images or videos selected from the Instagram and TikTok accounts of young female social media influencers that reflected societal appearance ideals. They then surveyed the participants on several body image measures, using appearance-neutral content – content without people – for comparison.

“We found that appearance-ideal short-form video content on social media, regardless of the medium, can have adverse effects on appearance satisfaction, negative mood, and self-objectification among participants,” said Jade Gurtala, lead author of the study.

The research also found participants made the same amount of upward appearance comparisons when they viewed ideal content in images or video. In other words, they compared their own bodies to the women in the appearance-ideal content, judging themselves as less attractive, negatively impacting their mood and increasing body dissatisfaction.

“The total exposure time was only like a minute and a half, and we found that was enough to have harmful impacts,” said Dr Fardouly.

“That was just in a lab-based setting, so it’ll be interesting to measure the impact of exposure over the long-term and whether that has some cumulative effect.”

Emergence of editing and enhancements

Appearance ideals promoted through social media are often enhanced and edited using manipulation techniques like hyper-realistic face and body filters that are becoming harder to detect, particularly with video.

While the exact nature of any enhancements applied to the content in the study was unknown, participants perceived the video content they viewed to be less edited and enhanced than the images and were less satisfied with their own appearance by comparison.

“If appearance-ideal video content is perceived as unedited and enhanced when it in reality is, then users may be more likely to engage in negative social comparisons and internalise the appearance ideals,” Gurtala said.

“So, viewing ideal video content may be more harmful than viewing ideal image content for some users.”

On average, study participants reported spending between two to three hours on social media each day. The researchers say reducing daily screen time or diversifying the type of content we consume may help minimise overall exposure to appearance-ideal content.

“There’s also a role for the platforms, which can have very pervasive algorithms that promote appearance ideals and keep users engaged, to help expand the range of content shown to users in their social media feeds,” Gurtala said.

Further research is also needed to determine if there are potential positive effects of viewing short-form video content featuring diverse and unedited bodies.

“Some evidence suggests image-based content that challenges these beauty ideals and promotes body positivity, body function, and body acceptance help to make social media a less harmful environment for body image,” Dr Fardouly said.

In the meantime, the findings can inform media literacy guides that play a significant role in educating young women about the impacts of social media use on body image and countering unrealistic representations of appearance.

“It’s important to update these educational body intervention programmes given the emerging evidence around the negative impacts of appearance-ideal video content, especially as it evolves and becomes a more dominant medium on social media,” Dr Fardouly said.

Jed Horner, TikTok country policy manager, trust & safety said the platform was “an inclusive and body-positive environment and we do not allow content that depicts, promotes, normalises or glorifies eating disorders.”

He also said that the app’s community’s wellbeing and safety were its “main priorities” and pointed to the safeguards it has in place. 

“This includes banning ads for fasting apps and weight loss supplements and putting restrictions on ads that promote harmful or negative body image. We have a longstanding relationship with the Butterfly Foundation and anyone searching for terms directly relating to eating disorders is shown information from the foundation including helpful and appropriate advice.

“We will continue to remove content and accounts that are in violation of our Community Guidelines and make it easy for users both on and off platform to report content that they believe may be in violation of these rules.”

When asked to comment, YouTube directed us to its safeguarding policies. 

“We worked with experts such as National Eating Disorder Association (NEDA) and Asociación de Lucha contra la Bulimia y la Anorexia (ALUBA) to develop a comprehensive framework that involves expanding the scope of our Community Guidelines, age-restricting certain videos, and surfacing crisis resource panels under videos discussing eating disorders,” said the platform’s community guidlines.

“We’ve long had policies to remove content that glorifies or promotes eating disorders. Moving forward, we’ll be updating our Community Guidelines to also prohibit content about eating disorders that feature imitable behaviour, or behaviour that we worked with experts to determine can lead at-risk viewers to imitate. This could include videos that show or describe:

  • Disordered eating behaviours, such as purging after eating or severely restricting calories
  • Weight-based bullying in the context of eating disorders

“In developing the new policies, we worked closely with NEDA and other groups to enhance understanding of what constitutes imitable behaviour, how it can show up in content, and how it can impact vulnerable viewers.

“Context will be key when it comes to this often nuanced content, and that’s why our approach relies on product features in addition to policies. Videos that are centred on eating disorder recovery or include sufficient educational, documentary, scientific or artistic context (EDSA) may receive an age restriction and/or a crisis resource panel.” 

WLT  contacted Meta for comment but received no reply by the time of publication.

Latest News

Why Too Few Women Are Scaling The Career Heights In The IT Sector
  • Opinion

Why Too Few Women Are Scaling The Career Heights In The IT Sector

There’s still more work to do, to ensure female candidates have a decent shot at challenging and fulfilling roles, says AVEVA director of sales, Pacific, Christine McNamara in this op-ed. Who thinks Australia is doing ok, when it comes to evening up the scales in the historically male-dominated world of IT. Women comprise 29 per […]

Host Of New Hires At Snap
  • Social Media

Host Of New Hires At Snap

Snap Inc. has announced a host of new hires across its team in Australia, including Dina Bailey as ANZ agency lead. Lead image: L to R – Dina Bailey, Bethany Rao-Davies, Sarah Ding, Rob Fitzpatrick, Tony, Daniel King, Elise Keeling The new hires include Dina Bailey, ANZ agency lead; Daniel King, senior client partner; and […]

X Booted Out Of Social Self-Reg Code Following Inaction On Voice To Parliament Disinformation
  • Social Media

X Booted Out Of Social Self-Reg Code Following Inaction On Voice To Parliament Disinformation

X, the platform formerly known as Twitter, has been kicked out of Australia’s code for managing misinformation and disinformation online due to its lack of response to user complaints during the Voice to Parliament referendum. Lead image; Linda Yaccarino, CEO, X Twitter and subsequently X, had been a signatory to the Australian Code of Practice […]

Leonardo.Ai Accelerates Global Growth Generating 700 Million AI Images In Less Than A Year On AWS
  • Technology

Leonardo.Ai Accelerates Global Growth Generating 700 Million AI Images In Less Than A Year On AWS

Amazon Web Services (AWS) has announced that Leonardo.Ai, an Australian generative artificial intelligence (generative AI) content production platform, is creating 4.5 million new images daily on the world’s most comprehensive and broadly adopted cloud. Since launching in December 2022, Leonardo.Ai’s users have generated more than 700 million images and trained more than 400,000 custom generative […]