The teenager’s suicide drags Mita and Snape to court

Teenage girls accuse Instagram of raising anxiety and depression levels (Getty)

American families turned to the judiciary in an attempt to hold large social media companies accountable, as they claimed that their platforms played a key role in the suicide of their children, particularly Meta, the parent company of Facebook and Instagram, and Snap, Snap’s parent company. chat “.

The family of Christopher James Dowley has finally joined this battle against the social media giants. Dowley was 14 when he opened his Facebook, Instagram and Snapchat accounts. Like other teenagers, he began documenting his diaries on these platforms. But during high school his parents noticed that he had become addicted to these platforms. His mother told CNN on Tuesday that in his final year, he “could not take his eyes off the phone.” He stayed awake until dawn exchanging messages with others on Instagram, and sometimes the conversations included nude photos, and then he began to suffer from sleep disturbances and obsessed with body shapes.

On January 4, 2015, while the Dowley family was dismantling Christmas trees and New Year decorations, he climbed into his room. He sent an SMS to his friend, and wrote on his Facebook account: “Who turned off the light?” He was holding a 22 caliber rifle in one hand and a smartphone in the other and shot himself. He was 17 years old. Police found a suicide note he had written on an envelope with the university admission letter. His mother told CNN: “When we found him, his phone was still in his hand and stained with blood. He was so addicted that his last moments of life were given to him to post on social media.”

The lawsuit, which was filed by the Dowley family last week, targets Snap, the owner of Snapchat, and Meta, the owner of Facebook and Instagram, accusing them of designing their own platforms. based on algorithms. aims to attract people to use them for a longer time, and thus achieve more profits. . She notes that they exploit the way juveniles make their decisions and emotions due to “incomplete brain development”. Dowley’s mother and father believe the addictive nature of social media platforms has directly affected their son’s mental health. They say they were encouraged to take legal action after former Facebook employee Frances Haugen leaked inside data revealing the company was aware of the damage its Instagram platform is causing to teenagers’ mental health and their relationship. with the body. “This lawsuit is not about winning or losing,” Dowley’s mother confirmed to CNN. “We are all losing now. But if we can force them to change the algorithm for a child – if we keep one – it’s worth trying.”

Haugen made public statements and testified before Congress last fall about how Facebook algorithms push new users toward harmful content, such as posts about eating disorders and self-harm. She also talked about attracting addicted users. At the time, Facebook CEO Mark Zuckerberg wrote a 1300-word post claiming that Haugen took these internal documents out of context and painted a “misrepresentation of the company.”

Haugen leaks have prompted U.S. lawmakers on both sides to review tech giants. Both sides introduced a bill in the Senate in February that proposes giving technology platforms new and clear responsibilities for protecting children from digital harm. President Joe Biden also urged lawmakers to “hold social media platforms accountable for the experiments they are conducting on our children across the country in search of profit.”

Dissatisfied with these measures, some families have turned to the courts in an effort to force technology companies to change the way they operate. The Dowley family’s attorney, Matthew Bergman, established the Law Center for Social Media Victims last fall after leaks were posted on Facebook. He now represents 20 families suing social media companies for manslaughter, according to CNN. Among these families are the family of a boy who committed suicide in 2019 at the age of 16 when he used Snapchat, and the family of a girl who committed suicide last year when he was 11, after suffering for two years. from addiction to social networks, specifically “Instagram” and Snapchat, although they set the minimum age for users to be 13 years old, but this does not prevent children under this age to create accounts.

Snap confirmed it could not comment on active lawsuits, but told CNN the importance it attaches to the mental health of its users and stressed that it is constantly exploring additional ways to provide support. Meta also declined to comment on the issue, but noted that he is currently launching a range of suicide prevention tools, such as automatically securing resources for a user if a friend or artificial intelligence detects a suicide-related post.

In the months since the revelation of internal documents, Instagram has created a series of safeguards aimed at protecting younger users, including a tool called Take a Break that aims to encourage people to take time away from the platform. It also introduced a tool that allows parents to see how much time their children spend on Instagram and set time limits. Last month, dozens of prosecutors wrote a letter to TikTok and Snap, urging them to upgrade existing tools and work better alongside third-party monitoring apps that can alert parents if children use languages ​​they want. Suicide or suicide.

Several U.S. states launched a joint investigation last November to find out if Instagram’s parent company deliberately allowed children and teens to use its social network, even though they knew it might be harmful to their mental health. and physical. In May, prosecutors in 44 states sent a letter to Zuckerberg urging him to abandon a project to launch a version of “Instagram” intended for children under the age of thirteen. In the wake of what Haugen discovered and the ensuing wave of widespread sentencing, Facebook acknowledged and announced last September the suspension of the project.

Over the same month, Instagram administrators revealed their efforts to curb the obsession with the ideal body in teenage girls, after the Wall Street Journal found that the network is aware of the problem as a result of its research, but limits its risk for the psyche of tens of millions of young people who see app releases every day. . The report said a slide show during an internal meeting of the company’s staff in 2019 confirmed that the app “exacerbates poor body relationship for one in three teens”. And another slide said that “Teenage girls accuse Instagram of increasing levels of anxiety and depression,” concluding a study of girls who face this type of problem. The platform has strengthened protection for small users. Since July, accounts created by persons under 16 (or even 18 in some countries) have been made private by default, with existing users encouraged, but not required, to make this choice.

Leave a Comment