Google refuses to remove website responsible for dozens of suicides

You can Click here for a list of resources from the Suicide Prevention Resource Center.

There is an extremely disturbing site on the internet where users encourage each other to commit suicide – and according to Google there is nothing they can do to remove the site from their search results.

A deep dive into the stomach The New York Times explores the site, which we will not name here. The story raises tough questions about both ethics and censorship – and particularly about Google, which has chosen to passively condemn the site while allowing it to remain a top hit in its search results.

Equal parts gruesome bulletin board and instruction manual, this site considers itself “pro-choice” – that is, people have the choice to die by suicide and have access to information about way to do it, alongside a community that will help them. do it without judging them or trying to keep them alive.

Over the past two years that the site has been in place, by the Time‘ Count, at least 45 users died by suicide, and probably many more. Many of them learned how on the site, got “support” from other users when their belief in ending their lives wavered, and even live-blog their deaths.

Run by two men in their twenties who live thousands of miles apart in Alabama and Uruguay, the site was born after Reddit shut down a forum with the same mission. The two operators were previously known by pseudonyms, but were exposed by the Time.

It should be noted that there is a raging debate over assisted suicide, in which terminally ill patients can access treatments to end their lives. This conversation is as lively as it is tense, from the Kevorkian machines of the 1990s to the states and countries set to legalize physician-assisted suicide.

The site promoted by the Time’ the investigation, however, should not be part of this debate. Its targets, on the other hand, are mostly healthy people for whom the decision to end their life is almost certainly a gross miscalculationwhich to Time clearly in refusing to include physician-assisted suicides in his charts mapping the steep rise in suicides over the past two decades.

Regardless of one’s personal beliefs on euthanasia and suicide – hell, whatever your beliefs on censorship either – the concept of arming a group of sick people with specific information and support to end their own life is disturbing.

In many ways, this cursed suicide site represents the latest in a long line of problematic online materials that provide people with information about all sorts of horrible things, from pro-anorexia blogs and ineffective COVID-19 treatments to forums. for white nationalists and “involuntary celibates” called incels.

These three topics have prompted tech companies like Google and Facebook to censor particularly horrific online content.

Often they obey. Facebook, for example, is often quick to try (and fail repeatedly) to filter out harmful materials.

The situation at Google, and its parent company Alphabet, is more complex. Although he removed medical misinformation and white supremacist content hosted on YouTube, it takes a more hands-on approach to the content it lists on the open web, as evidenced by this week’s controversy.

“This is a deeply painful and difficult issue, and we continue to focus on how our products can help those in vulnerable situations,” a Google spokesperson told Futurism. “If people come to Google to search for information about suicide, they see features promoting prevention hotlines that can provide essential help and support.”

“We have specialized ranking systems designed to prioritize the highest quality results available for self-injury queries, and we also block autocomplete predictions for these searches,” she continued. . “We balance these safeguards with our commitment to giving people open access to information. We are guided by local law when it comes to important and complex questions about what information people should be able to find online. »

This is a fairly fundamentalist position. The First Amendment may allow free speech for neo-Nazis and suicide advocates, but tech companies are not governments. They can, in principle, withdraw whatever they want.

Google’s slogan was “Don’t be mean”. It’s later removed this phrase from the company’s code of conduct in 2018 – and indeed, it feels like there’s a little room for evil in the company’s search results.

Updated to clarify that although Google has removed content hosted on YouTube, it does not have a history of de-indexing controversial content on its search engine.

READ MORE: Where the desperate connect and learn how to die [The New York Times]

Learn more about assisted suicide, which this site is definitely not about: Assisted suicide chamber approved by the authorities in Switzerland

Would you like to support the adoption of clean energy? Find out how much money (and the planet!) you could save by switching to solar power at UnderstandSolar.com. By registering via this linkFuturism.com may receive a small commission.

Sherry J. Basler