Why Are Gruesome Photos of a Murdered 17-Year-Old Girl Still All Over Instagram?

The brutal death of 17-year-old Bianca Devins, a Utica, New York teenager and aspiring influencer who was allegedly murdered by a jealous male friend on her way home from a concert, shattered many on the internet, who mourned Devins’ passing with the hashtag #ripbianca on Monday. The fact that her suspected killer, 21-year-old Brandon Clark — who was charged with second degree murder — took photos of Devins’ dead body and posted them on Instagram and Discord, made her death all the more horrific, as did the fact that both Clark’s account and the images of her dead body were viewable for hours after they were initially posted.

To make matters worse for Devins’ family, many on Instagram are using her death as an opportunity to troll for new followers, promising to send the photos via DM to anyone who follows the account. This is in spite of the fact that law enforcement officials have explicitly requested for people to stop sharing the images, as well as Devins’ stepmother posting a plea to Facebook begging people to stop. “I will FOREVER have those images in my mind when I think of her. When I close my eyes, those images haunt me,” she wrote. “How about we have some fucking consideration for her Mother, Sister, Step sisters and brother, Step Mother and Step Father, her Grandparents, Aunts, Uncles, Nieces and Nephews and her friends. How about we have some fucking consideration FOR HER!!!”

Related

Such cynical death profiteers notwithstanding, many have criticized Instagram and other social media platforms for being slow to take down the photos, with some saying that they reported the images only for Instagram to tell them that it did not violate community guidelines. It’s an endless game of Whack-a-Mole that has spawned a small cottage industry of de facto social media censors grabbing mallets, trying to report each new account as it pops up, only to see five more come up in its place.

“I thought if I reported it it would just be taken down, hence why I was so shocked that it was still up in the morning,” says Lauren MacMillan, a 19-year-old who received a message from Instagram this morning saying the photos of Devins that she flagged did not violate Instagram’s community guidelines. She says she’s shocked by how easy it has been to find the images. “There are some [young] kids on Insta and they really do not need to be seeing it, it could damage them,” she says. “No one deserves to be blatantly disrespected like that.”

“I keep checking her tagged photos on Instagram and there’s a new one every five minutes,” says Anna Russett, an influencer and creative consultant who estimates she has flagged between 10 and 15 accounts over the past 24 hours. “It seems like all of the reported posts are eventually taken down. But they just keep popping up under new accounts.” She says that while Instagram removed some of the images she flagged immediately after she complained about them, she has received some messages saying the platform will leave the content up with a sensitive content warning, while some messages have said the photo does not violate community guidelines at all. She also says she has seen cropped and adjusted versions of the explicit photos, as well as the images sandwiched between other, more innocuous images, as a way of evading recognition from Instagram’s algorithm. There doesn’t appear to be any “exact logic” guiding the decision-making process, she says.

To a degree, it’s not shocking that it would be challenging for Instagram — which says it has a billion users per month — to keep up with the proliferation of the images. Giant social media platforms have long struggled with being able to scrub hateful or violent content from platforms. Last spring, for instance, Instagram’s parent company Facebook and YouTube were criticized for failing to quickly delete uploaded and livestreamed footage of the New Zealand mosque shooting, which killed 49 people. Instagram in particular has garnered criticism for rigorously applying some of its community guidelines, such as those prohibiting nudity or sexualized content, while letting far more objectionable content, such as hate speech or violent images, remain on the platform.

In this sense, Instagram’s handling of the Devins photos is exemplary of a wider problem, but Russett says this is a particularly egregious example considering the elements of the case: Devins was an active 4chan user and a so-called “egirl,” or an aspiring influencer who coopted aspects of gaming culture in her Instagram persona, which has prompted many 4chan trolls to viciously mock her and share memes about her death. “There’s so much more violence and abuse around gamer girls than others it seems,” she says. The fact that Instagram’s community guidelines specifically make room for violent content when it’s explicitly intended to “condemn, raise awareness or educate” others about violence has also seemingly created a loophole for trolls to share the images: many captions accompanying the posts make reference to wanting to raise awareness of Devins’ death, in the same breath as joking about it or requesting more followers.

In a statement to Rolling Stone, Instagram initially said it has “taken steps to prevent others from re-uploading the content posted to that account to Instagram” by utilizing technology to flag the images in advance of others sharing them. Regarding complaints from users that they had flagged the photos, only to receive messages from Instagram saying that it did not violate community guidelines or that it had been given a sensitive content warning, the spokesperson said: “The image violated our policies and we have removed it.” When asked why the users received these messages saying the photo did not violate community guidelines when it did, the spokesperson declined to comment further, saying only that the images had been removed.

Instagram is far from the only platform that has failed to adequately address the dissemination of the images. On Twitter, for instance, the photos are also being widely shared, with some users actively spamming the #ripbianca hashtag with the photos. One tweet that Russett says she flagged at 8:00 p.m. Monday night is still publicly available as of press time. (Twitter did not immediately respond to a request for comment from Rolling Stone.)

But given how shrouded in secrecy Instagram’s algorithm is, it’s difficult to know exactly how it is monitoring such content, though it’s clear that whatever it’s doing, users are finding a way around it to post the gruesome image, over and over and over again. Russett says there are steps that Instagram could potentially take to at least make it more difficult for people to do this, such as disabling Devins’ Instagram page from being tagged in other photos and videos. The spokesperson for Instagram said that the photo was not immediately visible in Devins’ tagged feed and her account had been memorialized, which prevents people from making any changes to existing information or posts on the deceased person’s profile, but does not disable others from tagging their own photos and videos to the profile. When Rolling Stone noted that a photo of Devins deceased with her tagged in it had been posted seven minutes ago and was clearly visible in her tagged feed, the spokesperson said, “We’re looking into the tagged photos now and will keep you posted.”

For now, it seems, Instagram is largely relying on good samaritans such as Russett and McMillan reporting the images, as well as people trying to flood Devins’ profile and hashtags memorializing her with images of puppies, kittens, and pastel pink clouds to block the gruesome images. Some are even begging whoever is controlling Devins’ account to disable it, or urging her followers to report the account and have it removed. “It’s just so wrong and I don’t understand why people are reposting it,” says MacMillan. “I have a brother the same age as Bianca and I know how devastated I would feel if this was happening to me. It’s just making everything harder for her family.”

Source: Read Full Article