2024-12-07 01:27:00
This is Behind the Blog, where we share our behind-the-scenes thoughts about how a few of our top stories of the week came together. This week, we talk about health insurance.
EMANUEL: Publicly traded companies have to disclose who their CEO is and what they are getting paid to the SEC because as publicly traded companies they owe shareholders and potential shareholders a degree of transparency about the company they are investing in and doing business with.
UnitedHealth Group, whose CEO was gunned down in the street this week, is a publicly traded company, as is the parent company for health insurer Anthem Blue Cross Blue Shield, which, as Sam reported last night, is one of a number of health insurance companies that took down the “leadership” pages from their sites, naming and showing their CEOs and other top executives.
I’m not going to jump into the fray here about the morality of murdering a CEO of a company that greedily makes life and death decisions that haunt countless of people and families for the rest of their lives other than to note that clearly a large segment of the public has responded to it with a certain sense of righteous glee. What I think is interesting is the decision of these companies to now try and hide their leadership teams. Obviously, this is a pragmatic choice of whatever person or team is now responsible for their safety, but it also highlights one of the many hypocrisies that I believe makes people feel okay celebrating someone’s murder.
2024-12-07 00:18:06
It seems like the entire internet is celebrating the assassination of UnitedHealthcare CEO Brian Thompson. But social media managers and moderators seem to be struggling to tamp down the revelry to stay within platforms’ terms of use.
Thompson, who took a reported $10.2 million annual pay package to head the country’s leading insurer in denied claims, was killed outside of his hotel by a gunman just before 7 a.m. in Midtown Manhattan, an hour before his company’s investor conference started. Business went on, but the internet is still losing its mind.
On Reddit, a subreddit called r/undelete automatically tracks posts that reach the top 100 of r/all and then are deleted, either by volunteer community moderators or Reddit’s staff of administrators. In the last 48 hours, dozens of posts caught by undelete are about Thompson, meaning the most popular type of recently deleted content is about the assassination. Many of these posts had thousands of upvotes at the time they were deleted. On r/longtail, which tracks deletions that are outside the top 100 posts, there are many more about Thompson and UnitedHealthcare.
You can get a sense for the vibe of Reddit in communities like r/nursing, where nurses are posting horror stories about their patients dealing with insurance denials and memes about Thompson roasting in hell. “Please don't let this assassination go to waste,” one nurse posted. “This is the best time for nurses to speak up and contact their elected representatives and ask for action and legislation requiring accountability from health insurance companies and private equity companies that extract as much profit as possible.”
But on the 500,000-member subreddit for medical professionals, r/medicine, moderators deleted a thread about the news of Thompson's death after it had gained hundreds of comments, mostly doctors and nurses applauding the news or memeing about it: “If you would like to appeal the fatal gunshot, please call 1-800-555-1234 with case # 123456789P to initiate a peer to peer within 48 hours of the fatal gun shot,” one said. It’s a much different scene in r/medicine than it is in r/nursing: there’s only one thread about the shooting in r/medicine right now, while the nurses have a field day.
Replying to that single new thread on r/medicine, a moderator (who also says they are a nurse) wrote, “People - Please don't make the life of your mods a living hell. Anything that is celebrating violence is going to get taken down - if not from us, then from reddit. I think all the mods understand that there is a high level of frustration and antipathy towards insurance and insurance execs, but we also understand that murdering people in the streets is not good. We are a public group of medical professionals, we still need to act like that.” Comments to that thread are a little more subdued, but plenty of people are still commenting with their experiences with the insurer: “I once had to do a prior auth for United for a glass bottle,” a medical resident wrote. “The compounded intranasal midazolam was covered, but the glass bottle it came in was not.”
Multiple other big subreddits deleted popular threads about Thompson’s death. “United Healthcare CEO Brian Thompson’s final KD ratio (7,652,103:1) lands him among the all time greats,” a thread deleted by r/interestingasfuck moderators said. One of r/InterestingAsFuck’s rules is “No politics,” which a moderator for the subreddit told us is the reason it was removed. “This assassination, given its direct connection to systemic failures within the healthcare industry, is inherently political. Consequently, the post was removed in accordance with our rules,” they said. “Furthermore, the medical insurance industry's appalling lack of compassion and accountability has understandably led to widespread outrage. Unfortunately, some individuals have expressed that anger through comments appearing to justify or support this violent act. We cannot allow our platform to become a space for such rhetoric. As a result, we made the decision to remove the post entirely.”
Mods for r/memes and r/facepalm also deleted big threads. The moderators at r/memes said they’re removing memes regarding the murder because they’re prohibited under rule 2 of the subreddit, which prohibits mention of murder and death. “The moderators try to remove all posts that break this rule and it isn't targeted at any specific incident, individual, or company,” they said.
“Imagine this is your Payback for your own policies.. Wow” a now-deleted r/facepalm thread said. Moderators for r/ABoringDystopia deleted a thread with more than 2,600 upvotes titled “Nah man, I don't know him,” seemingly referring to the manhunt for the shooter, who is still at large. A thread in r/LeopardsAteMyFace titled “They won't hurt all the billionaires right???” was deleted by moderators for not strictly adhering to the “leopards eating faces” meme format: “As a reminder, people bitching about what is to come does not constitute a face being eaten. Unless and until there are actual consequences it is not LAMF” the removal notice said. That thread had around 8,400 upvotes and almost 400 comments when it was deleted.
On r/NoStupidQuestions, moderators deleted a thread titled “Why is the death of the United Healthcare CEO a big deal? Is it a bad company or something?” at 6,000 upvotes and 1,200 comments for breaking the “no loaded questions” rule. But considering how convoluted, confusing, and frequently arbitrarily cruel American healthcare is to anyone outside of the country (as well as most of us inside of it, too) that doesn’t seem like such a stupid question to me.
Deleted threads show up on r/undelete, which archives deleted posts automatically. We asked the moderators of that subreddit for their thoughts on so many threads about Thompson being deleted across the platform. “Personally, I don't have a strong opinion about post removals on this topic, and I can not speak for moderators of other communities. I understand it's attracting a lot of attention and reactions that bring out admin intervention and likely brigades, so I expect that some communities' teams may be overwhelmed and need to remove posts as they prioritize their time as volunteers,” they said. “One note I want to share is that Reddit does a poor job of explaining to an outside observer the difference between content that has been removed by admins (site employees), versus removed by moderators (community volunteers), versus voluntarily deleted by the author. There are differences, but it is often not clear to moderators, let alone the layperson. There are certainly a lot of removals being done by both admins and moderators in these posts, and the admins and moderators do not always agree.”
In fact, a moderator from a subreddit we hadn’t reached out to for comment reached out because they heard we were contacting moderators to speak about this. They said every instance they were aware of of the surveillance video of Thompson’s death being removed had not been moderators—who, as a reminder, are unpaid volunteers—but Reddit’s paid administrator team that enforces the sitewide terms of use.
The policy for violent content on Reddit states:
“Do not post content that encourages, glorifies, incites, or calls for violence or physical harm against an individual (including oneself) or a group of people; likewise, do not post content that glorifies or encourages the abuse of animals. We understand there are sometimes reasons to post violent content (e.g., educational, newsworthy, artistic, satire, documentary, etc.) so if you’re going to post something violent in nature that does not violate these terms, ensure you provide context to the viewer so the reason for posting is clear.”
“Reddit admin have been removing the video,” the mod who reached out said. “It does not to the best of my knowledge, and I have pretty good knowledge about this as it is kind of my thing to be an expert in this particular portion of the content policy, violate the content policy. In case that is a bit convoluted, to the best of my knowledge, this was a newsworthy event covered in numerous publications of good repute. The content policy does not allow for death videos except in an educational, newsworthy etc context.”
“This has led to some moderators removing content related to the event that is not the video of the murder, as they do not want their subreddits to have AEO (admin removals) which can reflect badly on the subreddits and result in discipline from Reddits modcoc team (moderator code of conduct.)”
Reddit did not immediately respond to a request for comment.
Over on X, as Miles Klee pointed out in Rolling Stone, it was open season for comedians, activists, and anyone trying to get off a hit tweet. “Claim denied,” “pre-existing condition,” etcetera—the jokes write themselves and are repeated on the platform formerly known as Twitter a billion times over. Tiktokkers, similarly, got in that pit with reaction videos about the unaliving of a CEO.
These are all expected shenanigans from X and Tiktok, where the platforms pay people to go viral. They’re also more suited to shitposting and comedy, so it makes sense people are getting their riffs off on the platforms made for viral short form riffing. Likely place for them to be. But on the sites where more normies and corpo thinkfluencers abide—Facebook and LinkedIn—there are signs that United Healthcare, which is the health benefits division of UnitedHealth Group, are fighting for their lives.
On Facebook, UnitedHealth Group locked comments on its post mourning the death of its “dear friend and colleague,” but it couldn’t block people from reacting with emojis, which more than 73,000 have so far with the crying-laughing face (compared to around 2,400 doing a sad face). Laugh-reacting became a meme of its own on Facebook, with a lot of the more than 6,800 shared posts including people telling friends to go hit the laugh emoji.
Unitedhealth Group locked comments on its Linkedin post about the incident, too, but more than 6,000 people so far have “liked” the post (instead of “support” or “heart” reactions) and more than 200 hit the laughing emoji. It’s really special to see people wearing suits in their profile pictures and with titles like “Senior Director in Marketing & Data Analytics” and “Business Development Manager” hitting the laugh button on a health insurance company's announcement of the death of its CEO.
Social media is not real life, but when something is happening across every major platform, it’s fair to call it at least reflective of real life. And there’s rarely such a quick turnaround from outrage to real-world events as we just saw happen this week: In an announcement that couldn't have had worse timing if they tried, Anthem Blue Cross Blue Shield posted a notice for New York plans on December 1—three days before the shooting—that beginning with claims processed on or after February 1, the insurance provider would only cover anesthesia for surgeries up to a time limit that the physician estimated it would require. Connecticut and Missouri would have similar changes. In other words, if a procedure went long, every moment the patient was under anesthesia—which is almost always extremely expensive—the meter would be running.
Multiple Connecticut lawmakers posted this week that they disagreed with the change. Connecticut senator Chris Murphy wrote: "This is appalling. Saddling patients with thousands of dollars in surprise additional medical debt. And for what? Just to boost corporate profits?"
And New York senator Mike Gianaris wrote: “Ridiculous. Does Anthem expect a patient to get up in the middle of a surgery and walk away?” He also vowed to introduce legislation to prevent such practices in the future.
On Thursday, Anthem walked the change back entirely. "There has been significant widespread misinformation about an update to our anesthesia policy. As a result, we have decided to not proceed with this policy change,” a spokesperson for Anthem told me in an email. “To be clear, it never was and never will be the policy of Anthem Blue Cross Blue Shield to not pay for medically necessary anesthesia services. The proposed update to the policy was only designed to clarify the appropriateness of anesthesia consistent with well-established clinical guidelines.” I asked what the “misinformation” was, which they didn’t answer.
Jason Koebler contributed reporting to this story.
2024-12-06 22:03:26
Last month, Matt Lyzell, the creator of the Netflix interactive series Battle Kitty announced on his personal Instagram account that Netflix was going to remove his show from the streaming service just two years after its debut. By the end of the day, Netflix confirmed that not only Battle Kitty was being removed, but that all 24 Netflix interactive series were to be removed on December 1, with the exception of Black Mirror: Bandersnatch, Unbreakable Kimmy Schmidt: Kimmy vs. the Reverend, Ranveer vs. Wild with Bear Grylls, and You vs. Wild.
“The technology served its purpose, but is now limiting as we focus on technological efforts in other areas,” a Netflix spokesperson told 404 Media at the time.
It is normal for Netflix and other streaming services to rotate titles in and out of their catalogue depending on what they cost to license and host and how many subscriptions they drive to the platform, but Netflix removing its interactive series means that, as original Netflix creations, once they are removed from Netflix they will not be available anywhere else, and they are a new and unique format that dozens of producers, animators, voice actors, and other creatives have finished work on very recently.
Unwilling to accept Netflix’s decision to make all these interactive shows totally inaccessible, a group of fans—and, in a few cases, people who worked on the interactive shows—are finding ways to archive and make them available for free.
“I couldn’t let this work go to waste. We’re talking about over 100 hours of video and ~ one thousand hours of dubbing,” Pixel, one of the archivists in a Discord channel archiving Netflix interactive shows, told me.
On Discord, dozens of users have collaborated on capturing all the videos from Netflix before they were removed, as well as reverse engineering how the platform handled their interactive elements. Some shows are already fully emulated and can be streamed in bespoke, alternative players, others are uploaded to YouTube in a series of daisy-chained, interlinked videos that recreate a very similar interactive experience, while some others have been uploaded as non-interactive videos.
404 Media agreed not to name the Discord channel and some of the places where the Netflix interactive archives are being hosted so Pixel could talk about the archiving effort. While Netflix has made it so there is no way to view Netflix interactive shows without basically pirating them, the archivists worry that the company will still try to take down any alternative method for viewing them.
“While I can’t disclose fully how we are archiving these, I can say that they pull directly from Netflix’s servers, so no re-encoding or loss of quality,” Pixel said. “I would love to talk more about how it works, but it risks Netflix patching out the tool entirely.”
Netflix interactives, in case you are unfamiliar, are choose-your-own-adventure videos where the viewer can make choices at the end of a scene that determine how the story unfolds. This Netflix initiative was launched with great fanfare in 2018 when Netflix released Bandersnatch, an interactive entry in the science-fiction anthology series Black Mirror. The shows are interactive in the sense that viewers can nudge the story in different directions, but all they are doing is essentially deciding which pre-recorded video file will play next. Netflix actually made some of its interactive series available on YouTube by using YouTube’s built-in feature that allows users to choose what video to play next once a video ends, daisy chaining YouTube videos together to create the choose-your-own-adventure format natively on that platform. I’ve seen at least two other Netflix interactive shows fully recreated by archivists on YouTube with this method, sometimes in multiple languages.
Pixel explained that Netflix interactives rely on an “internal video” that contains all the interactive elements, including the different paths, variations, and endings. Decisions viewers make are defined by two JSON files, with one determining when a viewer is presented with a decision and and where in the internal video file to skip to based on that decision, and the other pulling the assets for the decision buttons from Netflix’s servers.
“We currently have a proof-of-concept emulator running off a python script that uses the jsons to make functioning decisions, although it needs ironing out and button images are broken as of now,” Pixel said. “We have a member of the team in Turkey that is hosting the files for once we get the emulator working on a webpage.”
While the archivists in the Discord were able to rip much of the content directly from Netflix before it was removed, each title is available in many languages, and as Pixel explained, they had trouble grabbing some of the interactive elements, so they weren’t able to grab everything.
In at least one case I’ve seen, the archivists shared video of one of the interactive shows pulled from Netflix that was uploaded to the personal account of someone who worked on the show, though Pixel said they've already ripped that show directly from Netflix.
“Since it's no longer on the Netflix app I figured why not upload it here as a lot of crazy talented people poured their hearts into it for a year,” the person who worked on the show said in a post sharing the video.
When I asked what the biggest challenge facing the archiving effort is at the moment, Pixel said that “Right now it’s probably keeping track of everything hah. Me and Scramble [another person involved in the archiving effort] had to contact a bunch of people who were willing to help rip stuff/give us already discontinued shows. Right now it’s the emulator. We have a lot of people counting on us and I get a LOT of dms from people asking how to play them haha.”
Correction: This story previously said the archivists used video that was uploaded by one of the show's creators. The archivists say they had already ripped that show directly from Netflix. 404 Media regrets the error.
2024-12-06 09:50:00
Following the murder of its CEO on Wednesday morning, United Healthcare removed a page from its website listing the rest of its executive leadership, and several other health insurance companies have done the same, hiding the names and photos of their executives from easy public access.
As of Thursday, United Healthcare’s “about us” page that listed leadership, including slain CEO Brian Thompson, redirects to the company’s homepage. An archive of the page shows that it was still up as of Wednesday morning, but is redirecting at the time of writing and isn’t directly accessible from Google search or the site’s navigation buttons.
Anthem Blue Cross Blue Shield, which Thursday said it would walk back changes announced this week that would charge patients for anesthesia during procedures that went longer than estimated, now redirects its own leadership page to its “about us” page. Originally that page showed leadership, including President and CEO Kim Keck, Executive Vice President and CFO Christina Fisher, and 23 more executives as of earlier this year according to archives of the page, but is now inaccessible.
2024-12-05 23:25:04
More than 150 employees at the cloud services giant Digital Ocean protested last year after its CEO explained in an all-hands meeting that his former mentor was a member of the Ku Klux Klan, which he said shows how employees can work together despite holding different beliefs. The CEO’s comments led to widespread outrage among employees on Slack, in a formal open letter, and in an employee walkout that has not been previously reported.
The all-hands meeting was intended to address the fallout of an employee posting an anti-LGBT meme on LinkedIn after the company changed its logo to be rainbow colored during Pride Month.
404 Media has obtained video of a July 2023 meeting in which the then-CEO of Digital Ocean, Yancey Spruill, tells employees that a company's "values," are not the same as an individual employee’s personally held beliefs. Digital Ocean is a huge, publicly traded cloud services and data center provider that has become particularly important with the rise of AI. Spruill has since left the company.
"Every time we leave our home we have to bend our belief system because we engage with human beings who are different than us in any number of dimensions. And this is really critical that beliefs are not our values, our behaviors. However, we all have to sign up for the [company's] values," Spruill said. "All the companies I’ve ever been in, I don’t remember the numbers, the EBITDA, the projects I worked on. What I do remember is—did that company live and honor its values? Did the employees?"
2024-12-04 22:00:31
YouTube is running hundreds of ads featuring deepfaked celebrities like Arnold Schwarzenegger and Sylvester Stallone hawking supplements that promise to help men with erectile dysfunction.
The ads, which were discovered by Alexios Mantzarlis in the Faked Up newsletter, have been running since at least November 12 and have around 300 variations according to Google’s Ad Transparency Center. All the ads use existing videos that are modified with an AI-generated voice and lip synced to match what the AI-generated voice is saying. Many of the ads feature non-celebrity women who talk about how their “husbands went wild” after “trying a secret simple mix” to treat their erectile dysfunction, but some of the ads feature deepfakes of celebrities including Arnold Schwarzenegger, Sylvester Stallone, Mike Tyson, and Terry Crews.
“Have you heard about the salt trick that is making me stay hard for hours in bed?” an AI-generated Schwarzenegger asks in his instantly recognizable Austrian accent. “Top adult actors have been using this for the last five years to stay rock hard. I mean, you didn’t think they last that long without a little hack, right?”
Video ads of Stallone, Tyson, and Crews repeating the exact same script indicate that whoever wrote the ads copy/pasted it into an AI voice generator.
The ads lead users to a page on “thrivewithcuriosity.com,” where after confirming they are “40+” years old, they are shown a meandering and very explicit 40-minute long presentation about the miracle drug that is getting men including celebrities, strippers, and adult performers “rock hard.”
That video opens with a real Today Show interview Stallone did with his wife and three daughters to promote their reality show “The Family Stallone,” but it’s been very uncomfortably edited with AI-generated audio and lip sync to make it seem as if he’s talking about how hard he can get now to satisfy his wife thanks to the miracle drug.
The video takes viewers on a bizarre journey from a strip club in Texas to a fake Harvard urologist’s office to an abandoned church in Thailand where scientists discovered a species of bat with abnormally large and long-lasting erections. Along the way, deepfake videos of everyone from Tom Hanks, Denzel Washington, and adult entertainment star Johnny Sins are made to say they have been quietly using this secret formula to last longer in bed. The video eventually concludes by offering viewers the opportunity to buy six bottles or 180 days-worth of Prolong Power at $49 per bottle.
That link sends users to a page on digistore24.com where they can enter their credit card information to purchase Prolong Power, but I was able to find the supplement for sale many other places online. Many sellers on Amazon offer Prolong Power, where it has mixed reviews from users, with some saying “This product is a scam,” “don’t bother,” and “fake.” According to its label, Prolong Power is made up of a “proprietary blend” of oat bran powder, fennel seed, cascara sagrada bark powderact, and other common ingredients that according to the National Library of Medicine are mostly helpful with constipation. Notably, the ingredients do not include “midnight beetle powder,” which the long video pitching Prolong Power explains is the secret ingredient that gave the church bats their magnificent erections.
Prolongpowers.com, which calls it the “#1 Natural Male Enhancement Supplement” claims it now offers a “new version” it calls Primor Dial Vigor X, and features testimonials from three customers who made “verified purchases.” However, a spokesperson for deepfake detection company Reality Defender said that according to their platform, the headshots attached to those testimonials were 99 percent likely to be AI-generated.
Back in January, YouTube deleted around 1,000 similar ads in which deepfaked celebrities unknowingly pitch scams.
“We are constantly working to enhance our enforcement systems in order to stay ahead of the latest trends and scam tactics, and ensure that we can respond to emerging threats quickly,” Google said at the time after deleting the ads. But obviously it still doesn’t have this problem fully under control.
"We prohibit ads that falsely claim a celebrity endorsement in order to scam people," a Google spokesperson told 404 Media in response to this story. "When we identify an advertiser engaging in this deceptive practice, we permanently suspend their account and remove all their ads from our platforms, as we have done in this case. We continue to heavily invest in our ability to detect and remove these kinds of scam ads and the bad actors behind them.”
Google said it removed the deepfake supplement ads and permanently suspended the account that paid for them after I reached out for comment.