Instagram is bringing back one of its more chaotic features, now reworked for the short-form video era.
A new tab in the Reels feed will serve up videos that a user’s friends have liked or added commentary to, Instagram leader Adam Mosseri announced in a video message today. Users will be able to see which friends have liked a video — a callback to the old Instagram “activity” feed that was killed in 2019.
“We want Instagram to not only be a place where you consume entertaining content, but one where you connect over that content with friends,” Mosseri says. In the new feed, you’ll be able to see which friends have liked a post and which have left a temporary “note” on a Reel.
That sounds nice in a ideal world, but given the way that the previous “activity” feed was scrutinized, I’m willing to bet a lot of users actually don’t want their friends to see all the Reels they’ve liked. (I’m not sure what benefits or insights my friends would get from seeing that I liked every single Shohei Ohtani post that crossed my feed, but OK.) It also might discourage people from engaging publicly with content in this way to avoid it being shown to all of their friends. It’s also not a given that you share interests or hobbies just because you’re friends with someone — for many people, it’s the hyper personalized nature of TikTok that makes the experience interesting in the first place.
Other platforms like X have gone the opposite route by hiding users’ liked posts, in part because people kept getting caught liking embarrassing things (if someone catches Ted Cruz liking thirst trap Reels, please email me immediately). Meta didn’t immediately respond to questions about whether users can opt out of having their activity shown in the new Reels feed.
Instagram stands to benefit if its biggest rival, TikTok, is forced to pull out of the US this weekend. Reels is Instagram’s answer to TikTok, but many creators and users say the atmosphere on Reels doesn’t live up to the environment TikTok has cultivated. While the new feature might stoke drama and pull some users into the Reels feed, it could also have the opposite effect for those who don’t want all their interests broadcast out.
Cash App is closing out the week on the hook for $255 million in multiple settlements around its consumer protections.
Block, the company that owns Cash App, agreed Wednesday to pay $80 million to 48 states that fined the company for violating laws intended to keep illicit activity off the platform.
“State regulators found Block was not in compliance with certain requirements, creating the potential that its services could be used to support money laundering, terrorism financing, or other illegal activities,” a press release from the Conference of State Bank Supervisors says.
Separately, the federal Consumer Financial Protection Bureau reached a settlement with Block on Thursday, in which the company agreed to pay $120 million to Cash App customers and another $55 million to the CFPB. According to the bureau, Cash App’s weak security measures put consumers at risk and made it difficult for users to get help after experiencing fraud on the platform. Cash App is also accused of tricking consumers into thinking that their bank, not Cash App, was responsible for handling disputes and that Cash App didn’t offer “meaningful and effective” customer service, which “left the network vulnerable to criminals defrauding users.”
How to regulate peer-to-peer money-transferring apps like Cash App is an ongoing fight. This week, NetChoice and TechNet sued to challenge the CFPB’s handling of such platforms like banks, calling it an “unlawful power grab.” Google filed a similar suit in December.
It’s been more than four years since Donald Trump first moved to expel TikTok from the US — and now, just days before a second Trump presidency begins, it just might happen.
President Joe Biden signed legislation last April that officially began the countdown that would force TikTok’s parent company, ByteDance, to divest from the US business. But even afterward,the atmosphere on the video powerhouse was mostly nonchalant, with a handful of stray jokes about “this app disappearing” slotted between the usual fare.
In the last week, though, the vibe has shifted — my favorite creators are posting links to their other social accounts, audiences are making highlight reels of the most viral moments on the app, and they’re saying goodbye to their “Chinese spy” and threatening to hand over their data to the Chinese government. A Chinese-owned app Xiaohongshu, known as RedNote, topped the App Store this week, driven by a wave of “TikTok refugees” trying to recreate the experience of the platform. It’s feeling a bit like a fever dream last day of school.
For many creatives online, this wouldn’t be the first time they’ve had to migrate to new spaces: reach, engagement, and visibility are constantly shifting even on the largest and most stable platforms. But the possibility that a social media site of this size would disappear — or slowly break down until it’s nonfunctional — is a new threat. For small creators especially, TikTok is like playing the lottery: you don’t need thousands of followers for your video to get big, and this unpredictability incentivized the average person to upload content.
It’s still unclear what will happen to TikTok after January 19th. I asked content creators what their game plan is. (Responses have been edited and condensed for clarity.)
“At the peak, I was making approximately 70 percent of my sales through TikTok from December 2020 to January 2022. Now, it drives at most, 10 percent of my sales,” says Noelle Johansen, who sells slogan sweatshirts, accessories, stickers, and other products.
“At my peak with TikTok, I was able to reach so many customers with ease. Instagram and Twitter have always been a shot in the dark as to whether the content will be seen, but TikTok was very consistent in showing my followers and potential new customers my videos,” Johansen told The Verge in an email. “I’ve also made great friends from the artist community on TikTok, and it’s difficult to translate that community to other social media. Most apps function a lot differently than TikTok, and many people don’t have the bandwidth to keep up with all of the new socials and building platforms there.”
Going forward, Johansen says they’ll focus on X and Instagram for sales while working to grow an audience on Bluesky and Threads.
“I think the ease of use on TikTok opened an avenue for a lot of would-be creators,” Kay Poyer, a popular creator making humor and commentary content, says. “Right now we’re seeing a cleaving point, where many will choose to stop or be forced to adapt back to older platforms (which tend to be more difficult to build followings on and monetize).”
As for her own plans, Poyer says she’ll stay where the engagement is if TikTok becomes unavailable — smaller platforms like Bluesky or Neptune aren’t yet impactful enough.
“I’m seeing a big spike in subscribers to my Substack, The Quiet Part, as well as followers flooding to my Instagram and Twitter,” Poyer told The Verge. “Personally I have chosen to make my podcast, Meat Bus, the flagship of my content. We’re launching our video episodes sometime next month on YouTube.”
Bethany Brookshire, a science journalist and author, has been sharing videos about human anatomy on TikTok, Bluesky, Instagram, and YouTube. Across platforms, Brookshire has observed differences in audiences — YouTube, for example, “is not a place [to] build an audience,” she says, citing negative comments on her work.
“I find people on TikTok comment and engage a lot more, and most importantly, their comments are often touching or funny,” she says. “When I was doing pelvic anatomy, a lot of people with uteruses wrote in to tell me they felt seen, that they had a specific condition, and they even bonded with each other in the comments.”
Brookshire told The Verge in an email that sharing content anywhere can at times feel fraught. Between Nazi content on Substack, right-wing ass-kissing at Meta, and the national security concerns of TikTok, it doesn’t feel like any platform is perfectly ideal.
“Sometimes I feel like the only ethical way to produce any content is to write it out in artisanal chalk on an organically sourced vegan stone, which I then try to show to a single person with their consent before gently tossing it into the ocean to complete its circle of life,” Brookshire says. “But if I want to inform, and I want to educate, I need to be in the places people go.”
The Woodstock Farm Sanctuary in upstate New York uses TikTok to share information with new audiences — the group’s Instagram following is mostly people who are already animal rights activists, vegans, or sanctuary supporters.
“TikTok has allowed us to reach people who don’t even know what animal sanctuaries are,” social media coordinator Riki Higgins told The Verge in an email. “While we still primarily fundraise via Meta platforms, we seem to make the biggest education and advocacy impact when we post on TikTok.”
Walt and Waldo escapd separate slaughter operations in different towns over the summer. We were able to rescue both, and they became each other’s comfort as they adjusted in quarantine. Usually, the quarantine period is only a few weeks and then new residents move in with existing groups, but Walt experienced some serious medical emergencies that took him a long time to heal from, and Waldo stayed by his side during those months. Finally, we were able to move this pair into the main sheep barn and watch them integrate into their new family, which was so special to watch. #whywoodstock
With a small social media and marketing team of two, Woodstock Farm Sanctuary (like other small businesses and organizations) must be strategic in how it uses its efforts. YouTube content can be more labor-intensive, Higgins says, and Instagram Reels is missing key features like 2x video speed and the ability to pause videos.
“TikTok users really, really don’t like Reels. They view it as the platform where jokes, trends, etc., go to die, where outdated content gets recycled, and especially younger users see it as an app only older audiences use,” Higgins says.
The sanctuary says it will meet audiences wherever they migrate in the case that TikTok becomes inaccessible.
Anna Rangos, who works in social media and makes tech and cultural commentary videos, is no stranger to having to pick up and leave a social media platform for somewhere else. As a retired sex worker, she saw firsthand how fragile a social media following could be.
“You could wake up one day to find your accounts deactivated, and restoring them? Forget it. Good luck getting any kind of service from Meta,” Rangos said in an email. Having an account deleted means lost income and hours of trying to rebuild a following. “Over my time in the industry, I went through three or four Instagram accounts, constantly trying to recapture my following.”
Sex workers and sex education creators regularly deal with their content being removed, censored, or entire accounts deleted. Rangos says that though the community on TikTok is more welcoming, she’s working to stake out her own space through a website and a newsletter. She also plans to stay active on YouTube, Pinterest, and Bluesky.
“I don’t plan on using Meta products much, given [Mark] Zuckerberg’s recent announcements regarding fact-checking,” she wrote in an email.
“I have found so much joy and community on TikTok mostly through Native TikTok,” says Amanda Chavira, an Indigenous beader who built an audience through tutorials and cultural content. “It’s sad to see TikTok go.”
Chavira says she plans to reupload some of her content to YouTube Shorts to see how her videos perform there but otherwise will be waiting to see if another viable video platform comes along. Chavira won’t be pivoting to Meta: she says she plans to delete her accounts on Threads, Instagram, and Facebook.
“I’d been considering leaving my Meta accounts for a long time,” she said in an email. “Facebook felt like a terrible place through election cycles, and then the pandemic, [and] then every other post I was seeing was a suggested ad or clickbait article. For Instagram, I’ve really been struggling to reach my target audience and didn’t have the time available to post all the time to try to increase engagement.” Her final straw was Meta’s decision to end the fact-checking program and Zuckerberg’s “pandering to the Trump administration,” she says.
Drake’s ongoing legal battle with his label, Universal Music Group, has escalated. The artist filed a lawsuit in federal court today, accusing UMG of harming his reputation and endangering him for profit. The suit stems from the diss track “Not Like Us” by Kendrick Lamar, another UMG artist. Drake’s legal complaint also again accuses UMG of using bots on Spotify and other streaming platforms, and payola to make the song more popular.
“On May 4, 2024, UMG approved, published, and launched a campaign to create a viral hit out of a rap track that falsely accuses Drake of being a pedophile and calls for violent retribution against him,” the complaint reads. “Even though UMG enriched itself and its shareholders by exploiting Drake’s music for years, and knew that the salacious allegations against Drake were false, UMG chose corporate greed over the safety and well-being of its artists.”
The lawsuit details a shooting at Drake’s (real name: Aubrey Graham) home just a few days after the song was released, during which a security guard was injured. Multiple break-ins occurred in the following days, which the lawsuit says were caused by UMG’s actions.
Why would UMG pit two of its own artists against each other? Drake’s team has a theory:
UMG’s actions are motivated, at least in part, by UMG’s desire to best position itself in negotiations with Kendrick Lamar in 2024 and Drake in 2025. With respect to Lamar, on information and belief, UMG was incentivized to prove that it could maximize Lamar’s sales—by any means necessary—after only being able to get him to sign a short-term exclusive contract. UMG wanted Lamar to see its value on an expedited timeframe in order to convince Lamar to resign exclusively and for a longer period of time. As to Drake, in 2024, his contract was nearing fulfillment. On information and belief, UMG anticipated that extending Drake’s contract would come at a high cost to UMG; as such, it was incentivized to devalue Drake’s music and brand in order to gain leverage in negotiations for an extension
Lamar is not named as a defendant in the suit; instead, Drake’s legal team pins the blame on UMG for releasing the song despite knowing the song’s “allegations are unequivocally false.”
“Drake is not a pedophile. Drake has never engaged in any acts that would require he be “placed on neighborhood watch.” Drake has never engaged in sexual relations with a minor. Drake has never been charged with, or convicted of, any criminal acts whatsoever,” the suit reads.
The suit follows a petition filed in November in which Drake accuses UMG and Spotify of artificially inflating the success of “Not Like Us” using payola and streaming bots. The petition — which itself isn’t a lawsuit but a precursor — was withdrawn this week. But the suit filed today includes similar allegations of “pay-for-play” schemes to get “Not Like Us” played on radio stations and promoted on streaming platforms. The suit also again accuses UMG of using bots to “artificially inflate the spread” of the song. It cites a “whistleblower” who claimed he was paid $2,500 over Zelle “via third parties to use ‘bots’ to achieve 30,000,000 streams on Spotify in the initial days following the Recording’s release.”
As The New York Times notes, Drake has enlisted Michael J. Gottlieb, the lawyer that represented the owner of the restaurant embroiled in the “Pizzagate” conspiracy theory. Drake’s complaint draws parallels between the shooting at the artist’s home and the shooting at the restaurant, calling it “the 2024 equivalent of ‘Pizzagate.’”
“The online response was similarly violent and hateful. An avalanche of online hate speech has branded Drake as a sex offender and pedophile, among other epithets,” the complaint reads.
UMG did not immediately respond to a request for comment.
Like the wildfire conditions in Los Angeles County, my For You page on TikTok turned overnight.
I woke up last week to a phone screen filled with ravenous flames and video after video of razed homes, businesses, and other structures. Influencers broke from their regular cadence of content to film themselves packing up a suitcase for evacuation; nameless accounts shared footage from streets I didn’t recognize, showcasing the devastation; freshly created profiles asked for help locating their lost pets. Scrolling on TikTok feels like trying to keep track of 1,000 live feeds at once, each urgent and horrifying in its own way.
What all of this amounts to is a different question entirely. Even as there’s no escaping disaster content, the clips, comments, check-ins, and footage are not actually very helpful. Our feeds are awash with both too much and not enough information. Though it’s not yet clear how these fires started, scientists say that climate change will only continue to exacerbate wildfires going forward. Current weather conditions — including a severe lack of rainfall this year in Los Angeles — have created a tinderbox in the region.
Questions like “Where are the shelters?” “Should I evacuate?” and “Where can I get a mask and other supplies?” are left unanswered in favor of frightening first-person reports. And who can blame Los Angeles-area residents? That’s what you’re supposed to do on TikTok. What they can’t do is share a link to mutual aid resources or to a news story about vital, up-to-date evacuation information. They can scroll endlessly on the algorithmic For You page, but they can’t sort content to display the most recent updates first. TikTok is simply not built to disseminate potentially lifesaving breaking news alerts. Instead, it’s filled with endless clips of news crews interviewing people who have lost everything.
The wildfire content machine echoes a similar phenomenon from just a few months ago, when October’s Hurricane Milton tore through Florida, killing dozens and causing billions of dollars in damages. Some of the most visible and viral content from the storm came from influencers and other content creators who stayed behind to vlog their way through the event, racking millions of views. So far, there’s not the same risk-taking-for-viral-content dynamic at play with the fires in Southern California, but the overall experience is not that different: a random infotainment feed where a video of a person losing nearly every earthly possession is followed directly by someone testing a new makeup product. Media critic Matt Pearce put it best: “TikTok was largely indifferent to whether I live or die.”
Instagram seemed slightly more useful, but only, I suspect, if you follow people who post relevant content. In times of crisis — during the Black Lives Matter uprisings of 2020 or the ongoing bombardment of Gaza — Instagram Stories has become something of a bulletin board for resharing infographics and resources. Linking to relevant announcements and news stories is really only possible through Stories, but at least you can. Instagram search, on the other hand, is a chaotic mixture of user-generated infographics, grainy pictures of the fires that have been screenshotted and reuploaded multiple times, and distasteful selfies from bodybuilders wishing LA well.
It should go without saying that depraved conspiracy theories once again spread on X, including from billionaire owner Elon Musk and other right-wing influencers who falsely claimed DEI initiatives were responsible for the fires. Twitter, once functioning like a breaking news feed, is now overrun with crypto spam and Nazi sympathizers. Meanwhile, smaller, more specialized apps like Watch Duty, a nonprofit wildfire monitoring platform, have filled gaps. On Bluesky, an X competitor, users have the option to pin feeds based on trending topics, creating a custom landing page for LA fire content.
We are in for more, not fewer, extreme weather events like storms and heatwaves, and it’s worth asking ourselves whether we are prepared to do this all over again. Platform decay is all the more apparent in times of emergency, when users are forced to wade through astronomical amounts of garbage: video content that scares but doesn’t help us, news websites with so many pop-up ads it feels illegal, or ramblings from tech elites who are looking for someone to blame rather than a way to help. By my estimations, our feeds will return to regularly scheduled programming in five or so business days, and the devastation from these fires will get lost in a sea of comedy skits and PR unboxings. Until, of course, the next one.
In September 2023, Meta made a big deal of its new AI chatbots that used celebrities’ likeness: everyone from Kendall Jenner to MrBeast leased themselves out to embody AI characters on Instagram and Facebook. The celebrity-based bots were killed off last summer after less than a year, but users have recently been finding a handful of other, entirely fake bot profiles still floating around — and the reaction is not good.
There’s “Jane Austen,” a “cynical novelist and storyteller”; “Liv,” whose bio claims she is a “proud Black queer momma of 2 & truth-teller”; and “Carter,” who promises to give users relationship advice. All are labeled as “AI managed by Meta” and the profiles date back to when the initial announcement was made. But the more than a dozen AI characters have apparently not been very popular: each has just a few thousand followers, with their posts getting just a few likes and comments.
That is, until the last week or so. After a wave of coverage in outlets like Rolling Stoneand posts circulating on social media, the bot accounts are just now being noticed, and the reaction is confusion, frustration, and anger.
“What the fuck does an AI know about dating?????”...
Browsing Amazon Haul, the online shopping giant’s new $20-and-under bargain bin section, I immediately recognize not just a product, but a specific image. The photo is on a listing for “Timeless Black Dress for Both Casual and Formal Gatherings,” and it is stolen.
The image actually belongs to a New York-based independent brand called Mirror Palais, which sold the “Daisy Dress” for $545 a few years ago. Elevated by social media algorithms and its celebrity fans, Mirror Palais’ images have traveled from the brand’s website, to tweets, to Pinterest mood boards, and finally, to the discount section of the world’s largest online retailer, where it is — obviously — not for sale. On Amazon, it’s listed for $7.49. When I add it to my cart, I realize it’s even cheaper: inexplicably and improbably, it is an additional 65 percent off.
This one image of a black mini dress does not just appear on random listings on Amazon Haul — you can find it on Walmart and AliExpress as well as smaller sites with names like Mermaid Way and VMzona — all selling dupes (short for duplicates) of the original Mirror Palais dress. I even find a separate listing on Amazon, in the typically but not unbelievably...
Forbes will stop using freelancers for some types of stories indefinitely — and has blamed the change on a recent update to Google Search policies.
In recent days, Forbes has said it will stop hiring freelancers to produce content for its product review section Forbes Vetted, according to a journalist who has written for the site. In a note shared with The Verge, an editor at Forbes cited Google’s “site reputation abuse” policy for the change.
Site reputation abuse — also called parasite SEO — refers to a website publishing a deluge of off-brand or irrelevant content in order to take advantage of the main site’s ranking power and reputation in Google Search. Often, this piggybacking is concealed from users browsing the website. (For instance: those weird coupon code sections on newspaper sites that pop up via search engines but aren’t prominently displayed on the homepage.) Sometimes this spammy content is produced by third-party marketing firms that are contracted to produce a mountain of search-friendly content.
Forbes did not respond to multiple requests for comment. It’s not clear what other sections of Forbes the pause extends to. Writer Cassandra Brooklyn described receiving similar news last week.
Many news outlets (including The Verge) hire freelancers to write and report stories. But Forbes has an especially wide pool of outside contributors publishing to its site. Many of these writers are legitimate journalists who do fair, in-depth reporting. But there’s also the Forbes contributor network, a group of thousands of marketers, CEOs, and other outside experts who get to publish questionable content under the trusted Forbes name.
Some editorial content on the site may have drawn the ire of Google, which has been targeting the firehose of search engine-first content on the web. In November, Google further tightened its rules around parasite SEO, specifically taking aim at the “third party” nature of this type of content.
“Our evaluation of numerous cases has shown that no amount of first-party involvement alters the fundamental third-party nature of the content or the unfair, exploitative nature of attempting to take advantage of the host’s sites ranking signals,” the company wrote in a blog post.
Like other testing and review sites, Forbes Vettedmakes money every time a reader makes a purchase using links in the outlets’ articles. A writer who got word of the pause in freelance work says the editorial process on their past stories was rigorous — they would test products, go through multiple rounds of edits, and interview sources. In addition to the pause in work, the writerwas told that some of their stories may need to be completely re-reported and re-published by an in-house staff member.
“They clearly put a ton of resources into Forbes Vetted,” the writer says. “The big product reviews I was doing were $3,000 a piece, which is a huge amount of money to then be like, ‘Oh, we have to rewrite all this with staff in-house.’”
Google’s spam policies state that the existence of freelancer content in and of itself is not a running afoul of the site reputation abuse policy — it’s only a violation if that content is also designed to take advantage of the site’s ranking signals. Google spokesperson Davis Thompson directed The Verge to an FAQ section describing the freelancer policy.
Snapchat is tweaking how people earn money on the platform by introducing a new, unified monetization program. The new program will cover content posted to Stories as well as Spotlight, the platform’s TikTok-like discovery feed filled with recommended video content. Under the program, influencers earn revenue for ads placed within eligible Stories and Spotlight posts.
Previously, monetization of these formats was splintered off from one another: Stories earnings were in one bucket, and Spotlight earnings were handled through a different program.
The new program is currently in testing with a small group of users, and will roll out widely on February 1st, 2025. To participate, users need to hit a set of benchmarks to be invited: 50,000 followers and either 10 million Snap views, 1 million Spotlight views, or 12,000 hours of watch time in the last 28 days.
They also need to post consistently, with at least 25 times per month to saved Stories or Spotlight and posting to Spotlight or public Stories on at least 10 of the last 28 days. Some of those eligibility requirements are significantly higher than they were under the old structure. To be eligible to earn money through Spotlight, for example, creators previously needed things like a public profile, 1,000 followers, and 10,000 video views.
Other video platforms have also streamlined or changed the original creator funds. TikTok, for example, now has one creator program that requires videos longer than 1-minute. On YouTube, Shorts creators earn money via ad revenue — a move by the company to meaningfully compete with TikTok.
When the first grainy images of the UnitedHealthcare shooting suspect emerged, some viewers noticed a seemingly small detail: he looked like he was wearing a Peak Design Everyday V1 backpack. Now, on platforms like Threads and TikTok, a recurring accusation has circulated: Peak Design “traced” the bag owner using the backpack’s serial number.
However, the company says that’s just not true, in a statement shared with The Verge Friday afternoon. “Peak Design has not provided customer information to the police and would only do so under the order of a subpoena,” the statement signed by CEO Peter Dering reads.
“We cannot associate a product serial number with a customer unless that customer has voluntarily registered their product on our site.” The statement goes on to say that the serial numbers on the V1 of the Everyday backpack “were not unique or identifying ... We did not implement unique serial numbers until V2 iterations of our Everyday Backpack.”
In footage of the killing of UnitedHealthcare CEO Brian Thompson, the shooter has a gray backpack with a top flap, which the NYPD believe is the same one they recovered in Central Park a few days later. The bag they eventually located is gray with black piping and what looks to be a tan-colored contrasting tab on the corner of the flap — just like Peak Design’s crowdfunded “Everyday” V1 model.
He toldThe New York Times last week that the item was likely bought between 2016 and 2019. Dering told the Times that he called the NYPD tip line to share what he knew and vowed to do “whatever is possible” to identify the shooter, including consulting Peak Design’s legal team to see what he could share with police.
The Times story is just 300 words long, but it appears to have sparked a wave of anger among those sympathetic to the suspect, Luigi Mangione. Despite the Times story's lack of mention of a serial code, the rumor spread like wildfire before the company’s denial today.
On the Peak Design subreddit, which is moderated by the brand, posts have popped up discussing the company’s ability to track customers using the serial number on a bag and tips on how to delete customer information. The complaints largely center on the fact that Dering volunteered any information at all to police—a significant shift in public attitudes around a killing.
In a follow-up email to The Verge, Dering added: “If you do choose to register a Peak Design product, and it is lost or stolen, you can reach out to our customer service team and have your registration erased, so the bag is not traceable back to you.”
Tumblr is introducing a new Community feature — in-app groups organized by topic or interest.
Communities are similar to subreddits or Facebook groups and had previously been in beta. Topics include things like film photography, marine biology, LGBTQ, and video games, and each topic has its own landing page where posts shared with the community populate. Many of the features mirror Reddit, like a count of how many members are online, moderators, and community guidelines. Posts shared to communities also get a new comments section that’s only visible within the group. Communities have the option to be public or private.
Tumblr pulling a page from Reddit’s playbook shouldn’t be surprising. As other communities and forums on the web have died off or been eaten by Google, Reddit has been on the up-and-up, growing its user base and turning a profit for the first time. But Subreddits managed by users are both Reddit’s crown jewels and a thorn in the side of corporate interests, as demonstrated by the coordinated action taken last year in protest of changes to the platform’s API pricing structure. As Google Search degrades in usefulness — or is replaced by AI summaries — platforms like Reddit have become a central part of finding helpful information online.
Reddit is also adding search engine-like features, including an AI-powered summary tool called Answers announced earlier this week. Though Tumblr’s communities feature — and Tumblr in general — isn’t the search destination Reddit is, the new grouping feature does streamline how users can find and engage with topic-based content and peers with similar interests. Communities is available on the web, iOS, and Android.
Beginning in February, health insurer Anthem Blue Cross Blue Shield was planning to set a time limit for anesthesia coverage during surgeries and procedures. Now, following days of widespread outrage at the health insurance industry generally, Anthem is walking that policy back, the insurer announced on Thursday.
In mid-November, the American Society of Anesthesiologists issued a press release about the policy, which was set to take effect in February in states like Connecticut, New York, and Missouri.
“If an anesthesiologist submits a bill where the actual time of care is longer than Anthem’s limit, Anthem will deny payment for the anesthesiologist’s care,” they group writes. “With this new policy, Anthem will not pay anesthesiologists for delivering safe and effective anesthesia care to patients who may need extra attention because their surgery is difficult, unusual or because a complication arises.”
The letter appears to have garnered little public attention until this week when severalposts on social media about the policy change began circulating. The posts gained traction after the CEO of UnitedHealthcare, Brian Thompson, was shot and killed in New York on Wednesday in what police say was a targeted attack.
A spokesperson for Anthem’s parent company, Elevance Health, told The New York Times that “misinformation” about the plan contributed to Anthem’s reversal.
“We realized, based on all the feedback we’ve been receiving the last 24 hours, that our communication about the policy was unclear, which is why we’re pulling back,” Janey Kiryluik, staff vice president for corporate communications, is quoted as saying.
Thompson’s shooting shocked the public, but it also ignited discussions about the havoc wreaked by the US healthcare system and insurers like UnitedHealthcare. United specifically has been the subject of investigations by outlets like Stat, which found the company uses algorithms to cut off payments and deny rehabilitation care for patients. The rate at which insurers deny patient claims is a closely guarded secret, but ProPublica last year followed one chronically ill patient’s fight to get coverage from United. In some online forums, there was little sympathy for the company and Thompson’s death: Americans carry at least $220 billion in medical debt, which upends lives as insurance companies profit.
For years, a fan-run account called Muppet History has been central to the Muppets fandom. It shared little-known facts, memes, and wholesome messages, amassing half a million followers on Instagram and more than 280,000 on X. Publicly, it was a wholesome and sweet platform, a passion project that took off. It became an unofficial ambassador of Jim Henson’s iconic cast of characters — inside and outside the world of diehard fans.
But on Monday night, a post on the account’s Instagram page had an ominous tone. “Good Evening,” the message started. “We wanted to take a moment to address some concerns that have arisen as of late.” The vague post — on which comments had been disabled — mentioned “overstepped” boundaries, the “harm” caused, and that people were made “uncomfortable.” It did not specify exactly what had happened.
Since that post, however, a rough sketch has come into focus. Fans claim that Muppet History’s co-runner Joshua Gillespie, who operates the account with his wife, Holly, was sending unwanted sexual messages to other people. Now, it’s gone from a bright spot on the internet to another soured piece of online culture, leaving a small community navigating the...
One Amazon influencer makes a living posting content from her beige home. But after she noticed another account hawking the same minimal aesthetic, a rivalry spiraled into a first-of-its-kind lawsuit. Can the legal system protect the vibe of a creator? And what if that vibe is basic?
Musically speaking, I’d say Drake lost the tit-for-tat feud this year between himself and Kendrick Lamar, which culminated in Lamar’s hit song “Not Like Us.” But Drake (real name: Aubrey Graham) doesn’t seem to want to accept defeat that easily. In a petition filed in New York’s state Supreme Court on Monday, Drake is accusing Universal Music Group — the label that’s represented him for his entire career and also represents Lamar — and Spotify of shady business practices aimed at making “Not Like Us” more of a hit than it already was:
In 2024, UMG did not rely on chance, or even ordinary business practices, to “break through the noise” on Spotify, and likely other music platforms. It instead launched a campaign to manipulate and...
Meta is promising “long-overdue improvements” to its X competitor, Threads, including more precise search features and expanded trending topics.
First, users will be able to search for posts within a specific date range or from a single account — similar to what X’s search allows. Threads is also testing a new trending page in the US that includes additional topics to follow as well as AI-generated summaries of what other users are talking about.
Instagram head Adam Mosseri, who led the launch of Threads, wrote in a post that the tests begin today.
This week has been full of updates on Threads, which is facing increased competition from Bluesky, the decentralized text-based platform that people have been flocking to in recent weeks....
Amazon announced today that it’s pumping another $4 billion into Claude AI maker Anthropic, bringing its total funding amount to $8 billion. This latest round follows $1.25 billion last September and another $2.75 billion in March.
Along with the money, Amazon wrote in its blog post that Amazon Web Services (AWS) would be named Anthropic’s “primary training partner” and that the OpenAI rival would use Trainium and Inferentia chips for future models.
The investment and deeper partnership align with previous reporting by Reutersthat Claude will power Amazon’s new Alexa voice assistant. The improved Alexa — the release of which has been delayed — reportedly performed better with Claude than it did when using Amazon’s in-house model. The...
Google is tightening its rules against “parasite SEO” content, or articles and pages that often have little to do with the site’s focus but that exploit the website’s Google ranking.
An example of parasite SEO content is a news blog that publishes online shopping coupon codes in a hidden part of its website or an educational site publishing unrelated affiliate marketing content. In March, Google announced it would crack down on this kind of “site reputation abuse,” and now it’s making it clear that it doesn’t matter if the publisher created the content themselves or outsourced it — it’s a search policy violation regardless.
“Since launching the policy, we’ve reviewed situations where there might be varying degrees of first-party...