Electronics, avocados, vegetables, cars, tractors, crude oil â these are some of the things that could soon get more expensive for US consumers. Under President Donald Trumpâs proposed plan, goods coming in from Mexico and Canada will be subject to a 25 percent tariff beginning on February 1st. White House press secretary Karoline Leavitt has also said Trump was âvery much still consideringâ tariffs on China on the same day. As of late Thursday, the specifics of these plans were still up in the air.
Sweeping tariffs were one of Trumpâs marquee campaign promises leading up to the election in November. Heâs previously threatened up to a 60 percent tariff on goods from China, a 100 percent tariff on goods from Mexico, and even a 200 percent tariff on John Deere products imported into the US. Despite this, Trump failed to levy any tariffs on day one of his presidency, with Bloomberg reporting on Thursday that his administration lacked even concepts of a plan. His first round is now supposed to hit goods from Mexico and Canada, the two largest trade partners for the US.
In 2022, around $1.8 trillion of goods and services moved between countries under the current United St …
Resale platform Poshmark is launching Smart List AI, an automated product listing tool for sellers, the company announced today.
The tool has been in beta testing since last fall and uses sellers’ photos to generate details like the type of product, the size and brand, the style and color, and other details, as well as a title and description for the listing. Like other platforms that have introduced AI listing tools, the promise is that Smart List AI will streamline the process for sellers and cut down on the amount of time it takes to list items. Sellers will still input the price of their item, Manish Chandra, founder and CEO of Poshmark, told The Verge via email.
Other platforms like eBay have also introduced AI-generated listings that fill in details like product titles and descriptions. And like eBay’s feature, Poshmark’s tool will depend on sellers actually confirming that the information is correct — in the secondhand market, accurate details for things like the brand, size, and condition of an item are essential.
In my experience on eBay, at least, AI-generated descriptions are not very helpful; they mostly read like marketing fluff. What’s actually helpful is a detailed list of any flaws, wear and tear, or precise measurements — things that an AI listing tool wouldn’t be able to generate.
Retailers, including Shopify and Amazon, have pushed generative AI shopping products on both the merchant and consumer ends. And whether its AI-generated product listings or entire garments “created” using AI, the aim is often the same: to increase the scale of production, but not necessarily improve the quality.
Smart List AI will be available to Poshmark users on iOS in February.
American Federation of Government Employees (AFGE), the union representing 750,000 federal employees, told members on Wednesday to not resign via an email sent to them by the US Office of Personnel Management. The email sent on Tuesday prompted employees to resign from service by accepting a “deferred resignation” offer in which they would be exempted from return-to-office requirements instituted under President Donald Trump’s new administration.
“There is not yet any evidence the administration can or will uphold its end of the bargain, that Congress will go along with this unilateral massive restructuring, or that appropriated funds can be used this way, among other issues that have been raised,” the union says in its notice to members, a copy of which was obtained by The Verge. “We are encouraging AFGE members NOT to resign or respond to this email until you have received further information and clarification.”
The union’s email also includes frequently asked questions about the offer, which has caused panic and confusion among the federal workforce. For one, there have been inconsistencies between the email and other circulating memos, like whether employees would be required to work during the time they resign and September 30th. The FAQ section also warns union members that the deferred resignation benefits are vague and questions whether OPM even has the legal authority to offer the purported benefits.
“Employees should not take the Program at face value,” a section of the FAQ reads.
Separately, AFGE and the American Federation of State, County and Municipal Employees (AFSCME) announced they were suing the Trump administration over its efforts to make it easier to fire civil servants.
The OPM email, which was later published on the office’s website, is similar in some ways to communication Twitter employees received from Elon Musk after he bought the platform in 2022. The OPM notice includes the subject line “Fork in the Road” — the same subject line Musk used in an email to Twitter staff pressuring them to commit to his vision for the company or quit. This week,Wired reported that OPM, which is essentially the federal government’s HR division, had been taken over by individuals linked to Musk’s various businesses. In another move straight out of the Twitter takeover playbook, Musk is reportedly sleeping at the offices of the Department of Government Efficiency (DOGE), the renamed agency that the Trump ally is now in charge of.
Earlier this week, several posts on the Instagram page of Aid Access, an abortion pill provider, were inaccessible to the public. Some images were blurred out, with no option to click through and view the post. Others appeared simply as a gray square with nondescript alt text, as if the image didn’t load.
Aid Access connects patients with doctors who provide abortion pills via telehealth appointments, and the posts that were blocked from being viewed included instructions for performing at-home abortions using pills. The issues on Instagram — first reported by Jessica Valenti — also reportedly made it difficult to find the Aid Access account using the app’s search function.
By Thursday evening, Meta had restored a handful of Aid Access posts, though some appear to still be missing. This latest incident is just another example of how Meta has restricted abortion information online for years. It also comes in the middle of Meta’s right-wing pivot, as the company has begun allowing more transphobic, racist, and otherwise hateful content on its platforms as it courts Donald Trump.
Medication abortions in the first trimester of pregnancy can safely be done at home, according to the World Health Organization. Licensed providers like those working with Aid Access have prescribed abortion pills to hundreds of thousands of patients. Thanks to shield laws, which protect healthcare workers who provide the procedure, patients in states where abortion is banned or restricted can also order abortion medication.
Social media is an important place for patients to seek information about abortions, says Rebecca Davis of Hey Jane, which offers virtual reproductive care like abortions and birth control.
“[Patients] will often turn to social media to just make sure we’re legit,” says Davis, who leads marketing at the company. “We’ve spent a lot of time and energy to really build up our social presence, so we’re verified on Instagram, we’re verified on TikTok.”
Hey Jane encountered restrictions on Instagram in recent days as well. Davis says the group has gotten messages that its Instagram profile was not easily accessible through the app’s search features. The Verge was able to replicate the issue: typing in “heyjane” or “hey jane” did not display the account as suggested. Users would need to know the account’s full handle, @heyjanehealth, in order for it to appear as a suggestion.
“We know that by not showing up in these searches we’re directly impacting people who are actively seeking this very timely, essential healthcare from getting the information that they need to make decisions,” Davis told The Verge.
Meta spokesperson Erin Logan told The Verge in an email that abortion rights groups are experiencing “a variety of issues — some due to correct enforcement, as well as over enforcement.” Logan said the company prohibits the sale of pharmaceutical drugs on Meta platforms without a LegitScript certification. (Hey Jane is among the providers certified with LegitScript.) Logan said these incidents were not the result of recent Meta policy updates.
“We’ve been quite clear in recent weeks that we want to allow more speech and reduce enforcement mistakes — and we’re committed to doing that,” Logan added, though no specifics were mentioned. Regarding issues encountered by Hey Jane, Logan directed The Verge to Meta’s policies, which state that content promoting the use of pharmaceutical drugs is allowed on the platform but may not be eligible for recommendation.
In recent weeks, Meta has rolled back its policies for what users can and can’t say on its platforms, opening the floodgates for more hate speech and offensive content under the guise of “free expression.” But providers have long had information about abortion restricted or removed, according to groups like Amnesty International and Repro Uncensored. Davis from Hey Jane says this isn’t the first time the group’s Instagram profile has been invisible in search: something similar happened in 2023, when fake Hey Jane accounts were appearing in search instead.
Even though abortion access groups have encountered issues on Meta platforms, Davis says it’s not as simple as moving to another social media site. Many patients use Instagram and other platforms like TikTok to find urgent information.
“While this moment certainly points to the value in diversifying the platforms that we’re on, that doesn’t necessarily mean that people who are seeking abortion care are going to be off of these platforms,” Davis says.
Meta and YouTube aren’t the only platforms looking to benefit from TikTok potentially disappearing — Substack wants in on the action, too.
The company announced Thursday it’s launching a $20 million “creator accelerator fund,” promising content creators they won’t lose revenue by jumping ship to Substack. Creators in the program also get “strategic and business support” from Substack, and early access to new features.
“We established this fund because we’ve seen creators who specialize in video, audio, and text expand their audience, revenue, and influence on Substack, where the platform’s network effects amplify the quality and impact of the work they’re doing,” the company said in a blog post.
This pivot on Substack’s part has been in the works for a while — for months, the company has been marketing itself not as a newsletter delivery service but as a creator platform similar to Patreon.
“On Substack, [creators] can build their own home on the internet: one where creators, not platform executives or advertisers, own their work and their audience,” the blog post reads. The post also cites “bans, backlash, and policies that change with the political winds” as a reason creators can’t depend on traditional social media services.
That’s all fine (we at The Verge have been saying this for a while). But creators focusing on Substack are also subject to ebbs and flows depending on what the company is prioritizing: first, it was newsletters, then it was tweet-like micro blogs, followed by full-on websites and livestreaming. For some, Substack’s initial stated mission of giving more freedom to independent writers is fading. And TikTok creators looking to move to Substack will need to rebuild their following all over again — you obviously can’t export your TikTok followers.
The $20 million fund isn’t the first time Substack has offered a pool of money meant to entice creators. Under a program called Substack Pro, the company poached top media talent from traditional newsrooms with higher pay, health insurance, and other perks. That program ended in 2022, with Substack cofounder Hamish McKenzie saying the deals weren’t employment arrangements but “seed funding deals to remove the financial risk for a writer in starting their own business.” In other words, welcome to Substack. Now that you’re here, you’re on your own — which is more or less the deal other platforms offer.
Dozens of popular subreddits are banning links to X after Elon Musk made a gesture that historians and human rights groups have described as a Nazi salute. Communities that have instituted a ban on links to X include r/formula1, r/military, r/nursing, r/TwoXChromosomes, and r/nintendo.
The shift is spreading across Reddit after neo-Nazis celebrated Musk’s speech at a rally on Monday for Donald Trump’s inauguration. During the speech, Musk twice raised his arm in a salute that historians, elected officials, and organizations that support Holocaust survivors have observed as a Nazi salute. During his speech, Musk places his hand on his chest and throws his arm forward at an angle, holding it mid-air for a few moments. “My heart goes out to you,” he says to supporters. Some supporters of Musk have defended him, saying the gesture went along with his words.
Musk has not disavowed the neo-Nazis reading his gesture as a Sieg Heil, and in fact has minimized criticism, writing on X that “The ‘everyone is Hitler’ attack is sooo tired.” Musk has previously amplified racist, antisemitic conspiracy theories like the Great Replacement Theory and is constantly posting anti-immigrant claims not based in reality. After pouring millions of dollars into US politics to elect Trump, Musk has expanded to German politics as well, endorsing the far right Alternative for Germany (AfD) party.
Regardless of Musk’s true intentions, extremists are thrilled: as Rolling Stone reported, white supremacists are calling it a “Donald Trump White Power moment” and thanking Musk for “hearing” them.
The subreddits that announced the new rule cover millions of users across geography and interests. Some subreddits have announced they will allow screenshots of content from X but not hyperlinks, and many other large communities like r/nba and r/nfl are discussing following suit. Even setting Musk’s right wing politics aside, viewing X links on Reddit isn’t a great experience: links often don’t unfurl and users need an X account to view the conversation on the platform.
“Reddit has a longstanding commitment to freedom of speech and freedom of association,” said a Reddit spokesperson who asked to remain anonymous citing the sensitivity of the subject. While individual subreddits are able to institute community rules, “Reddit Inc. has no ban on X links — there are still plenty of X links on Reddit,” the spokesperson wrote in an email.
X did not immediately respond to a request for comment.
Instagram is bringing back one of its more chaotic features, now reworked for the short-form video era.
A new tab in the Reels feed will serve up videos that a user’s friends have liked or added commentary to, Instagram leader Adam Mosseri announced in a video message today. Users will be able to see which friends have liked a video — a callback to the old Instagram “activity” feed that was killed in 2019.
“We want Instagram to not only be a place where you consume entertaining content, but one where you connect over that content with friends,” Mosseri says. In the new feed, you’ll be able to see which friends have liked a post and which have left a temporary “note” on a Reel.
That sounds nice in a ideal world, but given the way that the previous “activity” feed was scrutinized, I’m willing to bet a lot of users actually don’t want their friends to see all the Reels they’ve liked. (I’m not sure what benefits or insights my friends would get from seeing that I liked every single Shohei Ohtani post that crossed my feed, but OK.) It also might discourage people from engaging publicly with content in this way to avoid it being shown to all of their friends. It’s also not a given that you share interests or hobbies just because you’re friends with someone — for many people, it’s the hyper personalized nature of TikTok that makes the experience interesting in the first place.
Other platforms like X have gone the opposite route by hiding users’ liked posts, in part because people kept getting caught liking embarrassing things (if someone catches Ted Cruz liking thirst trap Reels, please email me immediately). Meta didn’t immediately respond to questions about whether users can opt out of having their activity shown in the new Reels feed.
Instagram stands to benefit if its biggest rival, TikTok, is forced to pull out of the US this weekend. Reels is Instagram’s answer to TikTok, but many creators and users say the atmosphere on Reels doesn’t live up to the environment TikTok has cultivated. While the new feature might stoke drama and pull some users into the Reels feed, it could also have the opposite effect for those who don’t want all their interests broadcast out.
Cash App is closing out the week on the hook for $255 million in multiple settlements around its consumer protections.
Block, the company that owns Cash App, agreed Wednesday to pay $80 million to 48 states that fined the company for violating laws intended to keep illicit activity off the platform.
“State regulators found Block was not in compliance with certain requirements, creating the potential that its services could be used to support money laundering, terrorism financing, or other illegal activities,” a press release from the Conference of State Bank Supervisors says.
Separately, the federal Consumer Financial Protection Bureau reached a settlement with Block on Thursday, in which the company agreed to pay $120 million to Cash App customers and another $55 million to the CFPB. According to the bureau, Cash App’s weak security measures put consumers at risk and made it difficult for users to get help after experiencing fraud on the platform. Cash App is also accused of tricking consumers into thinking that their bank, not Cash App, was responsible for handling disputes and that Cash App didn’t offer “meaningful and effective” customer service, which “left the network vulnerable to criminals defrauding users.”
How to regulate peer-to-peer money-transferring apps like Cash App is an ongoing fight. This week, NetChoice and TechNet sued to challenge the CFPB’s handling of such platforms like banks, calling it an “unlawful power grab.” Google filed a similar suit in December.
It’s been more than four years since Donald Trump first moved to expel TikTok from the US — and now, just days before a second Trump presidency begins, it just might happen.
President Joe Biden signed legislation last April that officially began the countdown that would force TikTok’s parent company, ByteDance, to divest from the US business. But even afterward,the atmosphere on the video powerhouse was mostly nonchalant, with a handful of stray jokes about “this app disappearing” slotted between the usual fare.
In the last week, though, the vibe has shifted — my favorite creators are posting links to their other social accounts, audiences are making highlight reels of the most viral moments on the app, and they’re saying goodbye to their “Chinese spy” and threatening to hand over their data to the Chinese government. A Chinese-owned app Xiaohongshu, known as RedNote, topped the App Store this week, driven by a wave of “TikTok refugees” trying to recreate the experience of the platform. It’s feeling a bit like a fever dream last day of school.
For many creatives online, this wouldn’t be the first time they’ve had to migrate to new spaces: reach, engagement, and visibility are constantly shifting even on the largest and most stable platforms. But the possibility that a social media site of this size would disappear — or slowly break down until it’s nonfunctional — is a new threat. For small creators especially, TikTok is like playing the lottery: you don’t need thousands of followers for your video to get big, and this unpredictability incentivized the average person to upload content.
It’s still unclear what will happen to TikTok after January 19th. I asked content creators what their game plan is. (Responses have been edited and condensed for clarity.)
“At the peak, I was making approximately 70 percent of my sales through TikTok from December 2020 to January 2022. Now, it drives at most, 10 percent of my sales,” says Noelle Johansen, who sells slogan sweatshirts, accessories, stickers, and other products.
“At my peak with TikTok, I was able to reach so many customers with ease. Instagram and Twitter have always been a shot in the dark as to whether the content will be seen, but TikTok was very consistent in showing my followers and potential new customers my videos,” Johansen told The Verge in an email. “I’ve also made great friends from the artist community on TikTok, and it’s difficult to translate that community to other social media. Most apps function a lot differently than TikTok, and many people don’t have the bandwidth to keep up with all of the new socials and building platforms there.”
Going forward, Johansen says they’ll focus on X and Instagram for sales while working to grow an audience on Bluesky and Threads.
“I think the ease of use on TikTok opened an avenue for a lot of would-be creators,” Kay Poyer, a popular creator making humor and commentary content, says. “Right now we’re seeing a cleaving point, where many will choose to stop or be forced to adapt back to older platforms (which tend to be more difficult to build followings on and monetize).”
As for her own plans, Poyer says she’ll stay where the engagement is if TikTok becomes unavailable — smaller platforms like Bluesky or Neptune aren’t yet impactful enough.
“I’m seeing a big spike in subscribers to my Substack, The Quiet Part, as well as followers flooding to my Instagram and Twitter,” Poyer told The Verge. “Personally I have chosen to make my podcast, Meat Bus, the flagship of my content. We’re launching our video episodes sometime next month on YouTube.”
Bethany Brookshire, a science journalist and author, has been sharing videos about human anatomy on TikTok, Bluesky, Instagram, and YouTube. Across platforms, Brookshire has observed differences in audiences — YouTube, for example, “is not a place [to] build an audience,” she says, citing negative comments on her work.
“I find people on TikTok comment and engage a lot more, and most importantly, their comments are often touching or funny,” she says. “When I was doing pelvic anatomy, a lot of people with uteruses wrote in to tell me they felt seen, that they had a specific condition, and they even bonded with each other in the comments.”
Brookshire told The Verge in an email that sharing content anywhere can at times feel fraught. Between Nazi content on Substack, right-wing ass-kissing at Meta, and the national security concerns of TikTok, it doesn’t feel like any platform is perfectly ideal.
“Sometimes I feel like the only ethical way to produce any content is to write it out in artisanal chalk on an organically sourced vegan stone, which I then try to show to a single person with their consent before gently tossing it into the ocean to complete its circle of life,” Brookshire says. “But if I want to inform, and I want to educate, I need to be in the places people go.”
The Woodstock Farm Sanctuary in upstate New York uses TikTok to share information with new audiences — the group’s Instagram following is mostly people who are already animal rights activists, vegans, or sanctuary supporters.
“TikTok has allowed us to reach people who don’t even know what animal sanctuaries are,” social media coordinator Riki Higgins told The Verge in an email. “While we still primarily fundraise via Meta platforms, we seem to make the biggest education and advocacy impact when we post on TikTok.”
Walt and Waldo escapd separate slaughter operations in different towns over the summer. We were able to rescue both, and they became each other’s comfort as they adjusted in quarantine. Usually, the quarantine period is only a few weeks and then new residents move in with existing groups, but Walt experienced some serious medical emergencies that took him a long time to heal from, and Waldo stayed by his side during those months. Finally, we were able to move this pair into the main sheep barn and watch them integrate into their new family, which was so special to watch. #whywoodstock
With a small social media and marketing team of two, Woodstock Farm Sanctuary (like other small businesses and organizations) must be strategic in how it uses its efforts. YouTube content can be more labor-intensive, Higgins says, and Instagram Reels is missing key features like 2x video speed and the ability to pause videos.
“TikTok users really, really don’t like Reels. They view it as the platform where jokes, trends, etc., go to die, where outdated content gets recycled, and especially younger users see it as an app only older audiences use,” Higgins says.
The sanctuary says it will meet audiences wherever they migrate in the case that TikTok becomes inaccessible.
Anna Rangos, who works in social media and makes tech and cultural commentary videos, is no stranger to having to pick up and leave a social media platform for somewhere else. As a retired sex worker, she saw firsthand how fragile a social media following could be.
“You could wake up one day to find your accounts deactivated, and restoring them? Forget it. Good luck getting any kind of service from Meta,” Rangos said in an email. Having an account deleted means lost income and hours of trying to rebuild a following. “Over my time in the industry, I went through three or four Instagram accounts, constantly trying to recapture my following.”
Sex workers and sex education creators regularly deal with their content being removed, censored, or entire accounts deleted. Rangos says that though the community on TikTok is more welcoming, she’s working to stake out her own space through a website and a newsletter. She also plans to stay active on YouTube, Pinterest, and Bluesky.
“I don’t plan on using Meta products much, given [Mark] Zuckerberg’s recent announcements regarding fact-checking,” she wrote in an email.
“I have found so much joy and community on TikTok mostly through Native TikTok,” says Amanda Chavira, an Indigenous beader who built an audience through tutorials and cultural content. “It’s sad to see TikTok go.”
Chavira says she plans to reupload some of her content to YouTube Shorts to see how her videos perform there but otherwise will be waiting to see if another viable video platform comes along. Chavira won’t be pivoting to Meta: she says she plans to delete her accounts on Threads, Instagram, and Facebook.
“I’d been considering leaving my Meta accounts for a long time,” she said in an email. “Facebook felt like a terrible place through election cycles, and then the pandemic, [and] then every other post I was seeing was a suggested ad or clickbait article. For Instagram, I’ve really been struggling to reach my target audience and didn’t have the time available to post all the time to try to increase engagement.” Her final straw was Meta’s decision to end the fact-checking program and Zuckerberg’s “pandering to the Trump administration,” she says.
Drake’s ongoing legal battle with his label, Universal Music Group, has escalated. The artist filed a lawsuit in federal court today, accusing UMG of harming his reputation and endangering him for profit. The suit stems from the diss track “Not Like Us” by Kendrick Lamar, another UMG artist. Drake’s legal complaint also again accuses UMG of using bots on Spotify and other streaming platforms, and payola to make the song more popular.
“On May 4, 2024, UMG approved, published, and launched a campaign to create a viral hit out of a rap track that falsely accuses Drake of being a pedophile and calls for violent retribution against him,” the complaint reads. “Even though UMG enriched itself and its shareholders by exploiting Drake’s music for years, and knew that the salacious allegations against Drake were false, UMG chose corporate greed over the safety and well-being of its artists.”
The lawsuit details a shooting at Drake’s (real name: Aubrey Graham) home just a few days after the song was released, during which a security guard was injured. Multiple break-ins occurred in the following days, which the lawsuit says were caused by UMG’s actions.
Why would UMG pit two of its own artists against each other? Drake’s team has a theory:
UMG’s actions are motivated, at least in part, by UMG’s desire to best position itself in negotiations with Kendrick Lamar in 2024 and Drake in 2025. With respect to Lamar, on information and belief, UMG was incentivized to prove that it could maximize Lamar’s sales—by any means necessary—after only being able to get him to sign a short-term exclusive contract. UMG wanted Lamar to see its value on an expedited timeframe in order to convince Lamar to resign exclusively and for a longer period of time. As to Drake, in 2024, his contract was nearing fulfillment. On information and belief, UMG anticipated that extending Drake’s contract would come at a high cost to UMG; as such, it was incentivized to devalue Drake’s music and brand in order to gain leverage in negotiations for an extension
Lamar is not named as a defendant in the suit; instead, Drake’s legal team pins the blame on UMG for releasing the song despite knowing the song’s “allegations are unequivocally false.”
“Drake is not a pedophile. Drake has never engaged in any acts that would require he be “placed on neighborhood watch.” Drake has never engaged in sexual relations with a minor. Drake has never been charged with, or convicted of, any criminal acts whatsoever,” the suit reads.
The suit follows a petition filed in November in which Drake accuses UMG and Spotify of artificially inflating the success of “Not Like Us” using payola and streaming bots. The petition — which itself isn’t a lawsuit but a precursor — was withdrawn this week. But the suit filed today includes similar allegations of “pay-for-play” schemes to get “Not Like Us” played on radio stations and promoted on streaming platforms. The suit also again accuses UMG of using bots to “artificially inflate the spread” of the song. It cites a “whistleblower” who claimed he was paid $2,500 over Zelle “via third parties to use ‘bots’ to achieve 30,000,000 streams on Spotify in the initial days following the Recording’s release.”
As The New York Times notes, Drake has enlisted Michael J. Gottlieb, the lawyer that represented the owner of the restaurant embroiled in the “Pizzagate” conspiracy theory. Drake’s complaint draws parallels between the shooting at the artist’s home and the shooting at the restaurant, calling it “the 2024 equivalent of ‘Pizzagate.’”
“The online response was similarly violent and hateful. An avalanche of online hate speech has branded Drake as a sex offender and pedophile, among other epithets,” the complaint reads.
UMG did not immediately respond to a request for comment.
Like the wildfire conditions in Los Angeles County, my For You page on TikTok turned overnight.
I woke up last week to a phone screen filled with ravenous flames and video after video of razed homes, businesses, and other structures. Influencers broke from their regular cadence of content to film themselves packing up a suitcase for evacuation; nameless accounts shared footage from streets I didn’t recognize, showcasing the devastation; freshly created profiles asked for help locating their lost pets. Scrolling on TikTok feels like trying to keep track of 1,000 live feeds at once, each urgent and horrifying in its own way.
What all of this amounts to is a different question entirely. Even as there’s no escaping disaster content, the clips, comments, check-ins, and footage are not actually very helpful. Our feeds are awash with both too much and not enough information. Though it’s not yet clear how these fires started, scientists say that climate change will only continue to exacerbate wildfires going forward. Current weather conditions — including a severe lack of rainfall this year in Los Angeles — have created a tinderbox in the region.
Questions like “Where are the shelters?” “Should I evacuate?” and “Where can I get a mask and other supplies?” are left unanswered in favor of frightening first-person reports. And who can blame Los Angeles-area residents? That’s what you’re supposed to do on TikTok. What they can’t do is share a link to mutual aid resources or to a news story about vital, up-to-date evacuation information. They can scroll endlessly on the algorithmic For You page, but they can’t sort content to display the most recent updates first. TikTok is simply not built to disseminate potentially lifesaving breaking news alerts. Instead, it’s filled with endless clips of news crews interviewing people who have lost everything.
The wildfire content machine echoes a similar phenomenon from just a few months ago, when October’s Hurricane Milton tore through Florida, killing dozens and causing billions of dollars in damages. Some of the most visible and viral content from the storm came from influencers and other content creators who stayed behind to vlog their way through the event, racking millions of views. So far, there’s not the same risk-taking-for-viral-content dynamic at play with the fires in Southern California, but the overall experience is not that different: a random infotainment feed where a video of a person losing nearly every earthly possession is followed directly by someone testing a new makeup product. Media critic Matt Pearce put it best: “TikTok was largely indifferent to whether I live or die.”
Instagram seemed slightly more useful, but only, I suspect, if you follow people who post relevant content. In times of crisis — during the Black Lives Matter uprisings of 2020 or the ongoing bombardment of Gaza — Instagram Stories has become something of a bulletin board for resharing infographics and resources. Linking to relevant announcements and news stories is really only possible through Stories, but at least you can. Instagram search, on the other hand, is a chaotic mixture of user-generated infographics, grainy pictures of the fires that have been screenshotted and reuploaded multiple times, and distasteful selfies from bodybuilders wishing LA well.
It should go without saying that depraved conspiracy theories once again spread on X, including from billionaire owner Elon Musk and other right-wing influencers who falsely claimed DEI initiatives were responsible for the fires. Twitter, once functioning like a breaking news feed, is now overrun with crypto spam and Nazi sympathizers. Meanwhile, smaller, more specialized apps like Watch Duty, a nonprofit wildfire monitoring platform, have filled gaps. On Bluesky, an X competitor, users have the option to pin feeds based on trending topics, creating a custom landing page for LA fire content.
We are in for more, not fewer, extreme weather events like storms and heatwaves, and it’s worth asking ourselves whether we are prepared to do this all over again. Platform decay is all the more apparent in times of emergency, when users are forced to wade through astronomical amounts of garbage: video content that scares but doesn’t help us, news websites with so many pop-up ads it feels illegal, or ramblings from tech elites who are looking for someone to blame rather than a way to help. By my estimations, our feeds will return to regularly scheduled programming in five or so business days, and the devastation from these fires will get lost in a sea of comedy skits and PR unboxings. Until, of course, the next one.
In September 2023, Meta made a big deal of its new AI chatbots that used celebrities’ likeness: everyone from Kendall Jenner to MrBeast leased themselves out to embody AI characters on Instagram and Facebook. The celebrity-based bots were killed off last summer after less than a year, but users have recently been finding a handful of other, entirely fake bot profiles still floating around — and the reaction is not good.
There’s “Jane Austen,” a “cynical novelist and storyteller”; “Liv,” whose bio claims she is a “proud Black queer momma of 2 & truth-teller”; and “Carter,” who promises to give users relationship advice. All are labeled as “AI managed by Meta” and the profiles date back to when the initial announcement was made. But the more than a dozen AI characters have apparently not been very popular: each has just a few thousand followers, with their posts getting just a few likes and comments.
That is, until the last week or so. After a wave of coverage in outlets like Rolling Stoneand posts circulating on social media, the bot accounts are just now being noticed, and the reaction is confusion, frustration, and anger.
“What the fuck does an AI know about dating?????”...
Browsing Amazon Haul, the online shopping giant’s new $20-and-under bargain bin section, I immediately recognize not just a product, but a specific image. The photo is on a listing for “Timeless Black Dress for Both Casual and Formal Gatherings,” and it is stolen.
The image actually belongs to a New York-based independent brand called Mirror Palais, which sold the “Daisy Dress” for $545 a few years ago. Elevated by social media algorithms and its celebrity fans, Mirror Palais’ images have traveled from the brand’s website, to tweets, to Pinterest mood boards, and finally, to the discount section of the world’s largest online retailer, where it is — obviously — not for sale. On Amazon, it’s listed for $7.49. When I add it to my cart, I realize it’s even cheaper: inexplicably and improbably, it is an additional 65 percent off.
This one image of a black mini dress does not just appear on random listings on Amazon Haul — you can find it on Walmart and AliExpress as well as smaller sites with names like Mermaid Way and VMzona — all selling dupes (short for duplicates) of the original Mirror Palais dress. I even find a separate listing on Amazon, in the typically but not unbelievably...
Forbes will stop using freelancers for some types of stories indefinitely — and has blamed the change on a recent update to Google Search policies.
In recent days, Forbes has said it will stop hiring freelancers to produce content for its product review section Forbes Vetted, according to a journalist who has written for the site. In a note shared with The Verge, an editor at Forbes cited Google’s “site reputation abuse” policy for the change.
Site reputation abuse — also called parasite SEO — refers to a website publishing a deluge of off-brand or irrelevant content in order to take advantage of the main site’s ranking power and reputation in Google Search. Often, this piggybacking is concealed from users browsing the website. (For instance: those weird coupon code sections on newspaper sites that pop up via search engines but aren’t prominently displayed on the homepage.) Sometimes this spammy content is produced by third-party marketing firms that are contracted to produce a mountain of search-friendly content.
Forbes did not respond to multiple requests for comment. It’s not clear what other sections of Forbes the pause extends to. Writer Cassandra Brooklyn described receiving similar news last week.
Many news outlets (including The Verge) hire freelancers to write and report stories. But Forbes has an especially wide pool of outside contributors publishing to its site. Many of these writers are legitimate journalists who do fair, in-depth reporting. But there’s also the Forbes contributor network, a group of thousands of marketers, CEOs, and other outside experts who get to publish questionable content under the trusted Forbes name.
Some editorial content on the site may have drawn the ire of Google, which has been targeting the firehose of search engine-first content on the web. In November, Google further tightened its rules around parasite SEO, specifically taking aim at the “third party” nature of this type of content.
“Our evaluation of numerous cases has shown that no amount of first-party involvement alters the fundamental third-party nature of the content or the unfair, exploitative nature of attempting to take advantage of the host’s sites ranking signals,” the company wrote in a blog post.
Like other testing and review sites, Forbes Vettedmakes money every time a reader makes a purchase using links in the outlets’ articles. A writer who got word of the pause in freelance work says the editorial process on their past stories was rigorous — they would test products, go through multiple rounds of edits, and interview sources. In addition to the pause in work, the writerwas told that some of their stories may need to be completely re-reported and re-published by an in-house staff member.
“They clearly put a ton of resources into Forbes Vetted,” the writer says. “The big product reviews I was doing were $3,000 a piece, which is a huge amount of money to then be like, ‘Oh, we have to rewrite all this with staff in-house.’”
Google’s spam policies state that the existence of freelancer content in and of itself is not a running afoul of the site reputation abuse policy — it’s only a violation if that content is also designed to take advantage of the site’s ranking signals. Google spokesperson Davis Thompson directed The Verge to an FAQ section describing the freelancer policy.
Snapchat is tweaking how people earn money on the platform by introducing a new, unified monetization program. The new program will cover content posted to Stories as well as Spotlight, the platform’s TikTok-like discovery feed filled with recommended video content. Under the program, influencers earn revenue for ads placed within eligible Stories and Spotlight posts.
Previously, monetization of these formats was splintered off from one another: Stories earnings were in one bucket, and Spotlight earnings were handled through a different program.
The new program is currently in testing with a small group of users, and will roll out widely on February 1st, 2025. To participate, users need to hit a set of benchmarks to be invited: 50,000 followers and either 10 million Snap views, 1 million Spotlight views, or 12,000 hours of watch time in the last 28 days.
They also need to post consistently, with at least 25 times per month to saved Stories or Spotlight and posting to Spotlight or public Stories on at least 10 of the last 28 days. Some of those eligibility requirements are significantly higher than they were under the old structure. To be eligible to earn money through Spotlight, for example, creators previously needed things like a public profile, 1,000 followers, and 10,000 video views.
Other video platforms have also streamlined or changed the original creator funds. TikTok, for example, now has one creator program that requires videos longer than 1-minute. On YouTube, Shorts creators earn money via ad revenue — a move by the company to meaningfully compete with TikTok.
When the first grainy images of the UnitedHealthcare shooting suspect emerged, some viewers noticed a seemingly small detail: he looked like he was wearing a Peak Design Everyday V1 backpack. Now, on platforms like Threads and TikTok, a recurring accusation has circulated: Peak Design “traced” the bag owner using the backpack’s serial number.
However, the company says that’s just not true, in a statement shared with The Verge Friday afternoon. “Peak Design has not provided customer information to the police and would only do so under the order of a subpoena,” the statement signed by CEO Peter Dering reads.
“We cannot associate a product serial number with a customer unless that customer has voluntarily registered their product on our site.” The statement goes on to say that the serial numbers on the V1 of the Everyday backpack “were not unique or identifying ... We did not implement unique serial numbers until V2 iterations of our Everyday Backpack.”
In footage of the killing of UnitedHealthcare CEO Brian Thompson, the shooter has a gray backpack with a top flap, which the NYPD believe is the same one they recovered in Central Park a few days later. The bag they eventually located is gray with black piping and what looks to be a tan-colored contrasting tab on the corner of the flap — just like Peak Design’s crowdfunded “Everyday” V1 model.
He toldThe New York Times last week that the item was likely bought between 2016 and 2019. Dering told the Times that he called the NYPD tip line to share what he knew and vowed to do “whatever is possible” to identify the shooter, including consulting Peak Design’s legal team to see what he could share with police.
The Times story is just 300 words long, but it appears to have sparked a wave of anger among those sympathetic to the suspect, Luigi Mangione. Despite the Times story's lack of mention of a serial code, the rumor spread like wildfire before the company’s denial today.
On the Peak Design subreddit, which is moderated by the brand, posts have popped up discussing the company’s ability to track customers using the serial number on a bag and tips on how to delete customer information. The complaints largely center on the fact that Dering volunteered any information at all to police—a significant shift in public attitudes around a killing.
In a follow-up email to The Verge, Dering added: “If you do choose to register a Peak Design product, and it is lost or stolen, you can reach out to our customer service team and have your registration erased, so the bag is not traceable back to you.”
Tumblr is introducing a new Community feature — in-app groups organized by topic or interest.
Communities are similar to subreddits or Facebook groups and had previously been in beta. Topics include things like film photography, marine biology, LGBTQ, and video games, and each topic has its own landing page where posts shared with the community populate. Many of the features mirror Reddit, like a count of how many members are online, moderators, and community guidelines. Posts shared to communities also get a new comments section that’s only visible within the group. Communities have the option to be public or private.
Tumblr pulling a page from Reddit’s playbook shouldn’t be surprising. As other communities and forums on the web have died off or been eaten by Google, Reddit has been on the up-and-up, growing its user base and turning a profit for the first time. But Subreddits managed by users are both Reddit’s crown jewels and a thorn in the side of corporate interests, as demonstrated by the coordinated action taken last year in protest of changes to the platform’s API pricing structure. As Google Search degrades in usefulness — or is replaced by AI summaries — platforms like Reddit have become a central part of finding helpful information online.
Reddit is also adding search engine-like features, including an AI-powered summary tool called Answers announced earlier this week. Though Tumblr’s communities feature — and Tumblr in general — isn’t the search destination Reddit is, the new grouping feature does streamline how users can find and engage with topic-based content and peers with similar interests. Communities is available on the web, iOS, and Android.
Beginning in February, health insurer Anthem Blue Cross Blue Shield was planning to set a time limit for anesthesia coverage during surgeries and procedures. Now, following days of widespread outrage at the health insurance industry generally, Anthem is walking that policy back, the insurer announced on Thursday.
In mid-November, the American Society of Anesthesiologists issued a press release about the policy, which was set to take effect in February in states like Connecticut, New York, and Missouri.
“If an anesthesiologist submits a bill where the actual time of care is longer than Anthem’s limit, Anthem will deny payment for the anesthesiologist’s care,” they group writes. “With this new policy, Anthem will not pay anesthesiologists for delivering safe and effective anesthesia care to patients who may need extra attention because their surgery is difficult, unusual or because a complication arises.”
The letter appears to have garnered little public attention until this week when severalposts on social media about the policy change began circulating. The posts gained traction after the CEO of UnitedHealthcare, Brian Thompson, was shot and killed in New York on Wednesday in what police say was a targeted attack.
A spokesperson for Anthem’s parent company, Elevance Health, told The New York Times that “misinformation” about the plan contributed to Anthem’s reversal.
“We realized, based on all the feedback we’ve been receiving the last 24 hours, that our communication about the policy was unclear, which is why we’re pulling back,” Janey Kiryluik, staff vice president for corporate communications, is quoted as saying.
Thompson’s shooting shocked the public, but it also ignited discussions about the havoc wreaked by the US healthcare system and insurers like UnitedHealthcare. United specifically has been the subject of investigations by outlets like Stat, which found the company uses algorithms to cut off payments and deny rehabilitation care for patients. The rate at which insurers deny patient claims is a closely guarded secret, but ProPublica last year followed one chronically ill patient’s fight to get coverage from United. In some online forums, there was little sympathy for the company and Thompson’s death: Americans carry at least $220 billion in medical debt, which upends lives as insurance companies profit.
For years, a fan-run account called Muppet History has been central to the Muppets fandom. It shared little-known facts, memes, and wholesome messages, amassing half a million followers on Instagram and more than 280,000 on X. Publicly, it was a wholesome and sweet platform, a passion project that took off. It became an unofficial ambassador of Jim Henson’s iconic cast of characters — inside and outside the world of diehard fans.
But on Monday night, a post on the account’s Instagram page had an ominous tone. “Good Evening,” the message started. “We wanted to take a moment to address some concerns that have arisen as of late.” The vague post — on which comments had been disabled — mentioned “overstepped” boundaries, the “harm” caused, and that people were made “uncomfortable.” It did not specify exactly what had happened.
Since that post, however, a rough sketch has come into focus. Fans claim that Muppet History’s co-runner Joshua Gillespie, who operates the account with his wife, Holly, was sending unwanted sexual messages to other people. Now, it’s gone from a bright spot on the internet to another soured piece of online culture, leaving a small community navigating the...
One Amazon influencer makes a living posting content from her beige home. But after she noticed another account hawking the same minimal aesthetic, a rivalry spiraled into a first-of-its-kind lawsuit. Can the legal system protect the vibe of a creator? And what if that vibe is basic?