Google Pixel owners continue to report battery drain issues with May update

If your Pixel’s battery doesn’t seem to last as long following the May 2025 update from Google, you’re not alone.
more…If your Pixel’s battery doesn’t seem to last as long following the May 2025 update from Google, you’re not alone.
more…It’s finally possible to purchase an audiobook from Spotify’s iPhone app with just a few taps. On Monday, Spotify announced that Apple approved an update that allows users in the US to see audiobook pricing within the app and buy individual audiobooks outside the App Store.
The update also lets Spotify Premium subscribers purchase additional audiobook listening hours. This change follows last month’s Epic Games vs. Apple ruling, which upended the iPhone maker’s control over the App Store. Under the ruling, Apple can’t collect fees on purchases made outside the app store, nor can it govern how developers point to external purchases.
Spotify submitted the update last week, but now it’s official. The music streaming service pulled audiobook purchases from its iOS app in 2022 after accusing Apple of “choking competition” with App Store rules that made it more difficult to purchase audiobooks. Spotify also started letting iPhone users purchase subscriptions outside the App Store earlier this month.
The iOS apps for Kindle, Patreon, and Delta’s emulator have also taken advantage of the court ruling, but Epic Games is still fighting to bring Fortnite back to the App Store. “This change lowers the barriers for more users to embrace their first — or tenth — audiobook, while allowing publishers and authors to reach fans and access new audiences seamlessly,” Spotify said in its announcement.
WizKids did Baldur’s Gate 3 dirty with its new D&D Icons of the Realms collection of miniature figures. The anticipated collection based on the beloved RPG (my personal 2023 GOTY) launched in April, and some buyers noticed that the characters look nothing — and I mean nothing — like how they were advertised. These cursed recreations of Shadowheart, Astarion, Karlach, Gale, Wyll, and Lae’zel look like unlicensed knockoffs when they are, in fact, officially licensed by Wizards of the Coast. And, it costs $50 for the set!
— Gale-LOD 😏 (@HighLODLar) May 13, 2025
Anyone who has played the game, or at least seen its cover art numerous times, knows what these characters should look like. They don’t look like 3D-printed discards predestined for the garbage bin, with little to no effort made to look authentic, unless perhaps you’re squinting from 10 feet away.
In response to buyers posting pictures of the botched figures on social media and retail sites, and likely from GamesRadar and other sites posting about it last week, WizKids posted an apology to buyers — and to Wizards of the Coast. In it, it details how to get a refund or a replacement to anyone who bought them online or at a local games store. Keep them, replace them, destroy them, pray to them. It’s up to you.
As previewed earlier this month, Google today released the NotebookLM app for Android and iOS ahead of I/O 2025.
more…Earlier in the year, Beats launched the Powerbeats Pro 2, the first Apple earbuds to feature in-ear heart rate monitoring. The debut was accompanied by the “Listen to Your Heart” campaign, voiced by RZA and starring LeBron James, Lionel Messi, and Shohei Ohtani.
Now, Beats is back with a new ad featuring Chinese tennis star and Olympic gold medalist Zheng Qinwen, affectionately known as “Queen Wen.” The ad marks her first major appearance since becoming a global ambassador for the brand earlier this year.
more…
SAG-AFTRA, the organization that represents voice, motion, and screen performers, has filed an unfair labor complaint against Epic Games. The complaint stems from the company’s recent introduction of an AI programmed to sound like James Earl Jones’ Darth Vader that can respond to a player’s actions and questions.
SAG-AFTRA wrote in a statement that it understands its members and members’ estates wish to use AI technology in any way they choose. “However,” SAG-AFTRA’s statement continued, “we must protect our right to bargain terms and conditions around uses of voice that replace the work of our members, including those who previously did the work of matching Darth Vader’s iconic rhythm and tone in video games.”
While the AI “revolution” slowly replaces human workers with oftentimes inferior products, and despite some members’ distaste for the practice, SAG-AFTRA has embraced the idea of using AI trained to replicate an actor’s performance. It has established contracts and partnerships with several AI companies with the idea being members can use this technology with specific contract-guaranteed protections. So the act of using an AI to replace Darth Vader’s voice performers (both the late James Earl Jones and those brought in after his death to match his performance) isn’t what SAG-AFTRA is objecting to. Rather, it’s the fact that this was done without Epic Games sitting down with SAG-AFTRA at the bargaining table to hash out the specifics.
“Fortnite‘s signatory company, Llama Productions, chose to replace the work of human performers with AI technology,” SAG-AFTRA wrote. “Unfortunately, they did so without providing any notice of their intent to do this and without bargaining with us over appropriate terms.” The Verge has reached out to Epic Games for comment.
AI and its use in video game voice and motion performance is the main stumbling block in the ongoing video game voice actor strike. Negotiations between SAG-AFTRA and the signatory companies of its interactive media agreement broke down last year, and performers have been on strike since July — a length of time that eclipses both the actors and writers strikes of 2023.
President Donald Trump signed the Take It Down Act into law, enacting a bill that will criminalize the distribution of nonconsensual intimate images (NCII) — including AI deepfakes — and require social media platforms to promptly remove them when notified.
The bill sailed through both chambers of Congress with several tech companies, parent and youth advocates, and first lady Melania Trump championing the issue. But critics — including a group that’s made it its mission to combat the distribution of such images — warn that its approach could backfire and harm the very survivors it seeks to protect.
The law makes publishing NCII, whether real or AI-generated, criminally punishable by up to three years in prison, plus fines. It also requires social media platforms to have processes to remove NCII within 48 hours of being notified and “make reasonable efforts” to remove any copies. The Federal Trade Commission is tasked with enforcing the law, and companies have a year to comply.
“I’m going to use that bill for myself, too”
Under any other administration, the Take It Down Act would likely see much of the pushback it does today by groups like the Electronic Frontier Foundation (EFF) and Center for Democracy and Technology (CDT), which warn the takedown provision could be used to remove or chill a wider array of content than intended, as well as threaten privacy-protecting technologies like encryption, since services that use it would have no way of seeing (or removing) the messages between users. But actions by the Trump administration in his first 100 days in office — including breaching Supreme Court precedent by firing the two Democratic minority commissioners at the FTC — have added another layer of fear for some of the law’s critics, who worry it could be used to threaten or stifle political opponents. Trump, after all, said during an address to Congress this year that once he signed the bill, “I’m going to use that bill for myself, too, if you don’t mind, because nobody gets treated worse than I do online. Nobody.”
The Cyber Civil Rights Initiative (CCRI), which advocates for legislation combating image-based abuse, has long pushed for the criminalization of nonconsensual distribution of intimate images (NDII). But the CCRI said it could not support the Take It Down Act because it may ultimately provide survivors with “false hope.” On Bluesky, CCRI President Mary Anne Franks called the takedown provision a “poison pill … that will likely end up hurting victims more than it helps.”
“Platforms that feel confident that they are unlikely to be targeted by the FTC (for example, platforms that are closely aligned with the current administration) may feel emboldened to simply ignore reports of NDII,” they wrote. “Platforms attempting to identify authentic complaints may encounter a sea of false reports that could overwhelm their efforts and jeopardize their ability to operate at all.”
In an interview with The Verge, Franks expressed concern that it could be “hard for people to parse” the takedown provision. “This is going to be a year-long process,” she said. “I think that as soon as that process has happened, you’ll then be seeing the FTC being very selective in how they treat supposed non-compliance with the statute. It’s not going to be about putting the power in the hands of depicted individuals to actually get their content removed.”
Trump, during his signing ceremony, dismissively referenced criticism of the bill. “People talked about all sorts of First Amendment, Second Amendment… they talked about any amendment they could make up, and we got it through,” he said.
Legal challenges to the most problematic parts may not come immediately, however, according to Becca Branum, deputy director of CDT’s Free Expression Project. “It’s so ambiguously drafted that I think it’ll be hard for a court to parse when it will be enforced unconstitutionally” before platforms have to implement it, Branum said. Eventually, users could sue if they have lawful content removed from platforms, and companies could ask a court to overturn the law if the FTC investigates or penalizes them for breaking it — it just depends on how quickly enforcement ramps up.