❌

Reading view

There are new articles available, click to refresh the page.

Consultant found guilty in murder of Cash App founder Bob Lee

Nima Momeni has been found guilty of second-degree murder in the death of notable fintech figure Bob Lee. Momeni, an IT consultant and entrepreneur, faces 16 years to life in prison. Best known as the creator and founder of Cash App, Lee was fatally stabbed in April 2023 in San Francisco. Momeni was arrested as a suspect nine days later.Β 

The autopsy report revealed that Lee was under the influence of cocaine, alcohol and ketamine at the time of his death. He had been in the company of Momeni and his sister, Khazar Momeni, that night. Prosecutors argued that Mr. Momeni attacked Lee on the street after discovering that an associate of Lee's had drugged and assaulted Ms. Momeni. Momeni's legal team said he acted in self-defense and was unaware that Lee was injured in their altercation.

Most recently, Lee had been the chief product officer for cryptocurrency operation MobileCoin. He had previously held an executive position at Square and played a role in developing the Android mobile operating system at Google.

This article originally appeared on Engadget at https://www.engadget.com/consultant-found-guilty-in-murder-of-cash-app-founder-bob-lee-192430902.html?src=rss

Β©

Β© MobileCoin

Headshot of Bob Lee, founder and creator of Cash App

ACLU highlights the rise of AI-generated police reports β€” what could go wrong?

The American Civil Liberties Association (ACLU) is sounding a warning about the use of AI in creating police reports, saying the tech could produce errors that affect evidence and court cases. The nonprofit highlighted the dangers of the tech in a white paper, following news that police departments in California are using a program called Draft One from Axon to transcribe body camera recording and create a first draft of police reports.Β 

One police department in Fresno said that it's using Draft One under a pilot program, but only for misdemeanor reports. "It's nothing more than a template," deputy chief Rob Beckwith told Industry Insider. "It’s not designed to have an officer push a button and generate a report." He said that the department has seen any errors with transcriptions and that it consulted with the Fresno County DA's office in training the force,

However, the ACLU noted four issues with the use of AI. First off, it said that AI is "quirky and unreliable and prone to making up fact... [and] is also biased." Secondly, it said that an officer's memories of an incident should be memorialized "before they are contaminated by an AI's body camera based storytelling." It added that if a police report is just an AI rehash of body camera video, certain facts might be omitted and it may even allow officers to lie if they did something illegal that wasn't captured on camera.Β 

The third point was around transparency, as the public needs to understand exactly how it works based on analysis by independent experts, according to the ACLU. Defendants in criminal cases also need to be able to interrogate the evidence, "yet much of the operation of these systems remains mysterious." Finally, the group noted that the use of AI transcriptions might remove accountability around the use of discretionary power. "For these reasons, the ACLU does not believe police departments should allow officers to use AI to generate draft police reports," it said.

This article originally appeared on Engadget at https://www.engadget.com/ai/aclu-highlights-the-rise-of-ai-generated-police-reports--what-could-go-wrong-133030452.html?src=rss

Β©

Β© ACLU

ACLU highlights the rise of AI-generated police reports β€” what could go wrong?

Apple sued for failing to implement tools that would detect CSAM in iCloud

Apple is being sued by victims of child sexual abuse over its failure to follow through with plans to scan iCloud for child sexual abuse materials (CSAM), The New York Times reports. In 2021, Apple announced it was working on a tool to detect CSAM that would flag images showing such abuse and notify the National Center for Missing and Exploited Children. But the company was hit with immediate backlash over the privacy implications of the technology, and ultimately abandoned the plan.

The lawsuit, which was filed on Saturday in Northern California, is seeking damages upwards of $1.2 billion dollars for a potential group of 2,680 victims, according to NYT. It claims that, after Apple showed off its planned child safety tools, the company β€œfailed to implement those designs or take any measures to detect and limit” CSAM on its devices, leading to the victims’ harm as the images continued to circulate.Β 

In a statement shared with Engadget, Apple spokesperson Fred Sainz said, β€œChild sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk. We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users. Features like Communication Safety, for example, warn children when they receive or attempt to send content that contains nudity to help break the chain of coercion that leads to child sexual abuse. We remain deeply focused on building protections that help prevent the spread of CSAM before it starts.” 

The lawsuit comes just a few months after Apple was accused of underreporting CSAM by the UK’s National Society for the Prevention of Cruelty to Children (NSPCC).

Update, December 8 2024, 6:55PM ET: This story has been updated to include Apple's statement to Engadget.

This article originally appeared on Engadget at https://www.engadget.com/big-tech/apple-sued-for-failing-to-implement-tools-that-would-detect-csam-in-icloud-202940984.html?src=rss

Β©

Β© Reuters / Reuters

FILE PHOTO: The Apple Inc logo is seen at the entrance to the Apple store in Brussels, Belgium November 28, 2022. REUTERS/Yves Herman/File Photo

Former Celsius CEO pleads guilty to two fraud charges

Former cryptocurrency leader Alex Mashinsky has pleaded guilty to two fraud charges. The founder and CEO of Celsius Network was indicted on seven criminal counts in 2023, including charges of fraud, conspiracy and market manipulation. He entered a not guilty plea at the time, but in a hearing today, Mashinsky pled guilty to two of those original counts. The first is commodities fraud and the second is a fraudulent scheme to manipulate the price of his company's in-house crypto token CEL. Reuters reported that as part of a plea deal, Mashinsky has agreed not to appeal any sentence of 30 years or less.

Mashinsky's case is one of several fraud cases being pursued against leaders of cryptocurrency operations. The most well-publicized charges are those brought against FTX founder Sam Bankman-Fried, who was found guilty on seven counts of fraud in 2023.

National agencies began a push into fraud charges for cryptocurrency schemes in 2022, when several notable companies filed for bankruptcy as token prices plummeted in response to rising interest rates and high inflation. That year, the Federal Trade Commission said that victims of crypto schemes had lost more than $1 billion since 2021.

This article originally appeared on Engadget at https://www.engadget.com/big-tech/former-celsius-ceo-pleads-guilty-to-two-fraud-charges-224046043.html?src=rss

Β©

Β© REUTERS / Reuters

Alex Mashinsky, founder and former CEO of bankrupt cryptocurrency lender Celsius Network, exits the Manhattan federal court in New York City, U.S., July 25, 2023. REUTERS/Brendan McDermid

Russia arrests ransomware attacker Wazawaka

One of the world’s most notorious hackers could finally be in custody. Bleeping ComputerΒ reports that ransomware affiliate Mikhail Pavlovich Matveev also known as Wazawaka, Uhodiransomwar, m1x and Boriselcin has been arrested.

Prosecutors have not confirmed if Matveev is under arrest, but reports indicate that Matveev may be the hacker in Russian custody. The Russian state news agency РИА Новости (translated on BlueSky by the Center for Strategic Research’s Oleg Shakirov) reported that the Kaliningrad Interior Ministry and Russian prosecutors sent a case of β€œa programmer accused of creating a malicious program” to court. An anonymous source with knowledge of the matter confirms that Matveev is the programmer.

Matveev is also wanted on charges in the US for launching attacks on US law enforcement agencies and healthcare organizations as far back as 2020. The US State Department is offered a $10 million reward for information leading to his capture in May of last year when the Department of Justice filed criminal charges against him. If he’s in Russian custody, the US may not get a chance to prosecute him.

Matveev, a Russian national, has links to ransomware hacking groups such as Hive, LockBit and Babuk. He’s linked to a number of attacks including an April 2021 lockout attack on the systems of the Washington D.C. Metropolitan Police Department. More than a year later, he allegedly helped launch a Hive ransomware attack on a healthcare NGO in New Jersey.

Attacks from LockBit are particularly destructive and egregious. In late 2022, the group infected the computer systems of 1,400 victims including a Holiday Inn hotel in Turkey. The Treasury Department’s Office of Foreign Assets Control also placed sanctions against Matveev for his role in several ransomware attacks on US services and critical infrastructure targets. The Justice Department believes Matveev has extracted more than $75 million from his victims in ransom payments.

This article originally appeared on Engadget at https://www.engadget.com/cybersecurity/russia-arrests-ransomware-attacker-wazawaka-202134431.html?src=rss

Β©

Β© FBI

The US Department of Justice is offering a $10 million reward for Mikhail Pavlovich Matveev.

Snap calls New Mexico's child safety complaint a 'sensationalist lawsuit'

Snap has accused New Mexico's attorney general of intentionally looking for adult users seeking sexually explicit content in order to make its app seem unsafe in a filing asking the court to dismiss the state's lawsuit. In the document shared by The Verge, the company questioned the veracity of the state's allegations. The attorney general's office said that while it was using a decoy account supposed to be owned by a 14-year-old girl, it was added by a user named Enzo (Nud15Ans). From that connection, the app allegedly suggested over 91 users, including adults looking for sexual content. Snap said in its motion to dismiss, however, that those "allegations are patently false."

It was the decoy account that searched for and added Enzo, the company wrote. The attorney general's operatives were also the ones who looked for and added accounts with questionable usernames, such as "nudenude_22" and "xxx_tradehot." In addition, Snap is accusing the office of "repeatedly [mischaracterizing]" its internal documents. The office apparently cited a document when it mentioned in its lawsuit that the company "consciously decided not to store child sex abuse images" and when it suggested that it doesn't report and provide those images to law enforcement. Snap denied that it was the case and clarified that it's not allowed to store child sexual abuse materials (CSAM) on its servers. It also said that it turns over such materials to the National Center for Missing and Exploited Children.

The New Mexico Department of Justice's director of communications was not impressed with the company's arguments. In a statement sent to The Verge, Lauren Rodriguez accused Snap of focusing on the minor details of the investigation in an "attempt to distract from the serious issues raised in the State’s case." Rodriguez also said that "Snap continues to put profits over protecting children" instead of "addressing... critical issues with real change to their algorithms and design features."

New Mexico came to the conclusion that Snapchat's features "foster the sharing of child sexual abuse material (CSAM) and facilitate child sexual exploitation" after a months-long investigation. It reported that it found a "vast network of dark web sites dedicated to sharing stolen, non-consensual sexual images from Snap" and that Snapchat was "by far" the biggest source of images and videos on the dark web sites that it had seen. The attorney general's office called Snapchat "a breeding ground for predators to collect sexually explicit images of children and to find, groom and extort them." Snap employees encounter 10,000 sextortion cases each month, the office's lawsuit said, but the company allegedly doesn't warn users so as not to "strike fear" among them. The complaint accused Snap's upper management of ignoring former trust and safety employees who'd pushed for additional safety mechanisms, as well.

This article originally appeared on Engadget at https://www.engadget.com/apps/snap-calls-new-mexicos-child-safety-complaint-a-sensationalist-lawsuit-140034898.html?src=rss

Β©

Β© Cheng Xin via Getty Images

CHONGQING, CHINA - OCTOBER 27: The Snapchat app page is displayed on a smartphone in the Apple App Store in front of the Snap Inc. logo on October 27, 2024 in Chongqing, China. (Photo by Cheng Xin/Getty Images)
❌