❌

Reading view

There are new articles available, click to refresh the page.

Researchers figure out how to get fresh lithium into batteries

As the owner of a 3-year-old laptop, I feel the finite lifespan of lithium batteries acutely. It's still a great machine, but the cost of a battery replacement would take me a significant way down the path of upgrading to a newer, even greater machine. If only there were some way to just plug it in overnight and come back to a rejuvenated battery.

While that sounds like science fiction, a team of Chinese researchers has identified a chemical that can deliver fresh lithium to well-used batteries, extending their life. Unfortunately, getting it to work requires that the battery has been constructed with this refresh in mind. Plus it hasn't been tested with the sort of lithium chemistry that is commonly used in consumer electronics.

Finding the right chemistry

The degradation of battery performance is largely a matter of its key components gradually dropping out of use within the battery. Through repeated cyclings, bits of electrodes fragment and lose contact with the conductors that collect current, while lithium can end up in electrically isolated complexes. There's no obvious way to re-mobilize these lost materials, so the battery's capacity drops. Eventually, the only way to get more capacity is to recycle the internals into a completely new battery.

Read full article

Comments

Β© Kinga Krzeminska

Microsoft demonstrates working qubits based on exotic physics

On Wednesday, Microsoft released an update on its efforts to build quantum computing hardware based on the physics of quasiparticles that have largely been the domain of theorists. The information coincides with the release of a paper in Nature that provides evidence that Microsoft's hardware can actually measure the behavior of a specific hypothesized quasiparticle.

Separately from that, the company announced that it has built hardware that uses these quasiparticles as the foundation for a new type of qubit, one that Microsoft is betting will allow it to overcome the advantages of companies that have been producing qubits for years.

The zero mode

Quasiparticles are collections of particles (and, in some cases, field lines) that can be described mathematically as if they were a single particle with properties that are distinct from their constituents. The best-known of these are probably the Cooper pairs that electrons form in superconducting materials.

Read full article

Comments

Β© John Brecher for Microsoft

Turning the Moon into a fuel depot will take a lot of power

If humanity is ever to spread out into the Solar System, we're going to need to find a way to put fuel into rockets somewhere other than the cozy confines of a launchpad on Earth. One option for that is in low-Earth orbit, which has the advantage of being located very close to said launch pads. But it has the considerable disadvantage of requiring a lot of energy to escape Earth's gravityβ€”it takes a lot of fuel to put substantially less fuel into orbit.

One alternative is to produce fuel on the Moon. We know there is hydrogen and oxygen present, and the Moon's gravity is far easier to overcome, meaning more of what we produce there can be used to send things deeper into the Solar System. But there is a tradeoff: Any fuel-production infrastructure will likely need to be built on Earth and sent to the Moon.

How much infrastructure is that going to involve? A study released today by PNAS evaluates the energy costs of producing oxygen on the Moon and finds that they're substantial: about 24 kWh per kilogram. This doesn't sound bad until you start considering how many kilograms we're going to eventually need.

Read full article

Comments

Β© NASA

AI used to design a multi-step enzyme that can digest some plastics

Enzymes are amazing catalysts. These proteins are made of nothing more than a handful of Earth-abundant elements, and they promote a vast array of reactions, convert chemical energy to physical motion, and act with remarkable specificity. In many cases, we have struggled to find non-enzymatic catalysts that can drive some of the same chemical reactions.

Unfortunately, there isn't an enzyme for many reactions we would sorely like to catalyzeβ€”things like digesting plastics or incorporating carbon dioxide into more complex molecules. We've had a few successes using directed evolution to create useful variations of existing enzymes, but efforts to broaden the scope of what enzymes can do have been limited.

With the advent of AI-driven protein design, however, we can now potentially design things that are unlike anything found in nature. A new paper today describes a success in making a brand-new enzyme with the potential to digest plastics. But it also shows how even a simple enzyme may have an extremely complex mechanismβ€”and one that's hard to tackle, even with the latest AI tools.

Read full article

Comments

Β© LAGUNA DESIGN

Seafloor detector picks up record neutrino while under construction

On Wednesday, a team of researchers announced that they got extremely lucky. The team is building a detector on the floor of the Mediterranean Sea that can identify those rare occasions when a neutrino happens to interact with the seawater nearby. And while the detector was only 10 percent of the size it will be on completion, it managed to pick up the most energetic neutrino ever detected.

For context, the most powerful particle accelerator on Earth, the Large Hadron Collider, accelerates protons to an energy of 7 Tera-electronVolts (TeV). The neutrino that was detected had an energy of at least 60 Peta-electronVolts, possibly hitting 230 PeV. That also blew away the previous records, which were in the neighborhood of 10 PeV.

Attempts to trace back the neutrino to a source make it clear that it originated outside our galaxy, although there are a number of candidate sources in the more distant Universe.

Read full article

Comments

Β© Patrick Dumas/CNRS

22 states sue to block new NIH funding policyβ€”court puts it on hold

On Friday, the National Institutes of Health (NIH) announced a sudden change to how it handles the indirect costs of researchβ€”the money that pays for things like support services and facilities maintenance. These costs help pay universities and research centers to provide the environment and resources all their researchers need to get research done. Previously, these had been set through negotiations with the university and audits of the spending. These averaged roughly 30 percent of the value of the grant itself and would frequently exceed 50 percent.

The NIH announcement set the rate at 15 percent for every campus. The new rate would start today and apply retroactively to existing grants, meaning most research universities are currently finding themselves facing catastrophic budget shortfalls.

Today, a coalition of 22 states filed a suit that seeks to block the new policy, alleging it violated both a long-standing law and a budget rider that Congress had passed in response to a 2017 attempt by Trump to drastically cut indirect costs. The suit seeks to prevent the new policy or its equivalent from being appliedβ€”something that Judge Angel Kelley of the District of Massachusetts granted later in the day. While that injunction only applies to research centers located in the states that have joined the suit, a separate suit was filed in the same district by a group of medical organizations, some of them (such as the Association of American Medical Colleges), have members throughout the country. As a result, Judge Kelley issued a separate ruling that extended the injunction to the remaining states.

Read full article

Comments

Β© Nicolas_

National Institutes of Health radically cuts support to universities

Grants paid by the federal government have two components. One covers the direct costs of performing the research, paying for salaries, equipment, and consumables like chemicals or enzymes. But the government also pays what are called indirect costs. These go to the universities and research institutes, covering the costs of providing and maintaining the lab space, heat and electricity, administrative and HR functions, and more.

These indirect costs are negotiated with each research institution and average close to 30 percent of the amount awarded for the research. Some institutions see indirect rates as high as half the value of the grant.

On Friday, the National Institutes of Health (NIH) announced that negotiated rates were ending. Every existing grant, and all those funded in the future, will see the indirect cost rate set to just 15 percent. With no warning and no time to adjust to the change in policy, this will prove catastrophic for the budget of nearly every biomedical research institution.

Read full article

Comments

Β© Steve McConnell/UC Berkeley

Quantum teleportation used to distribute a calculation

Performing complex algorithms on quantum computers will eventually require access to tens of thousands of hardware qubits. For most of the technologies being developed, this creates a problem: It's difficult to create hardware that can hold that many qubits. As a result, people are looking at various ideas of how we might link processors together in order to have them function as a single computational unit (a challenge that has obviously been solved for classical computers).

In today's issue of Nature, a team at Oxford University describes using quantum teleportation to link two pieces of quantum hardware that were located about 2 meters apart, meaning they could easily have been in different rooms entirely. Once linked, the two pieces of hardware could be treated as a single quantum computer, allowing simple algorithms to be performed that involved operations on both sides of the 2-meter gap.

Quantum teleportation is... different

Our idea of teleportation has been heavily shaped by Star Trek, where people disappear from one location while simultaneously appearing elsewhere. Quantum teleportation doesn't work like that. Instead, you need to pre-position quantum objects at both the source and receiving ends of the teleport and entangle them. Once that's done, it's possible to perform a series of actions that force the recipient to adopt the quantum state of the source. The process of performing this teleportation involves a measurement of the source object, which destroys its quantum state even as it appears at the distant site, so it does share that feature with the popular conception of teleportation.

Read full article

Comments

Β© D. Slichter/NIST

Bonobos recognize when humans are ignorant, try to help

A lot of human society requires what's called a "theory of mind"β€”the ability to infer the mental state of another person and adjust our actions based on what we expect they know and are thinking. We don't always get this rightβ€”it's easy to get confused about what someone else might be thinkingβ€”but we still rely on it for everything from navigating complicated social situations to avoiding bumping into people on the street.

There's some mixed evidence that other animals have a limited theory of mind, but there are alternate interpretations for most of it. So two researchers at Johns Hopkins, Luke Townrow and Christopher Krupenye, came up with a way of testing whether some of our closest living relatives, the bonobos, could infer the state of mind of a human they were cooperating with. The work clearly showed that the bonobos could tell when their human partner was ignorant.

Now you see it...

The experimental approach is quite simple and involves a setup familiar to street hustlers: a set of three cups, with a treat placed under one of them. Except in this case, there's no sleight-of-hand in that the chimp can watch as one experimenter places the treat under a cup, and all of the cups remain stationary throughout the experiment.

Read full article

Comments

Β© Anup Shah

Stem cells used to partially repair damaged hearts

When we developed the ability to convert various cells into a stem cell, it held the promise of an entirely new type of therapy. Rather than getting the body to try to fix itself with its cells or deal with the complications of organ transplants, we could convert a few adult cells to stem cells and induce them to form any tissue in the body. We could potentially repair or replace tissues with an effectively infinite supply of a patient's own cells.

Although the Nobel Prize for induced stem cells was handed out over a decade ago, the therapies have been slow to follow. In a new paper published in the journal Nature, however, a group of German researchers is now describing tests in primates of a method of repairing the heart using new muscle generated from stem cells. Although they're not yet providing everything that we might hope for, the results are promising. And they've been enough to start clinical trials, with similar results being seen in humans.

Heart problems

The heart contains a lot of specialized tissues, including those that form blood vessels or specialize in conducting electrical signals. But the key to the heart is a form of specialized muscle cell, called a cardiomyocyte. Once the heart matures, the cardiomyocytes stop dividing, meaning that you end up with a fixed population. Any damage to the heart due to injury or infection does not get repaired, meaning damage will be cumulative.

Read full article

Comments

Β© Douglas B. Cowan and James D. McCully

Science at risk: The funding pause is more damaging than you might think

UPDATE: Numerous sources are reporting that the Office of Management and Budget has rescinded the memo that suspended federal grant funding.

Starting a few days after the Trump inauguration, word spread within the research community that some grant spending might be on hold. On Monday, confirmation came in the form of a memo sent by the Office of Management and Budget (OMB): All grant money from every single agency would be on hold indefinitely. Each agency was given roughly two weeks to evaluate the grants they fund based on a list of ideological concerns; no new grants would be evaluated during this period.

While the freeze itself has been placed on hold, the research community has reacted with a mixture of shock, anger, and horror that might seem excessive to people who have never relied on grant money. To better understand the problems that this policy could create, we talked to a number of people who have had research supported by federal grants, providing them with anonymity to allow them to speak freely. The picture of this policy that they painted was one in which US research leadership could be irreparably harmed, with severe knock-on effects on industry.

Read full article

Comments

Β© izusek

USβ€˜s wind and solar will generate more power than coal in 2024

The Energy Information Agency has now released data on the performance of the US's electric grid over the first 11 months of 2024 and will be adding the final month soon (and a month is very little time for anything to change significantly in the data). The biggest story in the data is the dramatic growth of solar energy, with a 30 percent increase in generation in a single year, which will allow solar and wind combined to overtake coal in 2024.

But the US energy demand saw an increase of nearly 3 percent, which is roughly double the amount of additional solar generation. Should electric use continue to grow at a similar pace, renewable production will have to continue to grow dramatically for a few years before it can simply cover the added demand.

Going for the Sun

In the first 11 months of 2024, the US saw its electrical use grow by 2.8 percent, or roughly 100 Terawatt-hours. While there's typically year-to-year variation in use due to weather-driven demand, the US's consumption has largely been flat since the early 2000s. There are plenty of reasons to expect increased demand, including the growth of data centers and the electrification of heating and transit, but so far, there's been no clear sign of it in the data.

Read full article

Comments

Β© zhongguo

Researchers optimize simulations of molecules on quantum computers

One of the most frequently asked questions about quantum computers is a simple one: When will they be useful?

If you talk to people in the field, you'll generally get a response in the form of another question: useful for what? Quantum computing can be applied to a large range of problems, some of them considerably more complex than others. Utility will come for some of the simpler problems first, but further hardware progress is needed before we can begin tackling some of the more complex ones.

One that should be easiest to solve involves modeling the behavior of some simple catalysts. The electrons of these catalysts, which are critical for their chemical activity, obey the rules of quantum mechanics, which makes it relatively easy to explore them with a quantum computer.

Read full article

Comments

Β© Douglas Sacha

Trump issues flurry of orders on TikTok, DOGE, social media, AI, and energy

President Donald Trump's flurry of day-one actions included a reprieve for TikTok, the creation of a Department of Government Efficiency (DOGE), an order on social media "censorship," a declaration of an energy emergency, and reversal of a Biden order on artificial intelligence.

The TikTok executive order attempts to delay enforcement of a US law that requires TikTok to be banned unless its Chinese owner ByteDance sells the platform. "I am instructing the Attorney General not to take any action to enforce the Act for a period of 75 days from today to allow my Administration an opportunity to determine the appropriate course forward in an orderly way that protects national security while avoiding an abrupt shutdown of a communications platform used by millions of Americans," Trump's order said.

TikTok shut down in the US for part of the weekend but re-emerged after Trump said on Sunday that he would issue an order to "extend the period of time before the law's prohibitions take effect, so that we can make a deal to protect our national security." Trump also suggested that the US should own half of TikTok.

Read full article

Comments

Β© Getty Images

Edge of Mars’ great dichotomy eroded back by hundreds of kilometers

For decades, we have been imaging the surface of Mars with ever-finer resolution, cataloging a huge range of features on its surface, studying their composition, and, in a few cases, dispatching rovers to make on-the-ground readings. But a catalog of what's present on Mars doesn't give us answers to what's often the key question: how did a given feature get there? In fact, even with all the data we have available, there are a number of major bits of Martian geography that have produced major academic arguments that have yet to be resolved.

In Monday's issue of Nature Geoscience, a team of UK-based researchers tackle a big one: Mars' dichotomy, the somewhat nebulous boundary between its relatively elevated southern half, and the low basin that occupies its northern hemisphere, a feature that some have proposed also served as an ancient shoreline. The new work suggests that the edge of the dichotomy was eroded back by hundreds of kilometers during the time when an ocean might have occupied Mars' northern hemisphere.

Close to the edge

To view the Martian dichotomy, all you need to do is color-code a relief map of the Martian surface, something that NASA has conveniently done for us. Barring a couple of enormous basins, the entire southern hemisphere of the red planet is elevated by a kilometer or more, and sits atop a far thicker crust. With the exception of the volcanic Tharsis region the boundary between these two areas runs roughly along the equator.

Read full article

Comments

Β© NASA/JPL-Caltech/Univ. of Arizona

A solid electrolyte gives lithium-sulfur batteries ludicrous endurance

Lithium may be the key component in most modern batteries, but it doesn't make up the bulk of the material used in them. Instead, much of the material is in the electrodes, where the lithium gets stored when the battery isn't charging or discharging. So one way to make lighter and more compact lithium-ion batteries is to find electrode materials that can store more lithium. That's one of the reasons that recent generations of batteries are starting to incorporate silicon into the electrode materials.

There are materials that can store even more lithium than silicon; a notable example is sulfur. But sulfur has a tendency to react with itself, producing ions that can float off into the electrolyte. Plus, like any electrode material, it tends to expand in proportion to the amount of lithium that gets stored, which can create physical strains on the battery's structure. So while it has been easy to make lithium-sulfur batteries, their performance has tended to degrade rapidly.

But this week, researchers described a lithium-sulfur battery that still has over 80 percent of its original capacity after 25,000 charge/discharge cycles. All it took was a solid electrolyte that was more reactive than the sulfur itself.

Read full article

Comments

Β© P_Wei

Researchers use AI to design proteins that block snake venom toxins

It has been a few years since AI began successfully tackling the challenge of predicting the three-dimensional structure of proteins, complex molecules that are essential for all life. Next-generation tools are now available, and the Nobel Prizes have been handed out. But people not involved in biology can be forgiven for asking whether any of it can actually make a difference.

A nice example of how the tools can be put to use is being released in Nature on Wednesday. A team that includes the University of Washington's David Baker, who picked up his Nobel in Stockholm last month, used software tools to design completely new proteins that are able to inhibit some of the toxins in snake venom. While not entirely successful, the work shows how the new software tools can let researchers tackle challenges that would otherwise be difficult or impossible.

Blocking venom

Snake venom includes a complicated mix of toxins, most of them proteins, that engage in a multi-front assault on anything unfortunate enough to get bitten. Right now, the primary treatment is to use a mix of antibodies that bind to these toxins, produced by injecting sub-lethal amounts of venom proteins into animals. But antivenon treatments tend to require refrigeration, and even then, they have a short shelf life. Ensuring a steady supply also means regularly injecting new animals and purifying more antibodies from them.

Read full article

Comments

Β© Paul Starosta

Everyone agrees: 2024 the hottest year since the thermometer was invented

Over the last 24 hours or so, the major organizations that keep track of global temperatures have released figures for 2024, and all of them agree: 2024 was the warmest year yet recorded, joining 2023 as an unusual outlier in terms of how rapidly things heated up. At least two of the organizations, the European Union's Copernicus and Berkeley Earth, place the year at about 1.6Β° C above pre-industrial temperatures, marking the first time that the Paris Agreement goal of limiting warming to 1.5Β° has been exceeded.

NASA and the National Oceanic and Atmospheric Administration both place the mark at slightly below 1.5Β° C over pre-industrial temperatures (as defined by the 1850–1900 average). However, that difference largely reflects the uncertainties in measuring temperatures during that period rather than disagreement over 2024.

It’s hot everywhere

2023 had set a temperature record largely due to a switch to El NiΓ±o conditions midway through the year, which made the second half of the year exceptionally hot. It takes some time for that heat to make its way from the ocean into the atmosphere, so the streak of warm months continued into 2024, even as the Pacific switched into its cooler La NiΓ±a mode.

Read full article

Comments

Β© Copernicus

Coal likely to go away even without EPA’s power plant regulations

In April last year, the Environmental Protection Agency released its latest attempt to regulate the carbon emissions of power plants under the Clean Air Act. It's something the EPA has been required to do since a 2007 Supreme Court decision that settled a case that started during the Clinton administration. The latest effort seemed like the most aggressive yet, forcing coal plants to retire or install carbon capture equipment and making it difficult for some natural gas plants to operate without capturing carbon or burning green hydrogen.

Yet, according to a new analysis published in Thursday's edition of Science, they wouldn't likely have a dramatic effect on the US's future emissions even if they were to survive a court challenge. Instead, the analysis suggests the rules serve more like a backstop to prevent other policy changes and increased demand from countering the progress that would otherwise be made. This is just as well, given that the rules are inevitably going to be eliminated by the incoming Trump administration.

A long time coming

The net result of a number of Supreme Court decisions is that greenhouse gasses are pollutants under the Clean Air Act, and the EPA needed to determine whether they posed a threat to people. George W. Bush's EPA dutifully performed that analysis but sat on the results until its second term ended, leaving it to the Obama administration to reach the same conclusion. The EPA went on to formulate rules for limiting carbon emissions on a state-by-state basis, but these were rapidly made irrelevant because renewable power and natural gas began displacing coal even without the EPA's encouragement.

Read full article

Comments

Β© Ron and Patty Thomas

It’s remarkably easy to inject new medical misinformation into LLMs

It's pretty easy to see the problem here: The Internet is brimming with misinformation, and most large language models are trained on a massive body of text obtained from the Internet.

Ideally, having substantially higher volumes of accurate information might overwhelm the lies. But is that really the case? A new study by researchers at New York University examines how much medical information can be included in a large language model (LLM) training set before it spits out inaccurate answers. While the study doesn't identify a lower bound, it does show that by the time misinformation accounts for 0.001 percent of the training data, the resulting LLM is compromised.

While the paper is focused on the intentional "poisoning" of an LLM during training, it also has implications for the body of misinformation that's already online and part of the training set for existing LLMs, as well as the persistence of out-of-date information in validated medical databases.

Read full article

Comments

Β© Just_Super

❌