No Headphones, No Problem: This Acoustic Trick Bends Sound Through Space to Find You

What if you could listen to music or a podcast without headphones or earbuds and without disturbing anyone around you? Or have a private conversation in public without other people hearing you?

Our newly published research introduces a way to create audible enclaves – localized pockets of sound that are isolated from their surroundings. In other words, we’ve developed a technology that could create sound exactly where it needs to be.

The ability to send sound that becomes audible only at a specific location could transform entertainment, communication and spatial audio experiences.

[…]

The science of audible enclaves

We found a new way to send sound to one specific listener: through self-bending ultrasound beams and a concept called nonlinear acoustics.

Ultrasound refers to sound waves with frequencies above the human hearing range, or above 20 kHz. These waves travel through the air like normal sound waves but are inaudible to people. Because ultrasound can penetrate through many materials and interact with objects in unique ways, it’s widely used for medical imaging and many industrial applications.

[…]

Normally, sound waves combine linearly, meaning they just proportionally add up into a bigger wave. However, when sound waves are intense enough, they can interact nonlinearly, generating new frequencies that were not present before.

This is the key to our technique: We use two ultrasound beams at different frequencies that are completely silent on their own. But when they intersect in space, nonlinear effects cause them to generate a new sound wave at an audible frequency that would be heard only in that specific region.

Diagram of ultrasound beams bending around a head and intersection in an audible pocket
Audible enclaves are created at the intersection of two ultrasound beams.
Jiaxin Zhong et al./PNAS, CC BY-NC-ND

Crucially, we designed ultrasonic beams that can bend on their own. Normally, sound waves travel in straight lines unless something blocks or reflects them. However, by using acoustic metasurfaces – specialized materials that manipulate sound waves – we can shape ultrasound beams to bend as they travel. Similar to how an optical lens bends light, acoustic metasurfaces change the shape of the path of sound waves. By precisely controlling the phase of the ultrasound waves, we create curved sound paths that can navigate around obstacles and meet at a specific target location.

The key phenomenon at play is what’s called difference frequency generation. When two ultrasonic beams of slightly different frequencies, such as 40 kHz and 39.5 kHz, overlap, they create a new sound wave at the difference between their frequencies – in this case 0.5 kHz, or 500 Hz, which is well within the human hearing range. Sound can be heard only where the beams cross. Outside of that intersection, the ultrasound waves remain silent.

This means you can deliver audio to a specific location or person without disturbing other people as the sound travels.

[…]

This isn’t something that’s going to be on the shelf in the immediate future. For instance, challenges remain for our technology. Nonlinear distortion can affect sound quality. And power efficiency is another issue – converting ultrasound to audible sound requires high-intensity fields that can be energy intensive to generate.

Despite these hurdles, audio enclaves present a fundamental shift in sound control. By redefining how sound interacts with space, we open up new possibilities for immersive, efficient and personalized audio experiences.

Jiaxin Zhong, Postdoctoral Researcher in Acoustics, Penn State and Yun Jing, Professor of Acoustics, Penn State. This article is republished from The Conversation under a Creative Commons license. Read the original article.

Source: No Headphones, No Problem: This Acoustic Trick Bends Sound Through Space to Find You

Turning car exhausts into power: New method transforms carbon nanoparticles from emissions into renewable energy catalysts

We have developed a breakthrough method to convert carbon nanoparticles (CNPs) from vehicular emissions into high-performance electrocatalysts. This innovation provides a sustainable approach to pollution management and energy production by repurposing harmful particulate matter into valuable materials for renewable energy applications.

Our work, published in Carbon Neutralization, addresses both environmental challenges and the growing demand for efficient, cost-effective clean energy solutions.

Advancing electrocatalysis with multiheteroatom-doped CNPs

By doping CNPs with boron, nitrogen, oxygen and sulfur, we have significantly enhanced their catalytic performance. These multiheteroatom-doped nanoparticles exhibit remarkable efficiency in key electrochemical reactions. Our catalysts demonstrate high activity in the oxygen reduction reaction (ORR), which is essential for fuel cells and energy storage systems, as well as in the (HER), a crucial process for hydrogen fuel production.

Additionally, they show superior performance in the oxygen evolution reaction (OER), advancing water splitting for green hydrogen generation. By optimizing the composition of these materials, we have created an effective alternative to conventional precious metal-based catalysts, improving both cost-efficiency and sustainability.

[..]

Our research has far-reaching implications for clean energy and sustainable transportation industries. These catalysts can be integrated into fuel cells, enabling more efficient power generation for electric vehicles and energy storage systems. They also play a vital role in hydrogen production, supporting the transition to a hydrogen-based economy. Additionally, their use in renewable energy storage systems enhances the stability of wind and solar power generation.

While our findings demonstrate significant promise, further research is needed to scale up production, optimize material stability, and integrate these catalysts into commercial applications

[…]

Source: Turning pollution into power: New method transforms carbon nanoparticles from emissions into renewable energy catalysts

“Cool” years are now hotter than the “warm” years of the past: tracking global temperatures through El Niño and La Niña

Temperatures, as defined by “climate”, are based on temperatures over longer periods of time — typically 20-to-30-year averages — rather than single-year data points. But even when based on longer-term averages, the world has still warmed by around 1.3°C.

But you’ll also notice, in the chart, that temperatures haven’t increased linearly. There are spikes and dips along the long-run trend.

Many of these short-term fluctuations are caused by “ENSO” — the El Niño-Southern Oscillation — a natural climate cycle caused by changes in wind patterns and sea surface temperatures in the Pacific Ocean.

While it’s caused by patterns in the Pacific Ocean and most strongly affects countries in the tropics, it also impacts global temperatures and climate.

There are two key phases of this cycle: the La Niña phase, which tends to cause cooler global temperatures, and the El Niño phase, which brings hotter conditions. The world cycles between El Niño and La Niña phases every two to seven years. There are also “neutral” periods between these phases where the world is not in either extreme.

The zig-zag trend of global temperatures becomes understandable when you are taking the phases of the ENSO cycles into account. In the chart below, we see the data on global temperatures, but the line is now colored by the ENSO phase at that time.

The El Niño (warm phase) is shown in orange and red, and the La Niña (cold phase) is shown in blue.

You can see that temperatures often reach a short-term peak during warm El Niño years before falling back slightly as the world moves into La Niña years, shown in blue.

What’s striking is that global temperatures during recent La Niña years were warmer than El Niño years just a few decades before. “Cold years” today are hotter than “hot years” not too long ago.

Source: “Cool” years are now hotter than the “warm” years of the past: tracking global temperatures through El Niño and La Niña – Our World in Data

A Nasal Spray for Concussions Shows Early Promise

The best treatment for a hard knock on the head might someday involve a quick sniff of a nasal spray. Researchers have found early evidence in mice that an antibody-based treatment delivered up the nose can reduce the brain damage caused by concussions and more serious traumatic injuries.

Scientists at Mass General Brigham conducted the study, published Thursday in Nature Neuroscience. In brain-injured mice, the experimental spray appeared to improve the brain’s natural acute healing process while also reducing damaging inflammation later on. The findings could lead to a genuine prophylactic against the long-term impacts of traumatic brain injuries and other conditions like stroke, the researchers say.

[…]

Foralumab, developed by the company Tiziana Life Sciences, targets a specific group of proteins that interact with the brain’s immune cells, called CD3. This suppression of CD3, the team’s earlier work has suggested, increases the activity of certain immune cells known as regulatory T cells (Treg). As the name implies, these cells help regulate the brain’s immune response to make sure it doesn’t go haywire.

[…]

n their latest mice study, the researchers found that foralumab—via the increased activity of Treg cells—improved aspects of the brain’s immediate healing from a traumatic injury. The dosed mice’s microglia (the brain’s unique first line of immune defense) became better at eating and cleaning up after damaged cells, for instance. Afterward, the drug also appeared to prevent microglia from becoming chronically inflamed, As a result, relative to mice in a control group, mice treated with foralumab up to three days post-injury experienced greater improvements in their motor function and coordination.

[…]

Source: A Nasal Spray for Concussions Shows Early Promise

Ultrathin films are revolutionizing electrical conductivity

What if your electronic devices could adapt on the fly to temperature, pressure, or impact? Thanks to a new breakthrough in downsizing quantum materials, that idea is becoming a reality.

In an article published this month in Applied Physics Express, a multi-institutional research team led by Osaka University announced that they have successfully synthesized an ultrathin vanadium dioxide film on a flexible substrate, in a way that preserves the film’s electrical properties.

Vanadium dioxide is well known in the scientific community for its ability to transition between conductor and insulator phases at nearly room temperature. This phase transition underpins smart and adaptable electronics that can adjust to their environment in real time. But there is a limit to how thin vanadium dioxide films can be, because making a material too small affects its ability to conduct or insulate electricity.

“Ordinarily, when a film is placed on a hard substrate, strong surface forces interfere with the atomic structure of the film and degrade its conductive properties,” explains Boyuan Yu, lead author of the study.

To overcome this limitation, the team prepared their films on two-dimensional hexagonal boron nitride (hBN) crystals; hBN is a highly stable soft material that does not have strong bonds with oxides and thus does not excessively strain the film or spoil its delicate structure.

“The results are truly surprising,” says Hidekazu Tanaka, senior author. “We find that by using this soft substrate, the material structure is very nearly unaffected.”

By performing precise spectroscopy measurements, the team was able to confirm that the phase transition temperature of their vanadium dioxide layers remained essentially unchanged, even at thicknesses as thin as 12 nm.

“This discovery significantly improves our ability to manipulate quantum materials in practical ways,” says Yu. “We have gained a new level of control over the transition process, which means we can now tailor these materials to specific applications like sensors and flexible electronics.”

Given that quantum materials like vanadium dioxide play a crucial role in the design of microsensors and devices, this discovery could pave the way for functional and adaptable electronics that can be attached anywhere. The research team is currently working on such devices, as well as exploring ways to incorporate even thinner films and substrates.

Source: Powering the future — ultrathin films are revolutionizing electrical conductivity | ScienceDaily

Robotic exoskeleton can train expert pianists to play faster

A robotic hand exoskeleton can help expert pianists learn to play even faster by moving their fingers for them.

Robotic exoskeletons have long been used to rehabilitate people who can no longer use their hands due to an injury or medical condition, but using them to improve the abilities of able-bodied people has been less well explored.

Now, Shinichi Furuya at Sony Computer Science Laboratories in Tokyo and his colleagues have found that a robotic exoskeleton can improve the finger speed of trained pianists after a single 30-minute training session.

[…]

The robotic exoskeleton can raise and lower each finger individually, up to four times a second, using a separate motor attached to the base of each finger.

To test the device, the researchers recruited 118 expert pianists who had all played since before they had turned 8 years old and for at least 10,000 hours, and asked them to practise a piece for two weeks until they couldn’t improve.

Then, the pianists received a 30-minute training session with the exoskeleton, which moved the fingers of their right hand in different combinations of simple and complex patterns, either slowly or quickly, so that Furuya and his colleagues could pinpoint what movement type caused improvement.

The pianists who experienced the fast and complex training could better coordinate their right hand movements and move the fingers of either hand faster, both immediately after training and a day later. This, together with evidence from brain scans, indicates that the training changed the pianists’ sensory cortices to better control finger movements in general, says Furuya.

“This is the first time I’ve seen somebody use [robotic exoskeletons] to go beyond normal capabilities of dexterity, to push your learning past what you could do naturally,” says Nathan Lepora at the University of Bristol, UK. “It’s a bit counterintuitive why it worked, because you would have thought that actually performing the movements yourself voluntarily would be the way to learn, but it seems passive movements do work.”

 

Journal reference:

Science Robotics DOI: 10.1126/scirobotics.adn3802

Source: Robotic exoskeleton can train expert pianists to play faster | New Scientist

Robot arm developed that allows sense of touch

You can probably complete an amazing number of tasks with your hands without looking at them. But if you put on gloves that muffle your sense of touch, many of those simple tasks become frustrating. Take away proprioception — your ability to sense your body’s relative position and movement — and you might even end up breaking an object or injuring yourself.

[…]

Greenspon and his research collaborators recently published papers in Nature Biomedical Engineering and Science documenting major progress on a technology designed to address precisely this problem: direct, carefully timed electrical stimulation of the brain that can recreate tactile feedback to give nuanced “feeling” to prosthetic hands.

[…]

The researchers’ approach to prosthetic sensation involves placing tiny electrode arrays in the parts of the brain responsible for moving and feeling the hand. On one side, a participant can move a robotic arm by simply thinking about movement, and on the other side, sensors on that robotic limb can trigger pulses of electrical activity called intracortical microstimulation (ICMS) in the part of the brain dedicated to touch.

For about a decade, Greenspon explained, this stimulation of the touch center could only provide a simple sense of contact in different places on the hand.

“We could evoke the feeling that you were touching something, but it was mostly just an on/off signal, and often it was pretty weak and difficult to tell where on the hand contact occurred,” he said.

[…]

By delivering short pulses to individual electrodes in participants’ touch centers and having them report where and how strongly they felt each sensation, the researchers created detailed “maps” of brain areas that corresponded to specific parts of the hand. The testing revealed that when two closely spaced electrodes are stimulated together, participants feel a stronger, clearer touch, which can improve their ability to locate and gauge pressure on the correct part of the hand.

The researchers also conducted exhaustive tests to confirm that the same electrode consistently creates a sensation corresponding to a specific location.

“If I stimulate an electrode on day one and a participant feels it on their thumb, we can test that same electrode on day 100, day 1,000, even many years later, and they still feel it in roughly the same spot,” said Greenspon, who was the lead author on this paper.

[…]

The complementary Science paper went a step further to make artificial touch even more immersive and intuitive. The project was led by first author Giacomo Valle, PhD, a former postdoctoral fellow at UChicago who is now continuing his bionics research at Chalmers University of Technology in Sweden.

“Two electrodes next to each other in the brain don’t create sensations that ’tile’ the hand in neat little patches with one-to-one correspondence; instead, the sensory locations overlap,” explained Greenspon, who shared senior authorship of this paper with Bensmaia.

The researchers decided to test whether they could use this overlapping nature to create sensations that could let users feel the boundaries of an object or the motion of something sliding along their skin. After identifying pairs or clusters of electrodes whose “touch zones” overlapped, the scientists activated them in carefully orchestrated patterns to generate sensations that progressed across the sensory map.

Participants described feeling a gentle gliding touch passing smoothly over their fingers, despite the stimulus being delivered in small, discrete steps. The scientists attribute this result to the brain’s remarkable ability to stitch together sensory inputs and interpret them as coherent, moving experiences by “filling in” gaps in perception.

The approach of sequentially activating electrodes also significantly improved participants’ ability to distinguish complex tactile shapes and respond to changes in the objects they touched. They could sometimes identify letters of the alphabet electrically “traced” on their fingertips, and they could use a bionic arm to steady a steering wheel when it began to slip through the hand.

These advancements help move bionic feedback closer to the precise, complex, adaptive abilities of natural touch, paving the way for prosthetics that enable confident handling of everyday objects and responses to shifting stimuli.

[…]

“We hope to integrate the results of these two studies into our robotics systems, where we have already shown that even simple stimulation strategies can improve people’s abilities to control robotic arms with their brains,” said co-author Robert Gaunt, PhD, associate professor of physical medicine and rehabilitation and lead of the stimulation work at the University of Pittsburgh.

Greenspon emphasized that the motivation behind this work is to enhance independence and quality of life for people living with limb loss or paralysis.

[…]

Source: Fine-tuned brain-computer interface makes prosthetic limbs feel more real | ScienceDaily

Scientists find ‘spooky’ quantum entanglement within individual protons

Scientists have used high-energy particle collisions to peer inside protons, the particles that sit inside the nuclei of all atoms. This has revealed for the first time that quarks and gluons, the building blocks of protons, experience the phenomenon of quantum entanglement.

[…]

despite Einstein’s skepticism about entanglement, this “spooky” phenomenon has been verified over and over again. Many of those verifications have concerned testing increasing distances over which entanglement can be demonstrated. This new test took the opposite approach, investigating entanglement over a distance of just one quadrillionth of a meter, finding it actually occurs within individual protons.

The team found that the sharing of information that defines entanglement occurs across whole groups of fundamental particles called quarks and gluons within a proton.

[…]

To probe the inner structure of protons, scientists looked at high-energy particle collisions that have occurred in facilities like the Large Hadron Collider (LHC). When particles collide at extremely high speeds, other particles stream away from the collision like wreckage flung away from a crash between two vehicles.

This team used a technique developed in 2017 that applies quantum information science to electron-proton collisions to determine how entanglement influences the paths of particles streaming away. If quarks and gluons are entangled with protons, this technique says that should be revealed by the disorder, or “entropy,” seen in the sprays of daughter particles.

“Think of a kid’s messy bedroom, with laundry and other things all over the place,” Tu said. “In that disorganized room, the entropy is very high.”

The contrast to this is a low-entropy situation which is akin to a neatly tidied and sorted bedroom in which everything is organized in its proper place. A messy room indicates entanglement, if you will.

“For a maximally entangled state of quarks and gluons, there is a simple relation that allows us to predict the entropy of particles produced in a high-energy collision,” Brookhaven Lab theorist Dmitri Kharzeev said in the statement. “We tested this relation using experimental data.”

A large blue pipeline runs through complex machinery

The interior of the Large Hadron Collider is within which protons and other particles are collided at high speeds. (Image credit: Robert Lea)

To investigate how “messy” particles get after a collision, the team first turned to data generated by proton-proton collisions conducted at the LHC. Then, in search of “cleaner” data, the researchers looked to electron-proton collisions carried out at the Hadron-Electron Ring Accelerator (HERA) particle collider from 1992 to 2007.

This data was delivered by the H1 team and its spokesperson as well as Deutsches Elektronen-Synchrotron (DESY) researcher Stefan Schmitt after a three-year search through HERA results.

Comparing HERA data with the entropy calculations, the team’s results matched their predictions perfectly, providing strong evidence that quarks and gluons inside protons are maximally entangled.

“Entanglement doesn’t only happen between two particles but among all the particles,” Kharzeev said. “Maximal entanglement inside the proton emerges as a consequence of strong interactions that produce a large number of quark-antiquark pairs and gluons.”

The revelation of maximal entanglement of quarks and gluons within protons could help reveal what keeps these fundamental particles bound together with the building blocks of atomic nuclei.

[…]

Source: Scientists find ‘spooky’ quantum entanglement on incredibly tiny scales — within individual protons | Space

The Real Reason People Don’t Trust in Science: They buy propaganda lies

[…]

contemplating November’s annual Pew Research Center survey of public confidence in science.

The Pew survey found 76 percent of respondents voicing “a great deal or fair amount of confidence in scientists to act in the public’s best interests.” That’s up a bit from last year, but still down from prepandemic measures, to suggest that an additional one in 10 Americans has lost confidence in scientists since 2019.

[…]

Why? Pew’s statement and many news stories about the findings somehow missed the obvious culprit: the four years and counting of a propaganda campaign by Donald Trump’s allies to shift blame to scientists for his first administration’s disastrous, botched handling of the COVID pandemic that has so far killed at least 1.2 million Americans.

Even the hot dog guy would blanch at the transparency of the scapegoating. It was obviously undertaken to inoculate Trump from voter blame for the pandemic. The propaganda kicked off four years ago with a brazen USA TODAY screed from his administration’s economic advisor Peter Navarro (later sent to federal prison on unrelated charges). Navarro wrongly blamed then–National Institute for Allergy and Infectious Diseases chief Anthony Fauci for the administration’s myriad pandemic response screwups. Similar inanities followed from Trump’s White House, leading to years of right-wing nonsense and surreal hearings that ended last June with Republican pandemic committee members doing everything but wearing hot dog costumes while questioning Fauci. Browbeating a scientific leader behind COVID vaccines that saved millions of lives at a combative hearing proved as mendacious as it was shameful.

The Pew survey’s results, however, show this propaganda worked on some Republican voters. The drop in public confidence in science the survey reports is almost entirely contained to that circle, plunging from 85 percent approval among Republican voters in April of 2020 to 66 percent now. It hardly budged for those not treated to nightly doses of revisionist history in an echo chamber—where outlets pretended that masking, school and business restrictions, and vaccines, weren’t necessities in staving off a deadly new disease. Small wonder that Republican voters’ excess death rates were 1.5 times those among Democrats after COVID vaccines appeared.

Stacked bar charts show percent breakdowns of how various groups of Americans characterized the amount of confidence they had in scientists to act in the best interests of the public, over seven iterations of a survey from January 2019 to October 2024. The proportion of respondents who say “a fair amount” or “a great deal” falls over time, but this change is much more dramatic among Republicans and those who lean Republican, compared with Democrats and those who lean Democratic.

Amanda Montañez; Source: Pew Research Center

Instead of noting the role of this propaganda in their numbers, Pew’s statement about the survey pointed only to perceptions that scientists aren’t “good communicators,” held by 52 percent of respondents, and the 47 percent who said, “research scientists feel superior to others” in the survey.

[…]

it matches the advice in a December NASEM report on scientific misinformation: “Scientists, medical professionals, and health professionals who choose to take on high profile roles as public communicators of science should understand how their communications may be misinterpreted in the absence of context or in the wrong context.” This completely ignores the deliberate misinterpretation of science to advance political aims, the chief kind of science misinformation dominating the modern public sphere.

It isn’t a secret what is going on: Oil industry–funded lawmakers and other mouthpieces have similarly vilified climate scientists for decades to stave off paying the price for global warming. A study published in 2016 in the American Sociological Review concluded that the U.S. public’s slow erosion of trust in science from 1974 to 2010 was almost entirely among conservatives. Such conservatives had adopted “limited government” politics, which clashes with science’s “fifth branch” advisory role in setting regulations—seen most clearly in the FDA resisting Trump’s calls for wholesale approval of dangerous drugs to treat COVID. That flavor of politics made distrust for scientists the collateral damage of the half-century-long attack on regulation. The utter inadequacy of an unscientific, limited-government response to the 2020 pandemic only primed this resentment—fanned by hate aimed at Fauci—to deliver the dent in trust for science we see today.

[…]

With Trump headed back to the White House, his profoundly unqualified pick for Department of Health and Human Services chief is Robert F. Kennedy, Jr., whose antivaccine advocacy contributed to 83 measles deaths in American Samoa in 2018. For the National Institutes of Health he has picked Stanford University’s Jay Bhattacharya, one of three authors of a lethally misguided 2020 planpushed then on the Trump White Houseto spur coronavirus infections that would have caused, “the severe illness and preventable deaths of hundreds of thousands of people,” according to the Infectious Diseases Society of America. Neither of these hot-dog-guy picks should be allowed anywhere near our vital health agencies.

[…]

Source: The Real Reason People Don’t Trust in Science Has Nothing to Do with Scientists | Scientific American

A new way to entangle Particles from a distance

[…] Traditionally, entanglement is achieved through local interactions or via entanglement swapping, where entanglement at a distance is generated through previously established entanglement and Bell-state measurements. However, the precise requirements enabling the generation of quantum entanglement without traditional local interactions remain less explored. Here, we demonstrate that independent particles can be entangled without the need for direct interaction, prior established entanglement, or Bell-state measurements, by exploiting the indistinguishability of the origins of photon pairs. Our demonstrations challenge the long-standing belief that the prior generation and measurement of entanglement are necessary prerequisites for generating entanglement between independent particles that do not share a common past. In addition to its foundational interest, we show that this technique might lower the resource requirements in quantum networks, by reducing the complexity of photon sources and the overhead photon numbers.

Source: Phys. Rev. Lett. 133, 233601 (2024) – Entangling Independent Particles by Path Identity

the PDF

Krenn Research Grou

Scientists Built a Tiny DNA ‘Hand’ That Grabs Viruses to Stop Infections

Imagine if scientists could grab virus particles the same way we pick up a tennis ball or a clementine, and prevent them from infecting cells. Well, scientists in Illinois have built a microscopic four-fingered hand to do just that.

A team of scientists, led by Xing Wang of the University of Illinois Urbana-Champaign, has created a tiny hand, dubbed the NanoGripper, from a single piece of folded DNA that can grab covid-19 particles. Their findings, detailed in a November 27 study published in the journal Science Robotics, demonstrate that the hand can conduct a rapid test to identify the virus as well as prevent the particles from infecting healthy cells. Although the study focused specifically on the covid-19 virus, the results have important implications for numerous medical conditions.

“We wanted to make a soft material, nanoscale robot with grabbing functions that never have been seen before, to interact with cells, viruses and other molecules for biomedical applications,” Wang said in a university statement. “We are using DNA for its structural properties. It is strong, flexible and programmable. Yet even in the DNA origami field, this is novel in terms of the design principle. We fold one long strand of DNA back and forth to make all of the elements, both the static and moving pieces, in one step.”

a NanoGripper hand and its components
A NanoGripper hand and its components. © Xing Wang

The NanoGripper has four jointed fingers and a palm. The fingers are programmed to attach to specific targets—in the case of covid-19, the virus’ infamous spike protein—and close their grip around them. According to the study, when the researchers exposed cells with NanoGrippers to covid-19, the hands’ gripping mechanisms prevented the viral spike proteins from infecting the cells.

“It would be very difficult to apply it after a person is infected, but there’s a way we could use it as a preventive therapeutic,” Wang explained. “We could make an anti-viral nasal spray compound. The nose is the hot spot for respiratory viruses, like covid or influenza. A nasal spray with the NanoGripper could prevent inhaled viruses from interacting with the cells in the nose.”

The hand is also decked with a unique sensor that detects covid-19 in 30 minutes with the accuracy of the now-familiar qPCR molecular tests used in hospitals.

“When the virus is held in the NanoGripper’s hand, a fluorescent molecule is triggered to release light when illuminated by an LED or laser,” said Brian Cunningham, one of Wang’s colleagues on the study, also from the University of Illinois Urbana-Champaign. “When a large number of fluorescent molecules are concentrated upon a single virus, it becomes bright enough in our detection system to count each virus individually.”

Like a true Swiss army knife, scientists could modify the NanoGripper to potentially detect and grab other viruses, including HIV, influenza, or hepatitis B, as detailed in the study. The NanoGripper’s “wrist side” could also attach to another biomedical tool for additional functions, such as targeted drug delivery.

Wang, however, is thinking even bigger than viruses: cancer. The fingers could be programmed to target cancer cells the same way they currently identify covid-19’s spike proteins, and then deliver focused cancer-fighting treatments.

A Nanogripper's Applications
An artistic rendering of the NanoGripper’s potential applications. © Xing Wang

“Of course it would require a lot of testing, but the potential applications for cancer treatment and the sensitivity achieved for diagnostic applications showcase the power of soft nanorobotics,” Wang concluded.

Here’s to hoping NanoGrippers might give scientists the ability to grab the next pandemic by the nanoballs.

Source: Scientists Built a Tiny DNA ‘Hand’ That Grabs Viruses to Stop Infections

I am very curious how the detection and delivery are programmed.

Snowfall in the Alps is a third less than a hundred years ago, meteorologists find

From 23% less in the northern Alps to a decrease of almost 50% on the southwestern slopes: Between 1920 and 2020, snowfall across the entirety of the Alps has decreased on average by a significant 34%. The results come from a study coordinated by Eurac Research and were published in the International Journal of Climatology. The study also examines how much altitude and climatological parameters such as temperature and total precipitation impact on snowfall.

The data on seasonal snowfall and rainfall was collected from 46 sites throughout the Alps, the most recent of which was collected from modern weather stations, and the historical data was gathered from handwritten records in which specially appointed observers recorded how many inches of snow were deposited at a given location.

[…]

“The most negative trends concern locations below an altitude of 2,000 meters and are in the southern regions such as Italy, Slovenia and part of the Austrian Alps.

In the Alpine areas to the north such as Switzerland and northern Tyrol, the research team observed the extent to which altitude also plays a central role. Although there has been an increase in precipitation during the winter seasons, at lower altitudes, snowfall has increasingly turned to rain as temperatures have risen. At higher elevations, however, thanks to sufficiently cold temperatures, snowfall is being maintained. In the southwestern and southeastern areas, temperatures have risen so much that even at , rain is frequently taking over .

[…]

Source: Snowfall in the Alps is a third less than a hundred years ago, meteorologists find

Is ‘bypassing’ a better way to battle misinformation? Researchers say new approach has advantages over the standard

Misinformation can lead to socially detrimental behavior, which makes finding ways to combat its effects a matter of crucial public concern. A new paper by researchers at the Annenberg Public Policy Center (APPC) in the Journal of Experimental Psychology: General explores an innovative approach to countering the impact of factually incorrect information called “bypassing,” and finds that it may have advantages over the standard approach of correcting inaccurate statements.

“The gold standard for tackling misinformation is a correction that factually contradicts the misinformation” by directly refuting the claim […]

in the study “Bypassing versus correcting misinformation: Efficacy and fundamental processes.” Corrections can work, but countering misinformation this way is an uphill battle: people don’t like to be contradicted, and a belief, once accepted, can be difficult to dislodge.

Bypassing works differently. Rather than directly addressing the misinformation, this strategy involves offering that has an implication opposite to that of the misinformation. For example, faced with the factually incorrect statement “genetically modified foods have health risks,” a bypassing approach might highlight the fact that genetically modified foods help the bee population. This counters the negative implication of the misinformation with positive implications, without taking the difficult path of confrontation

[…]

“bypassing can generally be superior to correction, specifically in situations when people are focused on forming beliefs, but not attitudes, about the information they encounter.” This is because “when an attitude is formed, it serves as an anchor for a person’s judgment of future claims. When a belief is formed, there is more room for influence, and a bypassing message generally exerts more.”

[…]

“bypassing can generally be superior to correction, specifically in situations when people are focused on forming beliefs, but not attitudes, about the information they encounter.” This is because “when an attitude is formed, it serves as an anchor for a person’s judgment of future claims. When a belief is formed, there is more room for influence, and a bypassing message generally exerts more.”

Source: Is ‘bypassing’ a better way to battle misinformation? Researchers say new approach has advantages over the standard

Plastic pollution is changing entire Earth system, scientists find

[…]

In 2022 at least 506m tonnes of plastics were produced worldwide, but only 9% gets recycled globally. The rest is burned, landfilled or dumped where it can leach into the environment. Microplastics are now everywhere, from the top of Mount Everest to the Mariana Trench, the deepest point on earth.

The new study of plastic pollution examined the mounting evidence of the effects of plastics on the environment, health and human wellbeing. The authors are urging delegates at the UN talks to stop viewing plastic pollution as merely a waste problem, and instead to tackle material flows through the whole life pathway of plastic, from raw material extraction, production and use, to its environmental release and its fate, and the Earth system effects.

“It’s necessary to consider the full life cycle of plastics, starting from the extraction of fossil fuel and the primary plastic polymer production” said the article’s lead author, Patricia Villarrubia-Gómez, at Stockholm Resilience Centre.

The research team showed that plastics pollution was changing the processes of the entire Earth system, and affected all pressing global environmental problems, including climate change, biodiversity loss, ocean acidification, and the use of freshwater and land.

“Plastics are seen as those inert products that protect our favourite products, or that make our lives easier that can be “easily cleaned-up” once they become waste,” Villarrubia-Gómez said. “But this is far from reality. Plastics are made out of the combination of thousands of chemicals. Many of them, such as endocrine disruptors and forever chemicals, pose toxicity and harm to ecosystems and human health. We should see plastics as the combination of these chemicals with which we interact on a daily basis.”

[…]

“We now find plastics in the most remote regions of the planet and in the most intimate, within human bodies. And we know that plastics are complex materials, released to the environment throughout the plastics life cycle, resulting in harm in many systems.

“The solutions we strive to develop must be considered with this complexity in mind, addressing the full spectra of safety and sustainability to protect people and the planet.”

Source: Plastic pollution is changing entire Earth system, scientists find | Plastics | The Guardian

Using mathematics to better understand cause and effect

Consider an example from climate science. Experts studying large atmospheric circulation patterns and their impacts on global weather would like to know how these systems might change with warming climates. Here, many variables come into play: ocean and air temperatures and pressures, ocean currents and depths, and even details of the earth’s rotation over time. But which variables cause which measured effects?

That is where information theory comes in as the framework to formulate causality. Adrián Lozano-Durán, an associate professor of aerospace at Caltech, and members of his group both at Caltech and MIT have developed a method that can be used to determine causality even in such complex systems.

The new mathematical tool can tease out the contributions that each variable in a system makes to a measured effect — both separately and, more importantly, in combination. The team describes its new method, called synergistic-unique-redundant decomposition of causality (SURD), in a paper published today, November 1, in the journal Nature Communications.

The new model can be used in any situation in which scientists are trying to determine the true cause or causes of a measured effect. That could be anything from what triggered the downturn of the stock market in 2008, to the contribution of various risk factors in heart failure, to which oceanic variables affect the population of certain fish species, to what mechanical properties are responsible for the failure of a material.

“Causal inference is very multidisciplinary and has the potential to drive progress across many fields,” says Álvaro Martínez-Sánchez, a graduate student at MIT in Lozano-Durán’s group, who is lead author of the new paper.

For Lozano-Durán’s group, SURD will be most useful in designing aerospace systems. For instance, by identifying which variable is increasing an aircraft’s drag, the method could help engineers optimize the vehicle’s design.

“Previous methods will only tell you how much causality comes from one variable or another,” explains Lozano-Durán. “What is unique about our method is its ability to capture the full picture of everything that is causing an effect.”

The new method also avoids the incorrect identification of causalities. This is largely because it goes beyond merely quantifying the effect produced by each variable independently. In addition to what the authors refer to as “unique causality,” the method incorporates two new categories of causality, namely redundant and synergistic causality.

Redundant causality occurs when more than one variable produces a measured effect, but not all the variables are needed to arrive at the same outcome. For example, a student can get a good grade in class because she is very smart or because she is a hard worker. Both could result in the good grade, but only one is necessary. The two variables are redundant.

Synergistic causality, on the other hand, involves multiple variables that must work together to produce an effect. Each variable on its own will not yield the same outcome. For instance, a patient takes medication A, but he does not recuperate from his illness. Similarly, when he takes medication B, he sees no improvement. But when he takes both medications, he fully recovers. Medications A and B are synergistic.

SURD mathematically breaks down the contributions of each variable in a system to its unique, redundant, and synergistic components of causality. The sum of all these contributions must satisfy a conservation-of-information equation that can then be used to figure out the existence of hidden causality, i.e., variables that could not be measured or that were thought not to be important. (If the hidden causality turns out to be too large, the researchers know they need to reconsider the variables they included in their analysis.)

To test the new method, Lozano-Durán’s team used SURD to analyze 16 validation cases — scenarios with known solutions that would normally pose significant challenges for researchers trying to determine causality.

“Our method will consistently give you a meaningful answer across all these cases,” says Gonzalo Arranz, a postdoctoral researcher in the Graduate Aerospace Laboratories at Caltech, who is also an author of the paper. “Other methods mix causalities that should not be mixed, and sometimes they get confused. They get a false positive identifying a causality that doesn’t exist, for example.”

In the paper, the team used SURD to study the creation of turbulence as air flows around a wall. In this case, air flows more slowly at lower altitudes, close to the wall, and more quickly at higher altitudes. Previously, some theories of what is happening in this scenario have suggested that the higher-altitude flow influences what is happening close to the wall and not the other way around. Other theories have suggested just the opposite — that the air flow near the wall affects what is happening at higher altitudes.

“We analyzed the two signals with SURD to understand in which way the interactions were happening,” says Lozano-Durán. “As it turns out, causality comes from the velocity that is far away. In addition, there is some synergy where the signals interact to create another type of causality. This decomposition, or breaking into pieces of causality, is what is unique for our method.”


Story Source:

Materials provided by California Institute of Technology. Note: Content may be edited for style and length.


Journal Reference:

  1. Álvaro Martínez-Sánchez, Gonzalo Arranz, Adrián Lozano-Durán. Decomposing causality into its synergistic, unique, and redundant components. Nature Communications, 2024; 15 (1) DOI: 10.1038/s41467-024-53373-4

Source: Using mathematics to better understand cause and effect | ScienceDaily

Researchers unlock a new way to grow quantum dots

The type of semiconductive nanocrystals known as quantum dots are both expanding the forefront of pure science and also hard at work in practical applications including lasers, quantum QLED televisions and displays, solar cells, medical devices, and other electronics.

A new technique for growing these microscopic crystals, published this week in Science, has not only found a new, more efficient way to build a useful type of quantum dot, but also opened up a whole group of novel chemical materials for future researchers’ exploration.

[…]

by replacing the organic solvents typically used to create nanocrystals with molten salt — literally superheated sodium chloride of the type sprinkled on baked potatoes.

“Sodium chloride is not a liquid in your mind, but assume you heat it to such a crazy temperature that it becomes a liquid. It looks like liquid. It has similar viscosity as water. It’s colorless. The only problem was that nobody ever considered these liquids as media for colloidal synthesis,”

[…]

much of the previous research on quantum dots, including the Nobel work, was around dots grown using combinations of elements from the second and sixth groups on the periodic table, Rabani said. These are called “II-VI” (two-six) materials.

More promising materials for quantum dots can be found elsewhere on the periodic table.

Materials found in the third and fifth groups of the periodic table (III-V materials) are used in the most efficient solar cells, brightest LEDs, most powerful semiconductor lasers, and fastest electronic devices. They would potentially make great quantum dots, but, with few exceptions, it was impossible to use them to grow nanocrystals in solution. The temperatures required to make these materials were too high for any known organic solvent.

Molten salt can handle the heat, making these previously inaccessible materials accessible.

[…]

One of the reasons researchers synthesizing nanocrystals overlooked molten salt was because of its strong polarity, said UChicago graduate student Zirui Zhou, second author of the new paper.

Salt’s positively charged ions and negatively charged ions have a strong pull toward each other. Small things like nanocrystals have small surface charges, so researchers assumed the charge would be too weak to push back as salt’s ions pull in. Any growing crystals would be crushed before they could form a stable material.

Or so previous researchers thought.

“It’s a surprising observation,” Zhou said. “This is very contradictory to what scientists traditionally think about these systems.”

The new technique can mean new building blocks for better, faster quantum and classical computers, but for many on the research team, the truly exciting part is opening up new materials for study.

[…]

Source: Researchers unlock a ‘new synthetic frontier’ for quantum dots | ScienceDaily

New 3 point graph mining algorithm finds patterns in complex networks

University of Virginia School of Engineering and Applied Science professor Nikolaos Sidiropoulos has introduced a breakthrough in graph mining with the development of a new computational algorithm.

Graph mining, a method of analyzing networks like social media connections or biological systems, helps researchers discover meaningful patterns in how different elements interact. The new algorithm addresses the long-standing challenge of finding tightly connected clusters, known as triangle-dense subgraphs, within large networks — a problem that is critical in fields such as fraud detection, computational biology and data analysis.

The research, published in IEEE Transactions on Knowledge and Data Engineering, was a collaboration led by Aritra Konar, an assistant professor of electrical engineering at KU Leuven in Belgium who was previously a research scientist at UVA.

Graph mining algorithms typically focus on finding dense connections between individual pairs of points, such as two people who frequently communicate on social media. However, the researchers’ new method, known as the Triangle-Densest-k-Subgraph problem, goes a step further by looking at triangles of connections — groups of three points where each pair is linked. This approach captures more tightly knit relationships, like small groups of friends who all interact with each other, or clusters of genes that work together in biological processes.

“Our method doesn’t just look at single connections but considers how groups of three elements interact, which is crucial for understanding more complex networks,” explained Sidiropoulos, a professor in the Department of Electrical and Computer Engineering. “This allows us to find more meaningful patterns, even in massive datasets.”

Finding triangle-dense subgraphs is especially challenging because it’s difficult to solve efficiently with traditional methods. But the new algorithm uses what’s called submodular relaxation, a clever shortcut that simplifies the problem just enough to make it quicker to solve without losing important details.

This breakthrough opens new possibilities for understanding complex systems that rely on these deeper, multi-connection relationships. Locating subgroups and patterns could help uncover suspicious activity in fraud, identify community dynamics on social media, or help researchers analyze protein interactions or genetic relationships with greater precision.


Story Source:

Materials provided by University of Virginia School of Engineering and Applied Science. Note: Content may be edited for style and length.


Journal Reference:

  1. Aritra Konar, Nicholas D. Sidiropoulos. Mining Triangle-Dense Subgraphs of a Fixed Size: Hardness, Lovasz extension and ´ Applications. IEEE Transactions on Knowledge and Data Engineering, 2024; 1 DOI: 10.1109/TKDE.2024.3444608

Source: Professor tackles graph mining challenges with new algorithm | ScienceDaily

A simple experiment revealed the complex ‘thoughts’ of fungi – yes vegans and vegetarians: plants also really live and think.

Fungi are fascinating lifeforms that defy conventional notions of animal intelligence. They don’t have brains, yet display clear signs of decision making and communication. But just how complex are these organisms and what can they tell us about other forms of awareness? To begin investigating these mysteries, researchers at Japan’s Tohoku University and Nagaoka College conducted a straightforward test to observe the decision-making prowess of a cord-forming fungus known as Phanerochaete velutina. According to the team’s study published in Fungal Ecology, their findings indicate fungi can “recognize” different spatial arrangements of wood and adapt accordingly to make the most of their world.

Although many people only recognize fungi by their aboveground mushrooms, those formations are just the outermost display of an often vast network of underground threads called mycelium. These interconnected webs are capable of relaying environmental information throughout an entire system that can stretch for miles. But mycelium’s growth doesn’t necessarily extend in every direction at random—it appears to be a calculated effort.

Fungal mycelial networks connecting wood blocks arranged in circle (left) and cross (right) shapes. ©Yu Fukasawa et al.
Fungal mycelial networks connecting wood blocks arranged in circle (left) and cross (right) shapes. Credit: Yu Fukasawa et al.

To demonstrate this ability, researchers set up two 24-cm-wide (9.44-in-wide) square dirt environments and soaked decaying wood blocks for 42 days in a solution containing P. velutina spores. They then placed the blocks in either a circular or cross-shaped arrangement inside the box, and let the fungi go about its business for 116 days. If the P. velutina grew at random, then it would indicate a lack of basal cognition decision-making—but that’s not what happened at all.

At first, the mycelium grew outward around each block for 13 days without connecting to each other. About a month later, however, both arrangements displayed extremely tangled fungi webs stretching between every wood sample. But then, something striking occurred—by day 116, each fungal network had organized itself along much more deliberate, clearly defined pathways. In the circle setting, P. velutina displayed uniform connectivity growing outward, but barely grew into the ring’s interior. Meanwhile, the cross fungi extended much further from its four outermost blocks.

Researchers theorized that, in the circular environment, the mycelial network determined there was little benefit to expend excess energy into a region it already occupied. In the case of the cross scenario, the team thinks that the four exterior post’s growth areas served as “outposts” for foraging missions. Taken together, the two tests strongly suggest networks of brainless organisms communicated between each other through the mycelial networks to grow according to the environmental situations.

“You’d be surprised at just how much fungi are capable of. They have memories, they learn, and they can make decisions,” Yu Fukasawa, a study co-author at Tohoku University, said in the paper’s announcement on October 8th. “Quite frankly, the differences in how they solve problems compared to humans is mind-blowing.”

While much remains to be understood about these often overlooked organisms, researchers believe continued experimentation and analysis may lead to a better understanding of the broader evolutionary history of consciousness, and even chart a path towards advanced bio-based computers.

Source: A simple experiment revealed the complex ‘thoughts’ of fungi | Popular Science

See also: Plants can be larks or night owls just like us

Are Plants Conscious? Researchers Argue, but agree they are intelligent.

Once considered outlandish, the idea that plants help their relatives is taking root

Plants communicate distress using their own kind of nervous system

Breakthrough study shows how plants sense the world

Biophotons: Are lentils communicating using quantum light messages?

It could take over 40 years for PFAS to leave groundwater

Per- and polyfluoroalkyl chemicals, known commonly as PFAS, could take over 40 years to flush out of contaminated groundwater in North Carolina’s Cumberland and Bladen counties, according to a new study from North Carolina State University. The study used a novel combination of data on PFAS, groundwater age-dating tracers, and groundwater flux to forecast PFAS concentrations in groundwater discharging to tributaries of the Cape Fear River in North Carolina.

The researchers sampled groundwater in two different watersheds adjacent to the Fayetteville Works fluorochemical plant in Bladen County.

“There’s a huge area of PFAS contaminated groundwater — including residential and agricultural land — which impacts the population in two ways,” says David Genereux, professor of marine, earth and atmospheric sciences at NC State and leader of the study.

“First, there are over 7,000 private wells whose users are directly affected by the contamination. Second, groundwater carrying PFAS discharges into tributaries of the Cape Fear River, which affects downstream users of river water in and near Wilmington.”

The researchers tested the samples they took to determine PFAS types and levels, then used groundwater age-dating tracers, coupled with atmospheric contamination data from the N.C. Department of Environmental Quality and the rate of groundwater flow, to create a model that estimated both past and future PFAS concentrations in the groundwater discharging to tributary streams.

They detected PFAS in groundwater up to 43 years old, and concentrations of the two most commonly found PFAS — hexafluoropropylene oxide-dimer acid (HFPO−DA) and perfluoro-2-methoxypropanoic acid (PMPA) — averaged 229 and 498 nanograms per liter (ng/L), respectively. For comparison, the maximum contaminant level (MCL) issued by the U.S. Environmental Protection Agency for HFPO-DA in public drinking water is 10 ng/L. MCLs are enforceable drinking water standards.

“These results suggest it could take decades for natural groundwater flow to flush out groundwater PFAS still present from the ‘high emission years,’ roughly the period between 1980 and 2019,” Genereux says. “And this could be an underestimate; the time scale could be longer if PFAS is diffusing into and out of low-permeability zones (clay layers and lenses) below the water table.”

The researchers point out that although air emissions of PFAS are substantially lower now than they were prior to 2019, they are not zero, so some atmospheric deposition of PFAS seems likely to continue to feed into the groundwater.

“Even a best-case scenario — without further atmospheric deposition — would mean that PFAS emitted in past decades will slowly flush from groundwater to surface water for about 40 more years,” Genereux says. “We expect groundwater PFAS contamination to be a multi-decade problem, and our work puts some specific numbers behind that. We plan to build on this work by modeling future PFAS at individual drinking water wells and working with toxicologists to relate past PFAS levels at wells to observable health outcomes.”


Story Source:

Materials provided by North Carolina State University. Original written by Tracey Peake. Note: Content may be edited for style and length.


Journal Reference:

  1. Craig R. Jensen, David P. Genereux, D. Kip Solomon, Detlef R. U. Knappe, Troy E. Gilmore. Forecasting and Hindcasting PFAS Concentrations in Groundwater Discharging to Streams near a PFAS Production Facility. Environmental Science & Technology, 2024; 58 (40): 17926 DOI: 10.1021/acs.est.4c06697

Source: It could take over 40 years for PFAS to leave groundwater | ScienceDaily

How personal care products affect indoor air quality

The personal care products we use on a daily basis significantly affect indoor air quality, according to new research by a team at EPFL. When used indoors, these products release a cocktail of more than 200 volatile organic compounds (VOCs) into the air, and when those VOCs come into contact with ozone, the chemical reactions that follow can produce new compounds and particles that may penetrate deep into our lungs. Scientists don’t yet know how inhaling these particles on a daily basis affects our respiratory health.

The EPFL team’s findings have been published in Environmental Science & Technology Letters.

[…]

In one test, the researchers applied the products under typical conditions, while the air quality was carefully monitored. In another test, they did the same thing but also injected , a reactive outdoor gas that occurs in European latitudes during the summer months.

[…]

However, when ozone was introduced into the chamber, not only new VOCs but also new particles were generated, particularly from perfume and sprays, exceeding concentrations found in heavily polluted such as downtown Zurich.

“Some molecules ‘nucleate’—in other words, they form new particles that can coagulate into larger ultrafine particles that can effectively deposit into our lungs,” explains Licina. “In my opinion, we still don’t fully understand the health effects of these pollutants, but they may be more harmful than we think, especially because they are applied close to our breathing zone. This is an area where new toxicological studies are needed.”

Preventive measures

To limit the effect of personal care products on , we could consider several alternatives for how buildings are engineered: introducing more ventilation—especially during the products’ use—incorporating air-cleaning devices (e.g., activated carbon-based filters combined with media filters), and limiting the concentration of indoor ozone.

Another preventive measure is also recommended, according to Licina: “I know this is difficult to hear, but we’re going to have to reduce our reliance on these products, or if possible, replace them with more natural alternatives that contain fragrant compounds with low chemical reactivity. Another helpful measure would be to raise awareness of these issues among and staff working with vulnerable groups, such as children and the elderly.”

More information: Tianren Wu et al, Indoor Emission, Oxidation, and New Particle Formation of Personal Care Product Related Volatile Organic Compounds, Environmental Science & Technology Letters (2024). DOI: 10.1021/acs.estlett.4c00353

Source: How personal care products affect indoor air quality

‘Writing’ with atoms could transform materials fabrication for quantum devices

[…]A research team at the Department of Energy’s Oak Ridge National Laboratory has created a novel advanced microscopy tool to “write” with atoms, placing those atoms exactly where they are needed to give a material new properties.

“By working at the , we also work at the scale where quantum properties naturally emerge and persist,” said Stephen Jesse, a materials scientist who leads this research and heads the Nanomaterials Characterizations section at ORNL’s Center for Nanophase Materials Sciences, or CNMS.

[…]

o accomplish improved control over atoms, the research team created a tool they call a synthescope for combining synthesis with advanced microscopy. The researchers use a , or STEM, transformed into an atomic-scale material manipulation platform.

The synthescope will advance the state of the art in fabrication down to the level of the individual building blocks of materials. This new approach allows researchers to place different atoms into a material at specific locations; the new atoms and their locations can be selected to give the material new properties.

[…]

https://www.youtube.com/watch?v=I5FSc-lqI6s

We realized that if we have a microscope that can resolve atoms, we may be able to use the same microscope to move atoms or alter materials with atomic precision. We also want to be able to add atoms to the structures we create, so we need a supply of atoms. The idea morphed into an atomic-scale synthesis platform—the synthescope.”

That is important because the ability to tailor materials atom-by-atom can be applied to many future technological applications in quantum information science, and more broadly in microelectronics and catalysis, and for gaining a deeper understanding of materials synthesis processes. This work could facilitate atomic-scale manufacturing, which is notoriously challenging.

“Simply by the fact that we can now start putting atoms where we want, we can think about creating arrays of atoms that are precisely positioned close enough together that they can entangle, and therefore share their , which is key to making quantum devices more powerful than conventional ones,” Dyck said.

Such devices might include quantum computers—a proposed next generation of computers that may vastly outpace today’s fastest supercomputers; quantum sensors; and quantum communication devices that require a source of a single photon to create a secure quantum communications system.

“We are not just moving atoms around,” Jesse said. “We show that we can add a variety of atoms to a material that were not previously there and put them where we want them. Currently there is no technology that allows you to place different elements exactly where you want to place them and have the right bonding and structure. With this technology, we could build structures from the atom up, designed for their electronic, optical, chemical or structural properties.”

The scientists, who are part of the CNMS, a nanoscience research center and DOE Office of Science user facility, detailed their research and their vision in a series of four papers in scientific journals over the course of a year, starting with proof of principle that the synthescope could be realized. They have applied for a patent on the technology.

“With these papers, we are redirecting what atomic-scale fabrication will look like using electron beams,” Dyck said. “Together these manuscripts outline what we believe will be the direction atomic fabrication technology will take in the near future and the change in conceptualization that is needed to advance the field.”

By using an , or e-beam, to remove and deposit the atoms, the ORNL scientists could accomplish a direct writing procedure at the atomic level.

“The process is remarkably intuitive,” said ORNL’s Andrew Lupini, STEM group leader and a member of the research team. “STEMs work by transmitting a high-energy e-beam through a material. The e-beam is focused to a point smaller than the distance between atoms and scans across the material to create an image with atomic resolution. However, STEMs are notorious for damaging the very materials they are imaging.”

The scientists realized they could exploit this destructive “bug” and instead use it as a constructive feature and create holes on purpose. Then, they can put whatever atom they want in that hole, exactly where they made the defect. By purposely damaging the material, they create a new material with different and useful properties.

[…]

To demonstrate the method, the researchers moved an e-beam back and forth over a graphene lattice, creating minuscule holes. They inserted tin atoms into those holes and achieved a continuous, atom-by-atom, direct writing process, thereby populating the exact same places where the carbon atom had been with tin atoms.

[…]

Source: ‘Writing’ with atoms could transform materials fabrication for quantum devices

Scientists Detect Invisible Electric Field Around Earth For First Time

An invisible, weak energy field wrapped around our planet Earth has finally been detected and measured.

It’s called the ambipolar field, an electric field first hypothesized more than 60 years ago

[…]

“Any planet with an atmosphere should have an ambipolar field,” says astronomer Glyn Collinson of NASA’s Goddard Space Flight Center.

“Now that we’ve finally measured it, we can begin learning how it’s shaped our planet as well as others over time.”

Earth isn’t just a blob of dirt sitting inert in space. It’s surrounded by all sorts of fields. There’s the gravity field.

[…]

There’s also the magnetic field, which is generated by the rotating, conducting material in Earth’s interior, converting kinetic energy into the magnetic field that spins out into space.

[…]

In 1968, scientists described a phenomenon that we couldn’t have noticed until the space age. Spacecraft flying over Earth’s poles detected a supersonic wind of particles escaping from Earth’s atmosphere. The best explanation for this was a third, electric energy field.

“It’s called the ambipolar field and it’s an agent of chaos. It counters gravity, and it strips particles off into space,” Collinson explains in a video.

“But we’ve never been able to measure this before because we haven’t had the technology. So, we built the Endurance rocket ship to go looking for this great invisible force.”

[…]

Here’s how the ambipolar field was expected to work. Starting at an altitude of around 250 kilometers (155 miles), in a layer of the atmosphere called the ionosphere, extreme ultraviolet and solar radiation ionizes atmospheric atoms, breaking off negatively charged electrons and turning the atom into a positively charged ion.

The lighter electrons will try to fly off into space, while the heavier ions will try to sink towards the ground. But the plasma environment will try to maintain charge neutrality, which results in the emergence of an electric field between the electrons and the ions to tether them together.

This is called the ambipolar field because it works in both directions, with the ions supplying a downward pull and the electrons an upward one.

The result is that the atmosphere is puffed up; the increased altitude allows some ions to escape into space, which is what we see in the polar wind.

This ambipolar field would be incredibly weak, which is why Collinson and his team designed instrumentation to detect it. The Endurance mission, carrying this experiment, was launched in May 2022, reaching an altitude of 768.03 kilometers (477.23 miles) before falling back to Earth with its precious, hard-won data.

And it succeeded. It measured a change in electric potential of just 0.55 volts – but that was all that was needed.

“A half a volt is almost nothing – it’s only about as strong as a watch battery,” Collinson says. “But that’s just the right amount to explain the polar wind.”

That amount of charge is enough to tug on hydrogen ions with 10.6 times the strength of gravity, launching them into space at the supersonic speeds measured over Earth’s poles.

Oxygen ions, which are heavier than hydrogen ions, are also lofted higher, increasing the density of the ionosphere at high altitudes by 271 percent, compared to what its density would be without the ambipolar field.

[…]

The research has been published in Nature.

Source: Scientists Detect Invisible Electric Field Around Earth For First Time : ScienceAlert

Doughnut-shaped region found inside Earth’s core deepens understanding of planet’s magnetic field

A doughnut-shaped region thousands of kilometers beneath our feet within Earth’s liquid core has been discovered by scientists from The Australian National University (ANU), providing new clues about the dynamics of our planet’s magnetic field.

The structure within Earth’s liquid core is found only at low latitudes and sits parallel to the equator. According to ANU seismologists, it has remained undetected until now.

The Earth has two core layers: the , a solid layer, and the outer core, a liquid layer. Surrounding the Earth’s core is the mantle. The newly discovered doughnut-shaped region is at the top of Earth’s outer core, where the liquid core meets the mantle.

Study co-author and ANU geophysicist, Professor Hrvoje Tkalčić, said the seismic waves detected are slower in the newly discovered region than in the rest of the liquid outer core.

[…]

“We don’t know the exact thickness of the doughnut, but we inferred that it reaches a few hundred kilometers beneath the core-mantle boundary.”

Rather than using traditional seismic wave observation techniques and observing signals generated by earthquakes within the first hour, the ANU scientists analyzed the similarities between waveforms many hours after the earthquake origin times, leading them to make the unique discovery.

“By understanding the geometry of the paths of the waves and how they traverse the outer core’s volume, we reconstructed their through the Earth, demonstrating that the newly discovered has low seismic speeds,” Professor Tkalčić said.

“The peculiar structure remained hidden until now as previous studies collected data with less volumetric coverage of the outer core by observing waves that were typically confined within one hour after the origin times of large earthquakes.

[…]

“Our findings are interesting because this low velocity within the liquid core implies that we have a high concentration of light chemical elements in these regions that would cause the seismic waves to slow down. These light elements, alongside temperature differences, help stir liquid in the ,” Professor Tkalčić said.

[…]

The research is published in Science Advances.

More information: Xiaolong Ma et al, Seismic low-velocity equatorial torus in the Earth’s outer core: Evidence from the late-coda correlation wavefield, Science Advances (2024). DOI: 10.1126/sciadv.adn5562

Source: Doughnut-shaped region found inside Earth’s core deepens understanding of planet’s magnetic field

String Theorists Accidentally Find a New Formula for Pi

[…] most recently in January 2024, when physicists Arnab Priya Saha and Aninda Sinha of the Indian Institute of Science presented a completely new formula for calculating it, which they later published in Physical Review Letters.

Saha and Sinha are not mathematicians. They were not even looking for a novel pi equation. Rather, these two string theorists were working on a unifying theory of fundamental forces, one that could reconcile electromagnetism, gravity and the strong and weak nuclear forces

[…]

For millennia, mankind has been trying to determine the exact value of pi. […]

One famous example is Archimedes, who estimated pi with the help of polygons: by drawing an n-sided polygon inside and one outside a circle and calculating the perimeter of each, he was able to narrow down the value of pi.

Three circles are bounded by polygons with an increasing number of sides.

A common method for determining pi geometrically involves drawing a bounding polygon inside and outside a circle and then comparing the two perimeters.

Fredrik/Leszek Krupinski/Wikimedia Commons

Teachers often present this method in school

[…]

In the 15th century experts found infinite series as a new way to express pi. […]

For example, the Indian scholar Madhava, who lived from 1350 to 1425, found that pi equals 4 multiplied by a series that begins with 1 and then alternately subtracts or adds fractions in which 1 is placed over successively higher odd numbers (so 1/3, 1/5, and so on). One way to express this would be:

A formula presents how pi can be calculated using a series developed by the Indian scholar Madhava.

This formula makes it possible to determine pi as precisely as you like in a very simple way.

[…]

As Saha and Sinha discovered more than 600 years later, Madhava’s formula is only a special case of a much more general equation for calculating pi. In their work, the string theorists discovered the following formula:

A formula presents a way of calculating pi that was identified by physicists Arnab Priya Saha and Aninda Sinha.

This formula produces an infinitely long sum. What is striking is that it depends on the factor λ , a freely selectable parameter. No matter what value λ has, the formula will always result in pi. And because there are infinitely many numbers that can correspond to λ, Saha and Sinha have found an infinite number of pi formulas.

If λ is infinitely large, the equation corresponds to Madhava’s formula. That is, because λ only ever appears in the denominator of fractions, the corresponding fractions for λ = ∞ become zero (because fractions with large denominators are very small). For λ = ∞, the equation of Saha and Sinha therefore takes the following form:

Saha and Sinha’s formula can be adapted based on the assumption of an infinitely large parameter.

The first part of the equation is already similar to Madhava’s formula: you sum fractions with odd denominators.

[…]

As the two string theorists report, however, pi can be calculated much faster for smaller values of λ. While Madhava’s result requires 100 terms to get within 0.01 of pi, Saha and Sinha’s formula for λ = 3 only requires the first four summands. “While [Madhava’s] series takes 5 billion terms to converge to 10 decimal places, the new representation with λ between 10 [and] 100 takes 30 terms,” the authors write in their paper. Saha and Sinha did not find the most efficient method for calculating pi, though. Other series have been known for several decades that provide an astonishingly accurate value much more quickly. What is truly surprising in this case is that the physicists came up with a new pi formula when their paper aimed to describe the interaction of strings.

[…]

Source: String Theorists Accidentally Find a New Formula for Pi | Scientific American

Researchers figure out how to keep clocks on the Earth, Moon in sync

[…] Our communications and GPS networks all depend on keeping careful track of the precise timing of signals—including accounting for the effects of relativity. The deeper into a gravitational well you go, the slower time moves, and we’ve reached the point where we can detect differences in altitude of a single millimeter. Time literally flows faster at the altitude where GPS satellites are than it does for clocks situated on Earth’s surface. Complicating matters further, those satellites are moving at high velocities, an effect that slows things down.

[…]

It would be easy to set up an equivalent system to track time on the Moon, but that would inevitably see the clocks run out of sync with those on Earth—a serious problem for things like scientific observations

[…]

Ashby and Patla worked on developing a system where anything can be calculated in reference to the center of mass of the Earth/Moon system. Or, as they put it in the paper, their mathematical system “enables us to compare clock rates on the Moon and cislunar Lagrange points with respect to clocks on Earth by using a metric appropriate for a locally freely falling frame such as the center of mass of the Earth–Moon system in the Sun’s gravitational field.”

[…]

The paper’s body has 55 of them, and there are another 67 in the appendices.

[…]

Things get complicated because there are so many factors to consider. There are tidal effects from the Sun and other planets. Anything on the surface of the Earth or Moon is moving due to rotation; other objects are moving while in orbit. The gravitational influence on time will depend on where an object is located.

[…]

he researchers say that their approach, while focused on the Earth/Moon system, is still generalizable. Which means that it should be possible to modify it and create a frame of reference that would work on both Earth and anywhere else in the Solar System. Which, given the pace at which we’ve sent things beyond low-Earth orbit, is probably a healthy amount of future-proofing.

The Astronomical Journal, 2024. DOI: 10.3847/1538-3881/ad643a  (About DOIs).

Source: Researchers figure out how to keep clocks on the Earth, Moon in sync | Ars Technica