More from nanoscale views
An early physics demonstration that many of us see in elementary school is that of static electricity: an electrical insulator like a wool cloth or animal fur is rubbed on a glass or plastic rod, and suddenly the rod can pick up pieces of styrofoam or little bits of paper. Alternately, a rubber balloon is rubbed against a kid's hair, and afterward the balloon is able to stick to a wall with sufficient force that static friction keeps the balloon from sliding down the surface. The physics here is that when materials are rubbed together, there can be a net transfer of electrical charge from one to the other, a phenomenon called triboelectricity. The electrostatic attraction between net charge on the balloon and the polarizable surface of the wall is enough to hold up the balloon. Balloons electrostatically clinging to a wall, from here. The big mysteries are, how and why do charges transfer between materials when they are rubbed together? As I wrote about once before, this is still not understood, despite more than 2500 years of observations. The electrostatic potentials that can be built up through triboelectricity are not small. They can be tens of kV, enough to cause electrons accelerating across those potentials to emit x-rays when they smack into the positively charged surface. Whatever is going on, it's a way to effectively concentrate the energy from mechanical work into displacing charges. This is how Wimshurst machines and Van de Graaff generators work, even though we don't understand the microscopic physics of the charge generation and separation. There are disagreements to this day about the mechanisms at work in triboelectricity, including the role of adsorbates, surface chemistry, whether the charges transferred are electrons or ions, etc. From how electronic charge transfer works between metals, or between metals and semiconductors, it's not crazy to imagine that somehow this should all come down to work functions or the equivalent. Depending on the composition and structure of materials, the electrons in there can be bound more tightly (energetically deeper compared to the energy of an electron far away, also called "the vacuum" level) or more loosely (energetically shallower, closer to the energy of a free electron). It's credible that bringing two such materials in contact could lead to electrons "falling down hill" from the more loosely-binding material into the more tightly binding one. That clearly is not the whole story, though, or this would've been figured out long ago. This week, a new paper revealed an interesting wrinkle. The net preference for picking up or losing charge seems to depend very clearly on the history of repeated contacts. The authors used PDMS silicone rubber, and they find that repeated contacting can deterministically bake in a tendency for charge to flow one direction. Using various surface spectroscopy methods, they find no obvious differences at the PDMS surface before/after the contacting procedures, but charge transfer is affected. My sneaking suspicion is that adsorbates will turn out to play a huge role in all of this. This may be one of those issues like friction (see here too), where there is a general emergent phenomenon (net charge transfer) that can take place via multiple different underlying pathways. Experiments in ultrahigh vacuum with ultraclean surfaces will undoubtedly show quantitatively different results than experiments in ambient conditions, but they may both show triboelectricity.
The National Science Foundation was created 75 years ago, at the behest of Vannevar Bush, who put together the famed study, Science, The Endless Frontier, in 1945. The NSF has played a critical role in a huge amount of science and engineering research since its inception, including advanced development of the laser, the page rank algorithm that ended up in google, and too many other contributions to list. The NSF funds university research as well as some national facilities. Organizationally, the NSF is an independent agency, meaning that it doesn’t reside under a particular cabinet secretary, though its Director is a presidential appointee who is confirmed by the US Senate. The NSF comprises a number of directorates (most relevant for readers of this blog are probably Mathematical and Physical Sciences; Engineering; and STEM Education, though there are several others). Within the directorates are divisions (for example, MPS → Division of Materials Research; Division of Chemistry; Division of Physics; Division of Mathematics etc.). Within each division are a variety of programs, spanning from individual investigator grants to medium to large center proposals, to group training grants, to individual graduate and postdoctoral fellowships. Each program is administered by one or more program officers who are either scientists who have become civil servants, or "rotators", academics who take a leave of absence from their university positions to serve at the NSF for some number of years. The NSF is the only agency whose mission historically has explicitly included science education. The NSF's budget has been about $9B/yr (though until very recently there was supposedly bipartisan support for large increases), and 94% of its funds are spent on research, education, and related activities. NSF funds more than 1/4 of all basic research done at universities in the US, and it also funds tech development, like small business innovation grants. The NSF, more than any other agency that funds physical science and engineering research, relies on peer review. Grants are reviewed by individual reviewers and/or panels. Compared to other agencies, the influence of program officers in the review process is minimal. If a grant doesn't excite the reviewers, it won't get funded. This has its pluses and minuses, but it's less of a personal networking process than other agencies. The success rate for many NSF programs is low, averaging around 25% in DMR, and 15% or so for graduate fellowships. Every NSF program officer with whom I've ever interacted has been dedicated and professional. Well, yesterday the NSF laid off 11% of its workforce. I had an exchange last night with a long-time NSF program director, who gave permission for me to share the gist, suitably anonymized. (I also corrected typos.) This person says that they want people to be aware of what's going on. They say that NSF leadership is apparently helping with layoffs, and that "permanent Program Directors (feds such as myself) will be undergoing RIF or Reduction In Force process within the next month or so. So far, through buyout and firing today we lost about 16% of the workforce, and RIF is expected to bring it up to 50%." When I asked further, this person said this was "fairly certain". They went on: "Another danger is budget. We do no know what happens after the current CR [continuing resolution] ends March 14. A long shutdown or another CR are possible. For FY26 we are told about plans to reduce the NSF budget by 50%-75% - such reduction will mean no new awards for at least a year, elimination of divisions, merging of programs. Individual researchers and professional societies can help by raising the voice of objection. But realistically, we need to win the midterms to start real change. For now we are losing this battle. I can only promise you that NSF PDs are united as never before in our dedication to serve our communities of reesarchers and educators. We will continue to do so as long as we are here." On a related note, here is a thread by a just laid off NSF program officer. Note that congress has historically ignored presidential budget requests to cut NSF, but it's not at all clear that this can be relied upon now. Voluntarily hobbling the NSF is, in my view, a terrible mistake that will take decades to fix. The argument that this is a fiscally responsible thing to do is weak. The total federal budget expenditures in FY24 was $6.75T. The NSF budget was $9B, or 0.13% of the total. The secretary of defense today said that their plan is to cut 8% of the DOD budget every year for the next several years. That's a reduction of 9 NSF budgets per year. I fully recognize that many other things are going on in the world right now, and many agencies are under similar pressures, but I wanted to highlight the NSF in particular. Acting like this is business as usual, the kind of thing that happens whenever there is a change of administration, is disingenuous.
While I could certainly write more about what is going on in the US these days (ahh, trying to dismantle organizations you don't understand), instead I want to briefly highlight a very exciting result from my colleagues, published in Nature last month. (I almost titled this post "Lies, Damn Lies, and (para)Statistics", but that sounds like I don't like the paper.) When we teach students about the properties of quantum objects (and about thermodynamics), we often talk about the "statistics" obeyed by indistinguishable particles. I've written about aspects of this before. "Statistics" in this sense means, what happens mathematically to the multiparticle quantum state \(|\Psi\rangle\) when two particles are swapped. If we use the label \(\mathbf{1}\) to mean the set of quantum numbers associated with particle 1, etc., then the question is, how are \(|\Psi(\mathbf{1},\mathbf{2})\rangle\) and \(|\Psi(\mathbf{2},\mathbf{1})\rangle\) related to each other. We know that probabilities have to be conserved, so \(\langle \Psi(\mathbf{1},\mathbf{2}) | \Psi(\mathbf{1},\mathbf{2})\rangle = \langle \Psi(\mathbf{2},\mathbf{1}) | \Psi(\mathbf{2},\mathbf{1})\rangle\). The usual situation is to assume \(|\Psi(\mathbf{2},\mathbf{1})\rangle\ = c |\Psi(\mathbf{1},\mathbf{2})\rangle\), where \(c\) is a complex number of magnitude 1. If \(c = 1\), which is sort of the "common sense" expectation from classical physics, the particles are bosons, obeying Bose-Einstein stastistics. If \(c = -1\), the particles are fermions and obey Fermi-Dirac statistics. In principle, one could have \(c = \exp(i\alpha)\), where \(\alpha\) is some phase angle. Particles in that general case are called anyons, and I wrote about them here. Low energy excitations of electrons (fermions) confined in 2D in the presence of a magnetic field can act like anyons, but it seems there can't be anyons in higher dimensions. Being imprecise, when particles are "dilute" -- "far" from each other in terms of position and momentum -- we typically don't really need to worry much about what kind of quantum statistics govern the particles. The distribution function - the average occupancy of a typical single-particle quantum state (labeled by a coordinate \(\mathbf{r}\), a wavevector \(\mathbf{k}\), and a spin \(\mathbf{\sigma}\) as one possibility) - is much less than 1. When particles are much more dense, though, the quantum statistics matter enormously. At low temperatures, bosons can all pile into the (single-particle, in the absence of interactions) ground state - that's Bose-Einstein condensation. In contrast, fermions have to stack up into higher energy states, since FD statistics imply that no two indistinguishable fermions can be in the same state - this is the Pauli Exclusion Principle, and it's basically why solids are solid. If a gas of particles is at a temperature \(T\) and a chemical potential \(\mu\), then the distribution function and a function of energy \(\epsilon\) for bosons or fermions is given by \(f(\epsilon,\mu,T) = 1/ (\exp((\epsilon-\mu)/k_{\mathrm{B}}T) \pm 1 )\), where the \(+\) sign is the fermion case and the \(-\) sign is the boson case. In the paper at hand, the authors take on parastatistics, the question of what happens if, besides spin, there are other "internal degrees of freedom" that are attached to particles described by additional indices that obey different algebras. As they point out, this is not a new idea, but what they have done here is show that it is possible to have mathematically consistent versions of this that do not trivially reduce to fermions and bosons and can survive in, say, 3 spatial dimensions. They argue that low energy excitations (quasiparticles) of some quantum spin systems can have these properties. That's cool but not necessarily surprising - there are quasiparticles in condensed matter systems that are argued to obey a variety of exotic relations originally proposed in the world of high energy theory (Weyl fermions, Majorana fermions, massless Dirac fermions). They also put forward the possibility that elementary particles could obey these statistics as well. (Ideas transferring over from condensed matter or AMO physics to high energy is also not a new thing; see the Anderson-Higgs mechanism, and the concept of unparticles, which has connections to condensed matter systems where electronic quasiparticles may not be well defined.) Fig. 1 from this paper, showing distribution functions for fermions, bosons, and more exotic systems studied in the paper. Interestingly, the authors work out what the distribution function can look like for these exotic particles, as shown here (fig 1 from the paper). The left panel shows how many particles can be in a single-particle spatial state for fermions (zero or one), bosons (up to \(\infty\)), and funky parastatistics-obeying particles of different types. The right panel shows the distribution functions for these cases. I think this is very cool. When I've taught statistical physics to undergrads, I've told the students that no one has written down a general distribution function for systems like this. Guess I'll have to revise my statements on this!
According to this article at politico, there was an all-hands meeting at NSF today (at least for the engineering directorate) where they were told that there will be staff layoffs of 25-50% over the next two months. This is an absolute catastrophe if it is accurately reported and comes to pass. NSF is already understaffed. This goes far beyond anything involving DEI, and is essentially a declaration that the US is planning to abrogate the federal role in supporting science and engineering research. Moreover, I strongly suspect that if this conversation is being had at NSF, it is likely being had at DOE and NIH. I don't even know how to react to this, beyond encouraging my fellow US citizens to call their representatives and senators and make it clear that this would be an unmitigated disaster. Update: looks like the presidential budget request will be for a 2/3 cut to the NSF. Congress often goes against such recommendations, but this is certainly an indicator of what the executive branch seems to want.
More in science
Every year, AI models get better at thinking. Could they possibly be capable of feeling? And if they are, how would we know?
Rare and powerful compounds, known as keystone molecules, can build a web of invisible interactions among species. The post A New, Chemical View of Ecosystems first appeared on Quanta Magazine
[Note that this article is a transcript of the video embedded above.] Lewis and Clark Lake, on the border between Nebraska and South Dakota, might not be a lake for much longer. Together with the dam that holds it back, the reservoir provides hydropower, flood control, and supports a robust recreational economy through fishing, boating, camping, birdwatching, hunting, swimming, and biking. All of that faces an existential threat from a seemingly innocuous menace: dirt. Around 5 million tons of it flows down this stretch of the Missouri River every year until it reaches the lake, where it falls out of suspension. Since the 1950s, when the dam was built, the sand and silt have built up a massive delta where the river comes in. The reservoir has already lost about 30 percent of its storage capacity, and one study estimated that, by 2045, it will be half full of sediment. On the surface, this seems like a silly problem, almost elementary. It’s just dirt! But I want to show you why it’s a slow-moving catastrophe with implications that span the globe. And I want you to think of a few solutions to it off the top of your head, because I think you’ll be surprised to learn why none of the ones we’ve come up with so far are easy. I’m Grady, and this is Practical Engineering. I want to clarify that the impacts dams have on sediment movement happen on both sides. Downstream, the impacts are mostly environmental. We think of rivers as carriers of water; it’s right there in the definition. But if you’ve ever seen a river that looks like chocolate milk after a storm, you already know that they are also major movers of sediment. And the natural flow of sediment has important functions in a river system. It transports nutrients throughout the watershed. It creates habitat in riverbeds for fish, amphibians, mammals, reptiles, birds, and a whole host of invertebrates. It fertilizes floodplains, stabilizes river banks, and creates deltas and beaches on the coastline that buffer against waves and storms. Robbing the supply of sediment from a river can completely alter the ecosystem downstream from a dam. But if a river is more than just a water carrier, a reservoir is more than just a water collector. And, of course, I built a model to show how this works. This is my acrylic flume. If you’re familiar with the channel, you’ve probably seen it in action before. I have it tilted up so we get two types of flow. On the right, we have a stream of fast-moving water to simulate a river, and on the left, I’ve built up a little dam. These stoplogs raise the level of the water, slowing it down to a gentle crawl. And there’s some mica power in the water, so you can really see the difference in velocity. Now let’s add some sediment. I bought these bags of colored sand, and I’m just going to dump them in the sump where my pump is recirculating this water through the flume. And watch what happens in the time lapse. The swift flow of the river carries the sand downstream, but as soon as it transitions into the slow flow of the reservoir, it starts to fall out of suspension. It’s a messy process at first. The sand kind of goes all over the place. But slowly, you can see it start to form a delta right where the river meets the reservoir. Of course, the river speeds up as it climbs over the delta, so the next batch of sediment doesn’t fall out until it’s on the downstream end. And each batch of sand that I dump into the pump just adds to it. The mass of sediment just slowly fills the reservoir, marching toward the dam. This looks super cool. In fact, I thought it was such a nice representation that I worked with an illustrator to help me make a print of it. We’re only going to print a limited run of these, so there's a link to the store down below if you want to pick one up. But, even though it looks cool, I want to be clear that it’s not a good thing. Some dams are built intentionally to hold sediment back, but in the vast majority of cases, this is an unwanted side effect of impounding water within a river valley. For most reservoirs, the whole point is to store water - for controlling floods, generating electricity, drinking, irrigation, cooling power plants, etc. So, as sediment displaces more and more of the reservoir volume, the value that reservoir provides goes down. And that’s not the only problem it causes. Making reservoirs shallower limits their use for recreation by reducing the navigable areas and fostering more unwanted algal blooms. Silt and sand can clog up gates and outlets to the structure and damage equipment like turbines. Sediment can even add forces to a dam that might not have been anticipated during design. Dirt is heavier than water. Let me prove that to you real quick. It’s a hard enough job to build massive structures that can hold back water, and sediment only adds to the difficulty. But I think the biggest challenge of this issue is that it’s inevitable, right? There are no natural rivers or streams that don’t carry some sediments along with them. The magnitude does vary by location. The world’s a big place, and for better or worse, we’ve built a lot of dams across rivers. There are a lot of factors that affect how quickly this truly becomes an issue at a reservoir, mostly things that influence water-driven erosion on the land upstream. Soil type is a big one; sandy soils erode faster than silts and clays (that’s why I used sand in the model). Land use is another big one. Vegetated areas like forests and grasslands hold onto their soil better than agricultural land or areas affected by wildfires. But in nearly all cases, without intervention, every reservoir will eventually fill up. Of course, that’s not good, but I don’t think there’s a lot of appreciation outside of a small community of industry professionals and activists for just how bad it is. Dams are among the most capital-intensive projects that we humans build. We literally pour billions of dollars into them, sometimes just for individual projects. This is kind of its own can of worms, but I’m just speaking generally that society often accepts pretty significant downsides in addition to the monetary costs, like environmental impacts and the risk of failure to downstream people and property in return for the enormous benefits dams can provide. And sedimentation is one of those problems that happens over a lifetime, so it’s easy at the beginning of a project to push it off to the next generation to fix. Well, the heyday of dam construction was roughly the 1930s through the 70s. So here we are starting to reckon with it, while being more dependent than ever on those dams. And there aren’t a lot of easy answers. To some extent, we consider sediment during design. Modern dams are built to withstand the forces, and the reservoir usually has what’s called a “dead pool,” basically a volume that is set aside for sediment from the beginning. Low-level gates sit above the dead pool so they don’t get clogged. But that’s not so much a solution as a temporary accommodation since THIS kind of deadpool doesn’t live forever. I think for most, the simplest idea is this: if there’s dirt in the lake, just take it out. Dredging soil is really not that complicated. We’ve been doing it for basically all of human history. And in some cases, it really is the only feasible solution. You can put an excavator on a barge, or a crane with a clamshell bucket, and just dig. Suction dredgers do it like an enormous vacuum cleaner, pumping the slurry to a barge or onto shore. But that word feasible is the key. The whole secret of building a dam across a valley is that you only have to move and place a comparatively small amount of material to get a lot of storage. Depending on the topography and design, every unit of volume of earth or concrete that makes up the dam itself might result in hundreds up to tens of thousands of times that volume of storage in the reservoir. But for dredging, it’s one-to-one. For every cubic meter of storage you want back, you have to remove it as soil from the reservoir. At that point, it’s just hard for the benefits to outweigh the costs. There’s a reason we don’t usually dig enormous holes to store large volumes of water. I mean, there are a lot of reasons, but the biggest one is just cost. Those 5 million tons of sediment that flow into Lewis and Clark Reservoir would fill around 200,000 end-dump semi-trailers. That’s every year, and it’s assuming you dry it out first, which, by the way, is another challenge of dredging: the spoils aren’t like regular soil. For one, they’re wet. That water adds volume to the spoils, meaning you have more material to haul away or dispose of. It also makes the spoils difficult to handle and move around. There are a lot of ways to dry them out or “dewater” them as the pros say. One of the most common is to pump spoils into geotubes, large fabric bags that hold the soil inside while letting the water slowly flow out. But it’s still extra work. And for two, sometimes sediments can be contaminated with materials that have washed off the land upstream. In that case, they require special handling and disposal. Many countries have pretty strict environmental rules about dredging and disposal of spoils, so you can see how it really isn’t a simple solution to sedimentation, and for most cases, it often just isn’t worth the cost. Another option for getting rid of sediment is just letting it flow through the dam. This is ideal because, as I mentioned before, sediment serves a lot of important functions in a river system. If you can let it continue on its journey downstream, in many ways, you’ve solved two problems in one, and there are a lot of ways to do this. Some dams have a low-level outlet that consistently releases turbid water that reaches the dam. But if you remember back to the model, not all of it does. In fact, in most cases, the majority of sediment deposits furthest from the dam, and most of it doesn’t reach the dam until the reservoir is pretty much full. Of course, my model doesn’t tell the whole story; it’s basically a 2D example with only one type of soil. As with all sediment transport phenomena, things are always changing. In fact, I decided to leave the model running with a time-lapse just to see what would happen. You can really get a sense of how dynamic this process can be. Again, it’s a very cool demonstration. But in most cases, much of the sediment that deposits in a reservoir is pretty much going to stay where it falls or take years and years before it reaches the dam. So, another option is to flush the reservoir. Just set the gates to wide open to get the velocity of water fast enough to loosen and scour the sediment, resuspending it so it can move downstream. I tried this in the model, and it worked pretty well. But again, this is just a 2D representation. In a real reservoir that has width, flushing usually just creates a narrow channel, leaving most of the sediment in place. And, inevitably, this requires drawing down the reservoir, essentially wasting all the water. And more importantly than that, it sends a massive plume of sediment laden water downstream. I’ve harped on the fact that we want sediment downstream of dams and that’s where it naturally belongs, but you can overdo it. Sediment can be considered a pollutant, and in fact, it’s regulated in the US as one. That’s why you see silt fences around construction sites. So the challenge of releasing sediment from a dam is to match the rate and quantity to what it would be if the dam wasn’t there. And that’s a very tough thing to do because of how variable those rates can be, because sediment doesn’t flow the same in a reservoir as it would in a river, because of the constraints it puts on operations (like the need to draw reservoirs down) and because of the complicated regulatory environment surrounding the release of sediments into natural waterways. The third major option for dealing with the problem is just reducing the amount of sediment that makes it to a reservoir in the first place. There are some innovations in capturing sediment upstream, like bedload interceptors that sit in streams and remove sediment over time. You can fight fire with fire by building check dams to trap sediment, but then you’ve just solved reservoir sedimentation by creating reservoir sedimentation. As I mentioned, those sediment loads depend a lot not only on the soil types in the watershed, but also on the land use or cover. Soil conservation is a huge field, and has played a big role in how we manage land in the US since the Dust Bowl of the 1930s. We have a whole government agency dedicated to the problem and a litany of strategies that reduce erosion, and many other countries have similar resources. A lot of those strategies involve maintaining good vegetation, preventing wildfires, good agricultural practices, and reforestation. But you have to consider the scale. Watersheds for major reservoirs can be huge. Lewis and Clark Reservoir’s catchment is about 16,000 square miles (41,000 square kilometers). That’s larger than all of Maryland! Management of an area that size is a complicated endeavor, especially considering that you have to do it over a long duration. So in many cases, there’s only so much you can do to keep sediment at bay. And really, that’s just an overview. I use Lewis and Clark Reservoir as an example, but like I said, this problem extends to essentially every on-channel reservoir across the globe. And the scope of the problem has created a huge variety of solutions I could spend hours talking about. And I think that’s encouraging. Even though most of the solutions aren’t easy, it doesn’t mean we can’t have infrastructure that’s sustainable over the long term, and the engineering lessons learned from past shortsightedness have given us a lot of new tools to make the best use of our existing infrastructure in the future.
A new proof extends the work of the late Maryam Mirzakhani, cementing her legacy as a pioneer of alien mathematical realms. The post Years After the Early Death of a Math Genius, Her Ideas Gain New Life first appeared on Quanta Magazine
Remember CRISPR (clustered regularly interspaced short palindromic repeats) – that new gene-editing system which is faster and cheaper than anything that came before it? CRISPR is derived from bacterial systems which uses guide RNA to target a specific sequence on a DNA strand. It is coupled with a Cas (CRISPR Associated) protein which can do […] The post The New TIGR-Tas Gene Editing System first appeared on NeuroLogica Blog.