More from nanoscale views
Some not-actively-discouraging news out of Washington DC yesterday: The Senate appropriations committee is doing its markups of the various funding bills (which all technically originated in the House), and it appears that they have pushed to keep the funding for NASA and NSF (which are bundled in the same bill with the Department of Justice for no obvious reason) at FY24 levels. See here as well. This is not yet a done deal within the Senate, but it's better than many alternatives. If you are a US citizen or permanent resident and one of your senators is on the appropriations committee, please consider calling them to reinforce how devastating massive budget cuts to these agencies would be. I am told that feedback to any other senators is also valuable, but appropriators are particularly important here. The House appropriations committee has not yet met to mark up their versions. They had been scheduled to do so earlier this week but punted it for an unknown time. Their relevant subcommittee membership is here. Again, if you are a constituent of one of these representatives, your calls would be particularly important, though it doesn't hurt for anyone to make their views heard to their representative. If the House version aligns with the presidential budget request, then a compromise between the two might still lead to 30% cuts to NSF and NASA, which would (IMO) still be catastrophic for the agencies and US science and competitiveness. This is a marathon, not a sprint. There are still many looming difficulties - staffing cuts are well underway. Spending of already appropriated funds at agencies like NSF is way down, leading to the possibility that the executive branch may just order (or not-order-but-effectively-order) agencies not to spend and then claw back the funds. This year and in future years they could decide to underspend appropriations knowing that any legal resistance will take years and cost a fortune to work its way through the courts. This appropriations battle is also an annual affair - even if the cuts are forestalled for now (it is unlikely that the executive would veto all the spending bills over science agency cuts), this would have to happen again next year, and so on. Still, right now, there is an opportunity to push against funding cuts. Failing to try would be a surrender. (Obligatory notice: yes, I know that there are large-scale budgetary challenges facing the US; I don't think destroying government investment in science and engineering research is an intelligent set of spending cuts.)
Here are a number of items from the past week or so that I think readers of this blog might find interesting: Essentially all the news pertaining to the US federal funding of science continues to be awful. This article from Science summarizes the situation well, as does this from The Guardian and this editorial in the Washington Post. I do like the idea of a science fair of cancelled grants as a way to try to get alleged bipartisan appropriator notice of just how bad the consequences would be of the proposed cuts. On a more uplifting note, mathematicians have empirically demonstrated a conjecture originally made by John Conway, that it is possible to make a tetrahedral pyramid that, under gravity, has only one stable orientation. Quanta has a nice piece on this with a cool animated gif, and here is the actual preprint about it. It's all about mass distributions and moments of inertia about edges. As others have pointed out including the authors, this could be quite useful for situations like recent lunar lander attempts that seem to have a difficult time not topping over. A paper last week in Nature uses photons and a microcavity to try to test how long it takes photons to tunnel through a classically forbidden region. In this setup, it is mathematically legit to model the photons as if they have an effective mass, and one can model the barrier they need to traverse in terms of an effective potential energy. Classically, if the kinetic energy of the particle of interest is less than the potential energy of the barrier, the particle is forbidden inside the barrier. I've posted about the issue of tunneling time repeatedly over the years (see here for a 2020 post containing links), because I think it's a fascinating problem both conceptually and as a puzzle for experimentalists (how does one truly do a fair test of this?). The take-away from this paper is, the more classically forbidden the motion, the faster the deduced tunneling time. This has been seen in other experiments testing this idea. A key element of novelty in the new paper is the claim that the present experiment seems (according to the authors) to not be reasonably modeled by Bohmian mechanics. I'd need to read this in more depth to better understand it, as I had thought that Bohmian mechanics applied to problems like this is generally indistinguishable in predictions from conventional quantum mechanics, basically by design. In other non-condensed matter news, there is an interstellar comet transiting the solar system right now. This is very cool - it's only the third such object detected by humans, but to be fair we've only really been looking for a few years. This suggests that moderately sized hunks of material are likely passing through from interstellar space all the time, and the Vera C. Rubin Observatory will detect a boatload of them. My inner science fiction fan is hoping that the object changes its orbit at perihelion by mysterious means. This week is crunch time for a final push on US congressional appropriators to try to influence science agency budgets in FY26. I urge you to reach out if this matters to you. Likewise, I think it's more than reasonable to ask congress why the NSF is getting kicked out of its headquarters with no plan for an alternative agency location, so that the HUD secretary can have a palatial second home in that building.
I participated in a program about 15 years ago that looked at science and technology challenges faced by a subset of the US government. I came away thinking that such problems fall into three broad categories. Actual science and engineering challenges, which require foundational research and creativity to solve. Technology that may be fervently desired but is incompatible with the laws of nature, economic reality, or both. Alleged science and engineering problems that are really human/sociology issues. Part of science and engineering education and training is giving people the skills to recognize which problems belong to which categories. Confusing these can strongly shape the perception of whether science and engineering research is making progress. There has been a lot of discussion in the last few years about whether scientific progress (however that is measured) has slowed down or stagnated. For example, see here: https://www.theatlantic.com/science/archive/2018/11/diminishing-returns-science/575665/ https://news.uchicago.edu/scientific-progress-slowing-james-evans https://www.forbes.com/sites/roberthart/2023/01/04/where-are-all-the-scientific-breakthroughs-forget-ai-nuclear-fusion-and-mrna-vaccines-advances-in-science-and-tech-have-slowed-major-study-says/ https://theweek.com/science/world-losing-scientific-innovation-research A lot of the recent talk is prompted by this 2023 study, which argues that despite the world having many more researchers than ever before (behold population growth) and more global investment in research, somehow "disruptive" innovations are coming less often, or are fewer and farther between these days. (Whether this is an accurate assessment is not a simple matter to resolve; more on this below.) There is a whole tech bro culture that buys into this, however. For example, see this interview from last week in the New York Times with Peter Thiel, which points out that Thiel has been complaining about this for a decade and a half. On some level, I get it emotionally. The unbounded future spun in a lot of science fiction seems very far away. Where is my flying car? Where is my jet pack? Where is my moon base? Where are my fusion power plants, my antigravity machine, my tractor beams, my faster-than-light drive? Why does the world today somehow not seem that different than the world of 1985, while the world of 1985 seems very different than that of 1945? Some of the folks that buy into this think that science is deeply broken somehow - that we've screwed something up, because we are not getting the future they think we were "promised". Some of these people have this as an internal justification underpinning the dismantling of the NSF, the NIH, basically a huge swath of the research ecosystem in the US. These same people would likely say that I am part of the problem, and that I can't be objective about this because the whole research ecosystem as it currently exists is a groupthink self-reinforcing spiral of mediocrity. Science and engineering are inherently human ventures, and I think a lot of these concerns have an emotional component. My take at the moment is this: Genuinely transformational breakthroughs are rare. They often require a combination of novel insights, previously unavailable technological capabilities, and luck. They don't come on a schedule. There is no hard and fast rule that guarantees continuous exponential technological progress. Indeed, in real life, exponential growth regimes never last. The 19th and 20th centuries were special. If we think of research as a quest for understanding, it's inherently hierarchal. Civilizational collapses aside, you can only discover how electricity works once. You can only discover the germ theory of disease, the nature of the immune system, and vaccination once (though in the US we appear to be trying really hard to test that by forgetting everything). You can only discover quantum mechanics once, and doing so doesn't imply that there will be an ongoing (infinite?) chain of discoveries of similar magnitude. People are bad at accurately perceiving rare events and their consequences, just like people have a serious problem evaluating risk or telling the difference between correlation and causation. We can't always recognize breakthroughs when they happen. Sure, I don't have a flying car. I do have a device in my pocket that weighs only a few ounces, gives me near-instantaneous access to the sum total of human knowledge, let's me video call people around the world, can monitor aspects of my fitness, and makes it possible for me to watch sweet videos about dogs. The argument that we don't have transformative, enormously disruptive breakthroughs as often as we used to or as often as we "should" is in my view based quite a bit on perception. Personally, I think we still have a lot more to learn about the natural world. AI tools will undoubtedly be helpful in making progress in many areas, but I think it is definitely premature to argue that the vast majority of future advances will come from artificial superintelligences and thus we can go ahead and abandon the strategies that got us the remarkable achievements of the last few decades. I think some of the loudest complainers (Thiel, for example) about perceived slowing advancement are software people. People who come from the software development world don't always appreciate that physical infrastructure and understanding are hard, and that there are not always clever or even brute-force ways to get to an end goal. Solving foundational problems in molecular biology or quantum information hardware or photonics or materials is not the same as software development. (The tech folks generally know this on an intellectual level, but I don't think all of them really understand it in their guts. That's why so many of them seem to ignore real world physical constraints when talking about AI.). Trying to apply software development inspired approaches to science and engineering research isn't bad as a component of a many-pronged strategy, but alone it may not give the desired results - as warned in part by this piece in Science this week. More frequent breakthroughs in our understanding and capabilities would be wonderful. I don't think dynamiting the US research ecosystem is the way to get us there, and hoping that we can dismantle everything because AI will somehow herald a new golden age seems premature at best.
The basis for much of modern electronics is a set of silicon technologies called CMOS, which stands for complementary metal oxide semiconductor devices and processes. "Complementary" means using semiconductors (typically silicon) that is locally chemically doped so that you can have both n-type (carriers are negatively charged electrons in the conduction band) and p-type (carriers are positively charged holes in the valence band) material on the same substrate. With field-effect transistors (using oxide gate dielectrics), you can make very compact, comparatively low power devices like inverters and logic gates. There are multiple different approaches to try to implement quantum information processing in solid state platforms, with the idea that the scaling lessons of microelectronics (in terms of device density and reliability) can be applied. I think that essentially all of these avenues require cryogenic operating conditions; all superconducting qubits need ultracold conditions for both superconductivity and to minimize extraneous quasiparticles and other decoherence sources. Semiconductor-based quantum dots (Intel's favorite) similarly need thermal perturbations and decoherence to be minimized. The wealth of solid state quantum computing research is the driver for the historically enormous (to me, anyway) growth of dilution refrigerator manufacturing (see my last point here). So you eventually want to have thousands of error-corrected logical qubits at sub-Kelvin temperatures, which may involve millions of physical qubits at sub-Kelvin temperatures, all of which need to be controlled. Despite the absolute experimental fearlessness of people like John Martinis, you are not going to get this to work by running a million wires from room temperature into your dil fridge. Fig. 1 from here. The alternative people in this area have converged upon is to create serious CMOS control circuitry that can work at 4 K or below, so that a lot of the wiring does not need to go from the qubits all the way to room temperature. The materials and device engineering challenges in doing this are substantial! Power dissipation really needs to be minimized, and material properties to work at cryogenic conditions are not the same as those optimized for room temperature. There have been major advances in this - examples include Google in 2019, Intel in 2021, IBM in 2024, and this week, folks at the University of New South Wales supported by Microsoft. In this most recent work, the aspect that I find most impressive is that the CMOS electronics are essentially a serious logic-based control board operating at milliKelvin temperatures right next to the chip with the qubits (in this case, spins-in-quantum-dots). I'm rather blown away that this works and with sufficiently low power dissipation that the fridge is happy. This is very impressive, and there is likely a very serious future in store for cryogenic CMOS.
As usual, I hope to write more about particular physics topics soon, but in the meantime I wanted to share a sampling of news items: First, it's a pleasure to see new long-form writing about condensed matter subjects, in an era where science blogging has unquestionably shrunk compared to its heyday. The new Quantum Matters substack by Justin Wilson (and William Shelton) looks like it will be a fun place to visit often. Similar in spirit, I've also just learned about the Knowmads podcast (here on youtube), put out by Prachi Garella and Bhavay Tyagi, two doctoral students at the University of Houston. Fun Interviews with interesting scientists about their science and how they get it done. There have been some additional news bits relevant to the present research funding/university-govt relations mess. Earlier this week, 200 business leaders published an open letter about how the slashing support for university research will seriously harm US economic competitiveness. More of this, please. I continue to be surprised by how quiet technology-related, pharma, and finance companies are being, at least in public. Crushing US science and engineering university research will lead to serious personnel and IP shortages down the line, definitely poor for US standing. Again, now is the time to push back on legislators about cuts mooted in the presidential budget request. The would-be 15% indirect cost rate at NSF has been found to be illegal, in a summary court judgment released yesterday. (Brief article here, pdf of the ruling here.) Along these lines, there are continued efforts for proposals about how to reform/alter indirect cost rates in a far less draconian manner. These are backed by collective organizations like the AAU and COGR. If you're interested in this, please go here, read the ideas, and give some feedback. (Note for future reference: the Joint Associations Group (JAG) may want to re-think their acronym. In local slang where I grew up, the word "jag" does not have pleasant connotations.) The punitive attempt to prevent Harvard from taking international students has also been stopped for now in the courts.
More in science
Calling all the builders
Tony Tyson’s cameras revealed the universe’s dark contents. Now, with the Rubin Observatory’s 3.2-billion-pixel camera, he’s ready to study dark matter and dark energy in unprecedented detail. The post The Biggest-Ever Digital Camera Is This Cosmologist’s Magnum Opus first appeared on Quanta Magazine
Some not-actively-discouraging news out of Washington DC yesterday: The Senate appropriations committee is doing its markups of the various funding bills (which all technically originated in the House), and it appears that they have pushed to keep the funding for NASA and NSF (which are bundled in the same bill with the Department of Justice for no obvious reason) at FY24 levels. See here as well. This is not yet a done deal within the Senate, but it's better than many alternatives. If you are a US citizen or permanent resident and one of your senators is on the appropriations committee, please consider calling them to reinforce how devastating massive budget cuts to these agencies would be. I am told that feedback to any other senators is also valuable, but appropriators are particularly important here. The House appropriations committee has not yet met to mark up their versions. They had been scheduled to do so earlier this week but punted it for an unknown time. Their relevant subcommittee membership is here. Again, if you are a constituent of one of these representatives, your calls would be particularly important, though it doesn't hurt for anyone to make their views heard to their representative. If the House version aligns with the presidential budget request, then a compromise between the two might still lead to 30% cuts to NSF and NASA, which would (IMO) still be catastrophic for the agencies and US science and competitiveness. This is a marathon, not a sprint. There are still many looming difficulties - staffing cuts are well underway. Spending of already appropriated funds at agencies like NSF is way down, leading to the possibility that the executive branch may just order (or not-order-but-effectively-order) agencies not to spend and then claw back the funds. This year and in future years they could decide to underspend appropriations knowing that any legal resistance will take years and cost a fortune to work its way through the courts. This appropriations battle is also an annual affair - even if the cuts are forestalled for now (it is unlikely that the executive would veto all the spending bills over science agency cuts), this would have to happen again next year, and so on. Still, right now, there is an opportunity to push against funding cuts. Failing to try would be a surrender. (Obligatory notice: yes, I know that there are large-scale budgetary challenges facing the US; I don't think destroying government investment in science and engineering research is an intelligent set of spending cuts.)
For the first time, solar was the largest source of electricity in the EU last month, supplying a record 22 percent of the bloc's power. Read more on E360 →
Episode two of The Works in Progress Podcast is out now