More from nanoscale views
Some not-actively-discouraging news out of Washington DC yesterday: The Senate appropriations committee is doing its markups of the various funding bills (which all technically originated in the House), and it appears that they have pushed to keep the funding for NASA and NSF (which are bundled in the same bill with the Department of Justice for no obvious reason) at FY24 levels. See here as well. This is not yet a done deal within the Senate, but it's better than many alternatives. If you are a US citizen or permanent resident and one of your senators is on the appropriations committee, please consider calling them to reinforce how devastating massive budget cuts to these agencies would be. I am told that feedback to any other senators is also valuable, but appropriators are particularly important here. The House appropriations committee has not yet met to mark up their versions. They had been scheduled to do so earlier this week but punted it for an unknown time. Their relevant subcommittee membership is here. Again, if you are a constituent of one of these representatives, your calls would be particularly important, though it doesn't hurt for anyone to make their views heard to their representative. If the House version aligns with the presidential budget request, then a compromise between the two might still lead to 30% cuts to NSF and NASA, which would (IMO) still be catastrophic for the agencies and US science and competitiveness. This is a marathon, not a sprint. There are still many looming difficulties - staffing cuts are well underway. Spending of already appropriated funds at agencies like NSF is way down, leading to the possibility that the executive branch may just order (or not-order-but-effectively-order) agencies not to spend and then claw back the funds. This year and in future years they could decide to underspend appropriations knowing that any legal resistance will take years and cost a fortune to work its way through the courts. This appropriations battle is also an annual affair - even if the cuts are forestalled for now (it is unlikely that the executive would veto all the spending bills over science agency cuts), this would have to happen again next year, and so on. Still, right now, there is an opportunity to push against funding cuts. Failing to try would be a surrender. (Obligatory notice: yes, I know that there are large-scale budgetary challenges facing the US; I don't think destroying government investment in science and engineering research is an intelligent set of spending cuts.)
Here are a number of items from the past week or so that I think readers of this blog might find interesting: Essentially all the news pertaining to the US federal funding of science continues to be awful. This article from Science summarizes the situation well, as does this from The Guardian and this editorial in the Washington Post. I do like the idea of a science fair of cancelled grants as a way to try to get alleged bipartisan appropriator notice of just how bad the consequences would be of the proposed cuts. On a more uplifting note, mathematicians have empirically demonstrated a conjecture originally made by John Conway, that it is possible to make a tetrahedral pyramid that, under gravity, has only one stable orientation. Quanta has a nice piece on this with a cool animated gif, and here is the actual preprint about it. It's all about mass distributions and moments of inertia about edges. As others have pointed out including the authors, this could be quite useful for situations like recent lunar lander attempts that seem to have a difficult time not topping over. A paper last week in Nature uses photons and a microcavity to try to test how long it takes photons to tunnel through a classically forbidden region. In this setup, it is mathematically legit to model the photons as if they have an effective mass, and one can model the barrier they need to traverse in terms of an effective potential energy. Classically, if the kinetic energy of the particle of interest is less than the potential energy of the barrier, the particle is forbidden inside the barrier. I've posted about the issue of tunneling time repeatedly over the years (see here for a 2020 post containing links), because I think it's a fascinating problem both conceptually and as a puzzle for experimentalists (how does one truly do a fair test of this?). The take-away from this paper is, the more classically forbidden the motion, the faster the deduced tunneling time. This has been seen in other experiments testing this idea. A key element of novelty in the new paper is the claim that the present experiment seems (according to the authors) to not be reasonably modeled by Bohmian mechanics. I'd need to read this in more depth to better understand it, as I had thought that Bohmian mechanics applied to problems like this is generally indistinguishable in predictions from conventional quantum mechanics, basically by design. In other non-condensed matter news, there is an interstellar comet transiting the solar system right now. This is very cool - it's only the third such object detected by humans, but to be fair we've only really been looking for a few years. This suggests that moderately sized hunks of material are likely passing through from interstellar space all the time, and the Vera C. Rubin Observatory will detect a boatload of them. My inner science fiction fan is hoping that the object changes its orbit at perihelion by mysterious means. This week is crunch time for a final push on US congressional appropriators to try to influence science agency budgets in FY26. I urge you to reach out if this matters to you. Likewise, I think it's more than reasonable to ask congress why the NSF is getting kicked out of its headquarters with no plan for an alternative agency location, so that the HUD secretary can have a palatial second home in that building.
I participated in a program about 15 years ago that looked at science and technology challenges faced by a subset of the US government. I came away thinking that such problems fall into three broad categories. Actual science and engineering challenges, which require foundational research and creativity to solve. Technology that may be fervently desired but is incompatible with the laws of nature, economic reality, or both. Alleged science and engineering problems that are really human/sociology issues. Part of science and engineering education and training is giving people the skills to recognize which problems belong to which categories. Confusing these can strongly shape the perception of whether science and engineering research is making progress. There has been a lot of discussion in the last few years about whether scientific progress (however that is measured) has slowed down or stagnated. For example, see here: https://www.theatlantic.com/science/archive/2018/11/diminishing-returns-science/575665/ https://news.uchicago.edu/scientific-progress-slowing-james-evans https://www.forbes.com/sites/roberthart/2023/01/04/where-are-all-the-scientific-breakthroughs-forget-ai-nuclear-fusion-and-mrna-vaccines-advances-in-science-and-tech-have-slowed-major-study-says/ https://theweek.com/science/world-losing-scientific-innovation-research A lot of the recent talk is prompted by this 2023 study, which argues that despite the world having many more researchers than ever before (behold population growth) and more global investment in research, somehow "disruptive" innovations are coming less often, or are fewer and farther between these days. (Whether this is an accurate assessment is not a simple matter to resolve; more on this below.) There is a whole tech bro culture that buys into this, however. For example, see this interview from last week in the New York Times with Peter Thiel, which points out that Thiel has been complaining about this for a decade and a half. On some level, I get it emotionally. The unbounded future spun in a lot of science fiction seems very far away. Where is my flying car? Where is my jet pack? Where is my moon base? Where are my fusion power plants, my antigravity machine, my tractor beams, my faster-than-light drive? Why does the world today somehow not seem that different than the world of 1985, while the world of 1985 seems very different than that of 1945? Some of the folks that buy into this think that science is deeply broken somehow - that we've screwed something up, because we are not getting the future they think we were "promised". Some of these people have this as an internal justification underpinning the dismantling of the NSF, the NIH, basically a huge swath of the research ecosystem in the US. These same people would likely say that I am part of the problem, and that I can't be objective about this because the whole research ecosystem as it currently exists is a groupthink self-reinforcing spiral of mediocrity. Science and engineering are inherently human ventures, and I think a lot of these concerns have an emotional component. My take at the moment is this: Genuinely transformational breakthroughs are rare. They often require a combination of novel insights, previously unavailable technological capabilities, and luck. They don't come on a schedule. There is no hard and fast rule that guarantees continuous exponential technological progress. Indeed, in real life, exponential growth regimes never last. The 19th and 20th centuries were special. If we think of research as a quest for understanding, it's inherently hierarchal. Civilizational collapses aside, you can only discover how electricity works once. You can only discover the germ theory of disease, the nature of the immune system, and vaccination once (though in the US we appear to be trying really hard to test that by forgetting everything). You can only discover quantum mechanics once, and doing so doesn't imply that there will be an ongoing (infinite?) chain of discoveries of similar magnitude. People are bad at accurately perceiving rare events and their consequences, just like people have a serious problem evaluating risk or telling the difference between correlation and causation. We can't always recognize breakthroughs when they happen. Sure, I don't have a flying car. I do have a device in my pocket that weighs only a few ounces, gives me near-instantaneous access to the sum total of human knowledge, let's me video call people around the world, can monitor aspects of my fitness, and makes it possible for me to watch sweet videos about dogs. The argument that we don't have transformative, enormously disruptive breakthroughs as often as we used to or as often as we "should" is in my view based quite a bit on perception. Personally, I think we still have a lot more to learn about the natural world. AI tools will undoubtedly be helpful in making progress in many areas, but I think it is definitely premature to argue that the vast majority of future advances will come from artificial superintelligences and thus we can go ahead and abandon the strategies that got us the remarkable achievements of the last few decades. I think some of the loudest complainers (Thiel, for example) about perceived slowing advancement are software people. People who come from the software development world don't always appreciate that physical infrastructure and understanding are hard, and that there are not always clever or even brute-force ways to get to an end goal. Solving foundational problems in molecular biology or quantum information hardware or photonics or materials is not the same as software development. (The tech folks generally know this on an intellectual level, but I don't think all of them really understand it in their guts. That's why so many of them seem to ignore real world physical constraints when talking about AI.). Trying to apply software development inspired approaches to science and engineering research isn't bad as a component of a many-pronged strategy, but alone it may not give the desired results - as warned in part by this piece in Science this week. More frequent breakthroughs in our understanding and capabilities would be wonderful. I don't think dynamiting the US research ecosystem is the way to get us there, and hoping that we can dismantle everything because AI will somehow herald a new golden age seems premature at best.
The basis for much of modern electronics is a set of silicon technologies called CMOS, which stands for complementary metal oxide semiconductor devices and processes. "Complementary" means using semiconductors (typically silicon) that is locally chemically doped so that you can have both n-type (carriers are negatively charged electrons in the conduction band) and p-type (carriers are positively charged holes in the valence band) material on the same substrate. With field-effect transistors (using oxide gate dielectrics), you can make very compact, comparatively low power devices like inverters and logic gates. There are multiple different approaches to try to implement quantum information processing in solid state platforms, with the idea that the scaling lessons of microelectronics (in terms of device density and reliability) can be applied. I think that essentially all of these avenues require cryogenic operating conditions; all superconducting qubits need ultracold conditions for both superconductivity and to minimize extraneous quasiparticles and other decoherence sources. Semiconductor-based quantum dots (Intel's favorite) similarly need thermal perturbations and decoherence to be minimized. The wealth of solid state quantum computing research is the driver for the historically enormous (to me, anyway) growth of dilution refrigerator manufacturing (see my last point here). So you eventually want to have thousands of error-corrected logical qubits at sub-Kelvin temperatures, which may involve millions of physical qubits at sub-Kelvin temperatures, all of which need to be controlled. Despite the absolute experimental fearlessness of people like John Martinis, you are not going to get this to work by running a million wires from room temperature into your dil fridge. Fig. 1 from here. The alternative people in this area have converged upon is to create serious CMOS control circuitry that can work at 4 K or below, so that a lot of the wiring does not need to go from the qubits all the way to room temperature. The materials and device engineering challenges in doing this are substantial! Power dissipation really needs to be minimized, and material properties to work at cryogenic conditions are not the same as those optimized for room temperature. There have been major advances in this - examples include Google in 2019, Intel in 2021, IBM in 2024, and this week, folks at the University of New South Wales supported by Microsoft. In this most recent work, the aspect that I find most impressive is that the CMOS electronics are essentially a serious logic-based control board operating at milliKelvin temperatures right next to the chip with the qubits (in this case, spins-in-quantum-dots). I'm rather blown away that this works and with sufficiently low power dissipation that the fridge is happy. This is very impressive, and there is likely a very serious future in store for cryogenic CMOS.
As usual, I hope to write more about particular physics topics soon, but in the meantime I wanted to share a sampling of news items: First, it's a pleasure to see new long-form writing about condensed matter subjects, in an era where science blogging has unquestionably shrunk compared to its heyday. The new Quantum Matters substack by Justin Wilson (and William Shelton) looks like it will be a fun place to visit often. Similar in spirit, I've also just learned about the Knowmads podcast (here on youtube), put out by Prachi Garella and Bhavay Tyagi, two doctoral students at the University of Houston. Fun Interviews with interesting scientists about their science and how they get it done. There have been some additional news bits relevant to the present research funding/university-govt relations mess. Earlier this week, 200 business leaders published an open letter about how the slashing support for university research will seriously harm US economic competitiveness. More of this, please. I continue to be surprised by how quiet technology-related, pharma, and finance companies are being, at least in public. Crushing US science and engineering university research will lead to serious personnel and IP shortages down the line, definitely poor for US standing. Again, now is the time to push back on legislators about cuts mooted in the presidential budget request. The would-be 15% indirect cost rate at NSF has been found to be illegal, in a summary court judgment released yesterday. (Brief article here, pdf of the ruling here.) Along these lines, there are continued efforts for proposals about how to reform/alter indirect cost rates in a far less draconian manner. These are backed by collective organizations like the AAU and COGR. If you're interested in this, please go here, read the ideas, and give some feedback. (Note for future reference: the Joint Associations Group (JAG) may want to re-think their acronym. In local slang where I grew up, the word "jag" does not have pleasant connotations.) The punitive attempt to prevent Harvard from taking international students has also been stopped for now in the courts.
More in science
[Note that this article is a transcript of the video embedded above.] In the early 1900s, Seattle was a growing city hemmed in by geography. To the west was Puget Sound, a vital link to the Pacific Ocean. To the east, Lake Washington stood between the city and the farmland and logging towns of the Cascades. As the population grew, pressure mounted for a reliable east–west transportation route. But Lake Washington wasn’t easy to cross. Carved by glaciers, the lake is deceptively deep, over 200 feet or 60 meters in some places. And under that deep water sits an even deeper problem: a hundred-foot layer of soft clay and mud. Building bridge piers all the way to solid ground would have required staggeringly sized supports. The cost and complexity made it infeasible to even consider. But in 1921, an engineer named Homer Hadley proposed something radical: a bridge that didn’t rest on the bottom at all. Instead, it would float on massive hollow concrete pontoons, riding on the surface like a ship. It took nearly two decades for his idea to gain traction, but with the New Deal and Public Works Administration, new possibilities for transportation routes across the country began to open up. Federal funds flowed, and construction finally began on what would become the Lacey V. Murrow Bridge. When it opened in 1940, it was the first floating concrete highway of its kind, a marvel of engineering and a symbol of ingenuity under constraint. But floating bridges, by their nature, carry some unique vulnerabilities. And fifty years later, this span would be swallowed by the very lake it crossed. Between that time and since, the Seattle area has kind of become the floating concrete highway capital of the world. That’s not an official designation, at least not yet, but there aren’t that many of these structures around the globe. And four of the five longest ones on Earth are clustered in one small area of Washington state. You have Hood Canal, Evergreen Point, Lacey V Murrow, and its neighbor, the Homer M. Hadley Memorial Bridge, named for the engineer who floated the idea in the first place. Washington has had some high-profile failures, but also some remarkable successes, including a test for light rail transit over a floating bridge just last month in June 2025. It's a niche branch of engineering, full of creative solutions and unexpected stories. So I want to take you on a little tour of the hidden engineering behind them. I’m Grady, and this is Practical Engineering. Floating bridges are basically as old as recorded history. It’s not a complicated idea: place pontoons across a body of water, then span them with a deck. For thousands of years, this straightforward solution has provided a fast and efficient way to cross rivers and lakes, particularly in cases where permanent bridges were impractical or when the need for a crossing was urgent. In fact, floating bridges have been most widely used in military applications, going all the way back to Xerxes crossing the Dardanelles in 480 BCE. They can be made portable, quick to erect, flexible to a wide variety of situations, and they generally don’t require a lot of heavy equipment. There are countless designs that have been used worldwide in various military engagements. But most floating bridges, both ancient and modern, weren’t meant to last. They’re quick to put up, but also quick to take out, either on purpose or by Mother Nature. They provide the means to get in, get across, and get out. So they aren’t usually designed for extreme conditions. Transitioning from temporary military crossings to permanent infrastructure was a massive leap, and it brought with it a host of engineering challenges. An obvious one is navigation. A bridge that floats on the surface of the water is, by default, a barrier to boats. So, permanent floating bridges need to make room for maritime traffic. Designers have solved this in several ways, and Washington State offers a few good case studies. The Evergreen Point Floating Bridge includes elevated approach spans on either end, allowing ships to pass beneath before the road descends to water level. The original Lacey V. Murrow Bridge took a different approach. Near its center, a retractable span could be pulled into a pocket formed by adjacent pontoons, opening a navigable channel. But, not only did the movable span create interruptions to vehicle traffic on this busy highway, it also created awkward roadway curves that caused frequent accidents. The mechanism was eventually removed after the East Channel Bridge was replaced to increase its vertical clearance, providing boats with an alternative route between the two sides of Lake Washington. Further west, the Hood Canal Bridge incorporates truss spans for smaller craft. And it has hydraulic lift sections for larger ships. The US Naval Base Kitsap is not far away, so sometimes the bridge even has to open for Navy submarines. These movable spans can raise vertically above the pontoons, while adjacent bridge segments slide back underneath. The system is flexible: one side can be opened for tall but narrow vessels, or both for wider ships. But floating bridges don’t just have to make room for boats. In a sense, they are boats. Many historical spans literally floated on boats lashed together. And that comes with its own complications. Unlike fixed structures, floating bridges are constantly interacting with water: waves, currents, and sometimes even tides and ice. They’re easiest to implement on calm lakes or rivers with minimal flooding, but water is water, and it’s a totally different type of engineering when you’re not counting on firm ground to keep things in place. We don’t just stretch floating bridges across the banks and hope for the best. They’re actually moored in place, usually by long cables and anchors, to keep materials from overstressing and to prevent movements that would make the roadway uncomfortable or dangerous. Some anchors use massive concrete slabs placed on the lakebed. Others are tied to piles driven deep into the ground. In particularly deep water or soft soil, anchors are lowered to the bottom with water hoses that jet soil away, allowing the anchor to sink deep into the mud. These anchoring systems do double duty, providing both structural integrity and day-to-day safety for drivers, but even with them, floating bridges have some unique challenges. They naturally sit low to the water, which means that in high winds, waves can crash directly onto the roadway, obscuring the visibility and creating serious risks to road users. Motion from waves and wind can also cause the bridge to flex and shift beneath vehicles, especially unnerving for drivers unused to the sensation. In Washington State, all the major floating bridges have been closed at various times due to weather. The DOT enforces wind thresholds for each bridge; if the wind exceeds the threshold, the bridge is closed to traffic. Even if the bridge is structurally sound, these closures reflect the reality that in extreme weather, the bridge itself becomes part of the storm. But we still haven’t addressed the floating elephant in the pool here: the concrete pontoons themselves. Floating bridges have traditionally been made of wood or inflatable rubber, which makes sense if you’re trying to stay light and portable. But permanent infrastructure demands something more durable. It might seem counterintuitive to build a buoyant structure out of concrete, but it’s not as crazy as it sounds. In fact, civil engineering students compete every year in concrete canoe races hosted by the American Society of Civil Engineers. Actually, I was doing a little recreational math to find a way to make this intuitive, and I stumbled upon a fun little fact. If you want to build a neutrally buoyant, hollow concrete cube, there’s a neat rule of thumb you can use. Just take the wall thickness in inches, and that’s your outer dimension in feet. Want 12-inch-thick concrete walls? You’ll need a roughly 12-foot cube. This is only fun because of the imperial system, obviously. It’s less exciting to say that the two dimensions have a roughly linear relationship with a factor of 12. And I guess it’s not really that useful except that it helps to visualize just how feasible it is to make concrete float. Of course, real pontoons have to do more than just barely float themselves. They have to carry the weight of a deck and whatever crosses it with an acceptable margin of safety. That means they’re built much larger than a neutrally buoyant box. But mass isn’t the only issue. Concrete is a reliable material and if you’ve watched the channel for a while, you know that there are a few things you can count on concrete to do, and one of them is to crack. Usually not a big deal for a lot of structures, but that’s a pretty big problem if you’re trying to keep water out of a pontoon. Designers put enormous effort into preventing leaks. Modern pontoons are subdivided into sealed chambers. Watertight doors are installed between the chambers so they can still be accessed and inspected. Leak detection systems provide early warnings if anything goes wrong. And piping is pre-installed with pumps on standby, so if a leak develops, the chambers can be pumped dry before disaster strikes. The concrete recipe itself gets extra attention. Specialized mixes reduce shrinkage, improve water resistance, and resist abrasion. Even temperature control during curing matters. For the replacement of the Evergreen Point Bridge, contractors embedded heating pipes in the base slabs of the pontoons, allowing them to match the temperature of the walls as they were cast. This enabled the entire structure to cool down at a uniform rate, reducing thermal stresses that could lead to cracking. There were also errors during construction, though. A flaw in the post-tensioning system led to millions of dollars in change orders halfway through construction and delayed the project significantly while they worked out a repair. But there’s a good reason why they were so careful to get the designs right on that project. Of the four floating bridges in Washington state, two of them have sunk. In February 1979, a severe storm caused the western half of the Hood Canal Bridge to lose its buoyancy. Investigations revealed that open hatches allowed rain and waves to blow in, slowly filling the pontoons and ultimately leading to the western half of the bridge sinking. The DOT had to establish a temporary ferry service across the canal for nearly four years while the western span was rebuilt. Then, in 1990, it happened again. This time, the failure occurred during rehabilitation work on the Lacey V. Murrow Bridge while it was closed. Contractors were using hydrodemolition, high-pressure water jets, to remove old concrete from the road deck. Because the water was considered contaminated, it had to be stored rather than released into Lake Washington. Engineers calculated that the pontoon chambers could hold the runoff safely. To accommodate that, they removed the watertight doors that normally separated the internal compartments. But, when a storm hit over Thanksgiving weekend, water flooded into the open chambers. The bridge partially sank, severing cables on the adjacent Hadley Bridge and delaying the project by more than a year - a potent reminder that even small design or operational oversights can have major consequences on this type of structure. And we still have a lot to learn. Recently, Sound Transit began testing light rail trains on the Homer Hadley Bridge, introducing a whole new set of engineering puzzles. One is electricity. With power running through the rails, there was concern about stray currents damaging the bridge. To prevent this, the track is mounted on insulated blocks, with drip caps to prevent water from creating a conductive path. And then there’s the bridge movement. Unlike typical bridges, a floating bridge can roll, pitch, and yaw with weather, lake level, and traffic loads. The joints between the fixed shoreline and the bridge have to be able to accommodate movement. It’s usually not an issue for cars, trucks, bikes, or pedestrians, but trains require very precise track alignment. Engineers had to develop an innovative “track bridge” system. It uses specialized bearings to distribute every kind of movement over a longer distance, keeping tracks aligned even as the floating structure shifts beneath it. Testing in June went well, but there’s more to be done before you can ride the Link light rail across a floating highway. If floating bridges are the present, floating tunnels might be the future. I talked about immersed tube tunnels in a previous video. They’re used around the world, made by lowering precast sections to the seafloor and connecting them underwater. But what if, instead of resting on the bottom, those tunnels floated in the water column? It should be possible to suspend a tunnel with negative buoyancy using surface pontoons or even tether one with positive buoyancy to the bottom using anchors. In deep water, this could dramatically shorten tunnel lengths, reduce excavation costs, and minimize environmental impacts. Norway has actually proposed such a tunnel across a fjord on its western coast, a project that, if realized, would be the first of its kind. Like floating bridges before it, this tunnel will face a long list of unknowns. But that’s the essence of engineering: meeting each challenge with solutions tailored to a specific place and need. There aren’t many locations where floating infrastructure makes sense. The conditions have to be just right - calm waters, minimal ice, manageable tides. But where the conditions do allow, floating bridges and their hopefully future descendants open up new possibilities for connection, mobility, and engineering.
Scientists have been scrambling to make sense of a recent acceleration in warming, which may be attributable, they say, to changes in solar output or to shifts in cloud cover. A new study finds the biggest driver may be a drop in air pollution in East Asia, primarily in China. Read more on E360 →
How does a cell know when it’s been damaged? A molecular alarm, set off by mutated RNA and colliding ribosomes, signals danger. The post RNA Is the Cell’s Emergency Alert System first appeared on Quanta Magazine
Alaska’s Tongass is the world’s largest temperate rainforest and a sanctuary for wildlife. The Trump administration’s plan to rescind a rule banning roads in wild areas of National Forests would open untouched parts of the Tongass and other forests to logging and development. Read more on E360 →
Tony Tyson’s cameras revealed the universe’s dark contents. Now, with the Rubin Observatory’s 3.2-billion-pixel camera, he’s ready to study dark matter and dark energy in unprecedented detail. The post The Biggest-Ever Digital Camera Is This Cosmologist’s Magnum Opus first appeared on Quanta Magazine