More from nanoscale views
The beginning of a RET poster session Research Experience for Teachers (RET) programs are an example of the kind of programs that the National Science Foundation funds which are focused on K12 (and broader) education. This summer I hosted a high school physics teacher in my lab for 6 weeks, where he worked on a brief project, with one of my doctoral students helping out in a mentoring role. Just yesterday was the big poster session for all of the participants in the program, and it was very enjoyable to talk with a whole cadre of high school science teachers from across the greater Houston area about their projects and their experiences. Readers may be more familiar with the sibling Research Experience for Undergraduates (REU) programs, which give undergraduate students the chance to work for 10 weeks or so in a lab that is very likely not at their home institution. REUs are a great way for students interested in research to get broad exposure to new topics, meet people and acquire new skills, and for some, figure out whether they like research (and maybe which topics are exciting to them). The educational goal of REUs is clear: providing direct research experience to interested undergrads, ideally while advancing a research project and for some small fraction of students resulting in an eventual publication. RET programs are different: They are intended as professional development. The teachers are exposed to new topics, hopefully a fun research environment, and they are encouraged to think carefully about how they can take the concepts they learn and translate those for the classroom. I am very much not an expert in education research, but there is evidence (see here, for example) that teachers who participate in these programs get a great deal of satisfaction and have lower attrition from teaching professions. (Note that it's hard to do statistics well on questions like that, since the population of teachers that seek out opportunities like this may be a special subset of the total population of teachers.) An idea that makes sense to me: Enhancing the motivation and job satisfaction of a teacher can have a larger cumulative impact on educating students than an individual research project for a single student. It would be a great shame if RET and REU programs are victims of large-scale cuts at NSF. The NSF is the only science agency with education as part of its mission (at least historically). All the more reason to try to persuade appropriators to not follow the draconian presidential budget request for the agency.
The US House and Senate appropriations subcommittees have now completed their markups on the bills relevant to the FY26 appropriations for NSF, NASA, and NIST. The AAAS has an interactive dashboard with current information here if you want to click and look at all the science-related agencies. Other agencies still need to go through the Senate subcommittees. Just a reminder of how this is supposed to work. The House and Senate mark up their own versions of the detailed appropriations bills. In principle these are passed by each chamber (with the Senate versions for practical purposes requiring 60/100 votes of support because of the filibuster). Then a conference committee hashes out the differences between the bills, and the conference version of the bills is then voted on by each chamber (again, needing 60/100 votes to pass in the Senate). Finally, the president signs the spending bills. In the fantasy land of Schoolhouse Rock, which largely described events until the 1990s, these annual spending bills are supposed to be passed in time for the start of the new fiscal year on October 1. In practice, Congress has been deeply dysfunctional for years, and there have been a lot of continuing resolutions, late budgets, and mammoth omnibus spending bills. To summarize: NSF - House recommendation = $6.997B (a 20.7% cut from FY25), Senate = $9B (a 2% increase from FY25). These are in sharp contrast to the presidential budget request (PBR) of a 55.8% cut. NASA - House = flat from FY25, Senate = $24.9B (0.2% increase). NIST - House = $1.28B (10.6% increase from FY25), Senate = $1.6B (38.3% increase from FY25) NOAA - House = $5.7B (28.3% increase from FY25), Senate = $6.1B (36.3% increase from FY25) DOE has gone through the House, where the Office of Science is recommending a 1.9% increase, in contrast to a 13.9% cut in the PBR. If you are eligible and able to do so, please keep pushing. As I wrote a few days ago, this is a long-term project, since appropriations happen every year. As long as you're making your opinions known, it's good to push on representatives and senators that they need to hold the agency leadership accountable to actually spend what congress appropriates. A science post soon....
Some not-actively-discouraging news out of Washington DC yesterday: The Senate appropriations committee is doing its markups of the various funding bills (which all technically originated in the House), and it appears that they have pushed to keep the funding for NASA and NSF (which are bundled in the same bill with the Department of Justice for no obvious reason) at FY24 levels. See here as well. This is not yet a done deal within the Senate, but it's better than many alternatives. If you are a US citizen or permanent resident and one of your senators is on the appropriations committee, please consider calling them to reinforce how devastating massive budget cuts to these agencies would be. I am told that feedback to any other senators is also valuable, but appropriators are particularly important here. The House appropriations committee has not yet met to mark up their versions. They had been scheduled to do so earlier this week but punted it for an unknown time. Their relevant subcommittee membership is here. Again, if you are a constituent of one of these representatives, your calls would be particularly important, though it doesn't hurt for anyone to make their views heard to their representative. If the House version aligns with the presidential budget request, then a compromise between the two might still lead to 30% cuts to NSF and NASA, which would (IMO) still be catastrophic for the agencies and US science and competitiveness. This is a marathon, not a sprint. There are still many looming difficulties - staffing cuts are well underway. Spending of already appropriated funds at agencies like NSF is way down, leading to the possibility that the executive branch may just order (or not-order-but-effectively-order) agencies not to spend and then claw back the funds. This year and in future years they could decide to underspend appropriations knowing that any legal resistance will take years and cost a fortune to work its way through the courts. This appropriations battle is also an annual affair - even if the cuts are forestalled for now (it is unlikely that the executive would veto all the spending bills over science agency cuts), this would have to happen again next year, and so on. Still, right now, there is an opportunity to push against funding cuts. Failing to try would be a surrender. (Obligatory notice: yes, I know that there are large-scale budgetary challenges facing the US; I don't think destroying government investment in science and engineering research is an intelligent set of spending cuts.)
Here are a number of items from the past week or so that I think readers of this blog might find interesting: Essentially all the news pertaining to the US federal funding of science continues to be awful. This article from Science summarizes the situation well, as does this from The Guardian and this editorial in the Washington Post. I do like the idea of a science fair of cancelled grants as a way to try to get alleged bipartisan appropriator notice of just how bad the consequences would be of the proposed cuts. On a more uplifting note, mathematicians have empirically demonstrated a conjecture originally made by John Conway, that it is possible to make a tetrahedral pyramid that, under gravity, has only one stable orientation. Quanta has a nice piece on this with a cool animated gif, and here is the actual preprint about it. It's all about mass distributions and moments of inertia about edges. As others have pointed out including the authors, this could be quite useful for situations like recent lunar lander attempts that seem to have a difficult time not topping over. A paper last week in Nature uses photons and a microcavity to try to test how long it takes photons to tunnel through a classically forbidden region. In this setup, it is mathematically legit to model the photons as if they have an effective mass, and one can model the barrier they need to traverse in terms of an effective potential energy. Classically, if the kinetic energy of the particle of interest is less than the potential energy of the barrier, the particle is forbidden inside the barrier. I've posted about the issue of tunneling time repeatedly over the years (see here for a 2020 post containing links), because I think it's a fascinating problem both conceptually and as a puzzle for experimentalists (how does one truly do a fair test of this?). The take-away from this paper is, the more classically forbidden the motion, the faster the deduced tunneling time. This has been seen in other experiments testing this idea. A key element of novelty in the new paper is the claim that the present experiment seems (according to the authors) to not be reasonably modeled by Bohmian mechanics. I'd need to read this in more depth to better understand it, as I had thought that Bohmian mechanics applied to problems like this is generally indistinguishable in predictions from conventional quantum mechanics, basically by design. In other non-condensed matter news, there is an interstellar comet transiting the solar system right now. This is very cool - it's only the third such object detected by humans, but to be fair we've only really been looking for a few years. This suggests that moderately sized hunks of material are likely passing through from interstellar space all the time, and the Vera C. Rubin Observatory will detect a boatload of them. My inner science fiction fan is hoping that the object changes its orbit at perihelion by mysterious means. This week is crunch time for a final push on US congressional appropriators to try to influence science agency budgets in FY26. I urge you to reach out if this matters to you. Likewise, I think it's more than reasonable to ask congress why the NSF is getting kicked out of its headquarters with no plan for an alternative agency location, so that the HUD secretary can have a palatial second home in that building.
I participated in a program about 15 years ago that looked at science and technology challenges faced by a subset of the US government. I came away thinking that such problems fall into three broad categories. Actual science and engineering challenges, which require foundational research and creativity to solve. Technology that may be fervently desired but is incompatible with the laws of nature, economic reality, or both. Alleged science and engineering problems that are really human/sociology issues. Part of science and engineering education and training is giving people the skills to recognize which problems belong to which categories. Confusing these can strongly shape the perception of whether science and engineering research is making progress. There has been a lot of discussion in the last few years about whether scientific progress (however that is measured) has slowed down or stagnated. For example, see here: https://www.theatlantic.com/science/archive/2018/11/diminishing-returns-science/575665/ https://news.uchicago.edu/scientific-progress-slowing-james-evans https://www.forbes.com/sites/roberthart/2023/01/04/where-are-all-the-scientific-breakthroughs-forget-ai-nuclear-fusion-and-mrna-vaccines-advances-in-science-and-tech-have-slowed-major-study-says/ https://theweek.com/science/world-losing-scientific-innovation-research A lot of the recent talk is prompted by this 2023 study, which argues that despite the world having many more researchers than ever before (behold population growth) and more global investment in research, somehow "disruptive" innovations are coming less often, or are fewer and farther between these days. (Whether this is an accurate assessment is not a simple matter to resolve; more on this below.) There is a whole tech bro culture that buys into this, however. For example, see this interview from last week in the New York Times with Peter Thiel, which points out that Thiel has been complaining about this for a decade and a half. On some level, I get it emotionally. The unbounded future spun in a lot of science fiction seems very far away. Where is my flying car? Where is my jet pack? Where is my moon base? Where are my fusion power plants, my antigravity machine, my tractor beams, my faster-than-light drive? Why does the world today somehow not seem that different than the world of 1985, while the world of 1985 seems very different than that of 1945? Some of the folks that buy into this think that science is deeply broken somehow - that we've screwed something up, because we are not getting the future they think we were "promised". Some of these people have this as an internal justification underpinning the dismantling of the NSF, the NIH, basically a huge swath of the research ecosystem in the US. These same people would likely say that I am part of the problem, and that I can't be objective about this because the whole research ecosystem as it currently exists is a groupthink self-reinforcing spiral of mediocrity. Science and engineering are inherently human ventures, and I think a lot of these concerns have an emotional component. My take at the moment is this: Genuinely transformational breakthroughs are rare. They often require a combination of novel insights, previously unavailable technological capabilities, and luck. They don't come on a schedule. There is no hard and fast rule that guarantees continuous exponential technological progress. Indeed, in real life, exponential growth regimes never last. The 19th and 20th centuries were special. If we think of research as a quest for understanding, it's inherently hierarchal. Civilizational collapses aside, you can only discover how electricity works once. You can only discover the germ theory of disease, the nature of the immune system, and vaccination once (though in the US we appear to be trying really hard to test that by forgetting everything). You can only discover quantum mechanics once, and doing so doesn't imply that there will be an ongoing (infinite?) chain of discoveries of similar magnitude. People are bad at accurately perceiving rare events and their consequences, just like people have a serious problem evaluating risk or telling the difference between correlation and causation. We can't always recognize breakthroughs when they happen. Sure, I don't have a flying car. I do have a device in my pocket that weighs only a few ounces, gives me near-instantaneous access to the sum total of human knowledge, let's me video call people around the world, can monitor aspects of my fitness, and makes it possible for me to watch sweet videos about dogs. The argument that we don't have transformative, enormously disruptive breakthroughs as often as we used to or as often as we "should" is in my view based quite a bit on perception. Personally, I think we still have a lot more to learn about the natural world. AI tools will undoubtedly be helpful in making progress in many areas, but I think it is definitely premature to argue that the vast majority of future advances will come from artificial superintelligences and thus we can go ahead and abandon the strategies that got us the remarkable achievements of the last few decades. I think some of the loudest complainers (Thiel, for example) about perceived slowing advancement are software people. People who come from the software development world don't always appreciate that physical infrastructure and understanding are hard, and that there are not always clever or even brute-force ways to get to an end goal. Solving foundational problems in molecular biology or quantum information hardware or photonics or materials is not the same as software development. (The tech folks generally know this on an intellectual level, but I don't think all of them really understand it in their guts. That's why so many of them seem to ignore real world physical constraints when talking about AI.). Trying to apply software development inspired approaches to science and engineering research isn't bad as a component of a many-pronged strategy, but alone it may not give the desired results - as warned in part by this piece in Science this week. More frequent breakthroughs in our understanding and capabilities would be wonderful. I don't think dynamiting the US research ecosystem is the way to get us there, and hoping that we can dismantle everything because AI will somehow herald a new golden age seems premature at best.
More in science
On 29 August 1949, the Soviet Union successfully tested its first nuclear weapon. Over the next year and a half, U.S. President Harry S. Truman resurrected the Office of Civilian Defense (which had been abolished at the end of World War II) and signed into law the Federal Civil Defense Act of 1950, which mobilized government agencies to plan for the aftermath of a global nuclear war. With the Cold War underway, that act kicked off a decades-long effort to ensure that at least some Americans survived nuclear armageddon. As the largest civilian federal agency with a presence throughout the country, the U.S. Post Office Department was in a unique position to monitor local radiation levels and shelter residents. By the end of 1964, approximately 1,500 postal buildings had been designated as fallout shelters, providing space and emergency supplies for 1.3 million people. Occupants were expected to remain in the shelters until the radioactivity outside was deemed safe. By 1968, about 6,000 postal employees had been trained to use radiological equipment, such as the CD V-700 pictured at top, to monitor beta and gamma radiation. And a group of postal employees organized a volunteer ham radio network to help with communications should the regular networks go down. What was civil defense in the Cold War? The basic premise of civil defense was that many people would die immediately in cities directly targeted by nuclear attacks. (Check out Alex Wellerstein’s interactive Nukemap for an estimate of casualties and impact should your hometown—or any location of your choosing—be hit.) It was the residents of other cities, suburbs, and rural communities outside the blast area that would most benefit from civil defense preparations. With enough warning, they could shelter in a shielded site and wait for the worst of the fallout to decay. Anywhere from a day or two to a few weeks after the attack, they could emerge and aid any survivors in the harder-hit areas. In 1957, a committee of the Office of Defense Mobilization drafted the report Deterrence and Survival in the Nuclear Age, for President Dwight D. Eisenhower. Better known as the Gaither Report, it called for the creation of a nationwide network of fallout shelters to protect civilians. Government publications such as The Family Fallout Shelter encouraged Americans who had the space, the resources, and the will to construct shelters for their homes. City dwellers in apartment buildings warranted only half a page in the booklet, with the suggestion to head to the basement and cooperate with other residents. This model fallout shelter from 1960 was designed for four to six people. Bettmann/Getty Images Ultimately, very few homeowners actually built a fallout shelter. But Rod Serling, creator of the television series “The Twilight Zone,” saw an opportunity for pointed social commentary. Aired in the fall of 1961, the episode “The Shelter” showed how quickly civilization (epitomized by a suburban middle-class family and their friends) broke down over decisions about who would be saved and who would not. Meanwhile, President John F. Kennedy had started to shift the national strategy from individual shelters to community shelters. At his instruction, the U.S. Army Corps of Engineers began surveying existing buildings suitable for public shelters. Post offices, especially ones with basements capable of housing at least 50 people, were a natural fit. Each postmaster general was designated as the local shelter manager and granted complete authority to operate the shelter, including determining who would be admitted or excluded. The Handbook for Fallout Shelter Management gave guidance for everything from sleeping arrangements to sanitation standards. Shelters were stocked with food and water, medicine, and, of course, radiological survey instruments. What to do in case of a nuclear attack These community fallout shelters were issued a standard kit for radiation detection. The kit came in a cardboard box that contained two radiation monitors, the CD V-700 (a Geiger counter, pictured at top) and the CD V-715 (a simple ion chamber survey meter); two cigar-size CD V-742 dosimeters, to measure a person’s total exposure while wearing the device; and a charger for the dosimeters. Also included was the Handbook for Radiological Monitors, which provided instructions on how to use the equipment and report the results. Post office fallout shelters were issued standard kits for measuring radioactivity after a nuclear attack.National Postal Museum/Smithsonian Institution The shelter radiation kit included two radiation monitors, two cigar-size dosimeters, and a charger for the dosimeters. Photoquest/Getty Images In the event of an attack, the operator would take readings with the CD V-715 at selected locations in the shelter. Then, within three minutes of finishing the indoor measurements, he would go outside and take a reading at least 25 feet (7.6 meters) from the building. If the radiation level outside was high, there were procedures for decontamination upon returning to the shelter. The “protection factor” of the shelter was calculated by dividing the outside reading by the inside reading. (Today the Federal Emergency Management Agency, FEMA, recommends a PF of at least 40 for a fallout shelter.) Operators were directed to retake the measurements and recalculate the protective factor at least once every 24 hours, or more frequently if the radiation levels changed rapidly. The CD V-700 was intended for detecting beta and gamma radiation during cleanup and decontamination operations, and also for detecting any radioactive contamination of food, water, and personnel. RELATED: DIY Gamma-Ray Spectroscopy With a Raspberry Pi Pico Each station would report their dose rates to a regional control center, so that the civil defense organization could determine when people could leave their shelter, where they could go, what routes to take, and what facilities needed decontamination. But if you’ve lived through a natural or manmade disaster, you’ll know that in the immediate aftermath, communications don’t always work so well. Indeed, the Handbook for Radiological Monitors acknowledged that a nuclear attack might disrupt communications. Luckily, the U.S. Post Office Department had a backup plan. In May 1958, Postmaster General Arthur E. Summerfield made an appeal to all postal employees who happened to be licensed amateur radio operators, to form an informal network that would provide emergency communications in the event of the collapse of telephone and telegraph networks and commercial broadcasting. The result was Post Office Net (PON), a voluntary group of ham radio operators; by 1962, about 1,500 postal employees in 43 states had signed on. That year, PON was opened up to nonemployees who had the necessary license. RELATED: The Uncertain Future of Ham Radio Although PON was never activated due to a nuclear threat, it did transmit messages during other emergencies. For example, in January 1967, after an epic blizzard blanketed Illinois and Michigan with heavy snow, the Michigan PON went into action, setting up liaisons with county weather services and relaying emergency requests, such as rescuing people stranded in vehicles on Interstate 94. A 1954 civil defense fair featured a display of amateur radios. The U.S. Post Office recruited about 1,500 employees to operate a ham radio network in the event that regular communications went down. National Archives The post office retired the network on 30 June 1974 as part of its shift away from civil defense preparedness. (A volunteer civil emergency-response ham radio network still exists, under the auspices of the American Radio Relay League.) And by 1977, laboratory tests indicated that most of the food and medicine stockpiled in post office basements was no longer fit for human consumption. In 1972 the Office of Civil Defense was replaced by the Defense Civil Preparedness Agency, which was eventually folded into FEMA. And with the end of the Cold War, the civil defense program officially ended in 1994, fortunately without ever being needed for a nuclear attack. Do we still need civil defense? The idea for this column came to me last fall, when I was doing research at the Linda Hall Library, in Kansas City, Mo., and I kept coming across articles about civil defense in magazines and journals from the 1950s and ’60s. I knew that the Smithsonian’s National Postal Museum, in Washington, D.C., had several civil defense artifacts (including the CD V-700 and a great “In Time of Emergency” public service announcement record album). As a child of the late Cold War, I remember being worried by the prospect of nuclear war. But then the Cold War ended, and so did my fears. I envisioned this month’s column capturing the intriguing history of civil defense and the earnest preparations of the era. That chapter of history, I assumed, was closed. Little did I imagine that by the time I began to write this, the prospect of a nuclear attack, if not an all-out war, would suddenly become much more real. These days, I understand the complexities and nuances of nuclear weapons much better than when I was a child. But I’m just as concerned that a nuclear conflict is imminent. Here’s hoping that history repeats itself, and it does not come to that. Part of a continuing series looking at historical artifacts that embrace the boundless potential of technology. An abridged version of this article appears in the August 2025 print issue. References The November 1951 issue of Electrical Engineering summarized a civil defense conference held at the General Electric Co.’s Electronics Park in Syracuse, N.Y., earlier that year. Two hundred eighty federal, state, county, and city officials from across the United States and Canada attended, which got me thinking about the topic. Many of the government’s civil defense handbooks are available through the Internet Archive. The U.S. Postal Bulletins have also been digitized, and the USPS historian’s office wrote a great account, “The Postal Service’s Role in Civil Defense During the Cold War.” Although I’ve highlighted artifacts from the National Postal Museum, the Smithsonian Institution has many other objects across multiple museums. Eric Green has been collecting civil defense material since 1978 and has made much of it available through his virtual Civil Defense Museum. Alex Wellerstein, a historian of nuclear technology at the Stevens Institute of Technology, writes the Substack newsletter Doomsday Machines, where he gives thoughtful commentary on how we think about the end of times, in both fiction and reality. His interactive Nukemap is informative and scary.
After finding the homeschooling life confining, the teen petitioned her way into a graduate class at Berkeley, where she ended up disproving a 40-year-old conjecture. The post At 17, Hannah Cairo Solved a Major Math Mystery first appeared on Quanta Magazine
The Titanic lies about 12,500 feet under the ocean. The pressure down there is so immense that even submersibles supposedly built for those conditions can, as we know, tragically fail. Now imagine taking a sub nearly three times deeper. Read more on E360 →
The next section of Chapter 3 of Stewart Brand’s Maintenance on Books in Progress
The Japan Atomic Energy Agency reported earlier this year that is has developed and tested a battery with depleted uranium as the active material of the negative electrode. Why would they do this, and what role could such a battery play? First let’s look at the details (which are sparse). The battery uses depleted uranium […] The post Depleted Uranium Batteries first appeared on NeuroLogica Blog.