More from nanoscale views
Recent events are very dire for research at US universities, and I will write further about those, but first a quick unrelated survey for those at such institutions. Back in the day, it was common for physics and some other (mechanical engineering?) departments to have machine shops with professional staff. In the last 15-20 years, there has been a huge growth in maker-spaces on campuses to modernize and augment those capabilities, though often maker-spaces are aimed at undergraduate design courses rather than doing work to support sponsored research projects (and grad students, postdocs, etc.). At the same time, it is now easier than ever (modulo tariffs) to upload CAD drawings to a website and get a shop in another country to ship finished parts to you. Quick questions: Does your university have a traditional or maker-space-augmented machine shop available to support sponsored research? If so, who administers this - a department, a college/school, the office of research? Does the shop charge competitive rates relative to outside vendors? Are grad students trained to do work themselves, and are there professional machinists - how does that mix work? Thanks for your responses. Feel free to email me if you'd prefer to discuss offline.
This NY Times feature lets you see how each piece of NSF's funding has been reduced this year relative to the normalized average spanning in the last decade. Note: this fiscal year, thanks to the continuing resolution, the actual agency budget has not actually been cut like this. They are just not spending congressionally appropriated agency funds. The agency, fearing/assuming that its budget will get hammered next fiscal year, does not want to start awards that it won't be able to fund in out-years. The result is that this is effectively obeying in advance the presidential budget request for FY26. (And it's highly likely that some will point to unspent funds later in the year and use that as a justification for cuts, when in fact it's anticipation of possible cuts that has led to unspent funds. I'm sure the Germans have a polysyllabic word for this. In English, "Catch-22" is close.) I encourage you to click the link and go to the article where the graphic is interactive (if it works in your location - not sure about whether the link works internationally). The different colored regions are approximately each of the NSF directorates (in their old organizational structure). Each subsection is a particular program. Seems like whoever designed the graphic was a fan of Tufte, and the scaling of the shaded areas does quantitatively reflect funding changes. However, most people have a tough time estimating relative areas of irregular polygons. Award funding in physics (the left-most section of the middle region) is down 85% relative to past years. Math is down 72%. Chemistry is down 57%. Materials is down 63%. Earth sciences is down 80%. Polar programs (you know, those folks who run all the amazing experiments in Antarctica) is down 88%. I know my readers are likely tired of me harping on NSF, but it's both important and a comparatively transparent example of what is also happening at other agencies. If you are a US citizen and think that this is the wrong path, then push on your congressional delegation about the upcoming budget.
Apologies for slow posting. Real life has been very intense, and I also was rather concerned when one of my readers mentioned last weekend that these days my blog was like concentrated doom-scrolling. I will have more to say about the present university research crisis later, but first I wanted to give a hopefully diverting example of the kind of problem-solving and following-your-nose that crops up in research. Recently in my lab we have had a need to measure very small changes in electrical resistance of some devices, at the level of a few milliOhms out of kiloOhms - parts in \(10^6\). One of my students put together a special kind of resistance bridge to do this, and it works very well. Note to interested readers: if you want to do this, make sure that you use components with very low temperature coefficients of their properties (e.g., resistors with a very small \(dR/dT\)), because otherwise your bridge becomes an extremely effective thermometer for your lab. It’s kind of cool to be able to see the lab temperature drift around by milliKelvins, but it's not great for measuring your sample of interest. There are a few ways to measure resistance. The simplest is the two-terminal approach, where you drive currents through and measure voltages across your device with the same two wires. This is easy, but it means that the voltage you measure includes contributions from the contacts those wires make with the device. A better alternative is the four-terminal method, where you use separate wires to supply/collect the current. Anyway, in the course of doing some measurements of a particular device's resistance as a function of magnetic field at low temperatures, we saw something weird. Below some rather low temperatures, when we measured in a 2-terminal arrangement, we saw a jump up in resistance by around 20 milliOhms (out of a couple of kOhms) as magnetic field was swept up from zero, and a small amount of resistance hysteresis with magnetic field sweep that vanished above maybe 0.25 T. This vanished completely in a 4-terminal arrangement, and also disappeared above about 3.4 K. What was this? Turns out that I think we accidentally rediscovered the superconducting transition in indium. While our contact pads on our sample mount looked clean to the unaided eye, they had previously had indium on there. The magic temperature is very close to the bulk \(T_{c}\) for indium. For one post, rather than dwelling on the terrible news about the US science ecosystem, does anyone out there have other, similar fun experimental anecdotes? Glitches that turned out to be something surprising? Please share in the comments.
Lots of news in the last few days regarding federal funding of university research: NSF has now frozen all funding for new and continuing awards. This is not good; just how bad it is depends on the definition of "until further notice". Here is an open letter from the NSF employees union to the basically-silent-so-far National Science Board, asking for the NSB to support the agency. Here is a grass roots SaveNSF website with good information and suggestions for action - please take a look. NSF also wants to cap indirect cost rates at 15% for higher ed institutions for new awards. This will almost certainly generate a law suit from the AAU and others. Speaking of the AAU, last week there was a hearing in the Massachusetts district court regarding the lawsuits about the DOE setting indirect cost rates to 15% for active and new awards. There had already been a temporary restraining order in place nominally stopping the change; the hearing resulted in that order being extended "until a further order is issued resolving the request for a temporary injunction." (See here, the entry for April 29.) In the meantime, the presidential budget request has come out, and if enacted it would be devastating to the science agencies. Proposed cuts include 55% to NSF, 40% to NIH, 33% to USGS, 25% to NOAA, etc. If these cuts went through, we are taking about more than $35B, at a rough eyeball estimate. And here is a letter from former NSF directors and NSB chairs to the appropriators in Congress, asking them to ignore that budget request and continue to support government sponsored science and engineering research. Unsurprisingly, during these times there is a lot of talk about the need for universities to diversify their research portfolios - that is, expanding non-federally-supported ways to continue generating new knowledge, training the next generation of the technically literate workforce, and producing IP and entrepreneurial startup companies. (Let's take it as read that it would be economically and societally desirable to continue these things, for the purposes of this post.) Philanthropy is great, and foundations do fantastic work in supporting university research, philanthropy can't come close to making up for sharp drawdowns of federal support. The numbers just don't work. The endowment of the Moore Foundation, for example, is around $10B, implying an annual payout of $500M or so, which is great but around 1.4% of the cuts being envisioned. Industry seems like the only non-governmental possibility that could in principle muster the resources that could make a large-scale difference. Consider the estimated profits (not revenues) of different industrial sectors. The US semiconductor market had revenues last year of around $500B with an annualized net margin of around 17%, giving $85B/yr of profit. US aerospace and defense similarly have an annual profit of around $70B. The financial/banking sector, which has historically benefitted greatly from PhD-trained quants, has an annual net income of $250B. I haven't even listed numbers for the energy and medical sectors, because those are challenging to parse (but large). All of those industries have been helped greatly by university research, directly and indirectly. It's the source of trained people. It's the source of initial work that is too long-term for corporations to be able to support without short-time-horizon shareholders getting annoyed. It's the source of many startup companies that sometimes grow and other times get gobbled up by bigger fish. Encouraging greater industrial sponsorship of university research is a key challenge. The value proposition must be made clear to both the companies and universities. The market is unforgiving and exerts pressure to worry about the short term not the long term. Given how Congress is functioning, it does not look like there are going to be changes to the tax code put in place that could incentivize long term investment. Cracking this and meaningfully growing the scale of industrial support for university research could be enormously impactful. Something to ponder.
More in science
[Note that this article is a transcript of the video embedded above.] “The big black stacks of the Illium Works of the Federal Apparatus Corporation spewed acid fumes and soot over the hundreds of men and women who were lined up before the red-brick employment office.” That’s the first line of one of my favorite short stories, written by Kurt Vonnegut in 1955. It paints a picture of a dystopian future that, thankfully, didn’t really come to be, in part because of those stacks. In some ways, air pollution is kind of a part of life. I’d love to live in a world where the systems, materials and processes that make my life possible didn’t come with any emissions, but it’s just not the case... From the time that humans discovered fire, we’ve been methodically calculating the benefits of warmth, comfort, and cooking against the disadvantages of carbon monoxide exposure and particulate matter less than 2.5 microns in diameter… Maybe not in that exact framework, but basically, since the dawn of humanity, we’ve had to deal with smoke one way or another. Since, we can’t accomplish much without putting unwanted stuff into the air, the next best thing is to manage how and where it happens to try and minimize its impact on public health. Of course, any time you have a balancing act between technical issues, the engineers get involved, not so much to help decide where to draw the line, but to develop systems that can stay below it. And that’s where the smokestack comes in. Its function probably seems obvious; you might have a chimney in your house that does a similar job. But I want to give you a peek behind the curtain into the Illium Works of the Federal Apparatus Corporation of today and show you what goes into engineering one of these stacks at a large industrial facility. I’m Grady, and this is Practical Engineering. We put a lot of bad stuff in the air, and in a lot of different ways. There are roughly 200 regulated hazardous air pollutants in the United States, many with names I can barely pronounce. In many cases, the industries that would release these contaminants are required to deal with them at the source. A wide range of control technologies are put into place to clean dangerous pollutants from the air before it’s released into the environment. One example is coal-fired power plants. Coal, in particular, releases a plethora of pollutants when combusted, so, in many countries, modern plants are required to install control systems. Catalytic reactors remove nitrous oxides. Electrostatic precipitators collect particulates. Scrubbers use lime (the mineral, not the fruit) to strip away sulfur dioxide. And I could go on. In some cases, emission control systems can represent a significant proportion of the costs involved in building and operating a plant. But these primary emission controls aren’t always feasible for every pollutant, at least not for 100 percent removal. There’s a very old saying that “the solution to pollution is dilution.” It’s not really true on a global scale. Case in point: There’s no way to dilute the concentration of carbon dioxide in the atmosphere, or rather, it’s already as dilute as it’s going to get. But, it can be true on a local scale. Many pollutants that affect human health and the environment are short-lived; they chemically react or decompose in the atmosphere over time instead of accumulating indefinitely. And, for a lot of chemicals, there are concentration thresholds below which the consequences on human health are negligible. In those cases, dilution, or really dispersion, is a sound strategy to reduce their negative impacts, and so, in some cases, that’s what we do, particularly at major point sources like factories and power plants. One of the tricks to dispersion is that many plumes are naturally buoyant. Naturally, I’m going to use my pizza oven to demonstrate this. Not all, but most pollutants we care about are a result of combustion; burning stuff up. So the plume is usually hot. We know hot air is less dense, so it naturally rises. And the hotter it is, the faster that happens. You can see when I first start the fire, there’s not much air movement. But as the fire gets hotter in the oven, the plume speeds up, ultimately rising higher into the air. That’s the whole goal: get the plume high above populated areas where the pollutants can be dispersed to a minimally-harmful concentration. It sounds like a simple solution - just run our boilers and furnaces super hot to get enough buoyancy for the combustion products to disperse. The problem with the solution is that the whole reason we combust things is usually to recover the heat. So if you’re sending a lot of that heat out of the system, just because it makes the plume disperse better, you’re losing thermodynamic efficiency. It’s wasteful. That’s where the stack comes in. Let me put mine on and show you what I mean. I took some readings with the anemometers with the stack on and off. The airspeed with the stack on was around double with it off. About a meter per second compared with two. But it’s a little tougher to understand why. It’s intuitive that as you move higher in a column of fluid, the pressure goes down (since there’s less weight of the fluid above). The deeper you dive in a pool, the more pressure you feel. The higher you fly in a plane or climb a mountain, the lower the pressure. The slope of that line is proportional to a fluid’s density. You don’t feel much of a pressure difference climbing a set of stairs because air isn’t very dense. If you travel the same distance in water, you’ll definitely notice the difference. So let’s look at two columns of fluid. One is the ambient air and the other is the air inside a stack. Since it’s hotter, the air inside the stack is less dense. Both columns start at the same pressure at the bottom, but the higher you go, the more the pressure diverges. It’s kind of like deep sea diving in reverse. In water, the deeper you go into the dense water, the greater the pressure you feel. In a stack, the higher you are in a column of hot air, the more buoyant you feel compared to the outside air. This is the genius of a smoke stack. It creates this difference in pressure between the inside and outside that drives greater airflow for a given temperature. Here’s the basic equation for a stack effect. I like to look at equations like this divided into what we can control and what we can’t. We don’t get to adjust the atmospheric pressure, the outside temperature, and this is just a constant. But you can see, with a stack, an engineer now has two knobs to turn: the temperature of the gas inside and the height of the stack. I did my best to keep the temperature constant in my pizza oven and took some airspeed readings. First with no stack. Then with the stock stack. Then with a megastack. By the way, this melted my anemometer; should have seen that coming. Thankfully, I got the measurements before it melted. My megastack nearly doubled the airspeed again at around three-and-a-half meters per second versus the two with just the stack that came with the oven. There’s something really satisfying about this stack effect to me. No moving parts or fancy machinery. Just put a longer pipe and you’ve fundamentally changed the physics of the whole situation. And it’s a really important tool in the environmental engineer’s toolbox to increase airflow upward, allowing contaminants to flow higher into the atmosphere where they can disperse. But this is not particularly revolutionary… unless you’re talking about the Industrial Revolution. When you look at all the pictures of the factories in the 19th century, those stacks weren’t there to improve air quality, if you can believe it. The increased airflow generated by a stack just created more efficient combustion for the boilers and furnaces. Any benefits to air quality in the cities were secondary. With the advent of diesel and electric motors, we could use forced drafts, reducing the need for a tall stack to increase airflow. That was kind of the decline of the forests of industrial chimneys that marked the landscape in the 19th century. But they’re obviously not all gone, because that secondary benefit of air quality turned into the primary benefit as environmental rules about air pollution became stricter. Of course, there are some practical limits that aren’t taken into account by that equation I showed. The plume cools down as it moves up the stack to the outside, so its density isn’t constant all the way up. I let my fire die down a bit so it wouldn’t melt the thermometer (learned my lesson), and then took readings inside the oven and at the top of the stack. You can see my pizza oven flue gas is around 210 degrees at the top of the mega-stack, but it’s roughly 250 inside the oven. After the success of the mega stack on my pizza oven, I tried the super-mega stack with not much improvement in airflow: about 4 meters per second. The warm air just got too cool by the time it reached the top. And I suspect that frictional drag in the longer pipe also contributed to that as well. So, really, depending on how insulating your stack is, our graph of height versus pressure actually ends up looking like this. And this can be its own engineering challenge. Maybe you’ve gotten back drafts in your fireplace at home because the fire wasn’t big or hot enough to create that large difference in pressure. You can see there are a lot of factors at play in designing these structures, but so far, all we’ve done is get the air moving faster. But that’s not the end goal. The purpose is to reduce the concentration of pollutants that we’re exposed to. So engineers also have to consider what happens to the plume once it leaves the stack, and that’s where things really get complicated. In the US, we have National Ambient Air Quality Standards that regulate six so-called “criteria” pollutants that are relatively widespread: carbon monoxide, lead, nitrogen dioxide, ozone, particulates, and sulfur dioxide. We have hard limits on all these compounds with the intention that they are met at all times, in all locations, under all conditions. Unfortunately, that’s not always the case. You can go on EPA’s website and look at the so-called “non-attainment” areas for the various pollutants. But we do strive to meet the standards through a list of measures that is too long to go into here. And that is not an easy thing to do. Not every source of pollution comes out of a big stationary smokestack where it’s easy to measure and control. Cars, buses, planes, trucks, trains, and even rockets create lots of contaminants that vary by location, season, and time of day. And there are natural processes that contribute as well. Forests and soil microbes release volatile organic compounds that can lead to ozone formation. Volcanic eruptions and wildfires release carbon monoxide and sulfur dioxide. Even dust storms put particulates in the air that can travel across continents. And hopefully you’re seeing the challenge of designing a smoke stack. The primary controls like scrubbers and precipitators get most of the pollutants out, and hopefully all of the ones that can’t be dispersed. But what’s left over and released has to avoid pushing concentrations above the standards. That design has to work within the very complicated and varying context of air chemistry and atmospheric conditions that a designer has no control over. Let me show you a demo. I have a little fog generator set up in my garage with a small fan simulating the wind. This isn’t a great example because the airflow from the fan is pretty turbulent compared to natural winds. You occasionally get some fog at the surface, but you can see my plume mainly stays above the surface, dispersing as it moves with the wind. But watch what happens when I put a building downstream. The structure changes the airflow, creating a downwash effect and pulling my plume with it. Much more frequently you see the fog at the ground level downstream. And this is just a tiny example of how complex the behavior of these plumes can be. Luckily, there’s a whole field of engineering to characterize it. There are really just two major transport processes for air pollution. Advection describes how contaminants are carried along by the wind. Diffusion describes how those contaminants spread out through turbulence. Gravity also affects air pollution, but it doesn’t have a significant effect except on heavier-than-air particulates. With some math and simplifications of those two processes, you can do a reasonable job predicting the concentration of any pollutant at any point in space as it moves and disperses through the air. Here’s the basic equation for that, and if you’ll join me for the next 2 hours, we’ll derive this and learn the meaning of each term… Actually, it might take longer than that, so let’s just look at a graphic. You can see that as the plume gets carried along by the wind, it spreads out in what’s basically a bell curve, or gaussian distribution, in the planes perpendicular to the wind direction. But even that is a bit too simplified to make any good decisions with, especially when the consequences of getting it wrong are to public health. A big reason for that is atmospheric stability. And this can make things even more complicated, but I want to explain the basics, because the effect on plumes of gas can be really dramatic. You probably know that air expands as it moves upward; there’s less pressure as you go up because there is less air above you. And as any gas expands, it cools down. So there’s this relationship between height and temperature we call the adiabatic lapse rate. It’s about 10 degrees Celsius for every kilometer up or about 28 Fahrenheit for every mile up. But the actual atmosphere doesn’t always follow this relationship. For example, rising air parcels can cool more slowly than the surrounding air. This makes them warmer and less dense, so they keep rising, promoting vertical motion in a positive feedback loop called atmospheric instability. You can even get a temperature inversion where you have cooler air below warmer air, something that can happen in the early morning when the ground is cold. And as the environmental lapse rate varies from the adiabatic lapse rate, the plumes from stacks change. In stable conditions, you usually get a coning plume, similar to what our gaussian distribution from before predicts. In unstable conditions, you get a lot of mixing, which leads to a looping plume. And things really get weird for temperature inversions because they basically act like lids for vertical movement. You can get a fanning plume that rises to a point, but then only spreads horizontally. You can also get a trapping plume, where the air gets stuck between two inversions. You can have a lofting plume, where the air is above the inversion with stable conditions below and unstable conditions above. And worst of all, you can have a fumigating plume when there are unstable conditions below an inversion, trapping and mixing the plume toward the ground surface. And if you pay attention to smokestacks, fires, and other types of emissions, you can identify these different types of plumes pretty easily. Hopefully you’re seeing now how much goes into this. Engineers have to keep track of the advection and diffusion, wind speed and direction, atmospheric stability, the effects of terrain and buildings on all those factors, plus the pre-existing concentrations of all the criteria pollutants from other sources, which vary in time and place. All that to demonstrate that your new source of air pollution is not going to push the concentrations at any place, at any time, under any conditions, beyond what the standards allow. That’s a tall order, even for someone who loves gaussian distributions. And often the answer to that tall order is an even taller smokestack. But to make sure, we use software. The EPA has developed models that can take all these factors into account to simulate, essentially, what would happen if you put a new source of pollution into the world and at what height. So why are smokestacks so tall? I hope you’ll agree with me that it turns out to be a pretty complicated question. And it’s important, right? These stacks are expensive to build and maintain. Those costs trickle down to us through the costs of the products and services we buy. They have a generally negative visual impact on the landscape. And they have a lot of other engineering challenges too, like resonance in the wind. And on the other hand, we have public health, arguably one of the most critical design criteria that can exist for an engineer. It’s really important to get this right. I think our air quality regulations do a lot to make sure we strike a good balance here. There are even rules limiting how much credit you can get for building a stack higher for greater dispersion to make sure that we’re not using excessively tall stacks in lieu of more effective, but often more expensive, emission controls and strategies. In a perfect world, none of the materials or industrial processes that we rely on would generate concentrated plumes of hazardous gases. We don’t live in that perfect world, but we are pretty fortunate that, at least in many places on Earth, air quality is something we don’t have to think too much about. And to thank for it, we have a relatively small industry of environmental professionals who do think about it, a whole lot. You know, for a lot of people, this is their whole career; what they ponder from 9-5 every day. Something most of us would rather keep out of mind, they face it head-on, developing engineering theories, professional consensus, sensible regulations, modeling software, and more - just so we can breathe easy.
A conversation about EHRs, who their customers actually are, and building apps
In the movie Blade Runner 2049 (an excellent film I highly recommend), Ryan Gosling’s character, K, has an AI “wife”, Joi, played by Ana de Armas. K is clearly in love with Joi, who is nothing but software and holograms. In one poignant scene, K is viewing a giant ad for AI companions and sees […] The post AI Therapists first appeared on NeuroLogica Blog.
Why are buildings today austere, while buildings of the past were ornate and elaborately ornamented?
The lush forests that have long sustained Cambodia’s Indigenous people have steadily fallen to illicit logging. Now, community members face intimidation and risk arrest as they patrol their forests to document the losses and try to push the government to stop the cutting. Read more on E360 →