Full Width [alt+shift+f] Shortcuts [alt+shift+k]
Sign Up [alt+shift+s] Log In [alt+shift+l]
20
(A post summarizing recent US science-related events will be coming later.  For now, here is my promised post about multiferroics, inspired in part by a recent visit to Rice by Yoshi Tokura.) Electrons carry spins and therefore magnetic moments (that is, they can act in some ways like little bar magnets), and as I was teaching undergrads this past week, under certain conditions some of the electrons in a material can spontaneously develop long-range magnetic order.  That is, rather than being, on average, randomly oriented, instead below some critical temperature the spins take on a pattern that repeats throughout the material.  In the ordered state, if you know the arrangement of spins in one (magnetic) unit cell of the material, that pattern is repeated over many (perhaps all, if the system is a single domain) the unit cells.  In picking out this pattern, the overall symmetry of the material is lowered compared to the non-ordered state.  (There can be local moment magnets, when the...
a month ago

Improve your reading experience

Logged in users get linked directly to articles resulting in a better reading experience. Please login for free, it takes less than 1 minute.

More from nanoscale views

How badly has NSF funding already been effectively cut?

This NY Times feature lets you see how each piece of NSF's funding has been reduced this year relative to the normalized average spanning in the last decade.  Note: this fiscal year, thanks to the continuing resolution, the actual agency budget has not actually been cut like this. They are just not spending congressionally appropriated agency funds.  The agency, fearing/assuming that its budget will get hammered next fiscal year, does not want to start awards that it won't be able to fund in out-years. The result is that this is effectively obeying in advance the presidential budget request for FY26.  (And it's highly likely that some will point to unspent funds later in the year and use that as a justification for cuts, when in fact it's anticipation of possible cuts that has led to unspent funds.  I'm sure the Germans have a polysyllabic word for this.  In English, "Catch-22" is close.) I encourage you to click the link and go to the article where the graphic is interactive (if it works in your location - not sure about whether the link works internationally).  The different colored regions are approximately each of the NSF directorates (in their old organizational structure).  Each subsection is a particular program.   Seems like whoever designed the graphic was a fan of Tufte, and the scaling of the shaded areas does quantitatively reflect funding changes.  However, most people have a tough time estimating relative areas of irregular polygons.  Award funding in physics (the left-most section of the middle region) is down 85% relative to past years.  Math is down 72%.  Chemistry is down 57%.  Materials is down 63%.  Earth sciences is down 80%.  Polar programs (you know, those folks who run all the amazing experiments in Antarctica) is down 88%.   I know my readers are likely tired of me harping on NSF, but it's both important and a comparatively transparent example of what is also happening at other agencies.  If you are a US citizen and think that this is the wrong path, then push on your congressional delegation about the upcoming budget.

yesterday 2 votes
A science anecdote palate cleanser

Apologies for slow posting.  Real life has been very intense, and I also was rather concerned when one of my readers mentioned last weekend that these days my blog was like concentrated doom-scrolling.  I will have more to say about the present university research crisis later, but first I wanted to give a hopefully diverting example of the kind of problem-solving and following-your-nose that crops up in research. Recently in my lab we have had a need to measure very small changes in electrical resistance of some devices, at the level of a few milliOhms out of kiloOhms - parts in \(10^6\).  One of my students put together a special kind of resistance bridge to do this, and it works very well.  Note to interested readers: if you want to do this, make sure that you use components with very low temperature coefficients of their properties (e.g., resistors with a very small \(dR/dT\)), because otherwise your bridge becomes an extremely effective thermometer for your lab.  It’s kind of cool to be able to see the lab temperature drift around by milliKelvins, but it's not great for measuring your sample of interest. There are a few ways to measure resistance.  The simplest is the two-terminal approach, where you drive currents through and measure voltages across your device with the same two wires.  This is easy, but it means that the voltage you measure includes contributions from the contacts those wires make with the device.  A better alternative is the four-terminal method, where you use separate wires to supply/collect the current.   Anyway, in the course of doing some measurements of a particular device's resistance as a function of magnetic field at low temperatures, we saw something weird.  Below some rather low temperatures, when we measured in a 2-terminal arrangement, we saw a jump up in resistance by around 20 milliOhms (out of a couple of kOhms) as magnetic field was swept up from zero, and a small amount of resistance hysteresis with magnetic field sweep that vanished above maybe 0.25 T.  This vanished completely in a 4-terminal arrangement, and also disappeared above about 3.4 K.  What was this?  Turns out that I think we accidentally rediscovered the superconducting transition in indium.  While our contact pads on our sample mount looked clean to the unaided eye, they had previously had indium on there.  The magic temperature is very close to the bulk \(T_{c}\) for indium. For one post, rather than dwelling on the terrible news about the US science ecosystem, does anyone out there have other, similar fun experimental anecdotes?  Glitches that turned out to be something surprising?  Please share in the comments.

4 days ago 7 votes
Updates, thoughts about industrial support of university research

Lots of news in the last few days regarding federal funding of university research: NSF has now frozen all funding for new and continuing awards.  This is not good; just how bad it is depends on the definition of "until further notice".   Here is an open letter from the NSF employees union to the basically-silent-so-far National Science Board, asking for the NSB to support the agency. Here is a grass roots SaveNSF website with good information and suggestions for action - please take a look. NSF also wants to cap indirect cost rates at 15% for higher ed institutions for new awards.  This will almost certainly generate a law suit from the AAU and others.   Speaking of the AAU, last week there was a hearing in the Massachusetts district court regarding the lawsuits about the DOE setting indirect cost rates to 15% for active and new awards.  There had already been a temporary restraining order in place nominally stopping the change; the hearing resulted in that order being extended "until a further order is issued resolving the request for a temporary injunction."  (See here, the entry for April 29.) In the meantime, the presidential budget request has come out, and if enacted it would be devastating to the science agencies.  Proposed cuts include 55% to NSF, 40% to NIH, 33% to USGS, 25% to NOAA, etc.   If these cuts went through, we are taking about more than $35B, at a rough eyeball estimate.  And here is a letter from former NSF directors and NSB chairs to the appropriators in Congress, asking them to ignore that budget request and continue to support government sponsored science and engineering research. Unsurprisingly, during these times there is a lot of talk about the need for universities to diversify their research portfolios - that is, expanding non-federally-supported ways to continue generating new knowledge, training the next generation of the technically literate workforce, and producing IP and entrepreneurial startup companies.  (Let's take it as read that it would be economically and societally desirable to continue these things, for the purposes of this post.) Philanthropy is great, and foundations do fantastic work in supporting university research, philanthropy can't come close to making up for sharp drawdowns of federal support.  The numbers just don't work.  The endowment of the Moore Foundation, for example, is around $10B, implying an annual payout of $500M or so, which is great but around 1.4% of the cuts being envisioned.   Industry seems like the only non-governmental possibility that could in principle muster the resources that could make a large-scale difference.   Consider the estimated profits (not revenues) of different industrial sectors.  The US semiconductor market had revenues last year of around $500B with an annualized net margin of around 17%, giving $85B/yr of profit.  US aerospace and defense similarly have an annual profit of around $70B.  The financial/banking sector, which has historically benefitted greatly from PhD-trained quants, has an annual net income of $250B.  I haven't even listed numbers for the energy and medical sectors, because those are challenging to parse (but large).  All of those industries have been helped greatly by university research, directly and indirectly.  It's the source of trained people.  It's the source of initial work that is too long-term for corporations to be able to support without short-time-horizon shareholders getting annoyed.  It's the source of many startup companies that sometimes grow and other times get gobbled up by bigger fish.  Encouraging greater industrial sponsorship of university research is a key challenge.  The value proposition must be made clear to both the companies and universities.  The market is unforgiving and exerts pressure to worry about the short term not the long term.  Given how Congress is functioning, it does not look like there are going to be changes to the tax code put in place that could incentivize long term investment.   Cracking this and meaningfully growing the scale of industrial support for university research could be enormously impactful.  Something to ponder.

2 weeks ago 4 votes
NSF, quo vadis?

There is a lot going on.  Today, some words about NSF. Yesterday Sethuraman Panchanathan, the director of the National Science Foundation, resigned 16 months before the end of his six year term.  The relevant Science article raises the possibility that this is because, as an executive branch appointee, he would effectively have to endorse the upcoming presidential budget request, which is rumored to be a 55% cut to the agency budget (from around $9B/yr to $4B/yr) and a 50% reduction in agency staffing.  (Note:  actual appropriations are set by Congress, which has ignored presidential budget requests in the past.)  This comes at the end of a week when all new awards were halted at the agency while non-agency personnel conducted "a second review" of all grants, and many active grants have been terminated.  Bear in mind, awards this year from NSF are already down 50% over last year, even without official budget cuts.  Update:  Here is Nature's reporting from earlier today. The NSF has been absolutely critical to a long list of scientific and technological advances over the last 70 years (see here while it's still up).  As mentioned previously, government support of basic research has a great return on investment for the national economy, and it's a tiny fraction of government spending.  Less than three years ago, the CHIPS & Science Act was passed with supposed bipartisan support in Congress, authorizing the doubling of the NSF budget.  Last summer I posted in frustration that this support seemed to be an illusion when it came to actual funding.   People can have disagreements about the "right" level of government support for science in times of fiscal challenges, but as far as I can tell, no one (including and especially Congress so far) voted for the dismantling of the NSF.  If you think the present trajectory is wrong, contact your legislators and make your voices heard.

4 weeks ago 5 votes
A Grand Bargain and its chaotic dissolution

After World War II, under the influence (direct and indirect) of people like Vannevar Bush, a "grand bargain" was effectively struck between the US government and the nation's universities.  The war had demonstrated how important science and engineering research could be, through the Manhattan Project and the development of radar, among other things.  University researchers had effectively and sometimes literally been conscripted into the war effort.  In the postwar period, with more citizens than ever going to college because of the GI Bill, universities went through a period of rapid growth, and the government began funding research at universities on the large scale.  This was a way of accomplishing multiple goals.  This funding got hundreds of scientists and engineers to work on projects that advisors and the academic community itself (through peer review) thought would be important but perhaps were of such long-term or indirect economic impact that industry would be unlikely to support them.  It trained the next generation of researchers and of the technically skilled workforce.  It accomplished this as a complement to national laboratories and direct federal agency work. After Sputnik, there was an enormous ramp-up of investment.  This figure (see here for an interactive version) shows different contributions to investment in research and development in the US from 1953 through 2021: Figure from NSF report on US R&D investment  A couple of days ago, the New York Times published a related figure, showing the growth in dollars of total federal funds sent to US universities, but I think this is a more meaningful graph (hat tip to Prof. Elizabeth Popp Berman at Michigan for her discussion of this).  In 2021, federal investment in research (the large majority of which is happening at universities) as a percentage of GDP was at its lowest level since 1953, and it was sinking further even before this year (for those worried about US competitiveness....  Also, industry does a lot more D than they do long-term R.). There are many studies by economists showing that federal investment in research has a large return (for example, here is one by the Federal Reserve Bank of Dallas saying that returns to the US economy on federal research expenditures are between 150% and 300%).  Remember, these funds are not just given to universities - they are in the form of grants and contracts, for which specific work is done and reported.   These investments also helped make US higher education the envy of much of the world and led to education of international students as a tremendous effective export business for the country. Of course, like any system created organically by people, there are problems.  Universities are complicated and full of (ugh) academics.  Higher education is too expensive.  Compliance bureaucracy can be onerous.  Any deliberative process like peer review trades efficiency for collective expertise but also the hazards of group-think.  At the same time, the relationship between federally sponsored research and universities has led to an enormous amount of economic, technological, and medical benefit over the last 70 years. Right now it looks like this whole apparatus is being radically altered, if not dismantled in part or in whole.  Moreover, this is not happening as a result of a debate or discussion about the proper role and scale of federal spending at universities, or an in-depth look at the flaws and benefits of the historically developed research ecosystem.  It's happening because "elections have consequences", and I'd be willing to bet that very very few people in the electorate cast their votes even secondarily because of this topic.   Sincere people can have differing opinions about these issues, but decisions of such consequence and magnitude should not be taken lightly or incidentally.   (I am turning off comments on this one b/c I don't have time right now to pay close attention.  Take it as read that some people would comment that US spending must be cut back and that this is a consequence.)

a month ago 16 votes

More in science

Animals as chemical factories

The price of purple

2 hours ago 1 votes
32 Bits That Changed Microprocessor Design

In the late 1970s, a time when 8-bit processors were state of the art and CMOS was the underdog of semiconductor technology, engineers at AT&T’s Bell Labs took a bold leap into the future. They made a high-stakes bet to outpace IBM, Intel, and other competitors in chip performance by combining cutting-edge 3.5-micron CMOS fabrication with a novel 32-bit processor architecture. Although their creation—the Bellmac-32 microprocessor—never achieved the commercial fame of earlier ones such as Intel’s 4004 (released in 1971), its influence has proven far more enduring. Virtually every chip in smartphones, laptops, and tablets today relies on the complementary metal-oxide semiconductor principles that the Bellmac-32 pioneered. As the 1980s approached, AT&T was grappling with transformation. For decades, the telecom giant—nicknamed “Ma Bell”—had dominated American voice communications, with its Western Electric subsidiary manufacturing nearly every telephone found in U.S. homes and offices. The U.S. federal government was pressing for antitrust-driven divestiture, but AT&T was granted an opening to expand into computing. With computing firms already entrenched in the market, AT&T couldn’t afford to play catch-up; its strategy was to leap ahead, and the Bellmac-32 was its springboard. The Bellmac-32 chip series has now been honored with an IEEE Milestone. Dedication ceremonies are slated to be held this year at the Nokia Bell Labs’ campus in Murray Hill, N.J., and at the Computer History Museum in Mountain View, Calif. A chip like no other Rather than emulate the industry standard of 8-bit chips, AT&T executives challenged their Bell Labs engineers to deliver something revolutionary: the first commercially viable microprocessor capable of moving 32 bits in one clock cycle. It would require not just a new chip but also an entirely novel architecture—one that could handle telecommunications switching and serve as the backbone for future computing systems. “We weren’t just building a faster chip,” says Michael Condry, who led the architecture team at Bell Labs’ Holmdel facility in New Jersey. “We were trying to design something that could carry both voice and computation into the future.” This configuration of the Bellmac-32 microprocessor had an integrated memory management unit optimized for Unix-like operating systems.AT&T Archives and History Center At the time, CMOS technology was seen as a promising—but risky—alternative to the NMOS and PMOS designs then in use. NMOS chips, which relied solely on N-type transistors, were fast but power-hungry. PMOS chips, which depend on the movement of positively-charged holes, were too slow. CMOS, with its hybrid design, offered the potential for both speed and energy savings. The benefits were so compelling that the industry soon saw that the need for double the number of transistors (NMOS and PMOS for each gate) was worth the tradeoff. As transistor sizes shrank along with the rapid advancement of semiconductor technology described by Moore’s Law, the cost of doubling up the transistor density soon became manageable and eventually became negligible. But when Bell Labs took its high-stakes gamble, large-scale CMOS fabrication was still unproven and looked to be comparatively costly. That didn’t deter Bell Labs. By tapping expertise from its campuses in Holmdel and Murray Hill as well as in Naperville, Ill., the company assembled a dream team of semiconductor engineers. The team included Condry; Sung-Mo “Steve” Kang, a rising star in chip design; Victor Huang, another microprocessor chip designer, and dozens of AT&T Bell Labs employees. They set out in 1978 to master a new CMOS process and create a 32-bit microprocessor from scratch. Designing the architecture The architecture group led by Condry, an IEEE Life Fellow who would later become Intel’s CTO, focused on building a system that would natively support the Unix operating system and the C programming language. Both were in their infancy but destined for dominance. To cope with the era’s memory limitations—kilobytes were precious—they introduced a complex instruction set that required fewer steps to carry out and could be executed in a single clock cycle. The engineers also built the chip to support the VersaModule Eurocard (VME) parallel bus, enabling distributed computing so several nodes could handle data processing in parallel. Making the chip VME-enabled also allowed it to be used for real-time control. The group wrote its own version of Unix, with real-time capabilities to ensure that the new chip design was compatible with industrial automation and similar applications. The Bell Labs engineers also invented domino logic, which ramped up processing speed by reducing delays in complex logic gates. Additional testing and verification techniques were developed and introduced via the Bellmac-32 Module, a sophisticated multi-chipset verification and testing project led by Huang that allowed the complex chip fabrication to have zero or near-zero errors. This was the first of its kind in VLSI testing. The Bell Labs engineers’ systematic plan for double- and triple-checking their colleagues’ work ultimately made the total design of the multiple chipset family work together seamlessly as a complete microcomputer system. Then came the hardest part: actually building the chip. Floor maps and colored pencils “The technology for layout, testing, and high-yield fabrication just wasn’t there,” recalls Kang, an IEEE Life Fellow who later became president of the Korea Advanced Institute of Science and Technology (KAIST) in Daejeon, South Korea. With no CAD tools available for full-chip verification, Kang says, the team resorted to printing oversize Calcomp plots. The schematics showed how the transistors, circuit lines, and interconnects should be arranged inside the chip to provide the desired outputs. The team assembled them on the floor with adhesive tape to create a massive square map more than 6 meters on a side. Kang and his colleagues traced every circuit by hand with colored pencils, searching for breaks, overlaps, or mishandled interconnects. Getting it made Once the physical design was locked in, the team faced another obstacle: manufacturing. The chips were fabricated at a Western Electric facility in Allentown, Pa., but Kang recalls that the yield rates (the percentage of chips on a silicon wafer that meet performance and quality standards) were dismal. To address that, Kang and his colleagues drove from New Jersey to the plant each day, rolled up their sleeves, and did whatever it took, including sweeping floors and calibrating test equipment, to build camaraderie and instill confidence that the most complicated product the plant workers had ever attempted to produce could indeed be made there. “We weren’t just building a faster chip. We were trying to design something that could carry both voice and computation into the future.” —Michael Condry, Bellmac-32 architecture team lead “The team-building worked out well,” Kang says. “After several months, Western Electric was able to produce more than the required number of good chips.” The first version of the Bellmac-32, which was ready by 1980, fell short of expectations. Instead of hitting a 4-megahertz performance target, it ran at just 2 MHz. The engineers discovered that the state-of-the-art Takeda Riken testing equipment they were using was flawed, with transmission-line effects between the probe and the test head leading to inaccurate measurements, so they worked with a Takeda Riken team to develop correction tables that rectified the measurement errors. The second generation of Bellmac chips had clock speeds that exceeded 6.2 MHz, sometimes reaching 9. That was blazing fast for its time. The 16-bit Intel 8008 processor inside IBM’s original PC released in 1981 ran at 4.77 MHz. Why Bellmac-32 didn’t go mainstream Despite its technical promise, the Bellmac-32 did not find wide commercial use. According to Condry, AT&T’s pivot toward acquiring equipment manufacturer NCR, which it began eyeing in the late 1980s, meant the company chose to back a different line of chips. But by then, the Bellmac-32’s legacy was already growing. “Before Bellmac-32, NMOS was dominant,” Condry says. “But CMOS changed the market because it was shown to be a more effective implementation in the fab.” In time, that realization reshaped the semiconductor landscape. CMOS would become the foundation for modern microprocessors, powering the digital revolution in desktops, smartphones, and more. The audacity of Bell Labs’ bet—to take an untested fabrication process and leapfrog an entire generation of chip architecture—stands as a landmark moment in technological history. As Kang puts it: “We were on the frontier of what was possible. We didn’t just follow the path—we made a new one.” Huang, an IEEE Life Fellow who later became deputy director of the Institute of Microelectronics, Singapore, adds: “This included not only chip architecture and design, but also large-scale chip verification—with CAD but without today’s digital simulation tools or even breadboarding [which is the standard method for checking whether a circuit design for an electronic system that uses chips works before making permanent connections by soldering the circuit elements together].” Condry, Kang, and Huang look back fondly on that period and express their admiration for the many AT&T employees whose skill and dedication made the Bellmac-32 chip series possible. Administered by the IEEE History Center and supported by donors, the Milestone program recognizes outstanding technical developments around the world. The IEEE North Jersey Section sponsored the nomination.

yesterday 4 votes
In Test, A.I. Weather Model Fails to Predict Freak Storm

Artificial intelligence is powering weather forecasts that are generally more accurate than conventional forecasts and are faster and cheaper to produce. But new research shows A.I. may fail to predict unprecedented weather events, a troubling finding as warming fuels new extremes. Read more on E360 →

yesterday 2 votes
Graduate Student Solves Classic Problem About the Limits of Addition

A new proof illuminates the hidden patterns that emerge when addition becomes impossible. The post Graduate Student Solves Classic Problem About the Limits of Addition first appeared on Quanta Magazine

yesterday 2 votes
My very busy week

I’m not sure who scheduled ODSC and PyConUS during the same week, but I am unhappy with their decisions. Last Tuesday I presented a talk and co-presented a workshop at ODSC, and on Thursday I presented a tutorial at PyCon. If you would like to follow along with my very busy week, here are the resources: Practical Bayesian Modeling with PyMC Co-presented with Alex Fengler for ODSC East 2025 In this tutorial, we explore Bayesian regression using PyMC – the... Read More Read More The post My very busy week appeared first on Probably Overthinking It.

yesterday 5 votes