Full Width [alt+shift+f] Shortcuts [alt+shift+k]
Sign Up [alt+shift+s] Log In [alt+shift+l]
4
While I've been absolutely buried under deadlines, it's been a crazy week for US science, and things are unlikely to calm down anytime soon.  As I've written before, I largely try to keep my political views off here, since that's not what people want to read from me, and I want to keep the focus on the amazing physics of materials and nanoscale systems.  (Come on, this is just cool - using light to dynamically change the chirality of crystals?  That's really nifty.)    Still, it's hard to be silent, even just limiting the discussion to science-related issues.  Changes of presidential administrations always carry a certain amount of perturbation, as the heads of many federal agencies are executive branch appointees who serve at the pleasure of the president.  That said, the past week was exceptional for multiple reasons, including pulling the US out of the WHO as everyone frets about H5N1 bird flu; a highly disruptive freeze of activity within HHS (lots of negative consequences even if...
yesterday

More from nanoscale views

This week in the arXiv: quantum geometry, fluid momentum "tunneling", and pasta sauce

Three papers caught my eye the other day on the arXiv at the start of the new year: arXiv:2501.00098 - J. Yu et al., "Quantum geometry in quantum materials" - I hope to write up something about quantum geometry soon, but I wanted to point out this nice review even if I haven't done my legwork yet.  The ultrabrief point:  The single-particle electronic states in crystalline solids may be written as Bloch waves, of the form \(u_{n \mathbf{k}}(\mathbf{r}) \exp(i \mathbf{k} \cdot \mathbf{r})\), where the (crystal) momentum is given by \(\hbar \mathbf{k}\) and \(u_{n \mathbf{k}}\) is a function with the real-space periodicity of the crystal lattice and contains an implicit \(\mathbf{k}\) dependence.  You can get very far in understanding solid-state physics without worrying about this, but it turns out that there are a number of very important phenomena that originate from the oft-neglected \(\mathbf{k}\) dependence of \(u_{n \mathbf{k}}\).  These include the anomalous Hall effect, the (intrinsic) spin Hall effect, the orbital Hall effect, etc.  Basically the \(\mathbf{k}\) dependence of \(u_{n \mathbf{k}}\) in the form of derivatives defines an internal "quantum" geometry of the electronic structure.  This review is a look at the consequences of quantum geometry on things like superconductivity, magnetic excitations, excitons, Chern insulators, etc. in quantum materials. Fig. 1 from arXiv:2501.01253 arXiv:2501.01253 - B. Coquinot et al., "Momentum tunnelling between nanoscale liquid flows" - In electronic materials there is a phenomenon known as Coulomb drag, in which a current driven through one electronic system (often a 2D electron gas) leads, through Coulomb interactions, to a current in adjacent but otherwise electrically isolated electronic system (say another 2D electron gas separated from the first by a few-nm insulating layer).  This paper argues that there should be a similar-in-spirit phenomenon when a polar liquid (like water) flows on one side of a thin membrane (like one or few-layer graphene, which can support electronic excitations like plasmons) - that this could drive flow of a polar fluid on the other side of the membrane (see figure).  They cast this in the language of momentum tunneling across the membrane, but the point is that it's some inelastic scattering process mediated by excitations in the membrane.  Neat idea. arXiv:2501.00536 - G. Bartolucci et al., "Phase behavior of Cacio and Pepe sauce" - Cacio e pepe is a wonderful Italian pasta dish with a sauce made from pecorino cheese, pepper, and hot pasta cooking water that contains dissolved starch.  When prepared well, it's incredibly creamy, smooth, and satisfying.  The authors here perform a systematic study of the sauce properties as a function of temperature and starch concentration relative to cheese content, finding the part of parameter space to avoid if you don't want the sauce to "break" (condensing out clumps of cheese-rich material and ruining the sauce texture).  That's cool, but what is impressive is that they are actually able to model the phase stability mathematically and come up with a scientifically justified version of the recipe.  Very fun.

3 weeks ago 34 votes
End of the year thoughts - scientific philanthropy and impact

As we head into 2025, and the prospects for increased (US) government investment in science, engineering, and STEM education seem very limited, I wanted to revisit a topic that I wrote about over a decade ago (!!!), the role of philanthropy and foundations in these areas.   Personally I think the case for government support of scientific research and education is overwhelmingly clear; while companies depend on having an educated technical workforce (at least for now) and continually improving technology, they are under great short-term financial pressures and genuinely long-term investment in research is rare.  Foundations are not a substitute for nation-state levels of support, but they are a critical component of the research and education landscape.   Annual citations of the EPR paper from Web of Science, a case study in the long-term impact of some "pure" scientific research, and giving hope to practicing scientists that surely our groundbreaking work will be appreciated sixty years  after publication.   A key question I've wondered about for a long time is how to properly judge the impact that research-supporting foundations are making.  The Science Philanthropy Alliance is a great organization that considers these issues deeply. The nature of long-term research is that it often takes a long time for its true impact (I don't mean just citation counts, but those are an indicator of activity) to be felt.  One (admittedly extreme) example is shown here, the citations-per-year (from Web of Science) of the 1935 Einstein/Podolsky/Rosen paper about entanglement.  (Side note:  You have to love the provocative title, "Can Quantum-Mechanical Description of Physical Reality Be Considered Complete?", which from the point of view of the authors at the time satisfies Betteridge's Law of Headlines.)  There are few companies that would be willing to invest in supporting research that won't have its true heyday for five or six decades. One additional tricky bit is that grants are usually given to people and organizations who are already active.  It's often not simple to point to some clear result or change in output that absolutely would not have happened without foundation support.  This is exacerbated by the fact that grants in science and engineering are often given to people and organizations who are not just active but already very well supported - betting on an odds-on favorite is a low risk strategy.  Many foundations do think very carefully about what areas to support, because they want to "move the needle".  For example, some scientific foundations are consciously reluctant to support closer-to-clinical-stage cancer research, since the total annual investment by governments and pharmaceutical companies in that area numbers in the many billions of dollars, and a modest foundation contribution would be a tiny delta on top of that.   Here is a list of the wealthiest charitable foundations (only a few of which support scientific research and/or education) and their endowments.  Nearly all of the science-related ones are also plugged in here.  A rough estimate of annual expenditures from endowed entities is about 5% of their holdings.  Recently I've come to think about private universities as one crude comparator for impact.  If a foundation has the same size endowment as a prestigious research university, I think it's worth thinking about the relative downstream impacts of those entities.  (Novo Nordisk Foundation has an endowment three times the size of Harvard's endowment.)   Another comparator would be the annual research expenditures of a relevant funding agency.  The US NSF put forward $234M into major research instrumentation and facilities in FY2024.  A foundation with a $5B endowment could in principle support all of that from endowment returns.  This lets me make my semiregular pitch about foundational or corporate support for research infrastructure and user facilities around the US.  The entire annual budget for the NSF's NNCI, which supports shared nanofabrication and characterization facilities around the US, is about $16M.   That's a niche where comparatively modest foundation (or corporate) support could have serious impact for interdisciplinary research and education across the country.  I'm sure there are other similar areas out there, and I hope someone is thinking about this.   Anyway, thanks to my readers - this is now the 20th year of this blog's existence (!!! again), and I hope to be able to keep it up well in the new year.

3 weeks ago 36 votes
Technological civilization and losing object permanence

In the grand tradition of physicists writing about areas outside their expertise, I wanted to put down some thoughts on a societal trend.  This isn't physics or nanoscience, so feel free to skip this post. Object permanence is a term from developmental psychology.  A person (or animal) has object permanence if they understand that something still exists even if they can't directly see it or interact with it in the moment.  If a kid realizes that their toy still exists even though they can't see it right now, they've got the concept.   I'm wondering if modern technological civilization has an issue with an analog of object permanence.  Let me explain what I mean, why it's a serious problem, and end on a hopeful note by pointing out that even if this is the case, we have the tools needed to overcome this. By the standards of basically any previous era, a substantial fraction of humanity lives in a golden age.  We have a technologically advanced, globe-spanning civilization.  A lot of people (though geographically very unevenly distributed) have grown up with comparatively clean water; comparatively plentiful food available through means other than subsistence agriculture; electricity; access to radio, television, and for the last couple of decades nearly instant access to communications and a big fraction of the sum total of human factual knowledge.   Whether it's just human nature or a consequence of relative prosperity, there seems to be some timescale on the order of a few decades over which a non-negligible fraction of even the most fortunate seem to forget the hard lessons that got us to this point.  If they haven't seen something with their own eyes or experienced it directly, they decide it must not be a real issue.  I'm not talking about Holocaust deniers or conspiracy theorists who think the moon landings were fake.  There are a bunch of privileged people who have never personally known a time when tens of thousands of their neighbors died from childhood disease (you know, like 75 years ago, when 21,000 Americans were paralyzed every year from polio (!), proportionately like 50,000 today), who now think we should get rid of vaccines, and maybe germs aren't real.  Most people alive today were not alive the last time nuclear weapons were used, so some of them argue that nuclear weapons really aren't that bad (e.g. setting off 2000 one megaton bombs spread across the US would directly destroy less than 5% of the land area, so we're good, right?).  Or, we haven't had massive bank runs in the US since the 1930s, so some people now think that insuring bank deposits is a waste of resources and should stop.  I'll stop the list here, before veering into even more politically fraught territory.  I think you get my point, though - somehow chunks of modern society seem to develop collective amnesia, as if problems that we've never personally witnessed must have been overblown before or don't exist at all.  (Interestingly, this does not seem to happen for most technological problems.  You don't see many people saying, you know, maybe building fires weren't that big a deal, let's go back to the good old days before smoke alarms and sprinklers.)   While the internet has downsides, including the ability to spread disinformation very effectively, all the available and stored knowledge also has an enormous benefit:  It should make it much harder than ever before for people to collectively forget the achievements of our species.  Sanitation, pasteurization, antibiotics, vaccinations - these are absolutely astonishing technical capabilities that were hard-won and have saved many millions of lives.  It's unconscionable that we are literally risking mass death by voluntarily forgetting or ignoring that.  Nuclear weapons are, in fact, terrible.  Insuring bank deposits with proper supervision of risk is a key factor that has helped stabilize economies for the last century.  We need to remember historical problems and their solutions, and make sure that the people setting policy are educated about these things. They say that those who cannot remember the past are doomed to repeat it.  As we look toward the new year, I hope that those who are familiar with the hard earned lessons of history are able to make themselves heard over the part of the populace who simply don't believe that old problems were real and could return.

a month ago 47 votes
Items for discussion, including google's latest quantum computing result

As we head toward the end of the calendar year, a few items: Google published a new result in Nature a few days ago.  This made a big news splash, including this accompanying press piece from google themselves, this nice article in Quanta, and the always thoughtful blog post by Scott Aaronson.  The short version:  Physical qubits as made today in the superconducting platform favored by google don't have the low error rates that you'd really like if you want to run general quantum algorithms on a quantum computer, which could certainly require millions of steps.  The hope of the community is to get around this using quantum error correction, where some number of physical qubits are used to function as one "logical" qubit.  If physical qubit error rates are sufficiently low, and these errors can be corrected with enough efficacy, the logical qubits can function better than the physical qubits, ideally being able to undergo a sequential operations indefinitely without degradation of their information.   One technique for this is called a surface code.  Google have implemented this in their most recent chip 105 physical qubit chip ("Willow"), and they seem to have crossed a huge threshold:  When they increase the size of their correction scheme (going from a 3 (physical qubit) \(\times\) 3 (physical qubit) to 5 \(\times\) 5 to 7 \(\times\) 7), the error rates of the resulting logical qubits fall as hoped.  This is a big deal, as it implies that larger chips, if they could be implemented, should scale toward the desired performance.  This does not mean that general purpose quantum computers are just around the corner, but it's very encouraging.  There are many severe engineering challenges still in place.  For example, the present superconducting qubits must be tweaked and tuned.  The reason google only has 105 of them on the Willow chip is not that they can't fit more - it's that they have to have wires and control capacity to tune and run them.  A few thousand really good logical qubits would be needed to break RSA encryption, and there is no practical way to put millions of wires down a dilution refrigerator.  Rather, one will need cryogenic control electronics.  On a closely related point, google's article talks about how it would take a classical computer ten septillion years to do what its Willow chip can do.  This is based on a very particularly chosen problem (as I mentioned here five years ago) called random circuit sampling, looking at the statistical properties of the outcome of applying random gate sequences to a quantum computer.  From what I can tell, this is very different than what most people mean when they think of a problem to benchmark a quantum computer's advantage over a classical computer.  I suspect the typical tech-literate person considering quantum computing wants to know, if I ask a quantum computer and a classical computer to factor huge numbers or do some optimization problem, how much faster is the quantum computer for a given size of problem?  Random circuit sampling feels instead much more to me like comparing an experiment to a classical theory calculation.  For a purely classical analog, consider putting an airfoil in a windtunnel and measuring turbulent flow, and comparing with a computational fluids calculation.  Yes, the windtunnel can get you an answer very quickly, but it's not "doing" a calculation, from my perspective.  This doesn't mean random circuit sampling is a poor benchmark, just that people should understand it's rather different from the kind of quantum/classical comparison they may envision. On one unrelated note:  Thanks to a timey inquiry from a reader, I have now added a search bar to the top of the blog.  (Just in time to capture the final decline of science blogging?) On a second unrelated note:  I'd be curious to hear from my academic readers on how they are approaching generative AI, both on the instructional side (e.g., should we abandon traditional assignments and take-home exams?  How do we check to see if students are really learning vs. becoming dependent on tools that have dubious reliability?) and on the research side (e.g., what level of generative AI tool use is acceptable in paper or proposal writing?  What aspects of these tools are proving genuinely useful to PIs?  To students?  Clearly generative AI's ability to help with coding is very nice indeed!)

a month ago 54 votes

More in science

New Book-Sorting Algorithm Almost Reaches Perfection

The library sorting problem is used across computer science for organizing far more than just books. A new solution is less than a page-width away from the theoretical ideal. The post New Book-Sorting Algorithm Almost Reaches Perfection first appeared on Quanta Magazine

2 days ago 4 votes
Final: So Where Should We Build Ten New Cities in the US?

Final article in the series

2 days ago 3 votes
Launching Version 14.2 of Wolfram Language & Mathematica: Big Data Meets Computation & AI

The Drumbeat of Releases Continues… Notebook Assistant Chat inside Any Notebook Bring Us Your Gigabytes! Introducing Tabular Manipulating Data in Tabular Getting Data into Tabular Cleaning Data for Tabular The Structure of Tabular Tabular Everywhere Algebra with Symbolic Arrays Language Tune-Ups Brightening Our Colors; Spiffing Up for 2025 LLM Streamlining & Streaming Streamlining Parallel Computation: […]

3 days ago 13 votes
The Jagged, Monstrous Function That Broke Calculus

In the late 19th century, Karl Weierstrass invented a fractal-like function that was decried as nothing less than a “deplorable evil.” In time, it would transform the foundations of mathematics. The post The Jagged, Monstrous Function That Broke Calculus first appeared on Quanta Magazine

3 days ago 7 votes