Full Width [alt+shift+f] Shortcuts [alt+shift+k]
Sign Up [alt+shift+s] Log In [alt+shift+l]
63
Over the past couple of weeks (and more) I have found a number of things to read that I wanted to pass on.  First, if you'd like a break from the seemingly continual stream of bad news in the world and enjoy good "think like a physicist"/dimensional analysis/order of magnitude estimate/Fermi problem discussions, I suggest: This paper (jstor link here) by Weisskopf, which does a great job at explaining quite a bit about matter, like the heights of mountains, e.g.  I had previously recommended this back in 2018.   That led me to a collection of Weisskopf's series of articles in the American Journal of Physics from back in the day, all under the name "Search for Simplicity".  Here is a link to a pdf file from the Minot Lab at Oregon State, and they have a bunch of other content along those lines.   The famed Edward Purcell also wrote a bunch of content for AJP about similar estimates; a page with links to those is here from the AAPT. The amazingly versatile and articulate Anthony Zee...
a year ago

More from nanoscale views

Turbulent times

While I've been absolutely buried under deadlines, it's been a crazy week for US science, and things are unlikely to calm down anytime soon.  As I've written before, I largely try to keep my political views off here, since that's not what people want to read from me, and I want to keep the focus on the amazing physics of materials and nanoscale systems.  (Come on, this is just cool - using light to dynamically change the chirality of crystals?  That's really nifty.)    Still, it's hard to be silent, even just limiting the discussion to science-related issues.  Changes of presidential administrations always carry a certain amount of perturbation, as the heads of many federal agencies are executive branch appointees who serve at the pleasure of the president.  That said, the past week was exceptional for multiple reasons, including pulling the US out of the WHO as everyone frets about H5N1 bird flu; a highly disruptive freeze of activity within HHS (lots of negative consequences even if it wraps up quickly); and immediate purging of various agency websites of any programs or language related to DEI, with threatened punishment for employees who don't report their colleagues for insufficient reporting of any continued DEI-related activities. Treating other people with respect, trying to make science (and engineering) welcoming to all, and trying to engage and educate the widest possible population in expanding human knowledge should not be controversial positions.  Saying that we should try to broaden the technical workforce, or that medical trials should involve women and multiple races should not be controversial positions. What I wrote eight years ago is still true.  It is easier to break things than to build things.  Rash steps very often have lingering unintended consequences.   Panic is not helpful.  Doomscrolling is not helpful.  Getting through challenging times requires determination, focus, and commitment to not losing one's principles.   Ok, enough out of me.  Next week (deadlines permitting) I'll be back with some science, because that's what I do.

a week ago 9 votes
This week in the arXiv: quantum geometry, fluid momentum "tunneling", and pasta sauce

Three papers caught my eye the other day on the arXiv at the start of the new year: arXiv:2501.00098 - J. Yu et al., "Quantum geometry in quantum materials" - I hope to write up something about quantum geometry soon, but I wanted to point out this nice review even if I haven't done my legwork yet.  The ultrabrief point:  The single-particle electronic states in crystalline solids may be written as Bloch waves, of the form \(u_{n \mathbf{k}}(\mathbf{r}) \exp(i \mathbf{k} \cdot \mathbf{r})\), where the (crystal) momentum is given by \(\hbar \mathbf{k}\) and \(u_{n \mathbf{k}}\) is a function with the real-space periodicity of the crystal lattice and contains an implicit \(\mathbf{k}\) dependence.  You can get very far in understanding solid-state physics without worrying about this, but it turns out that there are a number of very important phenomena that originate from the oft-neglected \(\mathbf{k}\) dependence of \(u_{n \mathbf{k}}\).  These include the anomalous Hall effect, the (intrinsic) spin Hall effect, the orbital Hall effect, etc.  Basically the \(\mathbf{k}\) dependence of \(u_{n \mathbf{k}}\) in the form of derivatives defines an internal "quantum" geometry of the electronic structure.  This review is a look at the consequences of quantum geometry on things like superconductivity, magnetic excitations, excitons, Chern insulators, etc. in quantum materials. Fig. 1 from arXiv:2501.01253 arXiv:2501.01253 - B. Coquinot et al., "Momentum tunnelling between nanoscale liquid flows" - In electronic materials there is a phenomenon known as Coulomb drag, in which a current driven through one electronic system (often a 2D electron gas) leads, through Coulomb interactions, to a current in adjacent but otherwise electrically isolated electronic system (say another 2D electron gas separated from the first by a few-nm insulating layer).  This paper argues that there should be a similar-in-spirit phenomenon when a polar liquid (like water) flows on one side of a thin membrane (like one or few-layer graphene, which can support electronic excitations like plasmons) - that this could drive flow of a polar fluid on the other side of the membrane (see figure).  They cast this in the language of momentum tunneling across the membrane, but the point is that it's some inelastic scattering process mediated by excitations in the membrane.  Neat idea. arXiv:2501.00536 - G. Bartolucci et al., "Phase behavior of Cacio and Pepe sauce" - Cacio e pepe is a wonderful Italian pasta dish with a sauce made from pecorino cheese, pepper, and hot pasta cooking water that contains dissolved starch.  When prepared well, it's incredibly creamy, smooth, and satisfying.  The authors here perform a systematic study of the sauce properties as a function of temperature and starch concentration relative to cheese content, finding the part of parameter space to avoid if you don't want the sauce to "break" (condensing out clumps of cheese-rich material and ruining the sauce texture).  That's cool, but what is impressive is that they are actually able to model the phase stability mathematically and come up with a scientifically justified version of the recipe.  Very fun.

3 weeks ago 36 votes
End of the year thoughts - scientific philanthropy and impact

As we head into 2025, and the prospects for increased (US) government investment in science, engineering, and STEM education seem very limited, I wanted to revisit a topic that I wrote about over a decade ago (!!!), the role of philanthropy and foundations in these areas.   Personally I think the case for government support of scientific research and education is overwhelmingly clear; while companies depend on having an educated technical workforce (at least for now) and continually improving technology, they are under great short-term financial pressures and genuinely long-term investment in research is rare.  Foundations are not a substitute for nation-state levels of support, but they are a critical component of the research and education landscape.   Annual citations of the EPR paper from Web of Science, a case study in the long-term impact of some "pure" scientific research, and giving hope to practicing scientists that surely our groundbreaking work will be appreciated sixty years  after publication.   A key question I've wondered about for a long time is how to properly judge the impact that research-supporting foundations are making.  The Science Philanthropy Alliance is a great organization that considers these issues deeply. The nature of long-term research is that it often takes a long time for its true impact (I don't mean just citation counts, but those are an indicator of activity) to be felt.  One (admittedly extreme) example is shown here, the citations-per-year (from Web of Science) of the 1935 Einstein/Podolsky/Rosen paper about entanglement.  (Side note:  You have to love the provocative title, "Can Quantum-Mechanical Description of Physical Reality Be Considered Complete?", which from the point of view of the authors at the time satisfies Betteridge's Law of Headlines.)  There are few companies that would be willing to invest in supporting research that won't have its true heyday for five or six decades. One additional tricky bit is that grants are usually given to people and organizations who are already active.  It's often not simple to point to some clear result or change in output that absolutely would not have happened without foundation support.  This is exacerbated by the fact that grants in science and engineering are often given to people and organizations who are not just active but already very well supported - betting on an odds-on favorite is a low risk strategy.  Many foundations do think very carefully about what areas to support, because they want to "move the needle".  For example, some scientific foundations are consciously reluctant to support closer-to-clinical-stage cancer research, since the total annual investment by governments and pharmaceutical companies in that area numbers in the many billions of dollars, and a modest foundation contribution would be a tiny delta on top of that.   Here is a list of the wealthiest charitable foundations (only a few of which support scientific research and/or education) and their endowments.  Nearly all of the science-related ones are also plugged in here.  A rough estimate of annual expenditures from endowed entities is about 5% of their holdings.  Recently I've come to think about private universities as one crude comparator for impact.  If a foundation has the same size endowment as a prestigious research university, I think it's worth thinking about the relative downstream impacts of those entities.  (Novo Nordisk Foundation has an endowment three times the size of Harvard's endowment.)   Another comparator would be the annual research expenditures of a relevant funding agency.  The US NSF put forward $234M into major research instrumentation and facilities in FY2024.  A foundation with a $5B endowment could in principle support all of that from endowment returns.  This lets me make my semiregular pitch about foundational or corporate support for research infrastructure and user facilities around the US.  The entire annual budget for the NSF's NNCI, which supports shared nanofabrication and characterization facilities around the US, is about $16M.   That's a niche where comparatively modest foundation (or corporate) support could have serious impact for interdisciplinary research and education across the country.  I'm sure there are other similar areas out there, and I hope someone is thinking about this.   Anyway, thanks to my readers - this is now the 20th year of this blog's existence (!!! again), and I hope to be able to keep it up well in the new year.

a month ago 38 votes
Technological civilization and losing object permanence

In the grand tradition of physicists writing about areas outside their expertise, I wanted to put down some thoughts on a societal trend.  This isn't physics or nanoscience, so feel free to skip this post. Object permanence is a term from developmental psychology.  A person (or animal) has object permanence if they understand that something still exists even if they can't directly see it or interact with it in the moment.  If a kid realizes that their toy still exists even though they can't see it right now, they've got the concept.   I'm wondering if modern technological civilization has an issue with an analog of object permanence.  Let me explain what I mean, why it's a serious problem, and end on a hopeful note by pointing out that even if this is the case, we have the tools needed to overcome this. By the standards of basically any previous era, a substantial fraction of humanity lives in a golden age.  We have a technologically advanced, globe-spanning civilization.  A lot of people (though geographically very unevenly distributed) have grown up with comparatively clean water; comparatively plentiful food available through means other than subsistence agriculture; electricity; access to radio, television, and for the last couple of decades nearly instant access to communications and a big fraction of the sum total of human factual knowledge.   Whether it's just human nature or a consequence of relative prosperity, there seems to be some timescale on the order of a few decades over which a non-negligible fraction of even the most fortunate seem to forget the hard lessons that got us to this point.  If they haven't seen something with their own eyes or experienced it directly, they decide it must not be a real issue.  I'm not talking about Holocaust deniers or conspiracy theorists who think the moon landings were fake.  There are a bunch of privileged people who have never personally known a time when tens of thousands of their neighbors died from childhood disease (you know, like 75 years ago, when 21,000 Americans were paralyzed every year from polio (!), proportionately like 50,000 today), who now think we should get rid of vaccines, and maybe germs aren't real.  Most people alive today were not alive the last time nuclear weapons were used, so some of them argue that nuclear weapons really aren't that bad (e.g. setting off 2000 one megaton bombs spread across the US would directly destroy less than 5% of the land area, so we're good, right?).  Or, we haven't had massive bank runs in the US since the 1930s, so some people now think that insuring bank deposits is a waste of resources and should stop.  I'll stop the list here, before veering into even more politically fraught territory.  I think you get my point, though - somehow chunks of modern society seem to develop collective amnesia, as if problems that we've never personally witnessed must have been overblown before or don't exist at all.  (Interestingly, this does not seem to happen for most technological problems.  You don't see many people saying, you know, maybe building fires weren't that big a deal, let's go back to the good old days before smoke alarms and sprinklers.)   While the internet has downsides, including the ability to spread disinformation very effectively, all the available and stored knowledge also has an enormous benefit:  It should make it much harder than ever before for people to collectively forget the achievements of our species.  Sanitation, pasteurization, antibiotics, vaccinations - these are absolutely astonishing technical capabilities that were hard-won and have saved many millions of lives.  It's unconscionable that we are literally risking mass death by voluntarily forgetting or ignoring that.  Nuclear weapons are, in fact, terrible.  Insuring bank deposits with proper supervision of risk is a key factor that has helped stabilize economies for the last century.  We need to remember historical problems and their solutions, and make sure that the people setting policy are educated about these things. They say that those who cannot remember the past are doomed to repeat it.  As we look toward the new year, I hope that those who are familiar with the hard earned lessons of history are able to make themselves heard over the part of the populace who simply don't believe that old problems were real and could return.

a month ago 48 votes
Items for discussion, including google's latest quantum computing result

As we head toward the end of the calendar year, a few items: Google published a new result in Nature a few days ago.  This made a big news splash, including this accompanying press piece from google themselves, this nice article in Quanta, and the always thoughtful blog post by Scott Aaronson.  The short version:  Physical qubits as made today in the superconducting platform favored by google don't have the low error rates that you'd really like if you want to run general quantum algorithms on a quantum computer, which could certainly require millions of steps.  The hope of the community is to get around this using quantum error correction, where some number of physical qubits are used to function as one "logical" qubit.  If physical qubit error rates are sufficiently low, and these errors can be corrected with enough efficacy, the logical qubits can function better than the physical qubits, ideally being able to undergo a sequential operations indefinitely without degradation of their information.   One technique for this is called a surface code.  Google have implemented this in their most recent chip 105 physical qubit chip ("Willow"), and they seem to have crossed a huge threshold:  When they increase the size of their correction scheme (going from a 3 (physical qubit) \(\times\) 3 (physical qubit) to 5 \(\times\) 5 to 7 \(\times\) 7), the error rates of the resulting logical qubits fall as hoped.  This is a big deal, as it implies that larger chips, if they could be implemented, should scale toward the desired performance.  This does not mean that general purpose quantum computers are just around the corner, but it's very encouraging.  There are many severe engineering challenges still in place.  For example, the present superconducting qubits must be tweaked and tuned.  The reason google only has 105 of them on the Willow chip is not that they can't fit more - it's that they have to have wires and control capacity to tune and run them.  A few thousand really good logical qubits would be needed to break RSA encryption, and there is no practical way to put millions of wires down a dilution refrigerator.  Rather, one will need cryogenic control electronics.  On a closely related point, google's article talks about how it would take a classical computer ten septillion years to do what its Willow chip can do.  This is based on a very particularly chosen problem (as I mentioned here five years ago) called random circuit sampling, looking at the statistical properties of the outcome of applying random gate sequences to a quantum computer.  From what I can tell, this is very different than what most people mean when they think of a problem to benchmark a quantum computer's advantage over a classical computer.  I suspect the typical tech-literate person considering quantum computing wants to know, if I ask a quantum computer and a classical computer to factor huge numbers or do some optimization problem, how much faster is the quantum computer for a given size of problem?  Random circuit sampling feels instead much more to me like comparing an experiment to a classical theory calculation.  For a purely classical analog, consider putting an airfoil in a windtunnel and measuring turbulent flow, and comparing with a computational fluids calculation.  Yes, the windtunnel can get you an answer very quickly, but it's not "doing" a calculation, from my perspective.  This doesn't mean random circuit sampling is a poor benchmark, just that people should understand it's rather different from the kind of quantum/classical comparison they may envision. On one unrelated note:  Thanks to a timey inquiry from a reader, I have now added a search bar to the top of the blog.  (Just in time to capture the final decline of science blogging?) On a second unrelated note:  I'd be curious to hear from my academic readers on how they are approaching generative AI, both on the instructional side (e.g., should we abandon traditional assignments and take-home exams?  How do we check to see if students are really learning vs. becoming dependent on tools that have dubious reliability?) and on the research side (e.g., what level of generative AI tool use is acceptable in paper or proposal writing?  What aspects of these tools are proving genuinely useful to PIs?  To students?  Clearly generative AI's ability to help with coding is very nice indeed!)

a month ago 56 votes

More in science

Chatbot Software Begins to Face Fundamental Limitations

Recent results show that large language models struggle with compositional tasks, suggesting a hard limit to their abilities. The post Chatbot Software Begins to Face Fundamental Limitations first appeared on Quanta Magazine

yesterday 2 votes
Links in Progress: We can still build beautifully

A tour of interesting developments built in the last two decades

yesterday 2 votes
The Value of Foreign Diplomas

Is that immigrant high-skilled or do they just have a fancy degree?

yesterday 8 votes
Incorruptible Skepticism

Everything, apparently, has a second life on TikTok. At least this keeps us skeptics busy – we have to redebunk everything we have debunked over the last century because it is popping up again on social media, confusing and misinforming another generation. This video is a great example – a short video discussing the “incorruptibility’ […] The post Incorruptible Skepticism first appeared on NeuroLogica Blog.

2 days ago 2 votes
Sony Kills Recordable Blu-Ray And Other Vintage Media

Physical media fans need not panic yet—you’ll still be able to buy new Blu-Ray movies for your collection. But for those who like to save copies of their own data onto the discs, the remaining options just became more limited: Sony announced last week that it’s ending all production of several recordable media formats—including Blu-Ray discs, MiniDiscs, and MiniDV cassettes—with no successor models. “Considering the market environment and future growth potential of the market, we have decided to discontinue production,” a representative of Sony said in a brief statement to IEEE Spectrum. Though availability is dwindling, most Blu-Ray discs are unaffected. The discs being discontinued are currently only available to consumers in Japan and some professional markets elsewhere, according to Sony. Many consumers in Japan use blank Blu-Ray discs to save TV programs, Sony separately told Gizmodo. Sony, which prototyped the first Blu-Ray discs in 2000, has been selling commercial Blu-Ray products since 2006. Development of Blu-Ray was started by Philips and Sony in 1995, shortly after Toshiba’s DVD was crowned the winner of the battle to replace the VCR, notes engineer Kees Immink, whose coding was instrumental in developing optical formats such as CDs, DVDs, and Blu-Ray discs. “Philips [and] Sony were so frustrated by that loss that they started a new disc format, using a blue laser,” Immink says. Blu-Ray’s Short-Lived Media Dominance The development took longer than expected, but when it was finally introduced a decade later, Blu-Ray was on its way to becoming the medium for distributing video, as DVD discs and VHS tapes had done in their heydays. In 2008, Spectrum covered the moment when Blu-Ray’s major competitor, HD-DVD, surrendered. But the timing was unfortunate, as the rise of streaming made it an empty victory. Still, Blu-Rays continue to have value as collector’s items for many film buffs who want high-quality recordings not subject to compression artifacts that can arise with streaming, not to mention those wary of losing access to movies due to the vagaries of streaming services’ licensing deals. Sony’s recent announcement does, however, cement the death of the MiniDV cassette and MiniDisc. MiniDV, magnetic cassettes meant to replace VHS tapes at one-fifth the size, were once a popular format of digital video cassettes. The MiniDisc, an erasable magneto-optical disc that can hold up to 80 minutes of digitized audio, still has a small following. The 64-millimeter (2.5-inch) discs, held in a plastic cartridge similar to a floppy disk, were developed in the mid-1980s as a replacement for analog cassette tapes. Sony finally released the product in 1992, and it was popular in Japan into the 2000s. To record data onto optical storage like CDs and Blu-Rays, lasers etch microscopic pits into the surface of the disc to represent ones and zeros. Lasers are also used to record data onto MiniDiscs, but instead of making indentations, they’re used to change the polarization of the material; the lasers heat up one side of the disc, making the material susceptible to a magnetic field, which can then alter the polarity of the heated area. Then in playback, the polarization of reflected light translates to a one or zero. When the technology behind media storage formats like the MiniDisc and Blu-Ray was first being developed, the engineers involved believed the technology would be used well into the future, says optics engineer Joseph Braat. His research at Philips with Immink served as the basis of the MiniDisc. Despite that optimism, “the density of information in optical storage was limited from the very beginning,” Braat says. Despite using the compact wavelengths of blue light, Blu-Ray soon hit a limit of how much data could be stored. Even dual-layer Blu-Ray discs can only hold 50 gigabytes per side; that amount of data will give you 50 hours of standard definition streaming on Netflix, or about seven hours of 4K video content. MiniDiscs still have a small, dedicated niche of enthusiasts, with active social media communities and in-person disc swaps. But since Sony stopped production of MiniDisc devices in 2013, the retro format has effectively been on technological hospice care, with the company only offering blank discs and repair services. Now, it seems, it’s officially over.

2 days ago 2 votes