Full Width [alt+shift+f] Shortcuts [alt+shift+k]
Sign Up [alt+shift+s] Log In [alt+shift+l]
18
While employees are thankful for their employer's largesse in providing free coffee, any cursory cost-benefit analysis shows that they get back more than they spend. Coffee's caffeine content boosts a person's focus and attention, and it reduces mental fatigue, all of these leading to increased productivity. Coffee extraction in a typical coffeemaker is at a high temperature, about 100 C, and heat modifies the coffee chemicals. Like any other chemical process, coffee extraction has a temperature dependent rate, and the purpose of using heated water is to get a rapid extraction. Creation of cold brew coffee, for which the coffee is extracted at room temperature, or lower, takes 24 hours or more. Cold brew coffee is less acidic, and it tastes less bitter than regular drip coffee. Australian scientists have invented a process that reduces cold brew time to the same several minutes required for drip coffee brewing. Their cold brew process uses an ultrasonic reactor. For their...
11 months ago

Improve your reading experience

Logged in users get linked directly to articles resulting in a better reading experience. Please login for free, it takes less than 1 minute.

More from Tikalon Blog by Dev Gualtieri

Tikalon Blog Archive

Tikalon Blog is now in archive mode. Here's a directory of links to easily printed and saved articles. If you're willing to wait a while for the download, a zip file of all the blog articles can be found at the link below. Note, however, that these articles are copyrighted and can't be used to train artificial intelligent agents. Individuals are free to republish single articles on their personal websites.

9 months ago 32 votes
Spiderweb Microphone

Microphones convert sound into an electrical signal for subsequent amplification, as in auditorium public address systems; or transmission, as in landline and mobile phones. The most common types of microphones are carbon, used in early telephones, condenser, electret, dynamic, ribbon, crystal and MEMS. All these microphones operate as transducers that convert sound pressure into an electrical signal. This makes them also sensitive to noise caused by air molecules bouncing against their diaphragms. In an effort to solve this thermal noise problem, a team of mechanical engineers has investigated a sound sensing approach that uses viscous air flow rather than sound pressure. Viscous flow is what vibrates spiderwebs in gentle breezes. Air flow passing a thread of a spiderweb drags the thread. They demonstrated sound detection by a simulated spiderweb, an array of thin cantilever beams. The beams were 0.5 micrometer thick silicon nitride placed over a hole in a silicon wafer, and a laser was used to measure the displacement of the microbeams, first in response to thermal noise, and then in response to sound waves from 100 to 1000 Hz. The cantilever velocity matched that of the sound wave, irrespective of the length or width of the beam. The demonstrated cantilever microphone is about 50 dBa less sensitive than the best pressure-based microphones; but, pressure microphones have been perfected over a span of 150 years. As the lead author of the paper comments, "Detecting air flow as a way to sense sound has largely been ignored by researchers, but the principles show that it's worth considering."

9 months ago 33 votes
Adornment

"Form follows function" is a maxim that an object's shape and appearance should be defined only by its purpose or function. A quick perusal of any antique shop will show that this maxim is generally ignored. Humans (Homo sapiens) have been called "naked apes," but we and our close species cousins quickly adopted the concept of wearing the fur skins of animals for protection. Our ancestors were likely much more interested in how they would obtain their next meal than how stylish they appeared in hyena fur. As human culture progressed, people desired to distinguish themselves from others; and, what could be an easier way to do that than through dress. This is accomplished by the simple technique of dyeing drab natural fibers, but the simple sewing needle is a technical innovation that's lead to a means of producing more ornate dress. A recent open access article in Science Advances investigates the use of delicate eyed needles in the Paleolithic as the means for producing refined, ornamented dress. One argument for clothing's becoming a means of decoration is that traditional body decoration, such as body painting with ochre, weren't effective in cold climates, since clothing was needed all the time for survival. Homo sapiens arrived in Europe at around 45,000 BC, and the earliest known eyed needles appeared in Siberia around 40,000 BC, in the Caucasus around 38,000 BC, in West Asia around 30,000 BC, and in Europe around 26,000 BC. Clothing the human body regardless of climate is a social practice that's persisted to this day. The eyed needle combined the processes of hole puncture and threading to allow finer and more efficient sewing.

10 months ago 30 votes
Brain Size

Deep thought is what distinguishes humans from other animals. The brain is the medium for thought; so, there's the idea that brain size is important, with larger brains allowing more profound thought. Larger brains in hominids appears to have an evolutionary advantage, but the largest animals do not have proportionally larger brains. For the last century, conventional wisdom was that body mass in mammals could be described by a power law. A British research team has created a large dataset of brain and body sizes from about 1,500 species to determine the trend in brain size evolution, finding that the trend is brain size and body mass is not log-linear, but rather log-curvilinear, plateauing at high body mass. The research team found that all groups of mammals demonstrated rapid bursts of evolutionary change, not only towards larger brain size, but smaller as well. Bats very rapidly reduced their brain size, suggesting that flight may have imposed an evolutionary constraint. Homo sapiens has evolved more than twenty times faster than all other mammalian species, resulting in the massive brain size of modern man. Primates, rodents, and carnivores show a tendency for increase in relative brain size as they evolved. It appears that there is something preventing brains from getting too big, perhaps because big brains beyond a certain size are simply too costly to maintain. This upper limit of brain size applies to animals with very different biology.

10 months ago 25 votes
Ice Formation

In today's bigger is better world, you don't order a large coffee, you order a 20 fluid ounce Venti coffee. From 1987 through 2004, McDonald's restaurants had a supersize option for larger than large portions of its French fries and soft drinks. The prefix, super, has been used to describe supercooling, the unexpected cooling without a phase change when liquids can be cooled below their freezing points without solidifying. Water has many unusual properties, and these are most probably the result of water molecule being small, and the force holding these molecules together in a liquid or solid arising from hydrogen bonding. Supercooled water is a hazard to aviation, since supercooled water droplets often existing in cumulus and stratus clouds will instantly freeze on aircraft surfaces and plug the Pitot tubes that indicate airspeed. It's easy to create supercooled water in the laboratory. You just need to purify the water to remove contained minerals. The mineral crystals act as nucleation sites. Bacteria and fungi are efficient natural ice nucleators because of the way their proteins act as ice templates. The best such natural ice nucleators the Pseudomonas syringae bacterium, which is used to make artificial snow. Larger protein molecules are usually better at ice nucleation, but small fungal proteins are good at ice nucleation when they clump into larger aggregates. Scientists at the University of Utah have developed a model for prediction of the nucleation temperature of ice on a given surface. Model parameters include the shapes of surface defects, and appropriately sized and shaped surface bumps and depressions can squeeze water molecules into configurations that make it easier or harder for ice to form.

10 months ago 20 votes

More in science

Researchers Uncover Hidden Ingredients Behind AI Creativity

Image generators are designed to mimic their training data, so where does their apparent creativity come from? A recent study suggests that it’s an inevitable by-product of their architecture. The post Researchers Uncover Hidden Ingredients Behind AI Creativity first appeared on Quanta Magazine

15 hours ago 2 votes
Science slow down - not a simple question

I participated in a program about 15 years ago that looked at science and technology challenges faced by a subset of the US government. I came away thinking that such problems fall into three broad categories. Actual science and engineering challenges, which require foundational research and creativity to solve. Technology that may be fervently desired but is incompatible with the laws of nature, economic reality, or both.  Alleged science and engineering problems that are really human/sociology issues. Part of science and engineering education and training is giving people the skills to recognize which problems belong to which categories.  Confusing these can strongly shape the perception of whether science and engineering research is making progress.  There has been a lot of discussion in the last few years about whether scientific progress (however that is measured) has slowed down or stagnated.  For example, see here: https://www.theatlantic.com/science/archive/2018/11/diminishing-returns-science/575665/  https://news.uchicago.edu/scientific-progress-slowing-james-evans https://www.forbes.com/sites/roberthart/2023/01/04/where-are-all-the-scientific-breakthroughs-forget-ai-nuclear-fusion-and-mrna-vaccines-advances-in-science-and-tech-have-slowed-major-study-says/ https://theweek.com/science/world-losing-scientific-innovation-research A lot of the recent talk is prompted by this 2023 study, which argues that despite the world having many more researchers than ever before (behold population growth) and more global investment in research, somehow "disruptive" innovations are coming less often, or are fewer and farther between these days.  (Whether this is an accurate assessment is not a simple matter to resolve; more on this below.) There is a whole tech bro culture that buys into this, however.  For example, see this interview from last week in the New York Times with Peter Thiel, which points out that Thiel has been complaining about this for a decade and a half.   On some level, I get it emotionally.  The unbounded future spun in a lot of science fiction seems very far away.  Where is my flying car?  Where is my jet pack?  Where is my moon base?  Where are my fusion power plants, my antigravity machine, my tractor beams, my faster-than-light drive?  Why does the world today somehow not seem that different than the world of 1985, while the world of 1985 seems very different than that of 1945? Some of the folks that buy into this think that science is deeply broken somehow - that we've screwed something up, because we are not getting the future they think we were "promised".  Some of these people have this as an internal justification underpinning the dismantling of the NSF, the NIH, basically a huge swath of the research ecosystem in the US.  These same people would likely say that I am part of the problem, and that I can't be objective about this because the whole research ecosystem as it currently exists is a groupthink self-reinforcing spiral of mediocrity.   Science and engineering are inherently human ventures, and I think a lot of these concerns have an emotional component.  My take at the moment is this: Genuinely transformational breakthroughs are rare.  They often require a combination of novel insights, previously unavailable technological capabilities, and luck.  They don't come on a schedule.   There is no hard and fast rule that guarantees continuous exponential technological progress.  Indeed, in real life, exponential growth regimes never last. The 19th and 20th centuries were special.   If we think of research as a quest for understanding, it's inherently hierarchal.  Civilizational collapses aside, you can only discover how electricity works once.   You can only discover the germ theory of disease, the nature of the immune system, and vaccination once (though in the US we appear to be trying really hard to test that by forgetting everything).  You can only discover quantum mechanics once, and doing so doesn't imply that there will be an ongoing (infinite?) chain of discoveries of similar magnitude. People are bad at accurately perceiving rare events and their consequences, just like people have a serious problem evaluating risk or telling the difference between correlation and causation.  We can't always recognize breakthroughs when they happen.  Sure, I don't have a flying car.  I do have a device in my pocket that weighs only a few ounces, gives me near-instantaneous access to the sum total of human knowledge, let's me video call people around the world, can monitor aspects of my fitness, and makes it possible for me to watch sweet videos about dogs.  The argument that we don't have transformative, enormously disruptive breakthroughs as often as we used to or as often as we "should" is in my view based quite a bit on perception. Personally, I think we still have a lot more to learn about the natural world.  AI tools will undoubtedly be helpful in making progress in many areas, but I think it is definitely premature to argue that the vast majority of future advances will come from artificial superintelligences and thus we can go ahead and abandon the strategies that got us the remarkable achievements of the last few decades. I think some of the loudest complainers (Thiel, for example) about perceived slowing advancement are software people.  People who come from the software development world don't always appreciate that physical infrastructure and understanding are hard, and that there are not always clever or even brute-force ways to get to an end goal.  Solving foundational problems in molecular biology or quantum information hardware or  photonics or materials is not the same as software development.  (The tech folks generally know this on an intellectual level, but I don't think all of them really understand it in their guts.  That's why so many of them seem to ignore real world physical constraints when talking about AI.).  Trying to apply software development inspired approaches to science and engineering research isn't bad as a component of a many-pronged strategy, but alone it may not give the desired results - as warned in part by this piece in Science this week.   More frequent breakthroughs in our understanding and capabilities would be wonderful.  I don't think dynamiting the US research ecosystem is the way to get us there, and hoping that we can dismantle everything because AI will somehow herald a new golden age seems premature at best.

14 hours ago 2 votes
Animals Adapting to Cities

Humans are dramatically changing the environment of the Earth in many ways. Only about 23% of the land surface (excluding Antarctica) is considered to be “wilderness”, and this is rapidly decreasing. What wilderness is left is also mostly managed conservation areas. Meanwhile, about 3% of the surface is considered urban. I could not find a […] The post Animals Adapting to Cities first appeared on NeuroLogica Blog.

17 hours ago 2 votes
Cryogenic CMOS - a key need for solid state quantum information processing

The basis for much of modern electronics is a set of silicon technologies called CMOS, which stands for complementary metal oxide semiconductor devices and processes.  "Complementary" means using semiconductors (typically silicon) that is locally chemically doped so that you can have both n-type (carriers are negatively charged electrons in the conduction band) and p-type (carriers are positively charged holes in the valence band) material on the same substrate.  With field-effect transistors (using oxide gate dielectrics), you can make very compact, comparatively low power devices like inverters and logic gates.   There are multiple different approaches to try to implement quantum information processing in solid state platforms, with the idea that the scaling lessons of microelectronics (in terms of device density and reliability) can be applied.  I think that essentially all of these avenues require cryogenic operating conditions; all superconducting qubits need ultracold conditions for both superconductivity and to minimize extraneous quasiparticles and other decoherence sources.  Semiconductor-based quantum dots (Intel's favorite) similarly need thermal perturbations and decoherence to be minimized.  The wealth of solid state quantum computing research is the driver for the historically enormous (to me, anyway) growth of dilution refrigerator manufacturing (see my last point here). So you eventually want to have thousands of error-corrected logical qubits at sub-Kelvin temperatures, which may involve millions of physical qubits at sub-Kelvin temperatures, all of which need to be controlled.  Despite the absolute experimental fearlessness of people like John Martinis, you are not going to get this to work by running a million wires from room temperature into your dil fridge.   Fig. 1 from here. The alternative people in this area have converged upon is to create serious CMOS control circuitry that can work at 4 K or below, so that a lot of the wiring does not need to go from the qubits all the way to room temperature.  The materials and device engineering challenges in doing this are substantial!  Power dissipation really needs to be minimized, and material properties to work at cryogenic conditions are not the same as those optimized for room temperature.  There have been major advances in this - examples include Google in 2019, Intel in 2021, IBM in 2024, and this week, folks at the University of New South Wales supported by Microsoft.   In this most recent work, the aspect that I find most impressive is that the CMOS electronics are essentially a serious logic-based control board operating at milliKelvin temperatures right next to the chip with the qubits (in this case, spins-in-quantum-dots).  I'm rather blown away that this works and with sufficiently low power dissipation that the fridge is happy.  This is very impressive, and there is likely a very serious future in store for cryogenic CMOS.

3 days ago 5 votes
Why U.S. Geothermal May Advance, Despite Political Headwinds

The Trump administration is outwardly hostile to clean energy sourced from solar and wind. But thanks to close ties to the fossil fuel industry and new technological breakthroughs, U.S. geothermal power may survive the GOP assaults on support for renewables and even thrive. Read more on E360 →

4 days ago 1 votes