More from Tikalon Blog by Dev Gualtieri
Tikalon Blog is now in archive mode. Here's a directory of links to easily printed and saved articles. If you're willing to wait a while for the download, a zip file of all the blog articles can be found at the link below. Note, however, that these articles are copyrighted and can't be used to train artificial intelligent agents. Individuals are free to republish single articles on their personal websites.
Microphones convert sound into an electrical signal for subsequent amplification, as in auditorium public address systems; or transmission, as in landline and mobile phones. The most common types of microphones are carbon, used in early telephones, condenser, electret, dynamic, ribbon, crystal and MEMS. All these microphones operate as transducers that convert sound pressure into an electrical signal. This makes them also sensitive to noise caused by air molecules bouncing against their diaphragms. In an effort to solve this thermal noise problem, a team of mechanical engineers has investigated a sound sensing approach that uses viscous air flow rather than sound pressure. Viscous flow is what vibrates spiderwebs in gentle breezes. Air flow passing a thread of a spiderweb drags the thread. They demonstrated sound detection by a simulated spiderweb, an array of thin cantilever beams. The beams were 0.5 micrometer thick silicon nitride placed over a hole in a silicon wafer, and a laser was used to measure the displacement of the microbeams, first in response to thermal noise, and then in response to sound waves from 100 to 1000 Hz. The cantilever velocity matched that of the sound wave, irrespective of the length or width of the beam. The demonstrated cantilever microphone is about 50 dBa less sensitive than the best pressure-based microphones; but, pressure microphones have been perfected over a span of 150 years. As the lead author of the paper comments, "Detecting air flow as a way to sense sound has largely been ignored by researchers, but the principles show that it's worth considering."
"Form follows function" is a maxim that an object's shape and appearance should be defined only by its purpose or function. A quick perusal of any antique shop will show that this maxim is generally ignored. Humans (Homo sapiens) have been called "naked apes," but we and our close species cousins quickly adopted the concept of wearing the fur skins of animals for protection. Our ancestors were likely much more interested in how they would obtain their next meal than how stylish they appeared in hyena fur. As human culture progressed, people desired to distinguish themselves from others; and, what could be an easier way to do that than through dress. This is accomplished by the simple technique of dyeing drab natural fibers, but the simple sewing needle is a technical innovation that's lead to a means of producing more ornate dress. A recent open access article in Science Advances investigates the use of delicate eyed needles in the Paleolithic as the means for producing refined, ornamented dress. One argument for clothing's becoming a means of decoration is that traditional body decoration, such as body painting with ochre, weren't effective in cold climates, since clothing was needed all the time for survival. Homo sapiens arrived in Europe at around 45,000 BC, and the earliest known eyed needles appeared in Siberia around 40,000 BC, in the Caucasus around 38,000 BC, in West Asia around 30,000 BC, and in Europe around 26,000 BC. Clothing the human body regardless of climate is a social practice that's persisted to this day. The eyed needle combined the processes of hole puncture and threading to allow finer and more efficient sewing.
Deep thought is what distinguishes humans from other animals. The brain is the medium for thought; so, there's the idea that brain size is important, with larger brains allowing more profound thought. Larger brains in hominids appears to have an evolutionary advantage, but the largest animals do not have proportionally larger brains. For the last century, conventional wisdom was that body mass in mammals could be described by a power law. A British research team has created a large dataset of brain and body sizes from about 1,500 species to determine the trend in brain size evolution, finding that the trend is brain size and body mass is not log-linear, but rather log-curvilinear, plateauing at high body mass. The research team found that all groups of mammals demonstrated rapid bursts of evolutionary change, not only towards larger brain size, but smaller as well. Bats very rapidly reduced their brain size, suggesting that flight may have imposed an evolutionary constraint. Homo sapiens has evolved more than twenty times faster than all other mammalian species, resulting in the massive brain size of modern man. Primates, rodents, and carnivores show a tendency for increase in relative brain size as they evolved. It appears that there is something preventing brains from getting too big, perhaps because big brains beyond a certain size are simply too costly to maintain. This upper limit of brain size applies to animals with very different biology.
In today's bigger is better world, you don't order a large coffee, you order a 20 fluid ounce Venti coffee. From 1987 through 2004, McDonald's restaurants had a supersize option for larger than large portions of its French fries and soft drinks. The prefix, super, has been used to describe supercooling, the unexpected cooling without a phase change when liquids can be cooled below their freezing points without solidifying. Water has many unusual properties, and these are most probably the result of water molecule being small, and the force holding these molecules together in a liquid or solid arising from hydrogen bonding. Supercooled water is a hazard to aviation, since supercooled water droplets often existing in cumulus and stratus clouds will instantly freeze on aircraft surfaces and plug the Pitot tubes that indicate airspeed. It's easy to create supercooled water in the laboratory. You just need to purify the water to remove contained minerals. The mineral crystals act as nucleation sites. Bacteria and fungi are efficient natural ice nucleators because of the way their proteins act as ice templates. The best such natural ice nucleators the Pseudomonas syringae bacterium, which is used to make artificial snow. Larger protein molecules are usually better at ice nucleation, but small fungal proteins are good at ice nucleation when they clump into larger aggregates. Scientists at the University of Utah have developed a model for prediction of the nucleation temperature of ice on a given surface. Model parameters include the shapes of surface defects, and appropriately sized and shaped surface bumps and depressions can squeeze water molecules into configurations that make it easier or harder for ice to form.
More in science
Under the sea ice during the Arctic’s pitch-black polar night, cells power photosynthesis on the lowest light levels ever observed in nature. The post How Does Life Happen When There’s Barely Any Light? first appeared on Quanta Magazine
It's not just about rents - it's also about the rooms friends and family can't afford to share
Everything, apparently, has a second life on TikTok. At least this keeps us skeptics busy – we have to redebunk everything we have debunked over the last century because it is popping up again on social media, confusing and misinforming another generation. This video is a great example – a short video discussing the “incorruptibility’ […] The post Incorruptible Skepticism first appeared on NeuroLogica Blog.
Physical media fans need not panic yet—you’ll still be able to buy new Blu-Ray movies for your collection. But for those who like to save copies of their own data onto the discs, the remaining options just became more limited: Sony announced last week that it’s ending all production of several recordable media formats—including Blu-Ray discs, MiniDiscs, and MiniDV cassettes—with no successor models. “Considering the market environment and future growth potential of the market, we have decided to discontinue production,” a representative of Sony said in a brief statement to IEEE Spectrum. Though availability is dwindling, most Blu-Ray discs are unaffected. The discs being discontinued are currently only available to consumers in Japan and some professional markets elsewhere, according to Sony. Many consumers in Japan use blank Blu-Ray discs to save TV programs, Sony separately told Gizmodo. Sony, which prototyped the first Blu-Ray discs in 2000, has been selling commercial Blu-Ray products since 2006. Development of Blu-Ray was started by Philips and Sony in 1995, shortly after Toshiba’s DVD was crowned the winner of the battle to replace the VCR, notes engineer Kees Immink, whose coding was instrumental in developing optical formats such as CDs, DVDs, and Blu-Ray discs. “Philips [and] Sony were so frustrated by that loss that they started a new disc format, using a blue laser,” Immink says. Blu-Ray’s Short-Lived Media Dominance The development took longer than expected, but when it was finally introduced a decade later, Blu-Ray was on its way to becoming the medium for distributing video, as DVD discs and VHS tapes had done in their heydays. In 2008, Spectrum covered the moment when Blu-Ray’s major competitor, HD-DVD, surrendered. But the timing was unfortunate, as the rise of streaming made it an empty victory. Still, Blu-Rays continue to have value as collector’s items for many film buffs who want high-quality recordings not subject to compression artifacts that can arise with streaming, not to mention those wary of losing access to movies due to the vagaries of streaming services’ licensing deals. Sony’s recent announcement does, however, cement the death of the MiniDV cassette and MiniDisc. MiniDV, magnetic cassettes meant to replace VHS tapes at one-fifth the size, were once a popular format of digital video cassettes. The MiniDisc, an erasable magneto-optical disc that can hold up to 80 minutes of digitized audio, still has a small following. The 64-millimeter (2.5-inch) discs, held in a plastic cartridge similar to a floppy disk, were developed in the mid-1980s as a replacement for analog cassette tapes. Sony finally released the product in 1992, and it was popular in Japan into the 2000s. To record data onto optical storage like CDs and Blu-Rays, lasers etch microscopic pits into the surface of the disc to represent ones and zeros. Lasers are also used to record data onto MiniDiscs, but instead of making indentations, they’re used to change the polarization of the material; the lasers heat up one side of the disc, making the material susceptible to a magnetic field, which can then alter the polarity of the heated area. Then in playback, the polarization of reflected light translates to a one or zero. When the technology behind media storage formats like the MiniDisc and Blu-Ray was first being developed, the engineers involved believed the technology would be used well into the future, says optics engineer Joseph Braat. His research at Philips with Immink served as the basis of the MiniDisc. Despite that optimism, “the density of information in optical storage was limited from the very beginning,” Braat says. Despite using the compact wavelengths of blue light, Blu-Ray soon hit a limit of how much data could be stored. Even dual-layer Blu-Ray discs can only hold 50 gigabytes per side; that amount of data will give you 50 hours of standard definition streaming on Netflix, or about seven hours of 4K video content. MiniDiscs still have a small, dedicated niche of enthusiasts, with active social media communities and in-person disc swaps. But since Sony stopped production of MiniDisc devices in 2013, the retro format has effectively been on technological hospice care, with the company only offering blank discs and repair services. Now, it seems, it’s officially over.
AGI Is Coming Sooner Due to o3, DeepSeek, and Other Cutting-Edge AI Developments