More from Tikalon Blog by Dev Gualtieri
Tikalon Blog is now in archive mode. Here's a directory of links to easily printed and saved articles. If you're willing to wait a while for the download, a zip file of all the blog articles can be found at the link below. Note, however, that these articles are copyrighted and can't be used to train artificial intelligent agents. Individuals are free to republish single articles on their personal websites.
Microphones convert sound into an electrical signal for subsequent amplification, as in auditorium public address systems; or transmission, as in landline and mobile phones. The most common types of microphones are carbon, used in early telephones, condenser, electret, dynamic, ribbon, crystal and MEMS. All these microphones operate as transducers that convert sound pressure into an electrical signal. This makes them also sensitive to noise caused by air molecules bouncing against their diaphragms. In an effort to solve this thermal noise problem, a team of mechanical engineers has investigated a sound sensing approach that uses viscous air flow rather than sound pressure. Viscous flow is what vibrates spiderwebs in gentle breezes. Air flow passing a thread of a spiderweb drags the thread. They demonstrated sound detection by a simulated spiderweb, an array of thin cantilever beams. The beams were 0.5 micrometer thick silicon nitride placed over a hole in a silicon wafer, and a laser was used to measure the displacement of the microbeams, first in response to thermal noise, and then in response to sound waves from 100 to 1000 Hz. The cantilever velocity matched that of the sound wave, irrespective of the length or width of the beam. The demonstrated cantilever microphone is about 50 dBa less sensitive than the best pressure-based microphones; but, pressure microphones have been perfected over a span of 150 years. As the lead author of the paper comments, "Detecting air flow as a way to sense sound has largely been ignored by researchers, but the principles show that it's worth considering."
"Form follows function" is a maxim that an object's shape and appearance should be defined only by its purpose or function. A quick perusal of any antique shop will show that this maxim is generally ignored. Humans (Homo sapiens) have been called "naked apes," but we and our close species cousins quickly adopted the concept of wearing the fur skins of animals for protection. Our ancestors were likely much more interested in how they would obtain their next meal than how stylish they appeared in hyena fur. As human culture progressed, people desired to distinguish themselves from others; and, what could be an easier way to do that than through dress. This is accomplished by the simple technique of dyeing drab natural fibers, but the simple sewing needle is a technical innovation that's lead to a means of producing more ornate dress. A recent open access article in Science Advances investigates the use of delicate eyed needles in the Paleolithic as the means for producing refined, ornamented dress. One argument for clothing's becoming a means of decoration is that traditional body decoration, such as body painting with ochre, weren't effective in cold climates, since clothing was needed all the time for survival. Homo sapiens arrived in Europe at around 45,000 BC, and the earliest known eyed needles appeared in Siberia around 40,000 BC, in the Caucasus around 38,000 BC, in West Asia around 30,000 BC, and in Europe around 26,000 BC. Clothing the human body regardless of climate is a social practice that's persisted to this day. The eyed needle combined the processes of hole puncture and threading to allow finer and more efficient sewing.
Deep thought is what distinguishes humans from other animals. The brain is the medium for thought; so, there's the idea that brain size is important, with larger brains allowing more profound thought. Larger brains in hominids appears to have an evolutionary advantage, but the largest animals do not have proportionally larger brains. For the last century, conventional wisdom was that body mass in mammals could be described by a power law. A British research team has created a large dataset of brain and body sizes from about 1,500 species to determine the trend in brain size evolution, finding that the trend is brain size and body mass is not log-linear, but rather log-curvilinear, plateauing at high body mass. The research team found that all groups of mammals demonstrated rapid bursts of evolutionary change, not only towards larger brain size, but smaller as well. Bats very rapidly reduced their brain size, suggesting that flight may have imposed an evolutionary constraint. Homo sapiens has evolved more than twenty times faster than all other mammalian species, resulting in the massive brain size of modern man. Primates, rodents, and carnivores show a tendency for increase in relative brain size as they evolved. It appears that there is something preventing brains from getting too big, perhaps because big brains beyond a certain size are simply too costly to maintain. This upper limit of brain size applies to animals with very different biology.
In today's bigger is better world, you don't order a large coffee, you order a 20 fluid ounce Venti coffee. From 1987 through 2004, McDonald's restaurants had a supersize option for larger than large portions of its French fries and soft drinks. The prefix, super, has been used to describe supercooling, the unexpected cooling without a phase change when liquids can be cooled below their freezing points without solidifying. Water has many unusual properties, and these are most probably the result of water molecule being small, and the force holding these molecules together in a liquid or solid arising from hydrogen bonding. Supercooled water is a hazard to aviation, since supercooled water droplets often existing in cumulus and stratus clouds will instantly freeze on aircraft surfaces and plug the Pitot tubes that indicate airspeed. It's easy to create supercooled water in the laboratory. You just need to purify the water to remove contained minerals. The mineral crystals act as nucleation sites. Bacteria and fungi are efficient natural ice nucleators because of the way their proteins act as ice templates. The best such natural ice nucleators the Pseudomonas syringae bacterium, which is used to make artificial snow. Larger protein molecules are usually better at ice nucleation, but small fungal proteins are good at ice nucleation when they clump into larger aggregates. Scientists at the University of Utah have developed a model for prediction of the nucleation temperature of ice on a given surface. Model parameters include the shapes of surface defects, and appropriately sized and shaped surface bumps and depressions can squeeze water molecules into configurations that make it easier or harder for ice to form.
More in science
In the late 1970s, a time when 8-bit processors were state of the art and CMOS was the underdog of semiconductor technology, engineers at AT&T’s Bell Labs took a bold leap into the future. They made a high-stakes bet to outpace IBM, Intel, and other competitors in chip performance by combining cutting-edge 3.5-micron CMOS fabrication with a novel 32-bit processor architecture. Although their creation—the Bellmac-32 microprocessor—never achieved the commercial fame of earlier ones such as Intel’s 4004 (released in 1971), its influence has proven far more enduring. Virtually every chip in smartphones, laptops, and tablets today relies on the complementary metal-oxide semiconductor principles that the Bellmac-32 pioneered. As the 1980s approached, AT&T was grappling with transformation. For decades, the telecom giant—nicknamed “Ma Bell”—had dominated American voice communications, with its Western Electric subsidiary manufacturing nearly every telephone found in U.S. homes and offices. The U.S. federal government was pressing for antitrust-driven divestiture, but AT&T was granted an opening to expand into computing. With computing firms already entrenched in the market, AT&T couldn’t afford to play catch-up; its strategy was to leap ahead, and the Bellmac-32 was its springboard. The Bellmac-32 chip series has now been honored with an IEEE Milestone. Dedication ceremonies are slated to be held this year at the Nokia Bell Labs’ campus in Murray Hill, N.J., and at the Computer History Museum in Mountain View, Calif. A chip like no other Rather than emulate the industry standard of 8-bit chips, AT&T executives challenged their Bell Labs engineers to deliver something revolutionary: the first commercially viable microprocessor capable of moving 32 bits in one clock cycle. It would require not just a new chip but also an entirely novel architecture—one that could handle telecommunications switching and serve as the backbone for future computing systems. “We weren’t just building a faster chip,” says Michael Condry, who led the architecture team at Bell Labs’ Holmdel facility in New Jersey. “We were trying to design something that could carry both voice and computation into the future.” This configuration of the Bellmac-32 microprocessor had an integrated memory management unit optimized for Unix-like operating systems.AT&T Archives and History Center At the time, CMOS technology was seen as a promising—but risky—alternative to the NMOS and PMOS designs then in use. NMOS chips, which relied solely on N-type transistors, were fast but power-hungry. PMOS chips, which depend on the movement of positively-charged holes, were too slow. CMOS, with its hybrid design, offered the potential for both speed and energy savings. The benefits were so compelling that the industry soon saw that the need for double the number of transistors (NMOS and PMOS for each gate) was worth the tradeoff. As transistor sizes shrank along with the rapid advancement of semiconductor technology described by Moore’s Law, the cost of doubling up the transistor density soon became manageable and eventually became negligible. But when Bell Labs took its high-stakes gamble, large-scale CMOS fabrication was still unproven and looked to be comparatively costly. That didn’t deter Bell Labs. By tapping expertise from its campuses in Holmdel and Murray Hill as well as in Naperville, Ill., the company assembled a dream team of semiconductor engineers. The team included Condry; Sung-Mo “Steve” Kang, a rising star in chip design; Victor Huang, another microprocessor chip designer, and dozens of AT&T Bell Labs employees. They set out in 1978 to master a new CMOS process and create a 32-bit microprocessor from scratch. Designing the architecture The architecture group led by Condry, an IEEE Life Fellow who would later become Intel’s CTO, focused on building a system that would natively support the Unix operating system and the C programming language. Both were in their infancy but destined for dominance. To cope with the era’s memory limitations—kilobytes were precious—they introduced a complex instruction set that required fewer steps to carry out and could be executed in a single clock cycle. The engineers also built the chip to support the VersaModule Eurocard (VME) parallel bus, enabling distributed computing so several nodes could handle data processing in parallel. Making the chip VME-enabled also allowed it to be used for real-time control. The group wrote its own version of Unix, with real-time capabilities to ensure that the new chip design was compatible with industrial automation and similar applications. The Bell Labs engineers also invented domino logic, which ramped up processing speed by reducing delays in complex logic gates. Additional testing and verification techniques were developed and introduced via the Bellmac-32 Module, a sophisticated multi-chipset verification and testing project led by Huang that allowed the complex chip fabrication to have zero or near-zero errors. This was the first of its kind in VLSI testing. The Bell Labs engineers’ systematic plan for double- and triple-checking their colleagues’ work ultimately made the total design of the multiple chipset family work together seamlessly as a complete microcomputer system. Then came the hardest part: actually building the chip. Floor maps and colored pencils “The technology for layout, testing, and high-yield fabrication just wasn’t there,” recalls Kang, an IEEE Life Fellow who later became president of the Korea Advanced Institute of Science and Technology (KAIST) in Daejeon, South Korea. With no CAD tools available for full-chip verification, Kang says, the team resorted to printing oversize Calcomp plots. The schematics showed how the transistors, circuit lines, and interconnects should be arranged inside the chip to provide the desired outputs. The team assembled them on the floor with adhesive tape to create a massive square map more than 6 meters on a side. Kang and his colleagues traced every circuit by hand with colored pencils, searching for breaks, overlaps, or mishandled interconnects. Getting it made Once the physical design was locked in, the team faced another obstacle: manufacturing. The chips were fabricated at a Western Electric facility in Allentown, Pa., but Kang recalls that the yield rates (the percentage of chips on a silicon wafer that meet performance and quality standards) were dismal. To address that, Kang and his colleagues drove from New Jersey to the plant each day, rolled up their sleeves, and did whatever it took, including sweeping floors and calibrating test equipment, to build camaraderie and instill confidence that the most complicated product the plant workers had ever attempted to produce could indeed be made there. “We weren’t just building a faster chip. We were trying to design something that could carry both voice and computation into the future.” —Michael Condry, Bellmac-32 architecture team lead “The team-building worked out well,” Kang says. “After several months, Western Electric was able to produce more than the required number of good chips.” The first version of the Bellmac-32, which was ready by 1980, fell short of expectations. Instead of hitting a 4-megahertz performance target, it ran at just 2 MHz. The engineers discovered that the state-of-the-art Takeda Riken testing equipment they were using was flawed, with transmission-line effects between the probe and the test head leading to inaccurate measurements, so they worked with a Takeda Riken team to develop correction tables that rectified the measurement errors. The second generation of Bellmac chips had clock speeds that exceeded 6.2 MHz, sometimes reaching 9. That was blazing fast for its time. The 16-bit Intel 8008 processor inside IBM’s original PC released in 1981 ran at 4.77 MHz. Why Bellmac-32 didn’t go mainstream Despite its technical promise, the Bellmac-32 did not find wide commercial use. According to Condry, AT&T’s pivot toward acquiring equipment manufacturer NCR, which it began eyeing in the late 1980s, meant the company chose to back a different line of chips. But by then, the Bellmac-32’s legacy was already growing. “Before Bellmac-32, NMOS was dominant,” Condry says. “But CMOS changed the market because it was shown to be a more effective implementation in the fab.” In time, that realization reshaped the semiconductor landscape. CMOS would become the foundation for modern microprocessors, powering the digital revolution in desktops, smartphones, and more. The audacity of Bell Labs’ bet—to take an untested fabrication process and leapfrog an entire generation of chip architecture—stands as a landmark moment in technological history. As Kang puts it: “We were on the frontier of what was possible. We didn’t just follow the path—we made a new one.” Huang, an IEEE Life Fellow who later became deputy director of the Institute of Microelectronics, Singapore, adds: “This included not only chip architecture and design, but also large-scale chip verification—with CAD but without today’s digital simulation tools or even breadboarding [which is the standard method for checking whether a circuit design for an electronic system that uses chips works before making permanent connections by soldering the circuit elements together].” Condry, Kang, and Huang look back fondly on that period and express their admiration for the many AT&T employees whose skill and dedication made the Bellmac-32 chip series possible. Administered by the IEEE History Center and supported by donors, the Milestone program recognizes outstanding technical developments around the world. The IEEE North Jersey Section sponsored the nomination.
Artificial intelligence is powering weather forecasts that are generally more accurate than conventional forecasts and are faster and cheaper to produce. But new research shows A.I. may fail to predict unprecedented weather events, a troubling finding as warming fuels new extremes. Read more on E360 →
A new proof illuminates the hidden patterns that emerge when addition becomes impossible. The post Graduate Student Solves Classic Problem About the Limits of Addition first appeared on Quanta Magazine
I’m not sure who scheduled ODSC and PyConUS during the same week, but I am unhappy with their decisions. Last Tuesday I presented a talk and co-presented a workshop at ODSC, and on Thursday I presented a tutorial at PyCon. If you would like to follow along with my very busy week, here are the resources: Practical Bayesian Modeling with PyMC Co-presented with Alex Fengler for ODSC East 2025 In this tutorial, we explore Bayesian regression using PyMC – the... Read More Read More The post My very busy week appeared first on Probably Overthinking It.