Full Width [alt+shift+f] Shortcuts [alt+shift+k]
Sign Up [alt+shift+s] Log In [alt+shift+l]
37
We’re not going to post things on Twitter X anymore. The new owner keeps doing awful stuff. If you have enjoyed our mostly-daily curated links via the aforementioned collapsing service, we invite you to bookmark our curated links page, or follow us a number of other ways. Rather than linger any longer on this tedious topic, here are some home-grown dad jokes. If there is any order in this universe, the comments section will fill with more of the same. Q: What is the flavor of a chair? Do you even know the meaning of the word ‘rhetorical?’ Don’t answer that! My friend bought an alarm clock that makes loud farting sounds in the morning. He’s in for a rude awakening. You’re right, these ARE my orthopedic shoes. I stand corrected. I want a good game of hide and seek, but skilled players are hard to find. Like tight sweaters, corporate acquisitions are hard to pull off. I was offered a job at the mirror factory. I could see myself working there. Did you hear about the farmer in Colorado...
a year ago

Improve your reading experience

Logged in users get linked directly to articles resulting in a better reading experience. Please login for free, it takes less than 1 minute.

More from Damn Interesting

A Trail Gone Cold

Iceland is known to the rest of the world as the land of Vikings and volcanos, an island caught between continents at the extremities of the map. Remote and comparatively inhospitable, it was settled only as long ago as the 9th century, and has seen little additional in-migration since. Even today, more than 90 percent of Iceland’s 390,000 residents can trace their ancestry back to the earliest permanent inhabitants, a Nordic-Celtic mix. The tradition of the Norse sagas lives on in the form of careful record-keeping about ancestry—and a national passion for genealogy. In other words, it is not the place to stumble upon old family mysteries. But growing up in the capital city of Reykjavík in the 1950s, neurologist Dr. Kári Stefánsson heard stories that left him curious. Stefánsson’s father had come from Djúpivogur, an eastern coastal town where everyone still spoke of a Black man who had moved there early in the 19th century. “Hans Jónatan”, they called him—a well-liked shopkeeper who had arrived on a ship, married a spirited woman from a local farm, and became a revered member of the community. The local census did record a man by the name of Hans Jónatan, born in the Caribbean, who was working at the general store in Djúpivogur in the 19th century—but that was all. No images of the man had survived, and his time in Iceland was well before any other humans with African ancestry are known to have visited the island. If tiny, remote Djúpivogur did have a Black man arrive in the 19th century, the circumstances must have been unusual indeed. It was an intriguing puzzle—and solid grounds for a scientific investigation. Given the amount of homogeneity in the baseline Icelandic population, the genetic signature of one relative newcomer with distinct ancestry might still stand out across a large sample of his descendants. Geneticists thus joined locals and history scholars, and they pieced together a story that bridged three continents. Continue reading ▶

a year ago 96 votes
Breaking a Bit

It’s been a busy summer, and the large shortfall in donations last month has been demoralizing, so we’re taking a week off to rest and recuperate. The curated links section will be (mostly) silent, and behind the scenes we’ll be taking a brief break from our usual researching, writing, editing, illustrating, narrating, sound designing, coding, et cetera. We plan to return to normalcy on the 11th of September. (The word “normalcy” was not considered an acceptable alternative to “normality” until 14 May 1920, when then-presidential-candidate Warren G. Harding misused the mathematical term in a campaign speech, stating that America needed, “not nostrums, but normalcy.” He then integrated this error into his campaign slogan, “Return to Normalcy.” Also, the G in Warren G. Harding stood for “Gamaliel.”) While we are away, on 06 September 2023, Damn Interesting will be turning 18 years old. To celebrate, here are the first emojis to ever appear in the body of a Damn Interesting post: 🎂🎉🎁 If you become bored while we are away, you might try a little mobile game we’ve been working on called Wordwhile. It can be played alone, or with a friend. If you enjoy games like Scrabble and Wordle, you may find this one ENJOYABLE (75 points). Launch Wordwhile → And, as always, there are lots of ways to explore our back-catalog. View this post ▶

a year ago 87 votes
Journey to the Invisible Planet

In the late 17th century, natural philosopher Isaac Newton was deeply uneasy with a new scientific theory that was gaining currency in Europe: universal gravitation. In correspondence with a scientific contemporary, Newton complained that it was “an absurdity” to suppose that “one body may act upon another at a distance through a vacuum.” The scientist who proposed this preposterous theory was Isaac Newton. He first articulated the idea in his widely acclaimed magnum opus Principia, wherein he explained, “I have not yet been able to discover the cause of these properties of gravity from phenomena and I feign no hypotheses […] It is enough that gravity does really exist and acts according to the laws I have explained.” Newton proposed that celestial bodies were not the sole sources of gravity in the universe, rather all matter attracts all other matter with a force that corresponds to mass and diminishes rapidly with distance. He had been studying the motions of the six known planets–Mercury, Venus, Mars, Jupiter, Saturn, and Uranus–and by expanding upon the laws of planetary motion developed by Johannes Kepler about eight decades earlier, he arrived at an equation for gravitational force F that seemed to match decades of data: Where m1 and m2 are the masses of the objects, r is the distance between their centers of mass, and G is the gravitational constant (~0.0000000000667408). But this is only an approximation; humanity may never know the precise value because it is impossible to isolate any measuring apparatus from all of the gravity in the universe. Fellow astronomers found that Newton’s theory seemed to be accurate–universal gravitation appeared to reliably forecast the sometimes irregular motion of the planets even more closely than Kepler’s laws. In 1705, Queen Anne knighted Isaac Newton to make him Sir Isaac Newton (though this honor was due to his work in politics, not for his considerable contributions to math or science). In the century that followed, Newton’s universal gravitation performed flawlessly. Celestial bodies appeared to adhere to the elegant theory, and in scientific circles, it began to crystallize into a law of nature. But in the early 19th century, cracks began to appear. When astronomer Alexis Bouvard used Newton’s equations to carefully calculate future positions of Jupiter and Saturn, they proved spectacularly accurate. However, when he followed up in 1821 with astronomical tables for Uranus–the outermost known planet–subsequent observations revealed that the planet was crossing the sky substantially slower than projected. The fault was not in Bouvard’s math; Uranus appeared to be violating the law of universal gravitation. Newton’s theory was again called into question in 1843 by a 32-year-old assistant astronomer at the Paris Observatory, Urbain Le Verrier. Le Verrier had been following the Uranus perturbations with great interest, while also compiling a painstaking record of the orbit of Mercury–the innermost known planet. He found that Mercury also departed from projections made by universal gravitation. Was universal gravitation a flawed theory? Or might undiscovered planets lurk in extra-Uranian and intra-Mercurial space, disturbing the orbits of the known planets? Astronomers around the world scoured the skies, seeking out whatever was perturbing the solar system. The answer, it turned out, was more bizarre than they could have supposed. Continue reading ▶

a year ago 40 votes
From Where the Sun Now Stands

An American Indian man on horseback stood outlined against a steely sky past midday on 05 October 1877. Winter was already settling into the prairies of what would soon become the state of Montana. Five white men stood in the swaying grass on the other side of the field, watching the horse move closer. Four wore blue uniforms, another in civilian attire. One of the uniformed men was tall and stout, with bright blue eyes and a large, curling mustache. He watched the proceedings with an air of self-importance. The surrender of the man on horseback might have been inevitable, sure, but it was nevertheless a nice feather in his cap. Perhaps his superiors would finally grant him that promotion after this whole affair was over. The other four men were more apprehensive. All of them were experienced in fighting American Indians on the frontier, but this opponent had been different. One man, with a full, dark beard and right arm missing below the elbow, looked at the approaching chief with grudging respect. The man had lost his arm in the American Civil War 15 years earlier, so he knew battle well. And in his opinion, the man across the field was a tactical genius, a “Red Napoleon.” Despite overwhelming odds, this Red Napoleon had wormed his way out of battle after battle, somehow always coming out on top. Continue reading ▶

over a year ago 34 votes

More in science

Animals as chemical factories

The price of purple

2 days ago 3 votes
32 Bits That Changed Microprocessor Design

In the late 1970s, a time when 8-bit processors were state of the art and CMOS was the underdog of semiconductor technology, engineers at AT&T’s Bell Labs took a bold leap into the future. They made a high-stakes bet to outpace IBM, Intel, and other competitors in chip performance by combining cutting-edge 3.5-micron CMOS fabrication with a novel 32-bit processor architecture. Although their creation—the Bellmac-32 microprocessor—never achieved the commercial fame of earlier ones such as Intel’s 4004 (released in 1971), its influence has proven far more enduring. Virtually every chip in smartphones, laptops, and tablets today relies on the complementary metal-oxide semiconductor principles that the Bellmac-32 pioneered. As the 1980s approached, AT&T was grappling with transformation. For decades, the telecom giant—nicknamed “Ma Bell”—had dominated American voice communications, with its Western Electric subsidiary manufacturing nearly every telephone found in U.S. homes and offices. The U.S. federal government was pressing for antitrust-driven divestiture, but AT&T was granted an opening to expand into computing. With computing firms already entrenched in the market, AT&T couldn’t afford to play catch-up; its strategy was to leap ahead, and the Bellmac-32 was its springboard. The Bellmac-32 chip series has now been honored with an IEEE Milestone. Dedication ceremonies are slated to be held this year at the Nokia Bell Labs’ campus in Murray Hill, N.J., and at the Computer History Museum in Mountain View, Calif. A chip like no other Rather than emulate the industry standard of 8-bit chips, AT&T executives challenged their Bell Labs engineers to deliver something revolutionary: the first commercially viable microprocessor capable of moving 32 bits in one clock cycle. It would require not just a new chip but also an entirely novel architecture—one that could handle telecommunications switching and serve as the backbone for future computing systems. “We weren’t just building a faster chip,” says Michael Condry, who led the architecture team at Bell Labs’ Holmdel facility in New Jersey. “We were trying to design something that could carry both voice and computation into the future.” This configuration of the Bellmac-32 microprocessor had an integrated memory management unit optimized for Unix-like operating systems.AT&T Archives and History Center At the time, CMOS technology was seen as a promising—but risky—alternative to the NMOS and PMOS designs then in use. NMOS chips, which relied solely on N-type transistors, were fast but power-hungry. PMOS chips, which depend on the movement of positively-charged holes, were too slow. CMOS, with its hybrid design, offered the potential for both speed and energy savings. The benefits were so compelling that the industry soon saw that the need for double the number of transistors (NMOS and PMOS for each gate) was worth the tradeoff. As transistor sizes shrank along with the rapid advancement of semiconductor technology described by Moore’s Law, the cost of doubling up the transistor density soon became manageable and eventually became negligible. But when Bell Labs took its high-stakes gamble, large-scale CMOS fabrication was still unproven and looked to be comparatively costly. That didn’t deter Bell Labs. By tapping expertise from its campuses in Holmdel and Murray Hill as well as in Naperville, Ill., the company assembled a dream team of semiconductor engineers. The team included Condry; Sung-Mo “Steve” Kang, a rising star in chip design; Victor Huang, another microprocessor chip designer, and dozens of AT&T Bell Labs employees. They set out in 1978 to master a new CMOS process and create a 32-bit microprocessor from scratch. Designing the architecture The architecture group led by Condry, an IEEE Life Fellow who would later become Intel’s CTO, focused on building a system that would natively support the Unix operating system and the C programming language. Both were in their infancy but destined for dominance. To cope with the era’s memory limitations—kilobytes were precious—they introduced a complex instruction set that required fewer steps to carry out and could be executed in a single clock cycle. The engineers also built the chip to support the VersaModule Eurocard (VME) parallel bus, enabling distributed computing so several nodes could handle data processing in parallel. Making the chip VME-enabled also allowed it to be used for real-time control. The group wrote its own version of Unix, with real-time capabilities to ensure that the new chip design was compatible with industrial automation and similar applications. The Bell Labs engineers also invented domino logic, which ramped up processing speed by reducing delays in complex logic gates. Additional testing and verification techniques were developed and introduced via the Bellmac-32 Module, a sophisticated multi-chipset verification and testing project led by Huang that allowed the complex chip fabrication to have zero or near-zero errors. This was the first of its kind in VLSI testing. The Bell Labs engineers’ systematic plan for double- and triple-checking their colleagues’ work ultimately made the total design of the multiple chipset family work together seamlessly as a complete microcomputer system. Then came the hardest part: actually building the chip. Floor maps and colored pencils “The technology for layout, testing, and high-yield fabrication just wasn’t there,” recalls Kang, an IEEE Life Fellow who later became president of the Korea Advanced Institute of Science and Technology (KAIST) in Daejeon, South Korea. With no CAD tools available for full-chip verification, Kang says, the team resorted to printing oversize Calcomp plots. The schematics showed how the transistors, circuit lines, and interconnects should be arranged inside the chip to provide the desired outputs. The team assembled them on the floor with adhesive tape to create a massive square map more than 6 meters on a side. Kang and his colleagues traced every circuit by hand with colored pencils, searching for breaks, overlaps, or mishandled interconnects. Getting it made Once the physical design was locked in, the team faced another obstacle: manufacturing. The chips were fabricated at a Western Electric facility in Allentown, Pa., but Kang recalls that the yield rates (the percentage of chips on a silicon wafer that meet performance and quality standards) were dismal. To address that, Kang and his colleagues drove from New Jersey to the plant each day, rolled up their sleeves, and did whatever it took, including sweeping floors and calibrating test equipment, to build camaraderie and instill confidence that the most complicated product the plant workers had ever attempted to produce could indeed be made there. “We weren’t just building a faster chip. We were trying to design something that could carry both voice and computation into the future.” —Michael Condry, Bellmac-32 architecture team lead “The team-building worked out well,” Kang says. “After several months, Western Electric was able to produce more than the required number of good chips.” The first version of the Bellmac-32, which was ready by 1980, fell short of expectations. Instead of hitting a 4-megahertz performance target, it ran at just 2 MHz. The engineers discovered that the state-of-the-art Takeda Riken testing equipment they were using was flawed, with transmission-line effects between the probe and the test head leading to inaccurate measurements, so they worked with a Takeda Riken team to develop correction tables that rectified the measurement errors. The second generation of Bellmac chips had clock speeds that exceeded 6.2 MHz, sometimes reaching 9. That was blazing fast for its time. The 16-bit Intel 8008 processor inside IBM’s original PC released in 1981 ran at 4.77 MHz. Why Bellmac-32 didn’t go mainstream Despite its technical promise, the Bellmac-32 did not find wide commercial use. According to Condry, AT&T’s pivot toward acquiring equipment manufacturer NCR, which it began eyeing in the late 1980s, meant the company chose to back a different line of chips. But by then, the Bellmac-32’s legacy was already growing. “Before Bellmac-32, NMOS was dominant,” Condry says. “But CMOS changed the market because it was shown to be a more effective implementation in the fab.” In time, that realization reshaped the semiconductor landscape. CMOS would become the foundation for modern microprocessors, powering the digital revolution in desktops, smartphones, and more. The audacity of Bell Labs’ bet—to take an untested fabrication process and leapfrog an entire generation of chip architecture—stands as a landmark moment in technological history. As Kang puts it: “We were on the frontier of what was possible. We didn’t just follow the path—we made a new one.” Huang, an IEEE Life Fellow who later became deputy director of the Institute of Microelectronics, Singapore, adds: “This included not only chip architecture and design, but also large-scale chip verification—with CAD but without today’s digital simulation tools or even breadboarding [which is the standard method for checking whether a circuit design for an electronic system that uses chips works before making permanent connections by soldering the circuit elements together].” Condry, Kang, and Huang look back fondly on that period and express their admiration for the many AT&T employees whose skill and dedication made the Bellmac-32 chip series possible. Administered by the IEEE History Center and supported by donors, the Milestone program recognizes outstanding technical developments around the world. The IEEE North Jersey Section sponsored the nomination.

3 days ago 7 votes
In Test, A.I. Weather Model Fails to Predict Freak Storm

Artificial intelligence is powering weather forecasts that are generally more accurate than conventional forecasts and are faster and cheaper to produce. But new research shows A.I. may fail to predict unprecedented weather events, a troubling finding as warming fuels new extremes. Read more on E360 →

3 days ago 4 votes
Graduate Student Solves Classic Problem About the Limits of Addition

A new proof illuminates the hidden patterns that emerge when addition becomes impossible. The post Graduate Student Solves Classic Problem About the Limits of Addition first appeared on Quanta Magazine

3 days ago 4 votes
My very busy week

I’m not sure who scheduled ODSC and PyConUS during the same week, but I am unhappy with their decisions. Last Tuesday I presented a talk and co-presented a workshop at ODSC, and on Thursday I presented a tutorial at PyCon. If you would like to follow along with my very busy week, here are the resources: Practical Bayesian Modeling with PyMC Co-presented with Alex Fengler for ODSC East 2025 In this tutorial, we explore Bayesian regression using PyMC – the... Read More Read More The post My very busy week appeared first on Probably Overthinking It.

3 days ago 7 votes