More from IEEE Spectrum
Millions of people worldwide have reason to be thankful that Swedish engineer Rune Elmqvist decided not to practice medicine. Although qualified as a doctor, he chose to invent medical equipment instead. In 1949, while working at Elema-Schonander (later Siemens-Elema), in Stockholm, he applied for a patent for the Mingograph, the first inkjet printer. Its movable nozzle deposited an electrostatically controlled jet of ink droplets on a spool of paper. Rune Elmqvist qualified to be a physician, but he devoted his career to developing medical equipment, like this galvanometer.Håkan Elmqvist/Wikipedia Elmqvist demonstrated the Mingograph at the First International Congress of Cardiology in Paris in 1950. It could record physiological signals from a patient’s electrocardiogram or electroencephalogram in real time, aiding doctors in diagnosing heart and brain conditions. Eight years later, he worked with cardiac surgeon Åke Senning to develop the first fully implantable pacemaker. So whether you’re running documents through an inkjet printer or living your best life due to a pacemaker, give a nod of appreciation to the inventive Dr. Elmqvist. The world’s first inkjet printer Rune Elmqvist was an inquisitive person. While still a student, he invented a specialized potentiometer to measure pH and a portable multichannel electrocardiograph. In 1940, he became head of development at the Swedish medical electronics company Elema-Schonander. Before the Mingograph, electrocardiograph machines relied on a writing stylus to trace the waveform on a moving roll of paper. But friction between the stylus and the paper prevented small changes in the electrical signal from being accurately recorded. Elmqvist’s initial design was a modified oscillograph. Traditionally, an oscillograph used a mirror to reflect a beam of light (converted from the electrical signal) onto photographic film or paper. Elmqvist swapped out the mirror for a small, moveable glass nozzle that continuously sprayed a thin stream of liquid onto a spool of paper. The electrical signal electrostatically controlled the jet. The Mingograph was originally used to record electrocardiograms of heart patients. It soon found use in many other fields.Siemens Healthineers Historical Institute By eliminating the friction of a stylus, the Mingograph (which the company marketed as the Mingograf) was able to record more detailed changes of the heartbeat. The machine had three paper-feed speeds: 10, 25, and 50 millimeters per second. The speed could be preset or changed while in operation. RELATED: The Inventions That Made Heart Disease Less Deadly An analog input jack on the Mingograph could be used to take measurements from other instruments. Researchers in disciplines far afield from medicine took advantage of this input to record pressure or sound. Phoneticians used it to examine the acoustic aspects of speech, and zoologists used it to record birdsongs. Throughout the second half of the 20th century, scientists cited the Mingograph in their research papers as an instrument for their experiments. Today, the Mingograph isn’t that widely known, but the underlying technology, inkjet printing, is ubiquitous. Inkjets dominate the home printer market, and specialized printers print DNA microarrays in labs for genomics research, create electrical traces for printed circuit boards, and much more, as Phillip W. Barth and Leslie A. Field describe in their 2024 IEEE Spectrum article “Inkjets Are for More Than Just Printing.” The world’s first implantable pacemaker Despite the influence of the Mingograph on the evolution of printing, it is arguably not Elmqvist’s most important innovation. The Mingograph helped doctors diagnose heart conditions, but it couldn’t save a patient’s life by itself. One of Elmqvist’s other inventions could and did: the first fully implantable, rechargeable pacemaker. The first implantable pacemaker [left] from 1958 had batteries that needed to be recharged once a week. The 1983 pacemaker [right] was programmable, and its batteries lasted several years.Siemens Healthineers Historical Institute Like many stories in the history of technology, this one was pushed into fruition at the urging of a woman, in this case Else-Marie Larsson. Else-Marie’s 43-year-old husband, Arne, suffered from scarring of his heart tissue due to a viral infection. His heart beat so slowly that he constantly lost consciousness, a condition known as Stokes-Adams syndrome. Else-Marie refused to accept his death sentence and searched for an alternative. After reading a newspaper article about an experimental implantable pacemaker being developed by Elmqvist and Senning at the Karolinska Hospital in Stockholm, she decided that her husband would be the perfect candidate to test it out, even though it had been tried only on animals up until that point. External pacemakers—that is, devices outside the body that regulated the heart beat by applying electricity—already existed, but they were heavy, bulky, and uncomfortable. One early model plugged directly into a wall socket, so the user risked electric shock. By comparison, Elmqvist’s pacemaker was small enough to be implanted in the body and posed no shock risk. Fully encased in an epoxy resin, the disk-shaped device had a diameter of 55 mm and a thickness of 16 mm—the dimensions of the Kiwi Shoe Polish tin in which Elmqvist molded the first prototypes. It used silicon transistors to pace a pulse with an amplitude of 2 volts and duration of 1.5 milliseconds, at a rate of 70 to 80 beats per minute (the average adult heart rate). The pacemaker ran on two rechargeable 60-milliampere-hour nickel-cadmium batteries arranged in series. A silicon diode connected the batteries to a coil antenna. A 150-kilohertz radio loop antenna outside the body charged the batteries inductively through the skin. The charge lasted about a week, but it took 12 hours to recharge. Imagine having to stay put that long. In 1958, over 30 years before this photo, Arne Larsson [right] received the first implantable pacemaker, developed by Rune Elmqvist [left] at Siemens-Elema. Åke Senning [center] performed the surgery.Sjöberg Bildbyrå/ullstein bild/Getty Images Else-Marie’s persuasion and persistence pushed Elmqvist and Senning to move from animal tests to human trials, with Arne as their first case study. During a secret operation on 8 October 1958, Senning placed the pacemaker in Arne’s abdomen wall with two leads implanted in the myocardium, a layer of muscle in the wall of the heart. The device lasted only a few hours. But its replacement, which happened to be the only spare at the time, worked perfectly for six weeks and then off and on for several more years. Arne Larsson lived another 43 years after his first pacemaker was implanted. Shown here are five of the pacemakers he received. Sjöberg Bildbyrå/ullstein bild/Getty Images Arne Larsson clearly was happy with the improvement the pacemaker made to his quality of life because he endured 25 more operations over his lifetime to replace each failing pacemaker with a new, improved iteration. He managed to outlive both Elmqvist and Senning, finally dying at the age of 86 on 28 December 2001. Thanks to the technological intervention of his numerous pacemakers, his heart never gave out. His cause of death was skin cancer. Today, more than a million people worldwide have pacemakers implanted each year, and an implanted device can last up to 15 years before needing to be replaced. (Some pacemakers in the 1980s used nuclear batteries, which could last even longer, but the radioactive material was problematic. See “The Unlikely Revival of Nuclear Batteries.”) Additionally, some pacemakers also incorporate a defibrillator to shock the heart back to a normal rhythm when it gets too far out of sync. This lifesaving device certainly has come a long way from its humble start in a shoe polish tin. Rune Elmqvist’s legacy Whenever I start researching the object of the month for Past Forward, I never know where the story will take me or how it might hit home. My dad lived with congestive heart failure for more than two decades and absolutely loved his pacemaker. He had a great relationship with his technician, Francois, and they worked together to fine-tune the device and maximize its benefits. And just like Arne Larsson, my dad died from an unrelated cause. An engineer to the core, he would have delighted in learning about the history of this fantastic invention. And he probably would have been tickled by the fact that the same person also invented the inkjet printer. My dad was not a fan of inkjets, but I’m sure he would have greatly admired Rune Elmqvist, who saw problems that needed solving and came up with elegantly engineered solutions. Part of a continuing series looking at historical artifacts that embrace the boundless potential of technology. An abridged version of this article appears in the September 2025 print issue. References There is frustratingly little documented information about the Mingograph’s origin story or functionality other than its patent. I pieced together how it worked by reading the methodology sections of various scientific papers, such as Alf Nachemson’s 1960 article in Acta Orthopaedica Scandinavica, “Lumbar Intradiscal Pressure: Experimental Studies on Post-mortem Material”; Ingemar Hjorth’s 1970 article in the Journal of Theoretical Biology, “A Comment on Graphic Displays of Bird Sounds and Analyses With a New Device, the Melograph Mona”; and Paroo Nihalani’s 1975 article in Phonetica, “Velopharyngeal Opening in the Formation of Voiced Stops in Sindhi.” Such sources reveal how this early inkjet printer moved from cardiology into other fields. Descriptions of Elmqvist’s pacemaker were much easier to find, with Mark Nicholls’s 2007 profile “Pioneers of Cardiology: Rune Elmqvist, M.D.,” in Circulation: Journal of the American Heart Association, being the main source. Siemens also pays tribute to the pacemaker on its website; see, for example, “A Lifesaver in a Plastic Cup.”
The last time I used a dial-up modem came sometime around 2001. Within just a few years, dial-up had exited my life, never to return. I haven’t even had a telephone line in my house for most of my adult life. But I still feel a strong tinge of sadness to know that AOL is finally retiring the ol’ hobbyhorse. At the end of September, it’s gone. The timeline is almost on-the-nose fitting: The widespread access to the Internet AOL’s service brought in the 1990s is associated with a digital phenomenon called the Eternal September. Before AOL allowed broad access to Usenet—a precursor to today’s online discussion forums—most new users appeared each September, when new college students frequently joined the platform. Thanks to AOL, they began showing up daily starting around September 1993. The fact that AOL’s dial-up is still active in the first place highlights a truism of technology: Sometimes, the important stuff sticks around well after it’s obsolete. Why AOL is ditching dial-up now It’s no surprise that dial-up has lingered for close to a quarter-century. Despite not having needed a dial-up modem myself since the summer of 2001, I was once so passionate about dial-up that I begged to get a modem for my 13th birthday. Modems are hard to shake, and not just because we fondly remember waiting so long for them to do their thing. Originally, the telephone modem was a hack. It was pushed into public consciousness partly by Deaf users who worked around the phone industry’s monopolistic regulations to develop the teletypewriter, a system to communicate over phone lines via text. Along the way, the community invented technologies like the acoustic coupler. To make that hack function, modems had to do multiple conversions in real time—from data to audio and back again, in two directions. As I put it in a piece that compared the modem to the telegraph: The modem, at least in its telephone-based forms, represents a dance between sound and data. By translating information into an aural signal, then into current, then back into an aural signal, then back into data once again, the modulation and demodulation going on is very similar to the process used with the original telegraph, albeit done manually. Modems like this one from U.S. Robotics work by converting data to audio and back again. Jphill19/Wikimedia Commons With telegraphs, the information was input by a person, translated into electric pulses, and received by another person. Modems work the same way, just without human translators. The result of all this back and forth was that modems had to give up a hell of a lot of speed to make this all work. The need to connect over a medium built for audio meant that data was at risk of getting lost over the line. (This is why error correction was an essential part of the modem’s evolution; often data needed to be shared more than once to ensure it got through. Without error correction, dial-up modems would be even slower.) Remember that sound? It marked many users’ first experience getting online.AdventuresinHD/YouTube Telephone lines were a hugely inefficient system for data because they were built for voice and heavily compressed audio. Voices are still clear and recognizable after being compressed, but audio compression can wreak havoc on data connections. Plus, there was the problem of line access. With a call, you could not easily share a connection. That meant you couldn’t make phone calls while using dial-up, leading to some homes getting a second line. And at the Internet Service Provider level, having multiple lines got very complex, very fast. The phone industry knew this, but its initial solution, ISDN, did not take off among mainstream consumers. (A later one, DSL, had better uptake, and is likely one of the few Internet options rural users currently have.) In some areas of the United States, dial-up remains the best option—the result of decades of poor investment in Internet infrastructure. So the industry moved to other solutions to get consumers Internet—coaxial cable, which was already widespread because of cable TV, and fiber, which wasn’t. The problem is, coax never reached quite as far as telephone wires did, in part because cable television wasn’t technically a utility in the way electricity or water were. In recent years, many attempts have been made to classify Internet access as a public utility, though the most recent one was struck down by an appeals court earlier this year. The public utility regulation is important. The telephone had struggled to reach rural communities in the 1930s, and only did so after a series of regulations, including one that led to the creation of the Federal Communications Commission, were put into effect. So too did electricity, which needed a dedicated law to expand its reach. But the reach of broadband is frustratingly incomplete, as highlighted by the fact that many areas of the country are not properly covered by cellular signals. And getting new wires hung can be an immensely difficult task, in part because companies that sell fiber, like Verizon and Google, often stop investing due to the high costs. (Though, to Google’s credit, it started expanding again in 2022 after a six-year rollback.) So, in some areas of the United States, dial-up remains the best option—the result of decades of poor investment in Internet infrastructure. This, for years, has propped up companies like AOL, which has evolved numerous times since it foolishly merged with Time Warner a quarter-century ago. The first PC-based client called America Online appeared on the graphical operating system GeoWorks. This screenshot shows the DOS AOL client that was distributed with GeoWorks 2.01.Ernie Smith But AOL is not the company it was. After multiple acquisitions and spin-outs, it is now a mere subsidiary of Yahoo, and it long ago transitioned into a Web-first property. Oh, it still has subscriptions, but they’re effectively fancy analogues for unnecessary security software. And their email client, while having been defeated by the likes of Gmail years ago, still has its fans. When I posted the AOL news on social media, about 90 percent of the responses were jokes or genuine notes of respect. But there was a small contingent, maybe 5 percent, that talked about how much this was going to screw over far-flung communities. I don’t think it’s AOL’s responsibility to keep this model going forever. Instead, it looks like the job is going to fall to two companies: Microsoft, whose MSN Dial-Up Internet Access costs US $179.95 per year, and the company United Online, which still operates the longtime dial-up players Juno and NetZero. Satellite Internet is also an option, with older services like HughesNet and newer ones like Starlink picking up the slack. It’s not AOL’s fault. But AOL is the face of this failing. AOL dropping dial-up is part of a long fade-out As technologies go, the dial-up modem has not lasted quite as long as the telegram, which has been active in one form or another for 181 years. But the modem, which was first used in 1958 as part of an air-defense system, has stuck around for a good 67 years. That makes it one of the oldest pieces of computer-related technology still in modern use. To give you an idea of how old that is: 1958 is also the year that the integrated circuit, an essential building block of any modern computer, was invented. The disk platter, which became the modern hard drive, was invented a year earlier. The floppy disk came a decade later. (It should be noted that the modem itself is not dying—your smartphone has one—but the connection your landline has to your modem, the really loud one, has seen better days.) The news that AOL is dropping its service might be seen as the end of the line for dial-up, but the story of the telegram hints that this may not be the case. In 2006, much hay was made about Western Union sending its final telegram. But Western Union was never the only company sending telegrams, and another company picked up the business. You can still send a telegram via International Telegram in 2025. (It’s not cheap: A single message, sent the same day, is $34, plus 75 cents per word.) In many ways, AOL dropping the service is a sign that this already niche use case is going to get more niche. But niche use cases have a way of staying relevant, given the right audience. It’s sort of like why doctors continue to use pagers. As a Planet Money episode from two years ago noted, the additional friction of using pagers worked well with the way doctors functioned, because it ensured that they knew the messages they were getting didn’t compete with anything else. Dial-up is likely never going to totally die, unless the landline phone system itself gets knocked offline, which AT&T has admittedly been itching to do. It remains one of the cheapest options to get online, outside of drinking a single coffee at a Panera and logging onto the wifi. But AOL? While dial-up may have been the company’s primary business earlier in its life, it hasn’t really been its focus in quite a long time. AOL is now a highly diversified company, whose primary focus over the past 15 years has been advertising. It still sells subscriptions, but those subscriptions are about to lose their most important legacy feature. AOL is simply too weak to support the next generation of Internet service themselves. Their inroad to broadband was supposed to be Time Warner Cable; that didn’t work out, so they pivoted to something else, but kept around the legacy business while it was still profitable. It’s likely that emerging technologies, like Microsoft’s Airband Initiative, which relies on distributing broadband over unused “white spaces” on the television dial, stand a better shot. 5G connectivity will also likely improve over time (T-Mobile already promotes its 5G home Internet as a rural option), and perhaps more satellite-based options will emerge. Technologies don’t die. They just slowly become so irrelevant that they might as well be dead. The monoculture of the AOL login experience When I posted the announcement, hidden in an obscure link on the AOL website sent to me by a colleague, it immediately went viral on Bluesky and Mastodon. That meant I got to see a lot of people react to this news in real time. Most had the same comment: I didn’t even know it was still around. Others made modem jokes, or talked about AOL’s famously terrible customer service. What was interesting was that most people said roughly the same thing about the service. That is not the case with most online experiences, which usually reflect myriad points of views. I think it speaks to the fact that while the Internet was the ultimate monoculture killer, the experience of getting online for the first time was largely monocultural. Usually, it started with a modem connecting to a phone number and dropping us into a single familiar place. We have lost a lot of Internet Service Providers over the years. Few spark the passion and memories of America Online, a network that somehow beat out more innovative and more established players to become the onramp to the Information Superhighway, for all the good and bad that represents. AOL must be embarrassed of that history. It barely even announced its closure.
On 29 August 1949, the Soviet Union successfully tested its first nuclear weapon. Over the next year and a half, U.S. President Harry S. Truman resurrected the Office of Civilian Defense (which had been abolished at the end of World War II) and signed into law the Federal Civil Defense Act of 1950, which mobilized government agencies to plan for the aftermath of a global nuclear war. With the Cold War underway, that act kicked off a decades-long effort to ensure that at least some Americans survived nuclear armageddon. As the largest civilian federal agency with a presence throughout the country, the U.S. Post Office Department was in a unique position to monitor local radiation levels and shelter residents. By the end of 1964, approximately 1,500 postal buildings had been designated as fallout shelters, providing space and emergency supplies for 1.3 million people. Occupants were expected to remain in the shelters until the radioactivity outside was deemed safe. By 1968, about 6,000 postal employees had been trained to use radiological equipment, such as the CD V-700 pictured at top, to monitor beta and gamma radiation. And a group of postal employees organized a volunteer ham radio network to help with communications should the regular networks go down. What was civil defense in the Cold War? The basic premise of civil defense was that many people would die immediately in cities directly targeted by nuclear attacks. (Check out Alex Wellerstein’s interactive Nukemap for an estimate of casualties and impact should your hometown—or any location of your choosing—be hit.) It was the residents of other cities, suburbs, and rural communities outside the blast area that would most benefit from civil defense preparations. With enough warning, they could shelter in a shielded site and wait for the worst of the fallout to decay. Anywhere from a day or two to a few weeks after the attack, they could emerge and aid any survivors in the harder-hit areas. In 1957, a committee of the Office of Defense Mobilization drafted the report Deterrence and Survival in the Nuclear Age, for President Dwight D. Eisenhower. Better known as the Gaither Report, it called for the creation of a nationwide network of fallout shelters to protect civilians. Government publications such as The Family Fallout Shelter encouraged Americans who had the space, the resources, and the will to construct shelters for their homes. City dwellers in apartment buildings warranted only half a page in the booklet, with the suggestion to head to the basement and cooperate with other residents. This model fallout shelter from 1960 was designed for four to six people. Bettmann/Getty Images Ultimately, very few homeowners actually built a fallout shelter. But Rod Serling, creator of the television series “The Twilight Zone,” saw an opportunity for pointed social commentary. Aired in the fall of 1961, the episode “The Shelter” showed how quickly civilization (epitomized by a suburban middle-class family and their friends) broke down over decisions about who would be saved and who would not. Meanwhile, President John F. Kennedy had started to shift the national strategy from individual shelters to community shelters. At his instruction, the U.S. Army Corps of Engineers began surveying existing buildings suitable for public shelters. Post offices, especially ones with basements capable of housing at least 50 people, were a natural fit. Each postmaster general was designated as the local shelter manager and granted complete authority to operate the shelter, including determining who would be admitted or excluded. The Handbook for Fallout Shelter Management gave guidance for everything from sleeping arrangements to sanitation standards. Shelters were stocked with food and water, medicine, and, of course, radiological survey instruments. What to do in case of a nuclear attack These community fallout shelters were issued a standard kit for radiation detection. The kit came in a cardboard box that contained two radiation monitors, the CD V-700 (a Geiger counter, pictured at top) and the CD V-715 (a simple ion chamber survey meter); two cigar-size CD V-742 dosimeters, to measure a person’s total exposure while wearing the device; and a charger for the dosimeters. Also included was the Handbook for Radiological Monitors, which provided instructions on how to use the equipment and report the results. Post office fallout shelters were issued standard kits for measuring radioactivity after a nuclear attack.National Postal Museum/Smithsonian Institution The shelter radiation kit included two radiation monitors, two cigar-size dosimeters, and a charger for the dosimeters. Photoquest/Getty Images In the event of an attack, the operator would take readings with the CD V-715 at selected locations in the shelter. Then, within three minutes of finishing the indoor measurements, he would go outside and take a reading at least 25 feet (7.6 meters) from the building. If the radiation level outside was high, there were procedures for decontamination upon returning to the shelter. The “protection factor” of the shelter was calculated by dividing the outside reading by the inside reading. (Today the Federal Emergency Management Agency, FEMA, recommends a PF of at least 40 for a fallout shelter.) Operators were directed to retake the measurements and recalculate the protective factor at least once every 24 hours, or more frequently if the radiation levels changed rapidly. The CD V-700 was intended for detecting beta and gamma radiation during cleanup and decontamination operations, and also for detecting any radioactive contamination of food, water, and personnel. RELATED: DIY Gamma-Ray Spectroscopy With a Raspberry Pi Pico Each station would report their dose rates to a regional control center, so that the civil defense organization could determine when people could leave their shelter, where they could go, what routes to take, and what facilities needed decontamination. But if you’ve lived through a natural or manmade disaster, you’ll know that in the immediate aftermath, communications don’t always work so well. Indeed, the Handbook for Radiological Monitors acknowledged that a nuclear attack might disrupt communications. Luckily, the U.S. Post Office Department had a backup plan. In May 1958, Postmaster General Arthur E. Summerfield made an appeal to all postal employees who happened to be licensed amateur radio operators, to form an informal network that would provide emergency communications in the event of the collapse of telephone and telegraph networks and commercial broadcasting. The result was Post Office Net (PON), a voluntary group of ham radio operators; by 1962, about 1,500 postal employees in 43 states had signed on. That year, PON was opened up to nonemployees who had the necessary license. RELATED: The Uncertain Future of Ham Radio Although PON was never activated due to a nuclear threat, it did transmit messages during other emergencies. For example, in January 1967, after an epic blizzard blanketed Illinois and Michigan with heavy snow, the Michigan PON went into action, setting up liaisons with county weather services and relaying emergency requests, such as rescuing people stranded in vehicles on Interstate 94. A 1954 civil defense fair featured a display of amateur radios. The U.S. Post Office recruited about 1,500 employees to operate a ham radio network in the event that regular communications went down. National Archives The post office retired the network on 30 June 1974 as part of its shift away from civil defense preparedness. (A volunteer civil emergency-response ham radio network still exists, under the auspices of the American Radio Relay League.) And by 1977, laboratory tests indicated that most of the food and medicine stockpiled in post office basements was no longer fit for human consumption. In 1972 the Office of Civil Defense was replaced by the Defense Civil Preparedness Agency, which was eventually folded into FEMA. And with the end of the Cold War, the civil defense program officially ended in 1994, fortunately without ever being needed for a nuclear attack. Do we still need civil defense? The idea for this column came to me last fall, when I was doing research at the Linda Hall Library, in Kansas City, Mo., and I kept coming across articles about civil defense in magazines and journals from the 1950s and ’60s. I knew that the Smithsonian’s National Postal Museum, in Washington, D.C., had several civil defense artifacts (including the CD V-700 and a great “In Time of Emergency” public service announcement record album). As a child of the late Cold War, I remember being worried by the prospect of nuclear war. But then the Cold War ended, and so did my fears. I envisioned this month’s column capturing the intriguing history of civil defense and the earnest preparations of the era. That chapter of history, I assumed, was closed. Little did I imagine that by the time I began to write this, the prospect of a nuclear attack, if not an all-out war, would suddenly become much more real. These days, I understand the complexities and nuances of nuclear weapons much better than when I was a child. But I’m just as concerned that a nuclear conflict is imminent. Here’s hoping that history repeats itself, and it does not come to that. Part of a continuing series looking at historical artifacts that embrace the boundless potential of technology. An abridged version of this article appears in the August 2025 print issue. References The November 1951 issue of Electrical Engineering summarized a civil defense conference held at the General Electric Co.’s Electronics Park in Syracuse, N.Y., earlier that year. Two hundred eighty federal, state, county, and city officials from across the United States and Canada attended, which got me thinking about the topic. Many of the government’s civil defense handbooks are available through the Internet Archive. The U.S. Postal Bulletins have also been digitized, and the USPS historian’s office wrote a great account, “The Postal Service’s Role in Civil Defense During the Cold War.” Although I’ve highlighted artifacts from the National Postal Museum, the Smithsonian Institution has many other objects across multiple museums. Eric Green has been collecting civil defense material since 1978 and has made much of it available through his virtual Civil Defense Museum. Alex Wellerstein, a historian of nuclear technology at the Stevens Institute of Technology, writes the Substack newsletter Doomsday Machines, where he gives thoughtful commentary on how we think about the end of times, in both fiction and reality. His interactive Nukemap is informative and scary.
As I try to write this article, my friend and I have six different screens attached to three types of devices. We’re working in the same room but on our own projects—separate yet together, a comfortable companionship. I had never really thought of the proliferation of screens as a peacekeeping tool until I stumbled across one of Allen B. DuMont’s 1950s dual-screen television sets. DuMont’s idea was to let two people in the same room watch different programs. It reminded me of my early childhood and my family’s one TV set, and the endless arguments with my sisters and parents over what to watch. Dad always won, and his choice was rarely mine. The DuMont Duoscopic Was 2 TVs in 1 Allen B. DuMont was a pioneer of commercial television in the United States. His eponymous company manufactured cathode-ray tubes and in 1938 introduced one of the earliest electronic TV sets. He understood how human nature and a shortage of TV screens could divide couples, siblings, and friends. Accordingly, he built at least two prototype TVs that could play two shows at once. In the 1945 prototype shown at top, DuMont retrofitted a maple-finished cabinet that originally held a single 15-inch Plymouth TV receiver to house two black-and-white 12-inch receivers. Separate audio could be played with or without earpieces. Viewers used a 10-turn dial to tune into TV channel 1 (which went off the air in 1948) and VHF channels 2 through 13. As radio was still much more popular than television, the dial also included FM from 88 to 108 megahertz, plus a few channels used for weather and aviation. The lower left drawer held a phonograph. It was an all-in-one entertainment center. To view their desired programs on the DuMont Duoscopic TV set, this family wore polarized glasses and listened through earpieces.Allen DuMont/National Museum of American History/Smithsonian In 1954, DuMont introduced a different approach. With the DuMont Duoscopic, two different channels were broadcast on a single screen. To the naked eye, the images appeared superimposed on one another. But a viewer who wore polarized glasses or looked at the screen through a polarized panel saw just one of the images. Duoscopic viewers could use an earpiece to listen to the audio of their choice. You could also use the TV set to watch a single program by selecting only one channel and playing the audio through one speaker. DuMont seemed committed to the idea that family members should spend time together, even if they were engaged in different activities. An image of the Duoscopic sent out by the Associated Press Wirephoto Service heralded “No more lonely nights for the missus.” According to the caption, she could join “Hubby,” who was already relaxing in his comfy armchair enjoying his favorite show, but now watch something of her own choosing. “Would you believe it?” a Duoscopic brochure asks. “While HE sees and hears the fights, SHE sees and hears her play…. Separate viewing and solo sound allows your family a choice.” The technology to separate and isolate the images and audio was key. The Duoscopic had two CRTs, each with its own feed, set at right angles to each other. A half-silvered mirror superimposed the two images onto a single screen, which could then be filtered with polarized glasses or screens. TV pioneer Allen B. DuMont designed and manufactured cathode ray tubes and TV sets and launched an early TV network.Science History Images/Alamy A separate box could be conveniently placed nearby to control the volume of each program. Users could toggle between the two programs with the flick of a switch. Each set came with eight earpieces with long cords. A short note in the March 1954 issue of Electrical Engineering praises the engineers who crafted the sound system to eliminate sound bleed from the speakers. It notes that a viewer “very easily could watch one television program and listen to the audio content of a second.” Or, as a United Press piece published in the Panama City News Herald suggested, part of the family could use the earpieces to watch and listen to the TV while others in the room could “read, play bridge, or just sit and brood.” I suspect the brooders were the children who still didn’t get to watch their favorite show. Of course, choice was a relative matter. In the 1950s, many U.S. television markets were lucky to have even two channels. Only in major metropolitan areas were there more programming options. The only known example of DuMont’s side-by-side version resides at the South Carolina State Museum, in Columbia. But sources indicate that DuMont planned to manufacture about 30 Duoscopics for demonstration purposes, although it’s unclear how many were actually made. (The Smithsonian’s National Museum of American History has a Duoscopic in its collections.) Alas, neither version ever went into mainstream production. Perhaps that’s because the economics didn’t make sense: Even in the early 1950s, it would have been easier and cheaper for families to simply purchase two television sets and watch them in different rooms. Who Was Early TV Pioneer Allen DuMont? DuMont is an interesting figure in the history of television because he was actively engaged in the full spectrum of the industry. Not only did he develop and manufacture receivers, he also conducted broadcasting experiments, published papers on transmission and reception, ran a television network, and produced programming. After graduating from Rensselaer Polytechnic Institute in 1924 with a degree in electrical engineering, DuMont worked in a plant that manufactured vacuum tubes. Four years later, he joined the De Forest Radio Co. as chief engineer. With Lee de Forest, DuMont helped design an experimental mechanical television station, but he was unconvinced by the technology and advocated for all-electric TV for its crisper image. RELATED: In 1926, TV Was Mechanical When the Radio Corporation of America acquired De Forest Radio in 1931, DuMont started his own laboratory in his basement, where he worked on improving cathode ray tubes. In 1932 he invented the “magic eye,” a vacuum tube that was a visual tuning aid in radio receivers. He sold the rights to RCA. In 1935, DuMont moved the operation to a former pickle factory in Passaic, N.J., and incorporated it as the Allen B. DuMont Laboratories. The company produced cathode ray oscilloscopes, which helped finance his experiments with television. He debuted the all-electronic DuMont 180 TV set in June 1938. It cost US $395, or almost $9,000 today—so not exactly an everyday purchase for most people. Although DuMont was quick to market, RCA and the Television Corp. of America were right on his tail. RELATED: RCA’s Lucite Phantom Teleceiver Introduced the Idea of TV Of course, if companies were going to sell televisions, consumers had to have programs to watch. So in 1939, DuMont launched his own television network, starting with station W2XWV, broadcasting from Passaic. The Federal Communications Commission licensed W2XWV as an experimental station for television research. DuMont received a commercial license and changed its call sign to WABD on 2 May 1944, three years after NBC’s and CBS’s commercial stations went into operation in New York City. Due to wartime restrictions and debates over industry standards, television remained mostly experimental during World War II. As of September 1944, there were only six stations operating—three in New York City and one each in Chicago, Los Angeles, and Philadelphia. There were approximately 7,000 TV sets in personal use. The DuMont Television Network’s variety show hosted by Jackie Gleason [left, hands raised] featured a recurring skit that later gave rise to “The Honeymooners.”Left: CBS/Getty Images; Right: Garry Winogrand/Picture Post/Hulton Archive/Getty Images While other networks focused on sports, movies, or remote broadcasts, the DuMont Television Network made its mark with live studio broadcasts. In April 1946, WABD moved its studios to the Wanamaker Department Store in Manhattan. DuMont converted the 14,200-cubic-meter (500,000-cubic-foot) auditorium into the world’s largest television studio. The network’s notable programming included “The Original Amateur Hour,” which started as a radio program; “The Johns Hopkins Science Review,” which had a surprisingly progressive take on women’s health; “Life Is Worth Living,” a devotional show hosted by Catholic Bishop Fulton Sheen (that garnered DuMont’s only Emmy Award); “Cavalcade of Stars,” a variety show hosted by Jackie Gleason that birthed “The Honeymooners”; and “Captain Video and His Video Rangers,” a children’s science fiction series, the first of its genre. My grandmother, who loved ballroom dancing, was a big fan of “The Arthur Murray Party,” a dance show hosted by Arthur’s wife, Kathryn; my mom fondly recalls Kathryn’s twirling skirts. While NBC, CBS, and the other major television players built their TV networks on their existing radio networks, DuMont was starting fresh. To raise capital for his broadcast station, he sold a half-interest in his company to Paramount Pictures in 1938. The partnership was contentious from the start. There were disputes over money, the direction of the venture, and stock. But perhaps the biggest conflict was when Paramount and some of its subsidiaries began applying for FCC licenses in the same markets as Dumont’s. This ate into the DuMont network’s advertising and revenue and its plans to expand. In August 1955, Paramount gained full control over the DuMont network and proceeded to shut it down. DuMont continued to manufacture television receivers until 1958, when he sold the business to the Emerson Radio & Phonograph Corp. Two years later, the remainder of DuMont Labs merged with the Fairchild Camera and Instrument Corp. (whose founder, Sherman Fairchild, had in 1957 helped a group of ambitious young scientists and engineers known as the “Traitorous Eight” set up Fairchild Semiconductor). Allen DuMont served as general manager of the DuMont division for a year and then became a technical consultant to Fairchild. He died in 1965. One Thing Allen DuMont Missed My family eventually got a second and then a third television, but my dad always had priority. He watched the biggest set from his recliner in the family room, while my mom made do with the smaller sets in the kitchen and bedroom. He was relaxing, while she was usually doing chores. As a family, we would watch different shows in separate places. An ad for the DuMont Duoscopic touted it as a device for household harmony: “While HE sees and hears the fights, SHE sees and hears her play.” National Museum of American History/Smithsonian These days, with so many screens on so many devices and so many programming options, we may have finally achieved DuMont’s vision of separate but together. While I was writing this piece, my friend was watching the French Open on the main TV, muted so she didn’t disturb me. She streamed the same channel on her tablet and routed the audio to her headset. We both worked on our respective laptops and procrastinated by checking messages on our phones. But there’s one aspect of human nature that DuMont’s prototypes and promotional materials failed to address—that moment when someone sees something so exciting that they just have to share it. Sarah and I were barely getting any work done in this separate-but-together setting because we kept interrupting each other with questions, comments, and the occasional tennis update. We’ve been friends too long; we can’t help but chitchat. The only way for me to actually finish this article will be to go to a room by myself with no other screens or people to distract me. Part of a continuing series looking at historical artifacts that embrace the boundless potential of technology. An abridged version of this article appears in the July 2025 print issue as “The 2-in-1 TV.” References I first learned about the Duoscopic in a short article in the March 1954 issue of Electrical Engineering, a precursor publication to Spectrum. My online research turned up several brochures and newspaper articles from the Early Television Museum, which surprisingly led me to the dual-screen DuMont at the South Carolina State Museum in my hometown of Columbia, S.C. Museum objects are primary sources, and I was fortunate to be able to visit this amazing artifact and examine it with Director of Collections Robyn Thiesbrummel. I also consulted the museum’s accession file, which gave additional information about the receiver from the time of acquisition. I took a look at Gary Newton Hess’s 1960 dissertation, An Historical Study of the Du Mont Television Network, as well as several of Allen B. DuMont’s papers published in the Proceedings of the IRE and Electrical Engineering.
More in science
Episode five of the Works in Progress podcast is about why China outbuilds America
Animals of all kinds mix and mingle in underground burrows, offering troubling opportunities for diseases to jump species. Read more on E360 →
Millions of people worldwide have reason to be thankful that Swedish engineer Rune Elmqvist decided not to practice medicine. Although qualified as a doctor, he chose to invent medical equipment instead. In 1949, while working at Elema-Schonander (later Siemens-Elema), in Stockholm, he applied for a patent for the Mingograph, the first inkjet printer. Its movable nozzle deposited an electrostatically controlled jet of ink droplets on a spool of paper. Rune Elmqvist qualified to be a physician, but he devoted his career to developing medical equipment, like this galvanometer.Håkan Elmqvist/Wikipedia Elmqvist demonstrated the Mingograph at the First International Congress of Cardiology in Paris in 1950. It could record physiological signals from a patient’s electrocardiogram or electroencephalogram in real time, aiding doctors in diagnosing heart and brain conditions. Eight years later, he worked with cardiac surgeon Åke Senning to develop the first fully implantable pacemaker. So whether you’re running documents through an inkjet printer or living your best life due to a pacemaker, give a nod of appreciation to the inventive Dr. Elmqvist. The world’s first inkjet printer Rune Elmqvist was an inquisitive person. While still a student, he invented a specialized potentiometer to measure pH and a portable multichannel electrocardiograph. In 1940, he became head of development at the Swedish medical electronics company Elema-Schonander. Before the Mingograph, electrocardiograph machines relied on a writing stylus to trace the waveform on a moving roll of paper. But friction between the stylus and the paper prevented small changes in the electrical signal from being accurately recorded. Elmqvist’s initial design was a modified oscillograph. Traditionally, an oscillograph used a mirror to reflect a beam of light (converted from the electrical signal) onto photographic film or paper. Elmqvist swapped out the mirror for a small, moveable glass nozzle that continuously sprayed a thin stream of liquid onto a spool of paper. The electrical signal electrostatically controlled the jet. The Mingograph was originally used to record electrocardiograms of heart patients. It soon found use in many other fields.Siemens Healthineers Historical Institute By eliminating the friction of a stylus, the Mingograph (which the company marketed as the Mingograf) was able to record more detailed changes of the heartbeat. The machine had three paper-feed speeds: 10, 25, and 50 millimeters per second. The speed could be preset or changed while in operation. RELATED: The Inventions That Made Heart Disease Less Deadly An analog input jack on the Mingograph could be used to take measurements from other instruments. Researchers in disciplines far afield from medicine took advantage of this input to record pressure or sound. Phoneticians used it to examine the acoustic aspects of speech, and zoologists used it to record birdsongs. Throughout the second half of the 20th century, scientists cited the Mingograph in their research papers as an instrument for their experiments. Today, the Mingograph isn’t that widely known, but the underlying technology, inkjet printing, is ubiquitous. Inkjets dominate the home printer market, and specialized printers print DNA microarrays in labs for genomics research, create electrical traces for printed circuit boards, and much more, as Phillip W. Barth and Leslie A. Field describe in their 2024 IEEE Spectrum article “Inkjets Are for More Than Just Printing.” The world’s first implantable pacemaker Despite the influence of the Mingograph on the evolution of printing, it is arguably not Elmqvist’s most important innovation. The Mingograph helped doctors diagnose heart conditions, but it couldn’t save a patient’s life by itself. One of Elmqvist’s other inventions could and did: the first fully implantable, rechargeable pacemaker. The first implantable pacemaker [left] from 1958 had batteries that needed to be recharged once a week. The 1983 pacemaker [right] was programmable, and its batteries lasted several years.Siemens Healthineers Historical Institute Like many stories in the history of technology, this one was pushed into fruition at the urging of a woman, in this case Else-Marie Larsson. Else-Marie’s 43-year-old husband, Arne, suffered from scarring of his heart tissue due to a viral infection. His heart beat so slowly that he constantly lost consciousness, a condition known as Stokes-Adams syndrome. Else-Marie refused to accept his death sentence and searched for an alternative. After reading a newspaper article about an experimental implantable pacemaker being developed by Elmqvist and Senning at the Karolinska Hospital in Stockholm, she decided that her husband would be the perfect candidate to test it out, even though it had been tried only on animals up until that point. External pacemakers—that is, devices outside the body that regulated the heart beat by applying electricity—already existed, but they were heavy, bulky, and uncomfortable. One early model plugged directly into a wall socket, so the user risked electric shock. By comparison, Elmqvist’s pacemaker was small enough to be implanted in the body and posed no shock risk. Fully encased in an epoxy resin, the disk-shaped device had a diameter of 55 mm and a thickness of 16 mm—the dimensions of the Kiwi Shoe Polish tin in which Elmqvist molded the first prototypes. It used silicon transistors to pace a pulse with an amplitude of 2 volts and duration of 1.5 milliseconds, at a rate of 70 to 80 beats per minute (the average adult heart rate). The pacemaker ran on two rechargeable 60-milliampere-hour nickel-cadmium batteries arranged in series. A silicon diode connected the batteries to a coil antenna. A 150-kilohertz radio loop antenna outside the body charged the batteries inductively through the skin. The charge lasted about a week, but it took 12 hours to recharge. Imagine having to stay put that long. In 1958, over 30 years before this photo, Arne Larsson [right] received the first implantable pacemaker, developed by Rune Elmqvist [left] at Siemens-Elema. Åke Senning [center] performed the surgery.Sjöberg Bildbyrå/ullstein bild/Getty Images Else-Marie’s persuasion and persistence pushed Elmqvist and Senning to move from animal tests to human trials, with Arne as their first case study. During a secret operation on 8 October 1958, Senning placed the pacemaker in Arne’s abdomen wall with two leads implanted in the myocardium, a layer of muscle in the wall of the heart. The device lasted only a few hours. But its replacement, which happened to be the only spare at the time, worked perfectly for six weeks and then off and on for several more years. Arne Larsson lived another 43 years after his first pacemaker was implanted. Shown here are five of the pacemakers he received. Sjöberg Bildbyrå/ullstein bild/Getty Images Arne Larsson clearly was happy with the improvement the pacemaker made to his quality of life because he endured 25 more operations over his lifetime to replace each failing pacemaker with a new, improved iteration. He managed to outlive both Elmqvist and Senning, finally dying at the age of 86 on 28 December 2001. Thanks to the technological intervention of his numerous pacemakers, his heart never gave out. His cause of death was skin cancer. Today, more than a million people worldwide have pacemakers implanted each year, and an implanted device can last up to 15 years before needing to be replaced. (Some pacemakers in the 1980s used nuclear batteries, which could last even longer, but the radioactive material was problematic. See “The Unlikely Revival of Nuclear Batteries.”) Additionally, some pacemakers also incorporate a defibrillator to shock the heart back to a normal rhythm when it gets too far out of sync. This lifesaving device certainly has come a long way from its humble start in a shoe polish tin. Rune Elmqvist’s legacy Whenever I start researching the object of the month for Past Forward, I never know where the story will take me or how it might hit home. My dad lived with congestive heart failure for more than two decades and absolutely loved his pacemaker. He had a great relationship with his technician, Francois, and they worked together to fine-tune the device and maximize its benefits. And just like Arne Larsson, my dad died from an unrelated cause. An engineer to the core, he would have delighted in learning about the history of this fantastic invention. And he probably would have been tickled by the fact that the same person also invented the inkjet printer. My dad was not a fan of inkjets, but I’m sure he would have greatly admired Rune Elmqvist, who saw problems that needed solving and came up with elegantly engineered solutions. Part of a continuing series looking at historical artifacts that embrace the boundless potential of technology. An abridged version of this article appears in the September 2025 print issue. References There is frustratingly little documented information about the Mingograph’s origin story or functionality other than its patent. I pieced together how it worked by reading the methodology sections of various scientific papers, such as Alf Nachemson’s 1960 article in Acta Orthopaedica Scandinavica, “Lumbar Intradiscal Pressure: Experimental Studies on Post-mortem Material”; Ingemar Hjorth’s 1970 article in the Journal of Theoretical Biology, “A Comment on Graphic Displays of Bird Sounds and Analyses With a New Device, the Melograph Mona”; and Paroo Nihalani’s 1975 article in Phonetica, “Velopharyngeal Opening in the Formation of Voiced Stops in Sindhi.” Such sources reveal how this early inkjet printer moved from cardiology into other fields. Descriptions of Elmqvist’s pacemaker were much easier to find, with Mark Nicholls’s 2007 profile “Pioneers of Cardiology: Rune Elmqvist, M.D.,” in Circulation: Journal of the American Heart Association, being the main source. Siemens also pays tribute to the pacemaker on its website; see, for example, “A Lifesaver in a Plastic Cup.”
An updated evolutionary model shows that living systems evolve in a split-and-hit-the-gas dynamic, where new lineages appear in sudden bursts rather than during a long marathon of gradual changes. The post The Sudden Surges That Forge Evolutionary Trees first appeared on Quanta Magazine
Back in the dawn of the 21st century, the American Chemical Society founded a new journal, Nano Letters, to feature letters-length papers about nanoscience and nanotechnology. This was coincident with the launch of the National Nanotechnology Initiative, and it was back before several other publishers put out their own nano-focused journals. For a couple of years now I've been an associate editor at NL, and it was a lot of fun to work with my fellow editors on putting together this roadmap, intended to give a snapshot of what we think the next quarter century might hold. I think some of my readers will get a kick out of it.