More from Evan Hahn's blog
A roundup of my notes from April. I’ve done this for the last few months: March February January Things I published I published a small UI tip about rounding percentages. In short, I don’t think you should show “100%” to the user unless it’s truly done, or “0%” unless it truly hasn’t started. Though this is a bit of a lie, I think it’s clearer to users. I posted clippings from Life in Code: A Personal History of Technology, a book of essays by Ellen Ullman. The book criticizes Silicon Valley (where I was born and raised!) and the modern tech scene. Yet Ullman seems to retain hope that these tools can be part of a better world. Perhaps I’m projecting, because that’s basically how I feel. I read the Economist’s style guide book and published my main takeaways. I think my writing is better after reading! Not something I published, but I was featured on DWeb’s social media and they chose a truly dreadful photo of me. Also, an old post of mine was featured on Remember The Milk’s blog. Things I wrote for Zelda Dungeon This was my first full month writing for Zelda Dungeon, and I published four articles. The month was defined by the announcement of the Switch 2, which is most of what I wrote about: “Improved Pro Controller Announced for Switch 2” “Echoes of Wisdom, Link’s Awakening, & Other Select Switch Games to Receive Free Updates to ‘Improve Playability’ on Switch 2” “Daily Debate: What Did You Think of Today’s Nintendo Switch 2 Direct?” I also published some thoughts about subtle references between Zelda games, which is one of my favorite parts of the series. Tech news I read It’s still bleak out there! Last month, I wrote about small players getting hurt by AI scraping bots. Big organizations like Wikimedia are affected, too. American tech companies build software that kills innocent civilians. Microsoft fired an employee who protested this. Relatedly, an indie dev pulled their game from Microsoft’s Xbox in protest. In case you didn’t know whether the US Immigration and Customs Enforcement agency (ICE) was evil, its director wants it to function “like [Amazon] Prime, but with human beings”. That’s something a cartoon villain would say. Seems like Google Chrome is keeping third-party cookies after all. Third-party cookies are bad for privacy, so I was sad to see this. “If Bitcoin were—as true believers often say—a government-free currency, Donald Trump’s idiot tariffs should have strengthened it.” That’s not what happened. Aftermath posted their “official Editorial Values”. Cool to see these spelled out explicitly, and I wish more publications did this. Links, links, and more links A roundup of links from April: I think ads are poisonous to society, so I loved “What If We Made Advertising Illegal?”. I think the author’s goal is to shift the Overton window, rather than realistically propose a ban. (A near-ban is proposed in Digital Degrowth, a pretty radical book I read this month, which further fueled my fire.) “AI ambivalence” and “Why I stopped using AI code editors” buck against the trend of loving AI for coding. My favorite quote: “I acknowledge that these tools are incredibly powerful, I’ve even started incorporating them into my work in certain limited ways […], but I absolutely hate them.” “Good Intentions Don’t Pave Roads: The Need For a New Strategy in Free-Software” argues that the free software movement needs to change its approach, because it’s losing. Soft Skills episode 454 had a great quote about measuring developer productivity: “many teams have killed the ability to do capacity planning by using story points as a performance metric.” “Is ’ethical AI’ an oxymoron?” asks some ethical questions about generative AI. Refreshingly, it also gives some answers. “It hurts me to know that the tools I share such a deep connection with are made by corporations that exploit workers in developing countries, greenwash their products while generating tons of electronic waste, fight against the rights of people to repair their possessions, engage in malicious compliance when governments try to regulate them, spy on their users, hold their users’ data hostage, and commit a long list of other crimes that would take too long to recount here.” Quotes like this and more in “The tools I love are made by awful people”. SELF is a platformer game. It’s short—maybe only 15 minutes—but left me wanting more. I think it’s the best-feeling 2D platformer I’ve ever played! I found it on Itch.io’s “randomizer”, a feature that shuffles you through different games to try. “Two Years of Rust” was good enough for me to personally email a thank-you to the author. I want more high-level descriptions of really using a tool. I want to know what it feels like to be proficient, and know what the pain points are. This post did exactly that! I learned about asarotos oikos, an ancient Roman mosaic style that looks like there’s a bunch of garbage on the floor. Hope you had a good April.
In short: maybe don’t round to 0% or 100% in your UI. I am not a UI expert. But I sometimes build user interfaces, and I sometimes want to render a percentage to the user. For example, something like “you’ve downloaded 45% of this file”. In my experience, it’s often better to round this number but avoid rounding to 0% or 100%. Rounding to 0% is bad because the user may think there’s been no progress. Even the smallest nonzero ratio, like 0.00001%, should render as 1%. Rounding to 100% is bad because the user may think things are done when they aren’t, and it’s better to show 99%. Ratios like 99.9% should still render as 99%, even if they technically round to 100%. For example, in your UI: Ratio (out of 1) Rendered 0 0% 0.00001 1% 0.01 1% 0.02 2% 0.99 99% 0.99999 99% 1 100% Here’s some Python code that demonstrates the algorithm I like to use: def render_ratio(ratio): if ratio <= 0: return "0%" if ratio >= 1: return "100%" if ratio <= 0.01: return "1%" if ratio >= 0.99: return "99%" return f"{round(ratio * 100)}%" This isn’t right for all apps, of course. Sometimes you want to show the exact percentage to the user, and sometimes you don’t want the app to appear “stuck” at 1% or 99%. But I’ve found this little trick to be useful.
I’ve been trying to improve my writing so I read Writing with Style, the Economist’s style guide book. Here were my main takeaways: Use short sentences. They’re more memorable. They’re easier to read. They’re generally easier to write. Colons are for setup and delivery. They describe them as “dramatic”. One thought per paragraph. The paragraph is a “unit of thought”, according to this book and to H.W. Fowler. Sometimes, you have a one-sentence paragraph because the thought fits into a single sentence. Prefer simpler terms. Use “get” instead of “obtain”, “make” instead of “manufacture”, or “give up” instead of “relinquish”. Ask if you ever use the word when talking to friends. And don’t soften difficult topics: “a poor person has no more money, opportunity or dignity when described as ‘deprived’, ‘disadvantaged’ or ‘underprivileged’.” The right word can eliminate others. More specific words let you “dispense with adjectives and adverbs entirely. Consider the difference between ‘walk’ and ‘strut’, or ‘say’ and ‘murmur’.” Find big-picture issues with a “reverse outline”. When editing, they recommend extracting the main point from each paragraph. This can catch structural issues. Watch out for differences between English dialects. I knew about a lot of these, like how I’d spell it “color” and a Brit would spell it “colour”. But I didn’t know about “quite”: in American English, it’s a synonym for “very”; in British English, it can mean “fairly”. (The book failed to mention other dialects of English, to my disappointment.) There were things I didn’t like about the book. It seemed allergic to whimsy. A lot of its rules felt arbitrary. The Economist writes for a different audience than I do. But these disagreements helped me clarify my own writing style, so they were still helpful. I think my writing is better as a result of this book. I recommend checking it out!
Life in Code: A Personal History of Technology is a book of essays by Ellen Ullman. In the book, Ullman laments the bad parts of computers and the internet. These systems eroded privacy, deepened income inequality, and enabled the rise of modern fascism. And they were built by a tiny subset of people—young men, mostly white and Asian, mostly wealthy—to the exclusion of almost everyone else. Despite all this, she maintains a hopeful fascination with technology. Perhaps humanity can use these tools as part of a better world. I share this sentiment, I think. Many of the stories are old by Silicon Valley standards, but they feel prescient. The book is filled with ideas that could be written today, if you modernized a few incidental details. These are my notes and quotes from the book. “Outside of Time” (1994) Ullman on the idea that low-level development is more respected: “If you want money and prestige, you need to write code that only machines or other programmers understand.” Oh, and these prestigious and lucrative jobs are primarily held by young men. And these boys impart their ideas into the systems they build: As the computer’s pretty, helpfully waiting face […] penetrates deeply into daily life, the cult of the boy engineer comes with it. The engineer’s assumptions and presumptions are in the code. “Come in, CQ” (1996) Learned about The WELL, an online community that’s been around since 1985. I also learned that the elm email client was succeed by Pine, another tree name. Pine was then succeeded by Alpine, another piece of wordplay. Quips like this resonate with me: I do believe that the operational definition of a thing—how it works—is its most eloquent self-expression. A lot of user interfaces seem to encourage immediate action: Although we seemed to be delaying, prolonging the time of imagination, the email was only rushing us. I read a message. The prompt then sat there, the cursor blinking. It was waiting for me to type “r” for “reply.” The whole system is designed for it, is pressing me, is pulsing, insisting: Reply. Reply right now. Even though I meant to hold the message awhile, even though I wanted to treat it as if it were indeed a “letter”—something to hold in my hand, read again, mull over—I cannot resist the voice of the software, which was murmuring, murmuring: Go ahead. You know you want to. Reply right now. A poignant paragraph about the demise of Morse code: The [Morse] code had a personality to it, a signature in the touch and rhythm on the key. For Turner, the signature’s origin was no mystery: “It’s coming from a person’s hand.” Makes me think about the things you trade for convenience, and the information that’s lost when you measure. “The Dumbing Down of Programming” (1998) This essay was originally published in Salon. I learned what “BIOS” stands for: Basic Input/Output System. Never thought about it before! To anyone who laments the messy design of modern terminals: […] we build our computers the way we build our cities—over time, without a plan, on top of ruins. This was written in 1998 and sounds similar to modern opinions about LLMs: My programming tools were full of wizards. Little dialogue boxes waiting for me to click “Next” and “Next” and “Finish.” Click and drag, and—shazzam—thousands of lines of working code. No need to get into the “hassle” of remembering the language. No need even to learn it. It is a powerful siren-song lure: You can make your program do all these wonderful and complicated things, and you don’t really need to understand. […] This not-knowing is a seduction. I feel myself drifting up, away from the core of what I’ve known programming to be: text that talks to the system and its other software, talk that depends upon knowing the system as deeply as possible. What a sweet temptation it is to succumb: Wizard, dazzle me. Ullman explains the risks of these systems. When something inevitably goes wrong, you may be powerless to debug it. I liked this bit which acknowledged the tradeoffs engineers have to make: We were reminded that software engineering was not about right and wrong but only better and worse, solutions that solved some problems while ignoring or exacerbating others. “What We Were Afraid of As We Feared Y2K” (1999—2000) This essay was heavily adapted from a 1999 Wired article. This essay made me think I should read an entire book about the history of Y2K. (If you know of a good one, let me know.) “The Museum of Me” (1998) Related to an earlier point about the developers encoding their worldview into the software they build: I have long believed that the ideas embedded in technology have a way of percolating up and outward into the nontechnical world at large, and that technology is made by people with intentions and, as such, is not neutral. The author talks about how the Internet glorified self-service, and only the very rich could afford human help. Here’s one of those prescient passages. Remember that this was written 27 years ago: But now, without leaving home, from the comfort of your easy chair, you can divorce yourself from the consensus on what constitutes “truth.” Each person can live in a private thought bubble, reading only those websites that reinforce his or her desired beliefs, joining only those online groups that give sustenance when the believer’s courage flags. “Fiber Optic Nights” (1999) You might be skeptical of the tech world. But when you’re surrounded by the techno-optimism of Silicon Valley, it’s hard to resist: At this stage of inebriation, I can’t resist the atmosphere of wild optimism. I let myself fall under the delicious cloud of dreams: the great global internet that will change human life—indeed, change humans themselves. Ullman laments how San Francisco’s diversity made way for tech startups, a “colonization” I noticed myself when I lived there. She expands on this much more in the final essay. And another sentence talking about how engineers only value “hard” engineering: Any serious software engineer would scoff at my dragging in philosophy, the fuzz of the humanities. “Off the High” (2000) Sad that this is still true 25 years later: Maybe what has put the damper on this year’s conference is that, after the Canadians pass their law, the United States will be the sole nation in the highly industrialized world without legal data-protections. Or maybe it’s the fact of being in Canada, where everyone who is an American knows that, on crossing back into the United States, they will lose their constitutional right not to be subjected to unreasonable searches. “Programming the Post-Human” (2002) This essay originally appeared in Harper’s Magazine in a slightly different form. Comparing Moore’s Law to software development: […] there is no Moore’s Law for software. On the contrary, as systems increase in complexity, it becomes harder—very much harder—to write reliable code. This essay is mostly about AI, and what it means to be alive and conscious. I think this quote succinctly sums up the whole thing: The more I thought about it, the more I decided that huge swaths of existence would be impenetrable—indescribable, un-programmable, utterly unable to be represented—to a creature that did not eat or shit. “Dining with Robots” (2004) Ullman rejects the often-used comparison that programming is like a recipe. Could a computer understand many of the subtle, and perhaps ancillary, parts of cooking? (To be fair, I’m a bad cook, so I probably can’t either.) The world resists the rigidity of software: The world, the actual world we inhabit, showed itself to be too marvelously varied, too ragged, too linked and interconnected, to be sorted into any set of frames or classes or problem spaces. Reminds me of a point repeatedly made in Beyond Measure, another book I took notes on. Computers are described as “fast, efficient, untiring, correct, standardized, organized”. “Close to the Mainframe” (2014) Ullman describes the intoxicating feeling of being sucked in by a tricky bug. This is one of the sweetest parts of computer programming! The Party Line (2015) Ullman talks about a small farm being affected by technological “efficiency”. This farm needed to start putting their milk in something called a bulk tank, or be left behind. Technology promises efficiency, but it also messes things up: Bulk-tank collection was surely more efficient [than] picking up individual cans. Consumers might benefit from the lower costs of production. It was technology at what it does best: standardize and homogenize and monetize, create efficiencies in sales and markets and distribution chains. It was also technology at its worst. The coming of the bulk tank was another of those ruptures in society. Yet this one did not widen the scope of individual freedoms. The tank would effectively drive the small family dairy farm out of existence. Programming for the Millions (2016) Ullman describes a programmer’s job as that of a “translator”. That’s sometimes how it feels! It reminds me of “meeting the computer halfway”. I liked this bit about breaking down the divide between humanities and software: I dare to imagine the general public learning how to write code. I do not mean that knowledge of programming should be elevated to the ranks of the other subjects that form basic literacy: languages, literature, history, psychology, sociology, economics, the basics of science and mathematics. I mean it the other way around. What I hope is that those with knowledge of the humanities will break into the closed society where code gets written: invade it. Boom Two: A Farewell (January 2017) The final essay really laments how San Francisco has changed. This quote sums it up best: The startup culture has overtaken San Francisco. It was once a place for kids running away from home, where people in their teens and early twenties came to get away from the lives they were supposed to lead but didn’t want to, to be gay or bisexual or other combinations of sexuality, all looking for some version of the old, wild, open San Francisco: the Beats, hippies, free love, the gay revolution. Yet nothing abides forever, and now we live in a city whose former identities, however mythical, have been swept away. A new wave of youthful seekers has come a-searching for yet another mythical San Francisco: a place where dreams of founding a successful internet startup are born, and fulfilled. There’s also a short passage about someone pitching their tech as being easy-to-use, using a phrase like “Even Grandma can use it.” Ullman (rightly) calls this out as sexist and ageist. I used to say stuff like this and am embarrassed by that! I’ll end with a quote about tech saviorism: How far away was and is the true work of creating a more egalitarian world, the slow, hard job of organizing, the hours of contentious community meetings: the clash of need against need. Only those who work close to that ground, and take the code into their own hands, can tell us what technology is good for.
More in technology
I moved recently, and so did my home server. You might have noticed it due to the downtime. This time I have built a dedicated shelf for it, which allows for more flexibility and room for additional expensive ideas. The internet connection is a fiber line, which is fantastic for a place that’s generally considered to be in the countryside. I had to hire a guy at the last place in Tallinn (capital of Estonia) to pull a fiber line from the basement to the apartment, with my own money, so I’m very happy that I don’t have to do it here. And yes, the ThinkPad T430 is still a solid home server. I had an issue with my battery calibration script resulting in the machine being turned off, but I fixed it by disabling it, at the cost of the battery probably dying soon. Seems like a tlp and/or Linux kernel issue that has surfaced recently, as it also happened on a different ThinkPad laptop when I last tried it. I can’t really remove the battery, because the “power on with AC attach” setting only works when the battery is connected and charged. The server/wardrobe/closet room is slightly chillier compared to the rest of the environment, meaning that the temperatures are also slightly lower. I also have an option to do some crazy ventilation experiments in the winter, but that will have to wait for a bit, mainly because it’s spring. I’m genuinely surprised that the Wi-Fi 5 signal is coming through the closet quite adequately, with the whole apartment being covered with at least 50 Mbit/s speeds, and over 300 Mbit/s when near the closet, which is about the maximum speed that I can achieve from the access point in ideal conditions.
When your machines run smoothly, your business can go far. That’s why condition monitoring – once a “nice to have” – is quickly becoming a must in maintenance strategies across industrial settings. But most dedicated systems can be complex to set up or difficult to scale. To make things easier, we’re introducing the Arduino Rileva ME Opta […] The post Optimize maintenance with the Arduino Rileva ME Opta Bundle appeared first on Arduino Blog.
They discuss interface design and Raskin's hatred of the mouse.
Air traffic control has been in the news lately, on account of my country's declining ability to do it. Well, that's a long-term trend, resulting from decades of under-investment, severe capture by our increasingly incompetent defense-industrial complex, no small degree of management incompetence in the FAA, and long-lasting effects of Reagan crushing the PATCO strike. But that's just my opinion, you know, maybe airplanes got too woke. In any case, it's an interesting time to consider how weird parts of air traffic control are. The technical, administrative, and social aspects of ATC all seem two notches more complicated than you would expect. ATC is heavily influenced by its peculiar and often accidental development, a product of necessity that perpetually trails behind the need, and a beneficiary of hand-me-down military practices and technology. Aviation Radio In the early days of aviation, there was little need for ATC---there just weren't many planes, and technology didn't allow ground-based controllers to do much of value. There was some use of flags and signal lights to clear aircraft to land, but for the most part ATC had to wait for the development of aviation radio. The impetus for that work came mostly from the First World War. Here we have to note that the history of aviation is very closely intertwined with the history of warfare. Aviation technology has always rapidly advanced during major conflicts, and as we will see, ATC is no exception. By 1913, the US Army Signal Corps was experimenting with the use of radio to communicate with aircraft. This was pretty early in radio technology, and the aircraft radios were huge and awkward to operate, but it was also early in aviation and "huge and awkward to operate" could be similarly applied to the aircraft of the day. Even so, radio had obvious potential in aviation. The first military application for aircraft was reconnaissance. Pilots could fly past the front to find artillery positions and otherwise provide useful information, and then return with maps. Well, even better than returning with a map was providing the information in real-time, and by the end of the war medium-frequency AM radios were well developed for aircraft. Radios in aircraft lead naturally to another wartime innovation: ground control. Military personnel on the ground used radio to coordinate the schedules and routes of reconnaissance planes, and later to inform on the positions of fighters and other enemy assets. Without any real way to know where the planes were, this was all pretty primitive, but it set the basic pattern that people on the ground could keep track of aircraft and provide useful information. Post-war, civil aviation rapidly advanced. The early 1920s saw numerous commercial airlines adopting radio, mostly for business purposes like schedule coordination. Once you were in contact with someone on the ground, though, it was only logical to ask about weather and conditions. Many of our modern practices like weather briefings, flight plans, and route clearances originated as more or less formal practices within individual airlines. Air Mail The government was not left out of the action. The Post Office operated what may have been the largest commercial aviation operation in the world during the early 1920s, in the form of Air Mail. The Post Office itself did not have any aircraft; all of the flying was contracted out---initially to the Army Air Service, and later to a long list of regional airlines. Air Mail was considered a high priority by the Post Office and proved very popular with the public. When the transcontinental route began proper operation in 1920, it became possible to get a letter from New York City to San Francisco in just 33 hours by transferring it between airplanes in a nearly non-stop relay race. The Post Office's largesse in contracting the service to private operators provided not only the funding but the very motivation for much of our modern aviation industry. Air travel was not very popular at the time, being loud and uncomfortable, but the mail didn't complain. The many contract mail carriers of the 1920s grew and consolidated into what are now some of the United States' largest companies. For around a decade, the Post Office almost singlehandedly bankrolled civil aviation, and passengers were a side hustle [1]. Air mail ambition was not only of economic benefit. Air mail routes were often longer and more challenging than commercial passenger routes. Transcontinental service required regular flights through sparsely populated parts of the interior, challenging the navigation technology of the time and making rescue of downed pilots a major concern. Notably, air mail operators did far more nighttime flying than any other commercial aviation in the 1920s. The post office became the government's de facto technical leader in civil aviation. Besides the network of beacons and markers built to guide air mail between cities, the post office built 17 Air Mail Radio Stations along the transcontinental route. The Air Mail Radio Stations were the company radio system for the entire air mail enterprise, and the closest thing to a nationwide, public air traffic control service to then exist. They did not, however, provide what we would now call control. Their role was mainly to provide pilots with information (including, critically, weather reports) and to keep loose tabs on air mail flights so that a disappearance would be noticed in time to send search and rescue. In 1926, the Watres Act created the Aeronautic Branch of the Department of Commerce. The Aeronautic Branch assumed a number of responsibilities, but one of them was the maintenance of the Air Mail routes. Similarly, the Air Mail Radio Stations became Aeronautics Branch facilities, and took on the new name of Flight Service Stations. No longer just for the contract mail carriers, the Flight Service Stations made up a nationwide network of government-provided services to aviators. They were the first edifices in what we now call the National Airspace System (NAS): a complex combination of physical facilities, technologies, and operating practices that enable safe aviation. In 1935, the first en-route air traffic control center opened, a facility in Newark owned by a group of airlines. The Aeronautic Branch, since renamed the Bureau of Air Commerce, supported the airlines in developing this new concept of en-route control that used radio communications and paperwork to track which aircraft were in which airways. The rising number of commercial aircraft made in-air collisions a bigger problem, so the Newark control center was quickly followed by more facilities built on the same pattern. In 1936, the Bureau of Air Commerce took ownership of these centers, and ATC became a government function alongside the advisory and safety services provided by the flight service stations. En route center controllers worked off of position reports from pilots via radio, but needed a way to visualize and track aircraft's positions and their intended flight paths. Several techniques helped: first, airlines shared their flight planning paperwork with the control centers, establishing "flight plans" that corresponded to each aircraft in the sky. Controllers adopted a work aid called a "flight strip," a small piece of paper with the key information about an aircraft's identity and flight plan that could easily be handed between stations. By arranging the flight strips on display boards full of slots, controllers could visualize the ordering of aircraft in terms of altitude and airway. Second, each center was equipped with a large plotting table map where controllers pushed markers around to correspond to the position reports from aircraft. A small flag on each marker gave the flight number, so it could easily be correlated to a flight strip on one of the boards mounted around the plotting table. This basic concept of air traffic control, of a flight strip and a position marker, is still in use today. Radar The Second World War changed aviation more than any other event of history. Among the many advancements were two British inventions of particular significance: first, the jet engine, which would make modern passenger airliners practical. Second, the radar, and more specifically the magnetron. This was a development of such significance that the British government treated it as a secret akin to nuclear weapons; indeed, the UK effectively traded radar technology to the US in exchange for participation in US nuclear weapons research. Radar created radical new possibilities for air defense, and complimented previous air defense development in Britain. During WWI, the organization tasked with defending London from aerial attack had developed a method called "ground-controlled interception" or GCI. Under GCI, ground-based observers identify possible targets and then direct attack aircraft towards them via radio. The advent of radar made GCI tremendously more powerful, allowing a relatively small number of radar-assisted air defense centers to monitor for inbound attack and then direct defenders with real-time vectors. In the first implementation, radar stations reported contacts via telephone to "filter centers" that correlated tracks from separate radars to create a unified view of the airspace---drawn in grease pencil on a preprinted map. Filter center staff took radar and visual reports and updated the map by moving the marks. This consolidated information was then provided to air defense bases, once again by telephone. Later technical developments in the UK made the process more automated. The invention of the "plan position indicator" or PPI, the type of radar scope we are all familiar with today, made the radar far easier to operate and interpret. Radar sets that automatically swept over 360 degrees allowed each radar station to see all activity in its area, rather than just aircraft passing through a defensive line. These new capabilities eliminated the need for much of the manual work: radar stations could see attacking aircraft and defending aircraft on one PPI, and communicated directly with defenders by radio. It became routine for a radar operator to give a pilot navigation vectors by radio, based on real-time observation of the pilot's position and heading. A controller took strategic command of the airspace, effectively steering the aircraft from a top-down view. The ease and efficiency of this workflow was a significant factor in the end of the Battle of Britain, and its remarkable efficacy was noticed in the US as well. At the same time, changes were afoot in the US. WWII was tremendously disruptive to civil aviation; while aviation technology rapidly advanced due to wartime needs those same pressing demands lead to a slowdown in nonmilitary activity. A heavy volume of military logistics flights and flight training, as well as growing concerns about defending the US from an invasion, meant that ATC was still a priority. A reorganization of the Bureau of Air Commerce replaced it with the Civil Aeronautics Authority, or CAA. The CAA's role greatly expanded as it assumed responsibility for airport control towers and commissioned new en route centers. As WWII came to a close, CAA en route control centers began to adopt GCI techniques. By 1955, the name Air Route Traffic Control Center (ARTCC) had been adopted for en route centers and the first air surveillance radars were installed. In a radar-equipped ARTCC, the map where controllers pushed markers around was replaced with a large tabletop PPI built to a Navy design. The controllers still pushed markers around to track the identities of aircraft, but they moved them based on their corresponding radar "blips" instead of radio position reports. Air Defense After WWII, post-war prosperity and wartime technology like the jet engine lead to huge growth in commercial aviation. During the 1950s, radar was adopted by more and more ATC facilities (both "terminal" at airports and "en route" at ARTCCs), but there were few major changes in ATC procedure. With more and more planes in the air, tracking flight plans and their corresponding positions became labor intensive and error-prone. A particular problem was the increasing range and speed of aircraft, and corresponding longer passenger flights, that meant that many aircraft passed from the territory of one ARTCC into another. This required that controllers "hand off" the aircraft, informing the "next" ARTCC of the flight plan and position at which the aircraft would enter their airspace. In 1956, 128 people died in a mid-air collision of two commercial airliners over the Grand Canyon. In 1958, 49 people died when a military fighter struck a commercial airliner over Nevada. These were not the only such incidents in the mid-1950s, and public trust in aviation started to decline. Something had to be done. First, in 1958 the CAA gave way to the Federal Aviation Administration. This was more than just a name change: the FAA's authority was greatly increased compared tot he CAA, most notably by granting it authority over military aviation. This is a difficult topic to explain succinctly, so I will only give broad strokes. Prior to 1958, military aviation was completely distinct from civil aviation, with no coordination and often no communication at all between the two. This was, of course, a factor in the 1958 collision. Further, the 1956 collision, while it did not involve the military, did result in part from communications issues between separate distinct CAA facilities and the airline's own control facilities. After 1958, ATC was completely unified into one organization, the FAA, which assumed the work of the military controllers of the time and some of the role of the airlines. The military continues to have its own air controllers to this day, and military aircraft continue to include privileges such as (practical but not legal) exemption from transponder requirements, but military flights over the US are still beholden to the same ATC as civil flights. Some exceptions apply, void where prohibited, etc. The FAA's suddenly increased scope only made the practical challenges of ATC more difficult, and commercial aviation numbers continued to rise. As soon as the FAA was formed, it was understood that there needed to be major investments in improving the National Airspace System. While the first couple of years were dominated by the transition, the FAA's second director (Najeeb Halaby) prepared two lengthy reports examining the situation and recommending improvements. One of these, the Beacon report (also called Project Beacon), specifically addressed ATC. The Beacon report's recommendations included massive expansion of radar-based control (called "positive control" because of the controller's access to real-time feedback on aircraft movements) and new control procedures for airways and airports. Even better, for our purposes, it recommended the adoption of general-purpose computers and software to automate ATC functions. Meanwhile, the Cold War was heating up. US air defense, a minor concern in the few short years after WWII, became a higher priority than ever before. The Soviet Union had long-range aircraft capable of reaching the United States, and nuclear weapons meant that only a few such aircraft had to make it to cause massive destruction. Considering the vast size of the United States (and, considering the new unified air defense command between the United States and Canada, all of North America) made this a formidable challenge. During the 1950s, the newly minted Air Force worked closely with MIT's Lincoln Laboratory (an important center of radar research) and IBM to design a computerized, integrated, networked system for GCI. When the Air Force committed to purchasing the system, it was christened the Semi-Automated Ground Environment, or SAGE. SAGE is a critical juncture in the history of the computer and computer communications, the first system to demonstrate many parts of modern computer technology and, moreover, perhaps the first large-scale computer system of any kind. SAGE is an expansive topic that I will not take on here; I'm sure it will be the focus of a future article but it's a pretty well-known and well-covered topic. I have not so far felt like I had much new to contribute, despite it being the first item on my "list of topics" for the last five years. But one of the things I want to tell you about SAGE, that is perhaps not so well known, is that SAGE was not used for ATC. SAGE was a purely military system. It was commissioned by the Air Force, and its numerous operating facilities (called "direction centers") were located on Air Force bases along with the interceptor forces they would direct. However, there was obvious overlap between the functionality of SAGE and the needs of ATC. SAGE direction centers continuously received tracks from remote data sites using modems over leased telephone lines, and automatically correlated multiple radar tracks to a single aircraft. Once an operator entered information about an aircraft, SAGE stored that information for retrieval by other radar operators. When an aircraft with associated data passed from the territory of one direction center to another, the aircraft's position and related information were automatically transmitted to the next direction center by modem. One of the key demands of air defense is the identification of aircraft---any unknown track might be routine commercial activity, or it could be an inbound attack. The air defense command received flight plan data on commercial flights (and more broadly all flights entering North America) from the FAA and entered them into SAGE, allowing radar operators to retrieve "flight strip" data on any aircraft on their scope. Recognizing this interconnection with ATC, as soon as SAGE direction centers were being installed the Air Force started work on an upgrade called SAGE Air Traffic Integration, or SATIN. SATIN would extend SAGE to serve the ATC use-case as well, providing SAGE consoles directly in ARTCCs and enhancing SAGE to perform non-military safety functions like conflict warning and forward projection of flight plans for scheduling. Flight strips would be replaced by teletype output, and in general made less necessary by the computer's ability to filter the radar scope. Experimental trial installations were made, and the FAA participated readily in the research efforts. Enhancement of SAGE to meet ATC requirements seemed likely to meet the Beacon report's recommendations and radically improve ARTCC operations, sooner and cheaper than development of an FAA-specific system. As it happened, well, it didn't happen. SATIN became interconnected with another planned SAGE upgrade to the Super Combat Centers (SCC), deep underground combat command centers with greatly enhanced SAGE computer equipment. SATIN and SCC planners were so confident that the last three Air Defense Sectors scheduled for SAGE installation, including my own Albuquerque, were delayed under the assumption that the improved SATIN/SCC equipment should be installed instead of the soon-obsolete original system. SCC cost estimates ballooned, and the program's ambitions were reduced month by month until it was canceled entirely in 1960. Albuquerque never got a SAGE installation, and the Albuquerque air defense sector was eliminated by reorganization later in 1960 anyway. Flight Service Stations Remember those Flight Service Stations, the ones that were originally built by the Post Office? One of the oddities of ATC is that they never went away. FSS were transferred to the CAB, to the CAA, and then to the FAA. During the 1930s and 1940s many more were built, expanding coverage across much of the country. Throughout the development of ATC, the FSS remained responsible for non-control functions like weather briefing and flight plan management. Because aircraft operating under instrument flight rules must closely comply with ATC, the involvement of FSS in IFR flights is very limited, and FSS mostly serve VFR traffic. As ATC became common, the FSS gained a new and somewhat odd role: playing go-between for ATC. FSS were more numerous and often located in sparser areas between cities (while ATC facilities tended to be in cities), so especially in the mid-century, pilots were more likely to be able to reach an FSS than ATC. It was, for a time, routine for FSS to relay instructions between pilots and controllers. This is still done today, although improved communications have made the need much less common. As weather dissemination improved (another topic for a future post), FSS gained access to extensive weather conditions and forecasting information from the Weather Service. This connectivity is bidirectional; during the midcentury FSS not only received weather forecasts by teletype but transmitted pilot reports of weather conditions back to the Weather Service. Today these communications have, of course, been computerized, although the legacy teletype format doggedly persists. There has always been an odd schism between the FSS and ATC: they are operated by different departments, out of different facilities, with different functions and operating practices. In 2005, the FAA cut costs by privatizing the FSS function entirely. Flight service is now operated by Leidos, one of the largest government contractors. All FSS operations have been centralized to one facility that communicates via remote radio sites. While flight service is still available, increasing automation has made the stations far less important, and the general perception is that flight service is in its last years. Last I looked, Leidos was not hiring for flight service and the expectation was that they would never hire again, retiring the service along with its staff. Flight service does maintain one of my favorite internet phenomenon, the phone number domain name: 1800wxbrief.com. One of the odd manifestations of the FSS/ATC schism and the FAA's very partial privatization is that Leidos maintains an online aviation weather portal that is separate from, and competes with, the Weather Service's aviationweather.gov. Since Flight Service traditionally has the responsibility for weather briefings, it is honestly unclear to what extend Leidos vs. the National Weather Service should be investing in aviation weather information services. For its part, the FAA seems to consider aviationweather.gov the official source, while it pays for 1800wxbrief.com. There's also weathercams.faa.gov, which duplicates a very large portion (maybe all?) of the weather information on Leidos's portal and some of the NWS's. It's just one of those things. Or three of those things, rather. Speaking of duplication due to poor planning... The National Airspace System Left in the lurch by the Air Force, the FAA launched its own program for ATC automation. While the Air Force was deploying SAGE, the FAA had mostly been waiting, and various ARTCCs had adopted a hodgepodge of methods ranging from one-off computer systems to completely paper-based tracking. By 1960 radar was ubiquitous, but different radar systems were used at different facilities, and correlation between radar contacts and flight plans was completely manual. The FAA needed something better, and with growing congressional support for ATC modernization, they had the money to fund what they called National Airspace System En Route Stage A. Further bolstering historical confusion between SAGE and ATC, the FAA decided on a practical, if ironic, solution: buy their own SAGE. In an upcoming article, we'll learn about the FAA's first fully integrated computerized air traffic control system. While the failed detour through SATIN delayed the development of this system, the nearly decade-long delay between the design of SAGE and the FAA's contract allowed significant technical improvements. This "New SAGE," while directly based on SAGE at a functional level, used later off-the-shelf computer equipment including the IBM System/360, giving it far more resemblance to our modern world of computing than SAGE with its enormous, bespoke AN/FSQ-7. And we're still dealing with the consequences today! [1] It also laid the groundwork for the consolidation of the industry, with a 1930 decision that took air mail contracts away from most of the smaller companies and awarded them instead to the precursors of United, TWA, and American Airlines.
Exploring a peculiar bit-twiddling hack at the intersection of 1980s geek sensibilities.