Full Width [alt+shift+f] Shortcuts [alt+shift+k]
Sign Up [alt+shift+s] Log In [alt+shift+l]
149
ThinkPad keyboards were once well known for their great layouts, feel and functionality. This included the media playback control keys. On the ThinkPad T430, the new chiclet keyboard layout moved the media keys to the function row. Still there, but less convenient to access. The ThinkPad L390 Yoga doesn’t have any visible function keys for controlling media playback. However, I found that the play/pause and stop buttons are still functional on the up/down arrow keys. Left/right arrow keys act as Home and End keys. evtest output for play/pause key combination: Event: time 1714716211.222697, type 4 (EV_MSC), code 4 (MSC_SCAN), value e3 Event: time 1714716211.222697, type 1 (EV_KEY), code 143 (KEY_WAKEUP), value 1 Event: time 1714716211.222697, -------------- SYN_REPORT ------------ Event: time 1714716212.291293, type 4 (EV_MSC), code 4 (MSC_SCAN), value a2 Event: time 1714716212.291293, type 1 (EV_KEY), code 164 (KEY_PLAYPAUSE), value 1 Event: time 1714716212.291293, --------------...
a year ago

Improve your reading experience

Logged in users get linked directly to articles resulting in a better reading experience. Please login for free, it takes less than 1 minute.

More from ./techtipsy

Home is where the home server is

I moved recently, and so did my home server. You might have noticed it due to the downtime. This time I have built a dedicated shelf for it, which allows for more flexibility and room for additional expensive ideas. The internet connection is a fiber line, which is fantastic for a place that’s generally considered to be in the countryside. I had to hire a guy at the last place in Tallinn (capital of Estonia) to pull a fiber line from the basement to the apartment, with my own money, so I’m very happy that I don’t have to do it here. And yes, the ThinkPad T430 is still a solid home server. I had an issue with my battery calibration script resulting in the machine being turned off, but I fixed it by disabling it, at the cost of the battery probably dying soon. Seems like a tlp and/or Linux kernel issue that has surfaced recently, as it also happened on a different ThinkPad laptop when I last tried it. I can’t really remove the battery, because the “power on with AC attach” setting only works when the battery is connected and charged. The server/wardrobe/closet room is slightly chillier compared to the rest of the environment, meaning that the temperatures are also slightly lower. I also have an option to do some crazy ventilation experiments in the winter, but that will have to wait for a bit, mainly because it’s spring. I’m genuinely surprised that the Wi-Fi 5 signal is coming through the closet quite adequately, with the whole apartment being covered with at least 50 Mbit/s speeds, and over 300 Mbit/s when near the closet, which is about the maximum speed that I can achieve from the access point in ideal conditions.

3 hours ago 1 votes
The coffee machine ran out of memory

After looking into an incident involving Kubernetes nodes running out of memory, I took a trip to the office kitchen to take a break and get a cup of the good stuff. My teammate got their drink first, and then it was my turn. Why is there a Windows 98 themed pop-up on the screen? I wanted to get my coffee, so I tapped on the small OK button. That may have forced the poor coffee machine to start swapping, for which I felt a little bit guilty. The UI was catching up with previous animations, and I got to the drink selection. None of the buttons worked. I reckon something critical crashed in the background. After looking into an incident involving a coffee machine running out of memory, I took a trip to the other office kitchen to take a break and get a cup of the good stuff. That one was fine. Guess it ran on something else than Java. laugh_track.mp3

3 weeks ago 17 votes
About the time I trashed my mother's laptop

Around 2003, my mother had a laptop: the Compaq Armada 1592DT. It ran Windows Me, the worst Windows to ever exist, whopping 96 MB of RAM, and a 3 GB hard drive. My mother used it for important stuff, and I played games on it. Given the limitations of the 3 GB hard drive, this soon lead to a conflict: there was no room to store any new games! I did my best to make additional room by running the disk cleaner utility, disabling unnecessary Windows features and deleting some PDF catalogues that my mother had downloaded, but there was still a constant lack of space. Armed with a lack of knowledge about computers, I went further and found a tool that promised to make more room on the hard drive. I can’t remember what it was, but it had a nice graphical user interface where the space on the drive was represented as a pie chart. To my amazement, I could slide that pie chart to make it so that 90% of the drive was free space! I went full speed ahead with it. What followed was a crash and upon rebooting I was presented with a black screen. Oops. My mother ended up taking it to a repair shop for 1200 EEK, which was a lot of money at the time. The repair shop ended up installing Windows 98 SE on it, which felt like a downgrade at the time, but in retrospect it was an improvement over Windows Me. I had no idea what I was doing at the time, but I assume that the tool I was playing with was some sort of a partition manager that had no safeguards in place to avoid shrinking and reformatting operating system partitions. Or if it did, then it made ignoring the big warning signs way too easy. Still 100% user error on my part. If only I knew that reinstalling Windows was a relatively simple operation at the time, but it took a solid 4-5 years until I did my first installation of Windows all by myself.

a month ago 23 votes
Fairphone Fairbuds XL review: admirable goals, awful product

I bought the Fairphone Fairbuds XL with my own money at a recent sale for 186.75 EUR, plus 15 EUR for shipping to Estonia. The normal price for these headphones is 239 EUR. This post is not sponsored. I admire what Fairphone wants to achieve, even going as far as getting the Fairphone 5 as a replacement for my iPhone X. Failing to repair my current headphones, I went ahead and decided to get the Fairphone Fairbuds XL as they also advertise the active noise-cancelling feature, and I like the Fairphone brand. Disclaimer: this review is going to be entirely subjective and based on my opinions and experiences with other audio products in the past. I also have tinnitus.1 I consulted rtings.com review before purchasing the product to get an idea about what to expect as a consumer. The comparison headphones The main point of comparison for this review is going to be the Sony WH-1000XM3, which are premium high-end wireless Bluetooth headphones, with active noise-cancelling (before that feature broke). These headphones retailed at a higher price during 2020 (about 300-400 EUR) so they are technically a tier above the Fairbuds XL, but given that its successsor, the WH-1000XM4, can be bought for 239 EUR new (and often about 200-ish EUR on sale!), then it is a fair comparison in my view. After I replaced the ear cushions on my Sony WH-1000XM3 headset, the active noise-cancelling feature started being flaky (popping and loud noises occurring with NC on). No amount of cleaning or calibrating fixed it, and even the authorized repair shop could not do anything about it. I diagnosed the issue to be with the internal noise-cancelling microphones and found that these failing is a very common issue for these headsets, even for newer versions of it. I am unable to compare the active noise-cancelling performance side-by-side, but I can say that the NC performance on the Sony WH-1000XM3 was simply excellent when it did work, no doubt about it. The Fairphone shop experience The first issue I had with the product was actually buying it. For some reason, the form would not accept my legal name which has letter “Õ” in it, a common vowel in Estonia. Knowing how poorly Javascript-based client-side validation can be built, I pulled a pro gamer move and copy-pasted my name into the form, which bypassed the faulty check altogether. Similar issue occurred with the address field, as we also have the letter “Ä” ( and “Ö”, “Ü”, for that matter). The name I can understand why Fairphone went with the name “Fairbuds XL”, it kind of made sense in their audio product line, and Apple set a precedent with AirPods Max. However, there is such a big missed opportunity here: they could’ve called the product… Fairphones. Yes, it would cause some confusion about their other product line, which is the Fairphone, but at least I would find the name more amusing. Packaging The packaging for the headphones is quite similar to what you’d get with the Fairphone 5: lots of cardboard and seemingly no plastic or otherwise problematic materials. Aside from the headphones themselves, you also get a nice egg bag, meant to protect your headphones when travelling with them. It’s okay, but nothing special, and it won’t protect your headphones from physical damage should they fall or get thrown around in a backpack. The Sony headphones come with a solid hardcase, which have done a fantastic job of protecting the headphones over the last 4 years. Longevity of a device depends both on repairability and durability, which is why a hard case would benefit the Fairbuds XL a lot. Factory defect My experience with the Fairbuds XL were off to a rocky start. I noticed that the USB-C cable that connects both sides of the headphones was inserted incorrectly. The headphones worked fine, but you could feel the flat USB-C cable being twisted inside the headband. The fix to this was to carefully push the headband back, disconnect the USB-C cable from the headphones, flip the cable around and reconnect it. Not a good first impression, but at least the fix was simple enough. Fit and feel The Fairbuds XL are not as comfortable as the reference headphones. The ear cushions and headrest are quite hard and not as soft as on the Sony WH-1000XM3. If you get the fit just right, then you probably won’t have issues with wearing these for a few hours at the time, but I found myself adjusting these often to stop them from hurting my ears and head even during a short test. The ear cups lack any kind of swiveling, which is likely contributing to the comparatively poor fit. Our ears are angled ever-so-slightly forwards, and the Sony WH-1000XM3 feels so much better on the ears as a result of its swiveling aspect. I also noticed that you can hear some components inside the headphones rattling when moving your head. This noise is very noticeable even during music playback and you don’t need to move your head a lot to hear that rattling. In my view, this is a serious defect in the product. When the headphones are folded in, the USB-C cable gets bent in the process and gets forced against one of the ear cushions. I suspect that within months or years of use, either the cable will fail or the ear cushion gets a permanent imprint of the USB-C cable position. The sound I’m not impressed with the sound that the Fairbuds XL produce. They are not in the same class as the Sony WH-1000XM3, with the default equalizer sounding incredibly bland. Most instruments and sounds are bland and not as clear. That’s the best I can describe it as. The Fairbuds app can be used to tune the sound via the equalizer, and out of all the presets I’ve found “Boston” to be the most pleasant one to use. Unfortunately the UI does not show how the presets customize the values in the equalizer, which makes tweaking a preset all that much harder. Compared to the Sony WH-1000XM3, I miss the cripsy sound and the all-encompassing bass, it can really bring all the satisfying details out. Given that I had used the Sony headphones for almost 5 years at this point may also just mean that I had gotten used to how it sounds. Active noise-cancelling The active noise-cancelling performance is nowhere near the Sony WH-1000XM3-s. The effect is very minor, and you’ll be hearing most of the surrounding sounds. Touching the active noise-cancelling microphones on the sides of the headphones will also make a loud sound inside the speaker, and walking around in a room will result in the headphones making wind noises. Because of this, I consider the active noise-cancelling functionality to be functionally broken. Microphone quality I used the Fairbuds XL in a work call, and based on feedback from other attendees, the microphone quality over Bluetooth can be categorized as barely passable, getting a solid 2 points out of 5. To be fair, Bluetooth microphone quality is also not great on the Sony WH-1000XM3-s, but compared to the Fairphone Fairbuds XL, they are still subjectively better. Fairbuds app The Fairbuds app is very simple, and you’d mainly want to use it for setting the equalizer settings and upgrading the firmware. The rest of the functionality seems to be a bunch of links to Fairphone articles and guides. The first time I installed the app, it told me that a firmware upgrade version V90 is available. During the first attempt, the progress bar stopped. Second attempt: it almost reached the end and did not complain about a firmware upgrade being available after that. Third attempt came after I had reinstalled the app. And there it was, the version V90 update, again. This time it got stuck at 1%. I’m probably still on the older version of the firmware, but I honestly can’t tell. Bluetooth multi-device connecting This is a feature that I didn’t know I needed in my life. With the reference Sony WH-1000XM3-s, whenever I wanted to switch where I listen to music from, I had to disconnect from my phone and then reconnect on the desktop, which was an annoying and manual process. With the Fairbuds XL, I can connect the headphones to both my laptop and phone and play media wherever, the headphones will switch to whichever device I’m actually using! This, too, has its quirks, and there might be a small delay when playing media on the other device, but I’ve grown so accustomed to using this feature now and can’t imagine myself going back to using anything else. This feature is not unique to the Fairbuds XL as other modern wireless headphones are also likely to boast this feature, but this is the first time I’ve had the opportunity to try this out myself. It’s a tremendous quality of life improvement for me. However, this, too, is not perfect. If I have the headphones connected to my phone and laptop, and I change to headset mode on the laptop for a meeting, then the playback on the phone will be butchered until I completely disconnect the headphones from the laptop. This seems like a firmware issue to me. The controls The Fairbuds XL has one button and one joystick. The button controls the active noise-cancelling settings (NC on, Ambient sound, NC off), plus the Bluetooth pairing mode. The joystick is used to turn the device on, switch songs and control the volume, and likely some other settings that relate to accepting calls and the like. Coming from the Sony WH-1000XM3, I have to say that I absolutely LOVE having physical buttons again! It’s so much easier to change the volume level, skip songs and start/stop playback with a physical button compared to the asinine touch surface solution that Sony has going on. The joystick is not perfect, skipping a song can be a little bit tricky due to how the joystick is positioned, you can’t always get a good handle due to your fingers hitting the rest of the headphone assembly. That’s the only concern I have with it. If the joystick was a little bit concave and larger, then that may make some of these actions easier for those of us with modest/large thumbs. The audio cue for skipping songs is a bit annoying and cannot seemingly be disabled. The sound effect resembles someone hitting a golf ball with a very poor driver. The ANC settings button is alright, but it’s not possible to quickly cycle between the three modes, you will have to fully listen to the nice lady speaking and then you can move on to the next setting. I wish that clicking the button in rapid succession would skip through the modes faster. USB-C port functionality I was curious to see if the Fairbuds XL worked as normal headphones if I just connected them up to my PC using a USB-C cable. To my surprise, they did! The audio quality was not as good as with Bluetooth, and the volume controls depended on which virtual device you select in your operating system. The Sony WH-1000XM3 do not work like this, the USB-C port is for charging only as far as I’ve tested, but it does have an actual 3.5mm port for wired use. When connected over Bluetooth and you connect a charging cable, the Fairbuds XL will pause momentarily and then continue playback while charging the battery. This is incredibly handy for a wireless device, especially in situations where you have an important meeting coming up and you’re just about to run out of battery. The Sony WH-1000XM3 will simply power off when you connect a charger cable, rendering them unusable while charging. Annoying issues For some reason, whenever I charge my Fairbuds XL, they magically turn on again and I have to shut them off a second time. I’m never quite sure if I’ve managed to shut the headphones off. It does the jingle that indicates that it’s powered off, but then I come back to it later and I find that they’re powered on again. Customer care experience I was so unhappy with the product that I tried out the refunding process for the Fairphone Fairbuds XL. I ordered the Fairbuds XL on 2025-02-10 and I received them on 2025-02-14, shipped to Estonia. According to Fairphone’s own materials, I can return the headphones without any questions asked, assuming that my use of them matches what can be done at a physical store. For Fairphone Products, including gift cards, you purchased on the Fairphone Webshop, you have a legal right to change your mind within 14 days and receive a refund amounting to the purchase price of the products and the costs of delivery and return. You are entitled to cancel your purchase within fourteen (14) days from the day the products were delivered to you, without explanation and without any penalties. In the case of a Cool-off, Fairphone may reduce the refund of the purchase price (including delivery costs) to reflect any reduction in the value of the Products, if this has been caused by your handling them in a way which would not normally be permitted in a shop. This means You are entitled to turn on and inspect Your purchased device to familiarise yourself with its properties and ensure that it is working correctly – comparable to the conditions that are permitted within a shop. I followed their instructions and filed a support ticket on 2025-02-16. On 2025-02-25, I had not yet received any contact from Fairphone and I asked them again under the same ticket. On 2025-03-07, I received an automated message that apologized for the delay and asked me to not make any additional tickets on the matter. I’m still waiting for an update for the support ticket over a month later, while the headphones sit in the original packaging. Based on the experiences by others in the Fairphone community forum, it seems that unacceptably large delays in customer service are the norm for Fairphone. Fairphone, if you want to succeed as a company, you need to make sure that the one part of your company that’s directly interfacing with your actual paying customers needs to be appropriately staffed and resourced. A bad customer support experience can turn off a brand evangelist overnight. Closing thoughts I want Fairphone to succeed in their mission, but products like these do not further the cause. The feature set of the Fairbuds XL seems competent, and I’m willing to give a pass on a few minor issues if the overall experience is good, but the unimpressive sound profile, broken active noise-cancelling mode, multiple quality issues and poor customer service mean that I can’t in good conscience recommend the Fairphone Fairbuds XL, not even on sale. Perhaps less resources should be spent on rebranding and more on engineering good products. Remember dubstep being a thing? Yeah, so do I. That, plus a little bit of mandatory military service can do a lot of damage to hearing. ↩︎

a month ago 25 votes
I yearn for the perfect home server

I’ve changed my home server setup a lot over the past decade, mainly because I keep changing the goals all the time. I’ve now realized why that keeps happening. I want the perfect home server. What is the perfect home server? I’d phrase it like this: The perfect home server uses very little power, offers plenty of affordable storage and provides a lot of performance when it’s actually being relied upon. In my case, low power means less than 5 W while idling, 10+ TB of redundant storage for data resilience and integrity concerns, and performance means about 4 modern CPU cores’ worth (low-to-midrange desktop CPU performance). I seem to only ever get one or two at most. Low power usage? Your performance will likely suffer, and you can’t run too many storage drives. You can run SSD-s, but they are not affordable if you need higher capacities. Lots of storage? Well, there goes the low power consumption goal, especially if you run 3.5" hard drives. Lots of performance? Lots of power consumed! There’s just something that annoys me whenever I do things on my home server and I have to wait longer than I should, and yet I’m bothered when my monitoring tells me that my home server is using 50+ watts.1 I keep an eye out for developments in the self-hosting and home server spaces with the hopes that I’ll one day stumble upon the holy grail, that one server that fits all my needs. I’ve gotten close, but no matter what setup I have, there’s always something that keeps bothering me. I’ve seen a few attempts at the perfect home server, covered by various tech reviewers, but they always have at least one critical flaw. Sometimes the whole package is actually great, the functionality rocks, and then you find that the hardware contains prototype-level solutions that result in the power consumption ballooning to over 30 W. Or the price is over 1000 USD/EUR, not including the drives. Or it’s only available in certain markets and the shipping and import duties destroy its value proposition. There is no affordable platform out there that provides great performance, flexibility and storage space, all while being quiet and using very little power.2 Desktop PC-s repurposed as home servers can provide room for a lot of storage, and they are by design very flexible, but the trade-off is the higher power consumption of the setup. Single board computers use very little power, but they can’t provide a lot of performance and connecting storage to them gets tricky and is overall limited. They can also get surprisingly expensive. NAS boxes provide a lot of storage space and are generally low power if you exclude the power consumption of hard drives, but the cheaper ones are not that performant, and the performant ones cost almost as much as a high-end PC. Laptops can be used as home servers, they are quite efficient and performant, but they lack the flexibility and storage options of desktop PC-s and NAS boxes. You can slap a USB-based DAS to it to add storage, but I’ve had poor experiences with these under high load, meaning that these approaches can’t be relied on if you care about your data and server stability. Then there’s the option of buying used versions of all of the above. Great bang for buck, but you’re likely taking a hit on the power efficiency part due to the simple fact that technology keeps evolving and getting more efficient. I’m still hopeful that one day a device exists that ticks all the boxes while also being priced affordably, but I’m afraid that it’s just a pipe dream. There are builds out there that fill in almost every need, but the parts list is very specific and the bulk of the power consumption wins come from using SSD-s instead of hard drives, which makes it less affordable. In the meantime I guess I’ll keep rocking my ThinkPad-as-a-server approach and praying that the USB-attached storage does not cause major issues. perhaps it’s an undiagnosed medical condition. Homeserveritis? ↩︎ if there is one, then let me know, you can find the contact details below! ↩︎

2 months ago 31 votes

More in technology

Home is where the home server is

I moved recently, and so did my home server. You might have noticed it due to the downtime. This time I have built a dedicated shelf for it, which allows for more flexibility and room for additional expensive ideas. The internet connection is a fiber line, which is fantastic for a place that’s generally considered to be in the countryside. I had to hire a guy at the last place in Tallinn (capital of Estonia) to pull a fiber line from the basement to the apartment, with my own money, so I’m very happy that I don’t have to do it here. And yes, the ThinkPad T430 is still a solid home server. I had an issue with my battery calibration script resulting in the machine being turned off, but I fixed it by disabling it, at the cost of the battery probably dying soon. Seems like a tlp and/or Linux kernel issue that has surfaced recently, as it also happened on a different ThinkPad laptop when I last tried it. I can’t really remove the battery, because the “power on with AC attach” setting only works when the battery is connected and charged. The server/wardrobe/closet room is slightly chillier compared to the rest of the environment, meaning that the temperatures are also slightly lower. I also have an option to do some crazy ventilation experiments in the winter, but that will have to wait for a bit, mainly because it’s spring. I’m genuinely surprised that the Wi-Fi 5 signal is coming through the closet quite adequately, with the whole apartment being covered with at least 50 Mbit/s speeds, and over 300 Mbit/s when near the closet, which is about the maximum speed that I can achieve from the access point in ideal conditions.

3 hours ago 1 votes
Optimize maintenance with the Arduino Rileva ME Opta Bundle

When your machines run smoothly, your business can go far. That’s why condition monitoring – once a “nice to have” – is quickly becoming a must in maintenance strategies across industrial settings. But most dedicated systems can be complex to set up or difficult to scale. To make things easier, we’re introducing the Arduino Rileva ME Opta […] The post Optimize maintenance with the Arduino Rileva ME Opta Bundle appeared first on Arduino Blog.

2 days ago 3 votes
Dr. Dobb's Journal Interviews Jef Raskin (1986)

They discuss interface design and Raskin's hatred of the mouse.

3 days ago 5 votes
2025-05-11 air traffic control

Air traffic control has been in the news lately, on account of my country's declining ability to do it. Well, that's a long-term trend, resulting from decades of under-investment, severe capture by our increasingly incompetent defense-industrial complex, no small degree of management incompetence in the FAA, and long-lasting effects of Reagan crushing the PATCO strike. But that's just my opinion, you know, maybe airplanes got too woke. In any case, it's an interesting time to consider how weird parts of air traffic control are. The technical, administrative, and social aspects of ATC all seem two notches more complicated than you would expect. ATC is heavily influenced by its peculiar and often accidental development, a product of necessity that perpetually trails behind the need, and a beneficiary of hand-me-down military practices and technology. Aviation Radio In the early days of aviation, there was little need for ATC---there just weren't many planes, and technology didn't allow ground-based controllers to do much of value. There was some use of flags and signal lights to clear aircraft to land, but for the most part ATC had to wait for the development of aviation radio. The impetus for that work came mostly from the First World War. Here we have to note that the history of aviation is very closely intertwined with the history of warfare. Aviation technology has always rapidly advanced during major conflicts, and as we will see, ATC is no exception. By 1913, the US Army Signal Corps was experimenting with the use of radio to communicate with aircraft. This was pretty early in radio technology, and the aircraft radios were huge and awkward to operate, but it was also early in aviation and "huge and awkward to operate" could be similarly applied to the aircraft of the day. Even so, radio had obvious potential in aviation. The first military application for aircraft was reconnaissance. Pilots could fly past the front to find artillery positions and otherwise provide useful information, and then return with maps. Well, even better than returning with a map was providing the information in real-time, and by the end of the war medium-frequency AM radios were well developed for aircraft. Radios in aircraft lead naturally to another wartime innovation: ground control. Military personnel on the ground used radio to coordinate the schedules and routes of reconnaissance planes, and later to inform on the positions of fighters and other enemy assets. Without any real way to know where the planes were, this was all pretty primitive, but it set the basic pattern that people on the ground could keep track of aircraft and provide useful information. Post-war, civil aviation rapidly advanced. The early 1920s saw numerous commercial airlines adopting radio, mostly for business purposes like schedule coordination. Once you were in contact with someone on the ground, though, it was only logical to ask about weather and conditions. Many of our modern practices like weather briefings, flight plans, and route clearances originated as more or less formal practices within individual airlines. Air Mail The government was not left out of the action. The Post Office operated what may have been the largest commercial aviation operation in the world during the early 1920s, in the form of Air Mail. The Post Office itself did not have any aircraft; all of the flying was contracted out---initially to the Army Air Service, and later to a long list of regional airlines. Air Mail was considered a high priority by the Post Office and proved very popular with the public. When the transcontinental route began proper operation in 1920, it became possible to get a letter from New York City to San Francisco in just 33 hours by transferring it between airplanes in a nearly non-stop relay race. The Post Office's largesse in contracting the service to private operators provided not only the funding but the very motivation for much of our modern aviation industry. Air travel was not very popular at the time, being loud and uncomfortable, but the mail didn't complain. The many contract mail carriers of the 1920s grew and consolidated into what are now some of the United States' largest companies. For around a decade, the Post Office almost singlehandedly bankrolled civil aviation, and passengers were a side hustle [1]. Air mail ambition was not only of economic benefit. Air mail routes were often longer and more challenging than commercial passenger routes. Transcontinental service required regular flights through sparsely populated parts of the interior, challenging the navigation technology of the time and making rescue of downed pilots a major concern. Notably, air mail operators did far more nighttime flying than any other commercial aviation in the 1920s. The post office became the government's de facto technical leader in civil aviation. Besides the network of beacons and markers built to guide air mail between cities, the post office built 17 Air Mail Radio Stations along the transcontinental route. The Air Mail Radio Stations were the company radio system for the entire air mail enterprise, and the closest thing to a nationwide, public air traffic control service to then exist. They did not, however, provide what we would now call control. Their role was mainly to provide pilots with information (including, critically, weather reports) and to keep loose tabs on air mail flights so that a disappearance would be noticed in time to send search and rescue. In 1926, the Watres Act created the Aeronautic Branch of the Department of Commerce. The Aeronautic Branch assumed a number of responsibilities, but one of them was the maintenance of the Air Mail routes. Similarly, the Air Mail Radio Stations became Aeronautics Branch facilities, and took on the new name of Flight Service Stations. No longer just for the contract mail carriers, the Flight Service Stations made up a nationwide network of government-provided services to aviators. They were the first edifices in what we now call the National Airspace System (NAS): a complex combination of physical facilities, technologies, and operating practices that enable safe aviation. In 1935, the first en-route air traffic control center opened, a facility in Newark owned by a group of airlines. The Aeronautic Branch, since renamed the Bureau of Air Commerce, supported the airlines in developing this new concept of en-route control that used radio communications and paperwork to track which aircraft were in which airways. The rising number of commercial aircraft made in-air collisions a bigger problem, so the Newark control center was quickly followed by more facilities built on the same pattern. In 1936, the Bureau of Air Commerce took ownership of these centers, and ATC became a government function alongside the advisory and safety services provided by the flight service stations. En route center controllers worked off of position reports from pilots via radio, but needed a way to visualize and track aircraft's positions and their intended flight paths. Several techniques helped: first, airlines shared their flight planning paperwork with the control centers, establishing "flight plans" that corresponded to each aircraft in the sky. Controllers adopted a work aid called a "flight strip," a small piece of paper with the key information about an aircraft's identity and flight plan that could easily be handed between stations. By arranging the flight strips on display boards full of slots, controllers could visualize the ordering of aircraft in terms of altitude and airway. Second, each center was equipped with a large plotting table map where controllers pushed markers around to correspond to the position reports from aircraft. A small flag on each marker gave the flight number, so it could easily be correlated to a flight strip on one of the boards mounted around the plotting table. This basic concept of air traffic control, of a flight strip and a position marker, is still in use today. Radar The Second World War changed aviation more than any other event of history. Among the many advancements were two British inventions of particular significance: first, the jet engine, which would make modern passenger airliners practical. Second, the radar, and more specifically the magnetron. This was a development of such significance that the British government treated it as a secret akin to nuclear weapons; indeed, the UK effectively traded radar technology to the US in exchange for participation in US nuclear weapons research. Radar created radical new possibilities for air defense, and complimented previous air defense development in Britain. During WWI, the organization tasked with defending London from aerial attack had developed a method called "ground-controlled interception" or GCI. Under GCI, ground-based observers identify possible targets and then direct attack aircraft towards them via radio. The advent of radar made GCI tremendously more powerful, allowing a relatively small number of radar-assisted air defense centers to monitor for inbound attack and then direct defenders with real-time vectors. In the first implementation, radar stations reported contacts via telephone to "filter centers" that correlated tracks from separate radars to create a unified view of the airspace---drawn in grease pencil on a preprinted map. Filter center staff took radar and visual reports and updated the map by moving the marks. This consolidated information was then provided to air defense bases, once again by telephone. Later technical developments in the UK made the process more automated. The invention of the "plan position indicator" or PPI, the type of radar scope we are all familiar with today, made the radar far easier to operate and interpret. Radar sets that automatically swept over 360 degrees allowed each radar station to see all activity in its area, rather than just aircraft passing through a defensive line. These new capabilities eliminated the need for much of the manual work: radar stations could see attacking aircraft and defending aircraft on one PPI, and communicated directly with defenders by radio. It became routine for a radar operator to give a pilot navigation vectors by radio, based on real-time observation of the pilot's position and heading. A controller took strategic command of the airspace, effectively steering the aircraft from a top-down view. The ease and efficiency of this workflow was a significant factor in the end of the Battle of Britain, and its remarkable efficacy was noticed in the US as well. At the same time, changes were afoot in the US. WWII was tremendously disruptive to civil aviation; while aviation technology rapidly advanced due to wartime needs those same pressing demands lead to a slowdown in nonmilitary activity. A heavy volume of military logistics flights and flight training, as well as growing concerns about defending the US from an invasion, meant that ATC was still a priority. A reorganization of the Bureau of Air Commerce replaced it with the Civil Aeronautics Authority, or CAA. The CAA's role greatly expanded as it assumed responsibility for airport control towers and commissioned new en route centers. As WWII came to a close, CAA en route control centers began to adopt GCI techniques. By 1955, the name Air Route Traffic Control Center (ARTCC) had been adopted for en route centers and the first air surveillance radars were installed. In a radar-equipped ARTCC, the map where controllers pushed markers around was replaced with a large tabletop PPI built to a Navy design. The controllers still pushed markers around to track the identities of aircraft, but they moved them based on their corresponding radar "blips" instead of radio position reports. Air Defense After WWII, post-war prosperity and wartime technology like the jet engine lead to huge growth in commercial aviation. During the 1950s, radar was adopted by more and more ATC facilities (both "terminal" at airports and "en route" at ARTCCs), but there were few major changes in ATC procedure. With more and more planes in the air, tracking flight plans and their corresponding positions became labor intensive and error-prone. A particular problem was the increasing range and speed of aircraft, and corresponding longer passenger flights, that meant that many aircraft passed from the territory of one ARTCC into another. This required that controllers "hand off" the aircraft, informing the "next" ARTCC of the flight plan and position at which the aircraft would enter their airspace. In 1956, 128 people died in a mid-air collision of two commercial airliners over the Grand Canyon. In 1958, 49 people died when a military fighter struck a commercial airliner over Nevada. These were not the only such incidents in the mid-1950s, and public trust in aviation started to decline. Something had to be done. First, in 1958 the CAA gave way to the Federal Aviation Administration. This was more than just a name change: the FAA's authority was greatly increased compared tot he CAA, most notably by granting it authority over military aviation. This is a difficult topic to explain succinctly, so I will only give broad strokes. Prior to 1958, military aviation was completely distinct from civil aviation, with no coordination and often no communication at all between the two. This was, of course, a factor in the 1958 collision. Further, the 1956 collision, while it did not involve the military, did result in part from communications issues between separate distinct CAA facilities and the airline's own control facilities. After 1958, ATC was completely unified into one organization, the FAA, which assumed the work of the military controllers of the time and some of the role of the airlines. The military continues to have its own air controllers to this day, and military aircraft continue to include privileges such as (practical but not legal) exemption from transponder requirements, but military flights over the US are still beholden to the same ATC as civil flights. Some exceptions apply, void where prohibited, etc. The FAA's suddenly increased scope only made the practical challenges of ATC more difficult, and commercial aviation numbers continued to rise. As soon as the FAA was formed, it was understood that there needed to be major investments in improving the National Airspace System. While the first couple of years were dominated by the transition, the FAA's second director (Najeeb Halaby) prepared two lengthy reports examining the situation and recommending improvements. One of these, the Beacon report (also called Project Beacon), specifically addressed ATC. The Beacon report's recommendations included massive expansion of radar-based control (called "positive control" because of the controller's access to real-time feedback on aircraft movements) and new control procedures for airways and airports. Even better, for our purposes, it recommended the adoption of general-purpose computers and software to automate ATC functions. Meanwhile, the Cold War was heating up. US air defense, a minor concern in the few short years after WWII, became a higher priority than ever before. The Soviet Union had long-range aircraft capable of reaching the United States, and nuclear weapons meant that only a few such aircraft had to make it to cause massive destruction. Considering the vast size of the United States (and, considering the new unified air defense command between the United States and Canada, all of North America) made this a formidable challenge. During the 1950s, the newly minted Air Force worked closely with MIT's Lincoln Laboratory (an important center of radar research) and IBM to design a computerized, integrated, networked system for GCI. When the Air Force committed to purchasing the system, it was christened the Semi-Automated Ground Environment, or SAGE. SAGE is a critical juncture in the history of the computer and computer communications, the first system to demonstrate many parts of modern computer technology and, moreover, perhaps the first large-scale computer system of any kind. SAGE is an expansive topic that I will not take on here; I'm sure it will be the focus of a future article but it's a pretty well-known and well-covered topic. I have not so far felt like I had much new to contribute, despite it being the first item on my "list of topics" for the last five years. But one of the things I want to tell you about SAGE, that is perhaps not so well known, is that SAGE was not used for ATC. SAGE was a purely military system. It was commissioned by the Air Force, and its numerous operating facilities (called "direction centers") were located on Air Force bases along with the interceptor forces they would direct. However, there was obvious overlap between the functionality of SAGE and the needs of ATC. SAGE direction centers continuously received tracks from remote data sites using modems over leased telephone lines, and automatically correlated multiple radar tracks to a single aircraft. Once an operator entered information about an aircraft, SAGE stored that information for retrieval by other radar operators. When an aircraft with associated data passed from the territory of one direction center to another, the aircraft's position and related information were automatically transmitted to the next direction center by modem. One of the key demands of air defense is the identification of aircraft---any unknown track might be routine commercial activity, or it could be an inbound attack. The air defense command received flight plan data on commercial flights (and more broadly all flights entering North America) from the FAA and entered them into SAGE, allowing radar operators to retrieve "flight strip" data on any aircraft on their scope. Recognizing this interconnection with ATC, as soon as SAGE direction centers were being installed the Air Force started work on an upgrade called SAGE Air Traffic Integration, or SATIN. SATIN would extend SAGE to serve the ATC use-case as well, providing SAGE consoles directly in ARTCCs and enhancing SAGE to perform non-military safety functions like conflict warning and forward projection of flight plans for scheduling. Flight strips would be replaced by teletype output, and in general made less necessary by the computer's ability to filter the radar scope. Experimental trial installations were made, and the FAA participated readily in the research efforts. Enhancement of SAGE to meet ATC requirements seemed likely to meet the Beacon report's recommendations and radically improve ARTCC operations, sooner and cheaper than development of an FAA-specific system. As it happened, well, it didn't happen. SATIN became interconnected with another planned SAGE upgrade to the Super Combat Centers (SCC), deep underground combat command centers with greatly enhanced SAGE computer equipment. SATIN and SCC planners were so confident that the last three Air Defense Sectors scheduled for SAGE installation, including my own Albuquerque, were delayed under the assumption that the improved SATIN/SCC equipment should be installed instead of the soon-obsolete original system. SCC cost estimates ballooned, and the program's ambitions were reduced month by month until it was canceled entirely in 1960. Albuquerque never got a SAGE installation, and the Albuquerque air defense sector was eliminated by reorganization later in 1960 anyway. Flight Service Stations Remember those Flight Service Stations, the ones that were originally built by the Post Office? One of the oddities of ATC is that they never went away. FSS were transferred to the CAB, to the CAA, and then to the FAA. During the 1930s and 1940s many more were built, expanding coverage across much of the country. Throughout the development of ATC, the FSS remained responsible for non-control functions like weather briefing and flight plan management. Because aircraft operating under instrument flight rules must closely comply with ATC, the involvement of FSS in IFR flights is very limited, and FSS mostly serve VFR traffic. As ATC became common, the FSS gained a new and somewhat odd role: playing go-between for ATC. FSS were more numerous and often located in sparser areas between cities (while ATC facilities tended to be in cities), so especially in the mid-century, pilots were more likely to be able to reach an FSS than ATC. It was, for a time, routine for FSS to relay instructions between pilots and controllers. This is still done today, although improved communications have made the need much less common. As weather dissemination improved (another topic for a future post), FSS gained access to extensive weather conditions and forecasting information from the Weather Service. This connectivity is bidirectional; during the midcentury FSS not only received weather forecasts by teletype but transmitted pilot reports of weather conditions back to the Weather Service. Today these communications have, of course, been computerized, although the legacy teletype format doggedly persists. There has always been an odd schism between the FSS and ATC: they are operated by different departments, out of different facilities, with different functions and operating practices. In 2005, the FAA cut costs by privatizing the FSS function entirely. Flight service is now operated by Leidos, one of the largest government contractors. All FSS operations have been centralized to one facility that communicates via remote radio sites. While flight service is still available, increasing automation has made the stations far less important, and the general perception is that flight service is in its last years. Last I looked, Leidos was not hiring for flight service and the expectation was that they would never hire again, retiring the service along with its staff. Flight service does maintain one of my favorite internet phenomenon, the phone number domain name: 1800wxbrief.com. One of the odd manifestations of the FSS/ATC schism and the FAA's very partial privatization is that Leidos maintains an online aviation weather portal that is separate from, and competes with, the Weather Service's aviationweather.gov. Since Flight Service traditionally has the responsibility for weather briefings, it is honestly unclear to what extend Leidos vs. the National Weather Service should be investing in aviation weather information services. For its part, the FAA seems to consider aviationweather.gov the official source, while it pays for 1800wxbrief.com. There's also weathercams.faa.gov, which duplicates a very large portion (maybe all?) of the weather information on Leidos's portal and some of the NWS's. It's just one of those things. Or three of those things, rather. Speaking of duplication due to poor planning... The National Airspace System Left in the lurch by the Air Force, the FAA launched its own program for ATC automation. While the Air Force was deploying SAGE, the FAA had mostly been waiting, and various ARTCCs had adopted a hodgepodge of methods ranging from one-off computer systems to completely paper-based tracking. By 1960 radar was ubiquitous, but different radar systems were used at different facilities, and correlation between radar contacts and flight plans was completely manual. The FAA needed something better, and with growing congressional support for ATC modernization, they had the money to fund what they called National Airspace System En Route Stage A. Further bolstering historical confusion between SAGE and ATC, the FAA decided on a practical, if ironic, solution: buy their own SAGE. In an upcoming article, we'll learn about the FAA's first fully integrated computerized air traffic control system. While the failed detour through SATIN delayed the development of this system, the nearly decade-long delay between the design of SAGE and the FAA's contract allowed significant technical improvements. This "New SAGE," while directly based on SAGE at a functional level, used later off-the-shelf computer equipment including the IBM System/360, giving it far more resemblance to our modern world of computing than SAGE with its enormous, bespoke AN/FSQ-7. And we're still dealing with the consequences today! [1] It also laid the groundwork for the consolidation of the industry, with a 1930 decision that took air mail contracts away from most of the smaller companies and awarded them instead to the precursors of United, TWA, and American Airlines.

4 days ago 8 votes
Sierpiński triangle? In my bitwise AND?

Exploring a peculiar bit-twiddling hack at the intersection of 1980s geek sensibilities.

5 days ago 8 votes