Full Width [alt+shift+f] Shortcuts [alt+shift+k]
Sign Up [alt+shift+s] Log In [alt+shift+l]
37
I have a bad habit of testing things whenever a “good” idea pops into my head. This is a short overview of one of them. The Orange Pi Zero is a SBC (single board computer) that has a slow 32-bit ARM 4 core CPU, 512MB of RAM and no display output. It’s actually quite OK for many tasks, such as reverse proxy (assuming 100Mbit/s is enough for you), low-performance NAS (assuming you are fine with 10-12MB/s file transfer speeds) or a Syncthing relay. Oh, and you can also turn it into a Wi-Fi access point. Why? Because Armbian made it really easy to test out and I was interested in seeing what kind of performance an old USB Wi-Fi dongle could offer. This cheap AP could also come into handy in situations where my main access point dies for some reason. How? Go to the Armbian device page for your SBC and download the latest image, then write it to your microSD card using the tool of your choice, start your SBC and finish the initial setup. After all that is done, start armbian-config as root....
over a year ago

Improve your reading experience

Logged in users get linked directly to articles resulting in a better reading experience. Please login for free, it takes less than 1 minute.

More from ./techtipsy

We get laptops with annoying cooling fans because we keep buying them

I don’t like laptops with loud cooling fans in them. Quite a controversial position, I know. But really, they do suck. A laptop can be great to use, have a fantastic keyboard, sharp display, lots of storage and a fast CPU, and all of that can be ruined by one component: the cooling fan. Laptop fans are small, meaning that they have to run faster to have any meaningful cooling effect, which means that they are usually very loud and often have a high-pitched whine to them, making them especially obnoxious. Sometimes it feels like a deliberate attack on one of my senses. Fans introduce a maintenance burden. They keep taking in dust, which tends to accumulate at the heat sink. If you skip maintenance, then you’ll see your performance drop and the laptop will get notably hot, which may contribute to a complete hardware failure. We’ve seen tremendous progress in the world of consumer CPU-s over the last decade. Power consumption is much lower while idle, processors can do a lot more work in the same power envelope, and yet most laptops that I see in use are still actively cooled by an annoying-ass cooling fan.1 And yet we keep buying them. But it doesn’t have to be this way. My colleagues that have switched to Apple Silicon laptops are sometimes surprised to hear the fan on their laptop because it’s a genuinely rare occurrence for them. Most of the time it just sits there doing nothing, and when it does come on, it’s whisper-quiet. And to top it off, some models, such as the Macbook Air series, are completely fanless. Meanwhile, those colleagues that run Lenovo ThinkPads with Ryzen 5000 and 7000 series APU-s (that includes me) have audible fans and at the same time the build times for the big Java monolith that we maintain are significantly slower (~15%) compared to the fan-equipped MacBooks.2 We can fix this, if we really wanted to. As a first step, you can change to a power saving mode on your current laptop. This will likely result in your CPU and GPU running more efficiently, which also helps avoid turning the cooling fan on. You will have to sacrifice some performance as a result of this change, which will not be a worthwhile trade-off for everyone. If you are OK with risking damaging your hardware, you can also play around with setting your own fan curve. The CPU and GPU throttling technology is quite advanced nowadays, so you will likely be fine in this area, but other components in the laptop, such as the battery, may not be very happy with higher temperatures. After doing all that, the next step is to avoid buying a laptop that abuses your sense of hearing. That’s the only signal that we can send to manufacturers that they will actually listen to. Money speaks louder than words. What alternative options do we have? Well, there are the Apple Silicon MacBooks, and, uhh, that one ThinkPad with an ARM CPU, and a bunch of Chromebooks, and a few Windows tablets I guess. I’ll be honest, I have not kept a keen eye on recent developments, but a quick search online for fanless laptops pretty much looks as I described. Laptops that you’d actually want to get work done on are completely missing from that list, unless you like Apple.3 In a corporate environment the choice of laptop might not be fully up to you, but you can do your best to influence the decision-makers. There’s one more alternative: ask your software vendor to not write shoddily thrown together software that performs like shit. Making a doctor appointment should not make my cooling fan go crazy. Not only is slow and inefficient software discriminatory towards those that cannot afford decent computer hardware, it’s also directly contributing to the growing e-waste generation problem by continuously raising the minimum hardware requirements for the software that we rely on every day. Written on a Lenovo ThinkPad X395 that just won’t stop heating up and making annoying fan noises. passive vs active cooling? More like passive vs annoying cooling. ↩︎ I dream of a day where Asahi Linux runs perfectly on an Apple Silicon MacBook. It’s not production ready right now, but the developers have done an amazing job so far! ↩︎ I like the hardware that Apple produces, it’s the operating system that I heavily dislike. ↩︎

a week ago 16 votes
Home is where the home server is

I moved recently, and so did my home server. You might have noticed it due to the downtime. This time I have built a dedicated shelf for it, which allows for more flexibility and room for additional expensive ideas. The internet connection is a fiber line, which is fantastic for a place that’s generally considered to be in the countryside. I had to hire a guy at the last place in Tallinn (capital of Estonia) to pull a fiber line from the basement to the apartment, with my own money, so I’m very happy that I don’t have to do it here. And yes, the ThinkPad T430 is still a solid home server. I had an issue with my battery calibration script resulting in the machine being turned off, but I fixed it by disabling it, at the cost of the battery probably dying soon. Seems like a tlp and/or Linux kernel issue that has surfaced recently, as it also happened on a different ThinkPad laptop when I last tried it. I can’t really remove the battery, because the “power on with AC attach” setting only works when the battery is connected and charged. The server/wardrobe/closet room is slightly chillier compared to the rest of the environment, meaning that the temperatures are also slightly lower. I also have an option to do some crazy ventilation experiments in the winter, but that will have to wait for a bit, mainly because it’s spring. I’m genuinely surprised that the Wi-Fi 5 signal is coming through the closet quite adequately, with the whole apartment being covered with at least 50 Mbit/s speeds, and over 300 Mbit/s when near the closet, which is about the maximum speed that I can achieve from the access point in ideal conditions.

2 weeks ago 19 votes
The coffee machine ran out of memory

After looking into an incident involving Kubernetes nodes running out of memory, I took a trip to the office kitchen to take a break and get a cup of the good stuff. My teammate got their drink first, and then it was my turn. Why is there a Windows 98 themed pop-up on the screen? I wanted to get my coffee, so I tapped on the small OK button. That may have forced the poor coffee machine to start swapping, for which I felt a little bit guilty. The UI was catching up with previous animations, and I got to the drink selection. None of the buttons worked. I reckon something critical crashed in the background. After looking into an incident involving a coffee machine running out of memory, I took a trip to the other office kitchen to take a break and get a cup of the good stuff. That one was fine. Guess it ran on something else than Java. laugh_track.mp3

a month ago 22 votes
About the time I trashed my mother's laptop

Around 2003, my mother had a laptop: the Compaq Armada 1592DT. It ran Windows Me, the worst Windows to ever exist, whopping 96 MB of RAM, and a 3 GB hard drive. My mother used it for important stuff, and I played games on it. Given the limitations of the 3 GB hard drive, this soon lead to a conflict: there was no room to store any new games! I did my best to make additional room by running the disk cleaner utility, disabling unnecessary Windows features and deleting some PDF catalogues that my mother had downloaded, but there was still a constant lack of space. Armed with a lack of knowledge about computers, I went further and found a tool that promised to make more room on the hard drive. I can’t remember what it was, but it had a nice graphical user interface where the space on the drive was represented as a pie chart. To my amazement, I could slide that pie chart to make it so that 90% of the drive was free space! I went full speed ahead with it. What followed was a crash and upon rebooting I was presented with a black screen. Oops. My mother ended up taking it to a repair shop for 1200 EEK, which was a lot of money at the time. The repair shop ended up installing Windows 98 SE on it, which felt like a downgrade at the time, but in retrospect it was an improvement over Windows Me. I had no idea what I was doing at the time, but I assume that the tool I was playing with was some sort of a partition manager that had no safeguards in place to avoid shrinking and reformatting operating system partitions. Or if it did, then it made ignoring the big warning signs way too easy. Still 100% user error on my part. If only I knew that reinstalling Windows was a relatively simple operation at the time, but it took a solid 4-5 years until I did my first installation of Windows all by myself.

a month ago 26 votes
Fairphone Fairbuds XL review: admirable goals, awful product

I bought the Fairphone Fairbuds XL with my own money at a recent sale for 186.75 EUR, plus 15 EUR for shipping to Estonia. The normal price for these headphones is 239 EUR. This post is not sponsored. I admire what Fairphone wants to achieve, even going as far as getting the Fairphone 5 as a replacement for my iPhone X. Failing to repair my current headphones, I went ahead and decided to get the Fairphone Fairbuds XL as they also advertise the active noise-cancelling feature, and I like the Fairphone brand. Disclaimer: this review is going to be entirely subjective and based on my opinions and experiences with other audio products in the past. I also have tinnitus.1 I consulted rtings.com review before purchasing the product to get an idea about what to expect as a consumer. The comparison headphones The main point of comparison for this review is going to be the Sony WH-1000XM3, which are premium high-end wireless Bluetooth headphones, with active noise-cancelling (before that feature broke). These headphones retailed at a higher price during 2020 (about 300-400 EUR) so they are technically a tier above the Fairbuds XL, but given that its successsor, the WH-1000XM4, can be bought for 239 EUR new (and often about 200-ish EUR on sale!), then it is a fair comparison in my view. After I replaced the ear cushions on my Sony WH-1000XM3 headset, the active noise-cancelling feature started being flaky (popping and loud noises occurring with NC on). No amount of cleaning or calibrating fixed it, and even the authorized repair shop could not do anything about it. I diagnosed the issue to be with the internal noise-cancelling microphones and found that these failing is a very common issue for these headsets, even for newer versions of it. I am unable to compare the active noise-cancelling performance side-by-side, but I can say that the NC performance on the Sony WH-1000XM3 was simply excellent when it did work, no doubt about it. The Fairphone shop experience The first issue I had with the product was actually buying it. For some reason, the form would not accept my legal name which has letter “Õ” in it, a common vowel in Estonia. Knowing how poorly Javascript-based client-side validation can be built, I pulled a pro gamer move and copy-pasted my name into the form, which bypassed the faulty check altogether. Similar issue occurred with the address field, as we also have the letter “Ä” ( and “Ö”, “Ü”, for that matter). The name I can understand why Fairphone went with the name “Fairbuds XL”, it kind of made sense in their audio product line, and Apple set a precedent with AirPods Max. However, there is such a big missed opportunity here: they could’ve called the product… Fairphones. Yes, it would cause some confusion about their other product line, which is the Fairphone, but at least I would find the name more amusing. Packaging The packaging for the headphones is quite similar to what you’d get with the Fairphone 5: lots of cardboard and seemingly no plastic or otherwise problematic materials. Aside from the headphones themselves, you also get a nice egg bag, meant to protect your headphones when travelling with them. It’s okay, but nothing special, and it won’t protect your headphones from physical damage should they fall or get thrown around in a backpack. The Sony headphones come with a solid hardcase, which have done a fantastic job of protecting the headphones over the last 4 years. Longevity of a device depends both on repairability and durability, which is why a hard case would benefit the Fairbuds XL a lot. Factory defect My experience with the Fairbuds XL were off to a rocky start. I noticed that the USB-C cable that connects both sides of the headphones was inserted incorrectly. The headphones worked fine, but you could feel the flat USB-C cable being twisted inside the headband. The fix to this was to carefully push the headband back, disconnect the USB-C cable from the headphones, flip the cable around and reconnect it. Not a good first impression, but at least the fix was simple enough. Fit and feel The Fairbuds XL are not as comfortable as the reference headphones. The ear cushions and headrest are quite hard and not as soft as on the Sony WH-1000XM3. If you get the fit just right, then you probably won’t have issues with wearing these for a few hours at the time, but I found myself adjusting these often to stop them from hurting my ears and head even during a short test. The ear cups lack any kind of swiveling, which is likely contributing to the comparatively poor fit. Our ears are angled ever-so-slightly forwards, and the Sony WH-1000XM3 feels so much better on the ears as a result of its swiveling aspect. I also noticed that you can hear some components inside the headphones rattling when moving your head. This noise is very noticeable even during music playback and you don’t need to move your head a lot to hear that rattling. In my view, this is a serious defect in the product. When the headphones are folded in, the USB-C cable gets bent in the process and gets forced against one of the ear cushions. I suspect that within months or years of use, either the cable will fail or the ear cushion gets a permanent imprint of the USB-C cable position. The sound I’m not impressed with the sound that the Fairbuds XL produce. They are not in the same class as the Sony WH-1000XM3, with the default equalizer sounding incredibly bland. Most instruments and sounds are bland and not as clear. That’s the best I can describe it as. The Fairbuds app can be used to tune the sound via the equalizer, and out of all the presets I’ve found “Boston” to be the most pleasant one to use. Unfortunately the UI does not show how the presets customize the values in the equalizer, which makes tweaking a preset all that much harder. Compared to the Sony WH-1000XM3, I miss the cripsy sound and the all-encompassing bass, it can really bring all the satisfying details out. Given that I had used the Sony headphones for almost 5 years at this point may also just mean that I had gotten used to how it sounds. Active noise-cancelling The active noise-cancelling performance is nowhere near the Sony WH-1000XM3-s. The effect is very minor, and you’ll be hearing most of the surrounding sounds. Touching the active noise-cancelling microphones on the sides of the headphones will also make a loud sound inside the speaker, and walking around in a room will result in the headphones making wind noises. Because of this, I consider the active noise-cancelling functionality to be functionally broken. Microphone quality I used the Fairbuds XL in a work call, and based on feedback from other attendees, the microphone quality over Bluetooth can be categorized as barely passable, getting a solid 2 points out of 5. To be fair, Bluetooth microphone quality is also not great on the Sony WH-1000XM3-s, but compared to the Fairphone Fairbuds XL, they are still subjectively better. Fairbuds app The Fairbuds app is very simple, and you’d mainly want to use it for setting the equalizer settings and upgrading the firmware. The rest of the functionality seems to be a bunch of links to Fairphone articles and guides. The first time I installed the app, it told me that a firmware upgrade version V90 is available. During the first attempt, the progress bar stopped. Second attempt: it almost reached the end and did not complain about a firmware upgrade being available after that. Third attempt came after I had reinstalled the app. And there it was, the version V90 update, again. This time it got stuck at 1%. I’m probably still on the older version of the firmware, but I honestly can’t tell. Bluetooth multi-device connecting This is a feature that I didn’t know I needed in my life. With the reference Sony WH-1000XM3-s, whenever I wanted to switch where I listen to music from, I had to disconnect from my phone and then reconnect on the desktop, which was an annoying and manual process. With the Fairbuds XL, I can connect the headphones to both my laptop and phone and play media wherever, the headphones will switch to whichever device I’m actually using! This, too, has its quirks, and there might be a small delay when playing media on the other device, but I’ve grown so accustomed to using this feature now and can’t imagine myself going back to using anything else. This feature is not unique to the Fairbuds XL as other modern wireless headphones are also likely to boast this feature, but this is the first time I’ve had the opportunity to try this out myself. It’s a tremendous quality of life improvement for me. However, this, too, is not perfect. If I have the headphones connected to my phone and laptop, and I change to headset mode on the laptop for a meeting, then the playback on the phone will be butchered until I completely disconnect the headphones from the laptop. This seems like a firmware issue to me. The controls The Fairbuds XL has one button and one joystick. The button controls the active noise-cancelling settings (NC on, Ambient sound, NC off), plus the Bluetooth pairing mode. The joystick is used to turn the device on, switch songs and control the volume, and likely some other settings that relate to accepting calls and the like. Coming from the Sony WH-1000XM3, I have to say that I absolutely LOVE having physical buttons again! It’s so much easier to change the volume level, skip songs and start/stop playback with a physical button compared to the asinine touch surface solution that Sony has going on. The joystick is not perfect, skipping a song can be a little bit tricky due to how the joystick is positioned, you can’t always get a good handle due to your fingers hitting the rest of the headphone assembly. That’s the only concern I have with it. If the joystick was a little bit concave and larger, then that may make some of these actions easier for those of us with modest/large thumbs. The audio cue for skipping songs is a bit annoying and cannot seemingly be disabled. The sound effect resembles someone hitting a golf ball with a very poor driver. The ANC settings button is alright, but it’s not possible to quickly cycle between the three modes, you will have to fully listen to the nice lady speaking and then you can move on to the next setting. I wish that clicking the button in rapid succession would skip through the modes faster. USB-C port functionality I was curious to see if the Fairbuds XL worked as normal headphones if I just connected them up to my PC using a USB-C cable. To my surprise, they did! The audio quality was not as good as with Bluetooth, and the volume controls depended on which virtual device you select in your operating system. The Sony WH-1000XM3 do not work like this, the USB-C port is for charging only as far as I’ve tested, but it does have an actual 3.5mm port for wired use. When connected over Bluetooth and you connect a charging cable, the Fairbuds XL will pause momentarily and then continue playback while charging the battery. This is incredibly handy for a wireless device, especially in situations where you have an important meeting coming up and you’re just about to run out of battery. The Sony WH-1000XM3 will simply power off when you connect a charger cable, rendering them unusable while charging. Annoying issues For some reason, whenever I charge my Fairbuds XL, they magically turn on again and I have to shut them off a second time. I’m never quite sure if I’ve managed to shut the headphones off. It does the jingle that indicates that it’s powered off, but then I come back to it later and I find that they’re powered on again. Customer care experience I was so unhappy with the product that I tried out the refunding process for the Fairphone Fairbuds XL. I ordered the Fairbuds XL on 2025-02-10 and I received them on 2025-02-14, shipped to Estonia. According to Fairphone’s own materials, I can return the headphones without any questions asked, assuming that my use of them matches what can be done at a physical store. For Fairphone Products, including gift cards, you purchased on the Fairphone Webshop, you have a legal right to change your mind within 14 days and receive a refund amounting to the purchase price of the products and the costs of delivery and return. You are entitled to cancel your purchase within fourteen (14) days from the day the products were delivered to you, without explanation and without any penalties. In the case of a Cool-off, Fairphone may reduce the refund of the purchase price (including delivery costs) to reflect any reduction in the value of the Products, if this has been caused by your handling them in a way which would not normally be permitted in a shop. This means You are entitled to turn on and inspect Your purchased device to familiarise yourself with its properties and ensure that it is working correctly – comparable to the conditions that are permitted within a shop. I followed their instructions and filed a support ticket on 2025-02-16. On 2025-02-25, I had not yet received any contact from Fairphone and I asked them again under the same ticket. On 2025-03-07, I received an automated message that apologized for the delay and asked me to not make any additional tickets on the matter. I’m still waiting for an update for the support ticket over a month later, while the headphones sit in the original packaging. Based on the experiences by others in the Fairphone community forum, it seems that unacceptably large delays in customer service are the norm for Fairphone. Fairphone, if you want to succeed as a company, you need to make sure that the one part of your company that’s directly interfacing with your actual paying customers needs to be appropriately staffed and resourced. A bad customer support experience can turn off a brand evangelist overnight. Closing thoughts I want Fairphone to succeed in their mission, but products like these do not further the cause. The feature set of the Fairbuds XL seems competent, and I’m willing to give a pass on a few minor issues if the overall experience is good, but the unimpressive sound profile, broken active noise-cancelling mode, multiple quality issues and poor customer service mean that I can’t in good conscience recommend the Fairphone Fairbuds XL, not even on sale. Perhaps less resources should be spent on rebranding and more on engineering good products. Remember dubstep being a thing? Yeah, so do I. That, plus a little bit of mandatory military service can do a lot of damage to hearing. ↩︎

2 months ago 31 votes

More in technology

The Tandy Corporation, Part 2

Trash-80s get Colorful, and Trash-80s get into your pockets

8 hours ago 2 votes
Plus Post: Sony SMC-777C

One person, Eight languages. The 8bit.

yesterday 2 votes
Harpoom: of course the Apple Network Server can be hacked into running Doom

a $10,000+ Apple server running IBM AIX. Of course you can. Well, you can now. Now, let's go ahead and get the grumbling out of the way. No, the ANS is not running Linux or NetBSD. No, this is not a backport of NCommander's AIX Doom, because that runs on AIX 4.3. The Apple Network Server could run no version of AIX later than 4.1.5 and there are substantial technical differences. (As it happens, the very fact it won't run on an ANS was what prompted me to embark on this port in the first place.) And no, this is not merely an exercise in flogging a geriatric compiler into building Doom Generic, though we'll necessarily do that as part of the conversion. There's no AIX sound driver for ANS audio, so this port is mute, but at the end we'll have a Doom executable that runs well on the ANS console under CDE and has no other system prerequisites. We'll even test it on one of IBM's PowerPC AIX laptops as well. Because we should. almost by default, Apple's first true Unix server since the A/UX-based Workgroup Server 95, but IBM AIX has a long history of its own dating back to the 1986 IBM RT PC. That machine was based on the IBM ROMP CPU as derived from the IBM 801, generally considered the first modern RISC design. AIX started as an early port of UNIX System V Release 3 and is thus a true Unix descendent, but also merged in some code from BSD 4.2 and 4.3. The RT PC ran AIX v1 and v2 as its primary operating systems, though IBM also supported 4.3BSD (ported by IBM as the Academic Operating System) and a spin of Pick OS. Although a truly full-fledged workstation with aggressive support from IBM, it ended up a failure in the market due to comparatively poor performance and notorious problems with its floating point support. Nevertheless, AIX's workstation roots persisted even through the substantial rewrite that became version 3 in 1989, and was likewise the primary operating system for its next-generation technical workstations now based on POWER. AIX 3 introduced AIXwindows, a licensed port of X.desktop from IXI Limited (later acquired by another licensee, SCO) initially based on X11R3 with Motif as the toolkit. In 1993 the "SUUSHI" partnership — named for its principals, Sun, Unix System Laboratories, the Univel joint initiative of Novell and AT&T, SCO, HP and IBM — negotiated an armistice in the Unix Wars, their previous hostilities now being seen as counterproductive against common enemy Microsoft. This partnership led to the Common Open Software Environment (COSE) initiative and the Common Desktop Environment (CDE), derived from HP VUE, which was also Motif-based. AIX might have been the next Mac OS. For that matter, OS/2 was still a thing on the desktop (as Warp 4) despite Workplace OS's failure, Ultimedia was a major IBM initiative in desktop multimedia, and the Common User Access model was part of CDE too. AIX 4 had multimedia capabilities as well through its own native port of Ultimedia, supporting applications like video capture and desktop video conferencing, and even featured several game ports IBM themselves developed — two for AIX 4.1 (Quake and Abuse) and one later for 4.3 (Quake II). The 4.1 game ports run well on ANS AIX with the Ultimedia libraries installed, though oddly Doom was never one of them. IBM cancelled all this with AIX 5L and never looked back. ANS "Harpoon" AIX only made two standard releases, 4.1.4.1 and 4.1.5, prior to Gil Amelio cancelling the line in April 1997. However, ANS AIX is almost entirely binary-compatible with regular 4.1 and there is pretty much no reason not to run 4.1.5, so we'll make that our baseline. Although AIX 4.3 was a big jump forward, for our purposes the major difference is support for X11R6 as 4.1 only supports X11R5. Upgrading the X11 libraries is certainly possible but leads to a non-standard configuration, and anyway, the official id Linux port by Dave Taylor hails from 1994 when many X11R5 systems would have still been out there. We would also rather avoid forcing people to install Ultimedia. There shouldn't be anything about basic Doom that would require anything more than the basic operating system we have. NCommander AIX Doom port is based on Chocolate Doom, taking advantage of SDL 1.2's partial support for AIX. Oddly, the headers for the MIT Shared Memory Extension were reportedly missing on 4.3 despite the X server being fully capable of it, and he ended up cribbing them from a modern copy of Xorg to get it to build. Otherwise, much of his time was spent bringing in other necessary SDL libraries and getting the sound working, neither of which we're going to even attempt. Owing to the ANS' history as a heavily modified Power Macintosh 9500, it thus uses AWACS audio for which no driver was ever written for AIX, and AIX 4.1 only supports built-in audio on IBM hardware. Until that changes or someone™ figures out an alternative, the most audio playback you'll get from Harpoon AIX is the server quacking on beeps (yes, I said quacking, the same as the Mac alert sound). However, Doom Generic is a better foundation for exotic Doom ports because it assumes very little about the hardware and has straight-up Xlib support, meaning we won't need SDL or even MIT-SHM. It also removes architecture-specific code and is endian-neutral, important because AIX to this day remains big-endian, though this is less of a issue with Doom specifically since it was originally written on big-endian NeXTSTEP 68K and PA-RISC. We now need to install a toolchain, since Harpoon AIX does not include an xlC license, and I'd be interested to hear from anyone trying to build this with it. Although the venerable AIXPDSLIB archive formerly at UCLA has gone to the great bitbucket in the sky, there are some archives of it around and I've reposted the packages I personally kept for 4.1 and 3.2.5 on the Floodgap gopher server. The most recent compiler AIXPDSLIB had for 4.1 was gcc 2.95.2, though for fun I installed the slightly older egcs 2.91.66, and you will also need GNU make, for which 3.81 is the latest available. These compilers use the on-board assembler and linker. I did not attempt to build a later compiler with this compiler. It may work and you get to try that yourself. Optionally you can also install gdb 5.3, which I did to stomp out some glitches. These packages are all uncompressed and un-tarred from the root directory in place; they don't need to be installed through smit. I recommend symlinking /usr/local/bin/make as /usr/local/bin/gmake so we know which one we're using. Finally, we'll need a catchy name for our port. Originally it was going to be ANS Doom, but that sounded too much like Anus Doom, which I proffer as a free metal band name and I look forward to going to one of their concerts soon. Eventually I settled on Harpoom, which I felt was an appropriate nod to its history while weird enough to be notable. All of my code is on Github along with pre-built binaries and all development was done on stockholm, my original Apple Network Server 500 that I've owned continuously since 1998, with a 200MHz PowerPC 604e, 1MB of cache, 512MB of parity RAM and a single disk here running a clean install of 4.1.5. Starting with Doom Generic out of the box, we'll begin with a Makefile to create a basic Doom that I can run over remote X for convenience. (Since the ANS runs big-endian, if you run a recent little-endian desktop as I do with my POWER9 you'll need to start your local X server with +byteswappedclients or a configuration file change, or the connection will fail.) I copied Makefile.freebsd and stripped it down to I also removed -Wl,-Map,$(OUTPUT).map from the link step in advance because AIX ld will barf on that. gmake understood the Makefile fine but the compile immediately bombed. It's time to get out that clue-by-four and start bashing the compiler with it. There is, in fact, no inttypes.h or stdint.h on AIX 4.1. So let's create an stdint.h! We could copy it from somewhere else, but I wanted this to only specify what it needed to. After several false starts, the final draft was and we include that instead of inttypes.h. Please note this is only valid for 32 bit systems like this one. Obviously we'll change that from <stdint.h> to "stdint.h". doomtype.h has this definition for a boolean: Despite this definition, undef isn't actually used in the codebase anywhere, and if C++ bool is available then it just typedefs it to boolean. But egcs and gcc come with their own definition, here in its entirety: This is almost identical. Since we know we don't really need undef, we comment out the old definition in doomtype.h, #include <stdbool.h> and just typedef bool boolean like C++. The col_t is an AIX specific problem that conflicts with AIX locales. Since col_t is only found in i_video.c, we'll just change it in four places to doomcol_t. The last problem was this bit of code at the end of I_InitGraphics(): Here we can cheat, being pre-C99, by merely removing the declaration. This is aided by the fact I_InitInput neither passes nor returns anything. The compiler accepted that. X11R5 does not support the X Keyboard Extension (Xkb). To make the compile go a bit farther I switched out X11/XKBlib.h for X11/keysym.h. We're going to have some missing symbols at link time but we'll deal with that momentarily. DG_Init() is naughty and didn't declare all its variables at the beginning. This version of the compiler can't cope with that and I had to rework the function. Although my revisions compiled, the link failed, as expected: XkbSetDetectableAutoRepeat tells the keyboard driver to not generate synthetic KeyRelease events for this X client when a key is auto-repeating. X11R5 doesn't have this capability, so the best we can do is XAutoRepeatOff, which still gives us single KeyPress and KeyRelease events but that's because it disables key repeat globally. (A footnote on this later on.) There's not a lot we can do about that, though we can at least add an atexit to ensure the previous keyboard repeat status is restored on quit. Similarly, there is no exact equivalent for XkbKeycodeToKeysym, though we can sufficiently approximate it for our purposes with XLookupKeysym in both places: That was enough to link Doom Generic. Nevertheless, with our $DISPLAY properly set and using the shareware WAD, it immediately fails: This error comes from this block of code in w_wad.c: With some debugging printfs, we discover the value of additional lumps we're being asked to allocate is totally bogus: This nonsense number is almost certainly caused by an unconverted little-endian value. Values in WADs are stored little-endian, even in the native Macintosh port. Doom Generic does have primitives for handling byteswaps, however, so it seems to have incorrectly detected us as little-endian. After some grepping this detection quite logically comes from i_swap.h. As we have intentionally not enabled sound, for some reason (probably an oversight) this file ends up defaulting to little endian: Ordinarily this would be a great place to use gcc's byteswap intrinsics, buuuuuuuuut (and I was pretty sure this would happen) ... so we're going to have to write some. Since they've been defined as quasi-functions, I decided to do this as actual inlineable functions with a sprinkling of inline PowerPC assembly. The semantics of static inline here are intended to take advantage of the way gcc of this era handled it. These snippets are very nearly the optimal code sequences, at least if the value is already in a register. If the value was being fetched from memory, you can do the conversion in one step with single instructions (lwbrx or lhbrx), but the way the code is structured we won't know where the value is coming from, so this is the best we can do for now. Atypically, these conversions must be signed. If you miss this detail and only handle the unsigned case, as yours truly did in a first draft, you get weird things like this: was extended on values we did not, 16-bit values started picking up wacky negative quantities because the most significant byte eventually became all ones and 32-bit PowerPC GPRs are 32 bits, all the time. Properly extending the sign after conversion was enough to fix it. CMAP256. Since this is a compile-time choice, and we want to support both remote X and console X, we'll just make two builds. I rebuilt the executable this time adding -DCMAP256 to the CFLAGS in the Makefile. PseudoColor 8-bit visuals, so we must not be creating a colourmap for the window nor updating it, and indeed there is none in the Doom Generic source code. Fortunately, there is in the original O.G. Linux Doom, so I cribbed some code from it. I added a new function DG_Upload_Palette to accept a 256-colour internal palette from the device-independent portion, turn it into an X Colormap, and push it to the X server with XStoreColors. Because the engine changes the entire palette every time (on damage, artifacts, etc.), we must set every colour change flag in the Colormap, which we do and cache on first run just like O.G. Linux Doom did. The last step is to tag the Colormap we create to the X window using XSetWindowColormap. the other AIX games, by the way. Here are some direct grabs from the framebuffer using xwd -icmap -screen -root -out. map to nothing, not Meta, Super or even Hyper in Harpoon's X server. Instead, when pressed or released each Command key generates an XEvent with an unexpected literal keycode of zero. After some experimentation, it turns out that no other key (tested with a full Apple Extended Keyboard II) on a connected ADB keyboard will generate this keycode. I believe this was most likely an inadvertent bug on Apple's part but I decided to take advantage of it. I don't think it's a good idea to do this if you're running a remote X session and the check is disabled there, but if you run the 256-colour version on the console, you can use the Command keys to strafe instead (Alt works in either version). Lastly, I added some code to check the default or available visuals so that you can't (easily) run the wrong version in the wrong place and bumped the optimization level to -O3. And that's the game. Here's a video of it on the console, though I swapped in an LCD display so that the CRT flicker won't set you off. This is literally me pointing my Pixel 7 Pro camera at the screen. RISC ThinkPad-like laptop that isn't, technically, a ThinkPad. You might see this machine in a future entry. precompiled builds both for 24-bit and 8-bit colour are available on Github. Like Doom Generic and the original Doom, Harpoom is released under the GNU General Public License v2.

yesterday 4 votes
Embedding Godot games in iOS is easy

Recently there’s been very exciting developments in the Godot game engine, that have allowed really easy and powerful integration into an existing normal iOS or Mac app. I couldn’t find a lot of documentation or discussion about this, so I wanted to shine some light on why this is so cool, and how easy it is to do! What’s Godot? For the uninitiated, Godot is an engine for building games, other common ones you might know of are Unity and Unreal Engine. It’s risen in popularity a lot over the last couple years due to its open nature: it’s completely open source, MIT licensed, and worked on in the open. But beyond that, it’s also a really well made tool for building games with (both 2D and 3D), with a great UI, beautiful iconography, a ton of tutorials and resources, and as a bonus, it’s very lightweight. I’ve had a lot of fun playing around with it (considering potentially integrating it into Pixel Pals), and while Unity and Unreal Engine are also phenomenal tools, Godot has felt lightweight and approachable in a really nice way. As an analogy, Godot feels closer to Sketch and Figma whereas Unity and Unreal feel more like Photoshop/Illustrator or other kinda bulky Adobe products. Even Apple has taken interest in it, contributing a substantial pull request for visionOS support in Godot. Why use it with iOS? You’ve always been able to build a game in Godot and export it to run on iOS, but recently thanks to advancements in the engine and work by amazing folks like Miguel de Icaza, you can now embed a Godot game in an existing normal SwiftUI or UIKit app just as you would an extra UITextView or ScrollView. Why is this important? Say you want to build a game or experience, but you don’t want it to feel just like another port, you want it to integrate nicely with iOS and feel at home there through use of some native frameworks and UI here and there to anchor the experience (share sheets, local notifications, a simple SwiftUI sheet for adding a friend, etc.). Historically your options have been very limited or difficult. You no longer have to have “a Godot game” or “an iOS app”, you can have the best of both worlds. A fun game built entirely in Godot, while having your share sheets, Settings screens, your paywall, home screen widgets, onboarding, iCloud sync, etc. all in native Swift code. Dynamically choosing which tool you want for the job. (Again, this was technically possible before and with other engines, but was much, much more complicated. Unity’s in particular seems to have been last updated during the first Obama presidency.) And truly, this doesn’t only benefit “game apps”. Heck, if the user is doing something that will take awhile to complete (uploading a video, etc.) you could give them a small game to play in the interim. Or just for some fun you could embed a little side scroller easter egg in one of your Settings screens to delight spelunking users. Be creative! SpriteKit? A quick aside. It wouldn’t be an article about game dev on iOS without mentioning SpriteKit, Apple’s native 2D game framework (Apple also has SceneKit for 3D). SpriteKit is well done, and actually what I built most of Pixel Pals in. But it has a lot of downsides versus a proper, dedicated game engine: Godot has a wealth of tutorials on YouTube and elsewhere, bustling Discord communities for help, where SpriteKit being a lot more niche can be quite hard to find details on The obvious one: SpriteKit only works on Apple platforms, so if you want to port your game to Android or Windows you’re probably not going to have a great time, where Godot is fully cross platform Godot being a full out game engine has a lot more tools for game development than can be handy, from animation tools, to sprite sheet editors, controls that make experimenting a lot easier, handy tools for creating shaders, and so much more than I could hope to go over in this article. If you ever watch a YouTube video of someone building a game in a full engine, the wealth of tools they have for speeding up development is bonkers. Godot is updated frequently by a large team of employees and volunteers, SpriteKit conversely isn’t exactly one of Apple’s most loved frameworks (I don’t think it’s been mentioned at WWDC in years) and kinda feels like something Apple ins’t interested in putting much more work into. Maybe that’s because it does everything Apple wants and is considered “finished” (if so I think that would be incorrect, see previous point for many things that it would be helpful for SpriteKit to have), but if you were to encounter a weird bug I’d feel much better about the likelihood of it getting fixed in Godot than SpriteKit I’m a big fan of using the right tool for the job. For iOS apps, most of the time that’s building something incredible in SwiftUI and UIKit. But for building a game, be it small or large, using something purpose built to be incredible at that seems like the play to me, and Godot feels like a great candidate there. Setup Simply add the SwiftGodotKit package to your Xcode project by selecting your project in the sidebar, ensuring your project is selected in the new sidebar, selecting the Package Dependencies tab, click the +, then paste the GitHub link. After adding it, you will also need to select the target that you added it to in the sidebar, select the Build Settings tab, then select “Other Linker Flags” and add -lc++. Lastly, with that same target, under the General tab add MetalFX.framework to Frameworks, Libraries, and Embedded Content. (Yeah you got me, I don’t know why we have to do that.) After that, you should be able to import SwiftGodotKit. Usage Now we’re ready to use Godot in our iOS app! What excites me most and I want to focus on is embedding an existing Godot game in your iOS app and communicating back and forth with it from your iOS app. This way, you can do the majority of the game development in Godot without even opening Xcode, and then sprinkle in delightful iOS integration by communicating between iOS and Godot where needed. To start, we’ll build a very simple game called IceCreamParlor, where we select from a list of ice cream options in SwiftUI, which then gets passed into Godot. Godot will have a button the user can tap to send a message back to SwiftUI with the total amount of ice cream. This will not be an impressive “game” by any stretch of the imagination, but should be easy to set up and understand the concepts so you can apply it to an actual game. To accomplish our communication, in essence we’ll be recreating iOS’ NotificationCenter to send messages back and forth between Godot and iOS, and like NotificationCenter, we’ll create a simple singleton to accomplish this. Those messages will be sent via Signals. This is Godot’s system for, well, signaling an event occurred, and can be used to signify everything from a button press, to a player taking damage, to a timer ending. Keeping with the NotificationCenter analogy, this would the be Notification that gets posted (except in Godot, it’s used for everything, where in iOS land you really wouldn’t use NotificationCenter for a button press.) And similar to Notification that has a userInfo field to provide more information about the notification, Godot signals can also take an argument that provides more information. (For example if the notification was “player took damage” the argument might be an integer that includes how much damage they took.) Like userInfo, this is optional however and you can also fire off a signal with no further information, something like “userUnlockedPro” for when they activate Pro after your SwiftUI paywall. For our simple example, we’re going to send a “selectedIceCream” signal from iOS to Godot, and a “updatedIceCreamCount” signal from Godot to iOS. The former will have a string argument for which ice cream was selected, and the latter will have an integer argument with the updated count. Setting up our Godot project Open Godot.app (available to download from their website) and create a new project, I’ll type in IceCreamParlor, choose the Mobile renderer, then click Create. Godot defaults to a 3D scene, so I’ll switch to 2D at the top, and then in the left sidebar click 2D Scene to create that as our root node. I’ll right-click the sidebar to add a child node, and select Label. We’ll set the text to the “Ice cream:”. In the right sidebar, we’ll go to Theme Overrides and increase the font size to 80 to make it more visible, and we’ll also rename it in the left sidebar from Label to IceCreamLabel. We’ll also do the same to add a Button to the scene, which we’ll call UpdateButton and sets its text to “Update Ice Cream Count”. If you click the Play button in the top right corner of Godot, it will run and you can click the button, but as of now it doesn’t do anything. We’ll select our root node (Node2D) in the sidebar, right click, and select “Attach Script”. Leave everything as default, and click Create. This will now present us with an area where we can actually code in GDScript, and we can refer to the objects in our scene by prefixing their name with a dollar sign. Inside our script, we’ll implement the _ready function, which is essentially Godot’s equivalent of viewDidLoad, and inside we’ll connect to our simple signal we discussed earlier. We’ll do this by grabbing a reference to our singleton, then reference the signal we want, then connect to it by passing a function we want to be called when the signal is received. And of course the function takes a String as a parameter because our signal includes what ice cream was selected. extends Node2D var ice_cream: Array[String] = [] func _ready() -> void: var singleton = Engine.get_singleton("GodotSwiftMessenger") singleton.ice_cream_selected.connect(_on_ice_cream_selected_signal_received) func _on_ice_cream_selected_signal_received(new_ice_cream: String) -> void: # We received a signal! Probably should do something… pass Note that we haven’t actually created the singleton yet, but we will shortly. Also note that normally in Godot, you have to declare custom signals like the ones we’re using, but we’re going to declare them in Swift. As long as they’re declared somewhere, Godot is happy! Let’s also hook up our button by going back to our scene, selecting our button in the canvas, selecting the “Node” tab in the right sidebar, and double-clicking the pressed() option. We can then select that same Node2D script and name the function _on_update_button_pressed to add a function that executes when the button is pressed (fun fact: the button being pressed event is also powered by signals). func _on_update_button_pressed() -> void: pass Setting up our iOS/Swift project Let’s jump over to Xcode and create a new SwiftUI project there as well, also calling it IceCreamParlor. We’ll start by adding the Swift package for SwiftGodotKit to Swift Package Manager, add -lc++ to our “Other Linker Flags” under “Build Settings”, add MetalFX, then go to ContentView.swift and add import SwiftGodotKit at the top. From here, let’s create a simple SwiftUI view so we can choose from some ice cream options. var body: some View { HStack { Button { } label: { Text("Chocolate") } Button { } label: { Text("Strawberry") } Button { } label: { Text("Vanilla") } } .buttonStyle(.bordered) } We’ll also create a new file in Xcode called GodotSwiftMessenger.swift. This will be where we implement our singleton that is akin to NotificationCenter. import SwiftGodot @Godot class GodotSwiftMessenger: Object { public static let shared = GodotSwiftMessenger() @Signal var iceCreamSelected: SignalWithArguments<String> @Signal var iceCreamCountUpdated: SignalWithArguments<Int> } We first import SwiftGodot (minus the Kit), essentially because this part is purely about interfacing with Godot through Godot, and doesn’t care about whether or not it’s embedded in an iOS app. For more details on SwiftGodot see its section below. Then, we annotate our class with the @Godot Swift Macro, which basically just says “Hey make Godot aware that this class exists”. The class is a subclass of Object as everything in Godot needs to inherit from Object, it’s essentially the parent class of everything. Following that is your bog standard Swift singleton initialization. Then, with another Swift Macro, we annotate a variable we want to be our signal which signifies that it’s a Signal to Godot. You can either specify its type as Signal or SignalWithArguments<T> depending on whether or not the specific signal also sends any data alongside it. We’ll use that “somethingHappened” signal we mentioned early, which includes a string for more details on what happened. Note that we used “ice_cream_selected” in Godot but “iceCreamSelected” in Swift, this is because the underscore convention is used in Godot, and SwiftGodotKit will automatically map the camelCase Swift convention to it. Now we need to tell Godot about this singleton we just made. We want Godot to know about it as soon as possible, otherwise if things aren’t hooked up, Godot might emit a signal that we wouldn’t receive in Swift, or vice-versa. So, we’ll hook it up very early in our app cycle. In SwiftUI, you might do this in the init of your main App struct as I’ll show below, and in UIKit in applicationDidFinishLaunching. @main struct IceCreamParlor: App { init() { initHookCb = { level in guard level == .scene else { return } register(type: GodotSwiftMessenger.self) Engine.registerSingleton(name: "GodotSwiftMessenger", instance: GodotSwiftMessenger.shared) } } var body: some Scene { WindowGroup { ContentView() } } } In addition to the boilerplate code Xcode gives us, we’ve added an extra step to the initializer, where we set a callback on initHookCb. This is just a callback that fires as Godot is setup, and it specifies what level of setup has occurred. We want to wait until the level setup is reached, which means the game is ready to go (you could set it up at an even earlier level if you see that as beneficial). Then, we just tell Godot about this type by calling register, and then we register the singleton itself with a name we want it to be accessible under. Again, we want to do this early, as if Godot was already setup in our app, and then we set initHookCb, its contents would never fire and thus we wouldn’t register anything. But don’t worry, this hook won’t fire until we first initialize our Godot game in iOS ourself, so as long as this code is called before then, we’re golden. Lastly, everything is registered in iOS land, but there’s still nothing that emits or receives signals. Let’s change that by going to ContentView.swift, and change our body to the following: import SwiftUI import SwiftGodotKit import SwiftGodot struct ContentView: View { @State var totalIceCream = 0 @State var godotApp: GodotApp = GodotApp(packFile: "main.pck") var body: some View { VStack { GodotAppView() .environment(\.godotApp, godotApp) Text("Total Ice Cream: \(totalIceCream)") HStack { Button { GodotSwiftMessenger.shared.iceCreamSelected.emit("chocolate") } label: { Text("Chocolate") } Button { GodotSwiftMessenger.shared.iceCreamSelected.emit("strawberry") } label: { Text("Strawberry") } Button { GodotSwiftMessenger.shared.iceCreamSelected.emit("vanilla") } label: { Text("Vanilla") } } .buttonStyle(.bordered) } .onAppear { GodotSwiftMessenger.shared.iceCreamCountUpdated.connect { newTotalIceCream in totalIceCream = newTotalIceCream } } } } There’s quite a bit going on here, but let’s break it down because it’s really quite simple. We have two new state variables, the first is to keep track of the new ice cream count. Could we just do this ourselves purely in SwiftUI? Totally, but for fun we’re going to be totally relying on Godot to keep us updated there, and we’ll just reflect that in SwiftUI to show the communication. Secondly and more importantly, we need to declare a variable for our actual game file so we can embed it. We do this embedding at the top of the VStack by creating a GodotAppView, a handy SwiftUI view we can now leverage, and we do so by just setting its environment variable to the game we just declared. Then, we change our buttons to actually emit the selections via signals, and when the view appears, we make sure we connect to the signal that keeps us updated on the count so we can reflect that in the UI. Note that we don’t also connect to the iceCreamSelected signal, because we don’t care to receive it in SwiftUI, we’re just firing that one off for Godot to handle. Communicating Let’s update our gdscript in Godot to take advantage of these changes. func _on_ice_cream_selected_signal_received(new_ice_cream: String) -> void: ice_cream.append(new_ice_cream) $IceCreamLabel.text = "Ice creams: " + ", ".join(ice_cream) func _on_update_button_pressed() -> void: var singleton = Engine.get_singleton("GodotSwiftMessenger") singleton.ice_cream_count_updated.emit(ice_cream.size()) Not too bad! We now receive the signal from SwiftUI and update our UI and internal state in Godot accordingly, as well as the UI by making our ice cream into a comma separated list. And then when the user taps the update button, we then send (emit) that signal back to SwiftUI with the updated count. Running To actually see this live, first make sure you have an actual iOS device plugged in. Unfortunately Godot doesn’t work with the iOS simulator. Secondly, in Godot, select the Project menu bar item, then Export, then click the Add button and select “iOS”. This will bring you to a screen with a bunch of options, but my understanding is that this is 99% if you’re building your app entirely in Godot, you can plug in all the things you’d otherwise plug into Xcode here instead, and Godot will handle them for you. That doesn’t apply to us, we’re going to do all that normally in Xcode anyway, we just want the game files, so ignore all that and select “Export PCK/ZIP…” at the bottom. It’ll ask you where you want to save it, and I just keep it in the Godot project directory, make sure “Godot Project Pack (*.pck)” is selected in the dropdown, and then save it as main.pck. That’s our “game” bundled up, as meager as it is! We’ll then drop that into Xcode, making sure to add it to our target, then we can run it on the device! Here we’ll see choosing the ice cream flavor at the bottom in SwiftUI beams it into the Godot game that’s just chilling like a SwiftUI view, and then we can tap the update button in Godot land to beam the new count right back to SwiftUI to be displayed. Not exactly a AAA game but enough to show the basics of communication 😄 Look at you go! Take this as a leaping off point for all the cool SwiftUI and Godot interoperability that you can accomplish, be it tappings a Settings icon in Godot to bring up a beautifully designed, native SwiftUI settings screen, or confirmation to you your game when the user updated to the Pro version of your game through your SwiftUI paywall. Bonus: SwiftGodot (minus the “Kit”) An additional fun option (that sits at the heart of SwiftGodotKit) is SwiftGodot, which allows you to actually build your entire Godot game with Swift as the programming language if you so choose. Swift for iOS apps, Swift on the server, Swift for game dev. Swift truly is everywhere. For me, I’m liking playing around in GDScript, which is Godot’s native programming language, but it’s a really cool option to know about. Embed size A fear might be that embedding Godot into your app might bloat the binary and result in an enormous app download size. Godot is very lightweight, adding it to your codebase adds a relatively meager (at least by 2025 standards) 30MB to your binary size. That’s a lot larger than SpriteKit’s 0MB, but for all the benefits Godot offers that’s a pretty compelling trade. (30MB was measured by handy blog sponsor, Emerge Tools.) Tips Logging If you log something in Godot/GDScript via print("something") that will also print to the Xcode console, handy! Quickly embedding the pck into iOS Exporting the pck file from Godot to Xcode is quite a few clicks, so if you’re doing it a lot it would be nice to speed that up. We can use the command line to make this a lot nicer. Godot.app also has a headless mode you can use by going inside the .app file, then Contents > MacOS > Godot. But typing the full path to that binary is no fun, so let’s symlink the binary to /usr/local/bin. sudo ln -s "/Applications/Godot.app/Contents/MacOS/Godot" /usr/local/bin/godot Now we can simply type godot anywhere in the Terminal to either open the Godot app, or we can use godot --headless for some command line goodness. My favorite way to do this, is to do something like the following within your Godot project directory: godot --headless --export-pack "iOS" /path/to/xcodeproject/target/main.pck This will handily export the pck and add it to our Xcode project, overwriting any existing pck file, from which point we can simply compile our iOS app. Wrapping it up I really think Godot’s new interoperability with iOS is an incredibly exciting avenue for building games on iOS, be it a full fledged game or a small little easter egg integrated into an existing iOS app, and hats off to all the folks who did the hard work getting it working. Hopefully this serves as an easy way to get things up and running! It might seem like a lot at first glance, but most of the code shown above is just boilerplate to get an example Godot and iOS project up and running, the actual work to embed a game and communicate across them is so delightfully simple! (Also big shout out to Chris Backas and Miguel de Icaza for help getting this tutorial off the ground.)

2 days ago 5 votes
Precision Clock Mk IV

[Hardware] GPS synchronised, millisecond precision, automatic timezones and more!

2 days ago 4 votes