Full Width [alt+shift+f] Shortcuts [alt+shift+k]
Sign Up [alt+shift+s] Log In [alt+shift+l]
7
I’ve changed my home server setup a lot over the past decade, mainly because I keep changing the goals all the time. I’ve now realized why that keeps happening. I want the perfect home server. What is the perfect home server? I’d phrase it like this: The perfect home server uses very little power, offers plenty of affordable storage and provides a lot of performance when it’s actually being relied upon. In my case, low power means less than 5 W while idling, 10+ TB of redundant storage for data resilience and integrity concerns, and performance means about 4 modern CPU cores’ worth (low-to-midrange desktop CPU performance). I seem to only ever get one or two at most. Low power usage? Your performance will likely suffer, and you can’t run too many storage drives. You can run SSD-s, but they are not affordable if you need higher capacities. Lots of storage? Well, there goes the low power consumption goal, especially if you run 3.5" hard drives. Lots of performance? Lots of power...
2 days ago

Improve your reading experience

Logged in users get linked directly to articles resulting in a better reading experience. Please login for free, it takes less than 1 minute.

More from ./techtipsy

Turns out that I'm a 'prolific open-source influencer' now

Yes, you read that right. I’m a prolific open-source influencer now. Some years ago I set up a Google Alert with my name, for fun. Who knows what it might show one day? On 7th of February, it fired an alert. Turns out that my thoughts on Ubuntu were somewhat popular, and it ended up being ingested by an AI slop generator over at Fudzilla, with no links back to the source or anything.1 Not only that, but their choice of spicy autocomplete confabulation bot a large language model completely butchered the article, leaving out critical information, which lead to one reader gloating about Windows. Not linking back to the original source? Not a good start. Misrepresenting my work? Insulting. Giving a Windows user the opportunity to boast about how happy they are with using it? Absolutely unacceptable. Here’s the full article in case they ever delete their poor excuse of a “news” “article”. two can play at that game. ↩︎

2 weeks ago 15 votes
IODD ST400 review: great idea, good product, terrible firmware

I’ve written about abusing USB storage devices in the past, with a passing mention that I’m too cheap to buy an IODD device. Then I bought one. I’ve always liked the promise of tools like Ventoy: you only need to carry the one storage device that boots anything you want. Unfortunately I still can’t trust Ventoy, so I’m forced to look elsewhere. The hardware I decided to get the IODD ST400 for 122 EUR (about 124 USD) off of Amazon Germany, since it was for some reason cheaper than getting it from iodd.shop directly. SATA SSD-s are cheap and plentiful, so the ST400 made the most sense to me. The device came with one USB cable, with type A and type C ends. The device itself has a USB type C port, which I like a lot. The buttons are functional and clicky, but incredibly loud. Setting it up Before you get started with this device, I highly recommend glancing over the official documentation. The text is poorly translated in some parts, but overall it gets the job done. Inserting the SSD was reasonably simple, it slotted in well and would not move around after assembling it. Getting the back cover off was tricky, but I’d rather have that than have to deal with a loose back cover that comes off when it shouldn’t. The most important step is the filesystem choice. You can choose between NTFS, FAT32 or exFAT. Due to the maximum file size limitation of 4GB on FAT32, you will probably want to go with either NTFS or exFAT. Once you have a filesystem on the SSD, you can start copying various installers and tools on it and mount them! The interface is unintuitive. I had to keep the manual close when testing mine, but eventually I figured out what I can and cannot do. Device emulation Whenever you connect the IODD device to a powered on PC, it will present itself as multiple devices: normal hard drive: the whole IODD filesystem is visible here, and you can also store other files and backups as well if you want to optical media drive: this is where your installation media (ISO files) will end up, read only virtual drives (up to 3 at a time): VHD files that represent virtual hard drives, but are seen as actual storage devices on the PC This combination of devices is incredibly handy. For example, you can boot an actual Fedora Linux installation as one of the virtual drives, and make a backup of the files on the PC right to the IODD storage itself. S.M.A.R.T information also seems to be passed through properly for the disk that’s inside. Tech tip: to automatically mount your current selection of virtual drives and ISO file at boot, hold down the “9” button for about 3 seconds. The button also has an exit logo on it. Without this step, booting an ISO or virtual drive becomes tricky as you’ll have to both spam the “select boot drive” key on the PC while navigating the menus on the IODD device to mount the ISO. The performance is okay. The drive speeds are limited to SATA II speeds, which means that your read/write speeds cap out at about 250 MB/s. Latency will depend a lot on the drive, but it stays mostly in the sub-millisecond range on my SSD. The GNOME Disks benchmark does show a notable chunk of reads having a 5 millisecond latency. The drive does not seem to exhibit any throttling under sustained loads, so at least it’s better than a normal USB stick. The speeds seem to be the same for all emulated devices, with latencies and speeds being within spitting distance. The firmware sucks, actually The IODD ST400 is a great idea that’s been turned into a good product, but the firmware is terrible enough to almost make me regret the purchase. The choice of filesystems available (FAT32, NTFS, exFAT) is very Windows-centric, but at least it comes with the upside of being supported on most popular platforms, including Linux and Mac. Not great, not terrible. The folder structure has some odd limitations. For example, you can only have 32 items within a folder. If you have more of that, you have to use nested folders. This sounds like a hard cap written somewhere within the device firmware itself. I’m unlikely to hit such limits myself and it doesn’t seem to affect the actual storage, just the device itself isn’t able to handle that many files within a directory listing. The most annoying issue has turned out to be defragmentation. In 2025! It’s a known limitation that’s handily documented on the IODD documentation. On Windows, you can fix it by using a disk defragmentation tool, which is really not recommended on an SSD. On Linux, I have not yet found a way to do that, so I’ve resorted to simply making a backup of the contents of the drive, formatting the disk, and copying it all back again. This is a frustrating issue that only comes up when you try to use a virtual hard drive. It would absolutely suck to hit this error while in the field. The way virtual drives are handled is also less than ideal. You can only use fixed VHD files that are not sparse, which seems to again be a limitation of the firmware. Tech tip: if you’re on Linux and want to convert a raw disk image (such as a disk copied with dd) to a VHD file, you can use a command like this one: qemu-img convert -f raw -O vpc -o subformat=fixed,force_size source.img target.vhd The firmware really is the worst part of this device. What I would love to see is a device like IODD but with free and open source firmware. Ventoy has proven that there is a market for a solution that makes juggling installation media easy, but it can’t emulate hardware devices. An IODD-like device can. Encryption and other features I didn’t test those because I don’t really need those features myself, I really don’t need to protect my Linux installers from prying eyes. Conclusion The IODD ST400 is a good device with a proven market, but the firmware makes me refrain from outright recommending it to everyone, at least not at this price. If it were to cost something like 30-50 EUR/USD, I would not mind the firmware issues at all.

3 weeks ago 14 votes
Feature toggles: just roll your own!

When you’re dealing with a particularly large service with a slow deployment pipeline (15-30 minutes), and a rollback delay of up to 10 minutes, you’re going to need feature toggles (some also call them feature flags) to turn those half-an-hour nerve-wrecking major incidents into a small whoopsie-daisy that you can fix in a few seconds. Make a change, gate it behind a feature toggle, release, enable the feature toggle and monitor the impact. If there is an issue, you can immediately roll it back with one HTTP request (or database query 1). If everything looks good, you can remove the usage of the feature toggle from your code and move on with other work. Need to roll out the new feature gradually? Implement the feature toggle as a percentage and increase it as you go. It’s really that simple, and you don’t have to pay 500 USD a month to get similar functionality from a service provider and make critical paths in your application depend on them.2 As my teammate once said, our service is perfectly capable of breaking down on its own. All you really need is one database table containing the keys and values for the feature toggles, and two HTTP endpoints, one to GET the current value of the feature toggle, and one to POST a new value for an existing one. New feature toggles will be introduced using tools like Flyway or Liquibase, and the same method can be used for also deleting them later on. You can also add convenience columns containing timestamps, such as created and modified, to track when these were introduced and when the last change was. However, there are a few considerations to take into account when setting up such a system. Feature toggles implemented as database table rows can work fantastically, but you should also monitor how often these get used. If you implement a feature toggle on a hot path in your service, then you can easily generate thousands of queries per second. A properly set up feature toggles system can sustain it without any issues on any competent database engine, but you should still try to monitor the impact and remove unused feature toggles as soon as possible. For hot code paths (1000+ requests/second) you might be better off implementing feature toggles as application properties. There’s no call to the database and reading a static property is darn fast, but you lose out on the ability to update it while the application is running. Alternatively, you can rely on the same database-based feature toggles system and keep a cached copy in-memory, while also refreshing it from time to time. Toggling won’t be as responsive as it will depend on the cache expiry time, but the reduced load on the database is often worth it. If your service receives contributions from multiple teams, or you have very anxious product managers that fill your backlog faster than you can say “story points”, then it’s a good idea to also introduce expiration dates for your feature toggles, with ample warning time to properly remove them. Using this method, you can make sure that old feature toggles get properly removed as there is no better prioritization reason than a looming major incident. You don’t want them to stick around for years on end, that’s just wasteful and clutters up your codebase. If your feature toggling needs are a bit more complicated, then you may need to invest more time in your DIY solution, or you can use one of the SaaS options if you really want to, just account for the added expense and reliance on yet another third party service. At work, I help manage a business-critical monolith that handles thousands of requests per second during peak hours, and the simple approach has served us very well. All it took was one motivated developer and about a day to implement, document and communicate the solution to our stakeholders. Skip the latter two steps, and you can be done within two hours, tops. letting inexperienced developers touch the production database is a fantastic way to take down your service, and a very expensive way to learn about database locks. ↩︎ I hate to refer to specific Hacker News comments like this, but there’s just something about paying 6000 USD a year for such a service that I just can’t understand. Has the Silicon Valley mindset gone too far? Or are US-based developers just way too expensive, resulting in these types of services sounding reasonable? You can hire a senior developer in Estonia for that amount of money for 2-3 weeks (including all taxes), and they can pop in and implement a feature toggles system in a few hours at most. The response comment with the status page link that’s highlighting multiple outages for LaunchDarkly is the cherry on top. ↩︎

3 weeks ago 14 votes
I'm done with Ubuntu

I liked Ubuntu. For a very long time, it was the sensible default option. Around 2016, I used the Ubuntu GNOME flavor, and after they ditched the Unity desktop environment, GNOME became the default option. I was really happy with it, both for work and personal computing needs. Estonian ID card software was also officially supported on Ubuntu, which made Ubuntu a good choice for family members. But then something changed. Upgrades suck Like many Ubuntu users, I stuck to the long-term support releases and upgraded every two years to the next major version. There was just one tiny little issue: every upgrade broke something. Usually it was a relatively minor issue, with some icons, fonts or themes being a bit funny. Sometimes things went completely wrong. The worst upgrade was the one I did on my mothers’ laptop. During the upgrade process from Ubuntu 20.04 to 22.04, everything blew up spectacularly. The UI froze, the machine was completely unresponsive. After a 30-minute wait and a forced restart later, the installation was absolutely fucked. In frustration, I ended up installing Windows so that I don’t have to support Ubuntu. Another family member, another upgrade. This is one that they did themselves on Lubuntu 18.04, and they upgraded to the latest version. The result: Firefox shortcuts stopped working, the status bar contained duplicate icons, and random errors popped up after logging in. After making sure that ID card software works on Fedora 40, I installed that instead. All they need is a working browser, and that’s too difficult for Ubuntu to handle. Snaps ruined Ubuntu Snaps. I hate them. They sound great in theory, but the poor implementation and heavy-handed push by Canonical has been a mess. Snaps auto-update by default. Great for security1, but horrible for users who want to control what their personal computer is doing. Snaps get forced upon users as more and more system components are forcibly switched from Debian-based packages to Snaps, which breaks compatibility, functionality and introduces a lot of new issues. You can upgrade your Ubuntu installation and then discover that your browser is now contained within a Snap, the desktop shortcut for it doesn’t work and your government ID card does not work for logging in to your bank any longer. Snaps also destroy productivity. A colleague was struggling to get any work done because the desktop environment on their Ubuntu installation was flashing certain UI elements, being unresponsive and blocking them from doing any work. Apparently the whole GNOME desktop environment is a Snap now, and that lead to issues. The fix was super easy, barely an inconvenience: roll back to the previous version of the GNOME snap restart still broken update to the latest version again restart still broken restart again it is fixed now What was the issue? Absolutely no clue, but a days’ worth of developers’ productivity was completely wasted. Some of these issues have probably been fixed by now, but if I executed migration projects at my day job with a similar track record, I would be fired.2 Snaps done right: Flatpak Snaps can be implemented in a way that doesn’t suck for end users. It’s called a Flatpak. They work reasonably well, you can update them whenever you want and they are optional. Your Firefox installation won’t suddenly turn into a Flatpak overnight. On the Steam Deck, Flatpaks are the main distribution method for user-installed apps and I don’t mind it at all. The only issue is the software selection, not every app is available as a Flatpak just yet. Consider Fedora Fedora works fine. It’s not perfect, but I like it. At this point I’ve used it for longer than Ubuntu and unless IBM ruins it for all of us, I think it will be a perfectly cromulent distro go get work done on. Hopefully it’s not too late for Canonical to reconsider their approach to building a Linux distro. the xz backdoor demonstrated that getting the latest versions of all software can also be problematic from the security angle. ↩︎ technical failures themselves are not the issue, but not responding to users’ feedback and not testing things certainly is, especially if you keep repeatedly making the same mistake. ↩︎

a month ago 23 votes

More in technology

Humanities Crash Course Week 10: Greek Drama

Week 10 of the humanities crash course had me reading (and listening to) classic Greek plays. I also listened to the blues and watched a movie starring a venerable recently departed actor. How do they connect? Perhaps they don’t. Let’s find out. Readings The plan for this week included six classic Greek tragedies and one comedy: Sophocles’s Oedipus Rex, Oedipus at Colonus, and Antigone, Aeschylus’s Agamemnon, Euripides’s The Bacchae, and Aristophanes’s Lysistrata. The tragedies by Sophocles form a trilogy. Oedipus Rex is by far the most famous: the titular character discovers he’s not just responsible for his father’s death, but inadvertently married his widowed mother in its wake. Much sadness ensues. The other two plays continue the story. Oedipus at Colonus has him and his daughters seeking protection in a foreign land as his sons duke it out over his throne. In Antigone, Oedipus’s daughter faces the consequences of burying her brother after his demise in that struggle. In both plays, sadness ensues. Agamemnon dramatizes a story we’ve already encountered in the Odyssey: the titular king returns home only to be betrayed and murdered by his wife and her lover. The motive? The usual: revenge, lust, power. Sadness ensues. The Bacchae centers on the cult of the demigod Dionysus. He comes to Thebes to avenge a slanderous rumor and spread his own cult. Not recognizing him, King Pentheus arrests him and persecutes his followers, a group of women that includes Pentheus’s mother, Agave. In ecstatic frenzy, Agave and the women tear him apart. Again, not light fare. Lysistrata, a comedy, was a respite. Looking to end to the Peloponnesian War, a group of women led by the titular character convince Greek women to go on a sex strike until the men stop the fighting. For such an old play, it’s surprisingly funny. (More on this below.) These plays are very famous, but I’d never read them. This time, I heard dramatizations of Sophocles’s plays and an audiobook of The Bacchae, and read ebooks of the remaining two. The dramatizations were the most powerful and understandable, but reading Lysistrata helped me appreciate the puns. Audiovisual Music: Gioia recommended classic blues tunes. I listened to Apple Music collections for Blind Lemon Jefferson and Blind Willie Johnson. I also revisited an album of blues music compiled for Martin Scorcese’s film series, The Blues. My favorite track here is Lead Belly’s C.C. Rider, a song that’s lived rent free in my brain the last several days: Art: Gioia recommended looking at Greek pottery. I studied some of this in college and didn’t spend much time looking again. Cinema: rather than something related to the readings, I sought out a movie starring Gene Hackman, who died a couple of weeks ago. I opted for Francis Ford Coppola’s THE CONVERSATION, which is about the ethics of privacy-invading technologies. Even though the movie is fifty-one years old, that description should make it clear that it’s highly relevant today. Reflections I was surprised by the freshness of the plays. Yes, most namechecks are meaningless without notes. (That’s an advantage books have over audiobooks.) But the stories deal with timeless themes: truth-seeking, repression, free will vs. predestination, the influence of religious belief on our actions, relations between the sexes, etc. Unsurprisingly, some of these themes are also central to THE CONVERSATION. I sensed parallels between Oedipus and the film’s protagonist, Harry Caul. ChatGPT provided useful insights. (Spoilers here for both the play and movie – but c’mon, these are old works!) Both characters investigate the truth only to find painful revelations about themselves. Both believe that gaining knowledge will help them control events – but their efforts only lead to self-destruction. Both misunderstand key pieces of evidence. Both end up “isolated, ruined by their own knowledge, and stripped of their former identity.” (I liked how ChatGPT phrased this!) Both stories explore the limits of perception: it’s possible to see (and record) and remain ignorant of the truth. Heavy stuff – as is wont in drama. Bur for me, the bigger surprise in exploring these works was Lysistrata. Humor is highly contextual: even contemporary stuff doesn’t play well across cultures. But this ancient Greek play is filled with randy situations and double entendres that are still funny. Much rides on the translation. The edition I read was translated by Jack Lindsay, and I marveled at his skills. It must’ve been challenging to get the rhymes and puns in and still make the story work. A note in the text mentioned that the Spartans in the story were translated to sound like Scots to make them relatable to the intended English audience. (!) Obviously, none of these ancient texts I’ve been reading were written in English. That will change in the latter stages of the course. I’m wondering if I should read texts originally written in Spanish and Italian in those languages, since I can. (But what would that do to my notes and running interactions with the LLMs? It’s an opportunity to explore…) Notes on Note-taking Part of why I’m undertaking this course is to experiment with note-taking and LLMs. This week, I tried a few new things. First, before reading each play, I read through its synopsis in Wikipedia. This helped me understand the narrative thread and themes and generally get oriented in unfamiliar terrain. Second, I tried a new cadence for capturing notes. These are short plays; I read one per day. (Except The Bacchae, which I read over two days.) During my early morning journaling sessions, I wrote down a synopsis of the play I’d read the previous day. Then, I asked GPT-4o for comments on the synopsis. The LLM invariably pointed out important things I’d missed. The point wasn’t making more complete notes, but helping me understand and remember better by writing down my fresh memories and reviewing them through a “third party.” I was forced to be clear and complete, since I knew I’d be asking for feedback. Third, I added new sections to my notes for each work. After the synopsis, I asked GPT-4o for an outline explaining why the work is considered important. I read these outlines and reflected on them. Then, I asked for criticisms, both modern and contemporary, that could be leveled against these works. Frankly, this is risky. One of my guidelines has been to stick to prompts where I can verify the LLM’s output. If I ask for a summary of a work I’ve just read, I’ll have a better shot at knowing whether the LLM is hallucinating. But in this case, I’m asking for stuff that I won’t be able to validate. Still, I’m not using these prompts to generate authoritative texts. Instead, the answers help me consider the work from different perspectives. The LLM helps me step outside my experience – and that’s one of the reasons for studying the humanities. Up Next Gioia scheduled Marcus Aurelius and Epictetus for week 11. I’ve read Meditations twice and loved it, and will revisit it now more systemically. But since I’m already familiar with this work, I’ll also spend more time with the Bible – the Book of Job, in particular. In addition to Job itself, I plan to read Mark Larrimore’s The Book of Job: A Biography, which explores its background. It’ll be the first time in the course that I read a work about a work. (As you may surmise, I’m keen on Job.) This will also be the first physical book I read in the course. Otherwise, I’m sticking with Gioia’s recommendations. Check out his post for the full syllabus. Again, there’s a YouTube playlist for the videos I’m sharing here. I’m also sharing these posts via Substack if you’d like to subscribe and comment. See you next week!

7 hours ago 1 votes
Reading List 03/08/2025

China’s industrial diplomacy, streetlights and crime, deorbiting Starlink satellites, a proposed canal across Thailand, a looming gas turbine shortage, and more.

yesterday 1 votes
+ iPhone 16e review in progress: battery life

You can never do too much battery testing, but after a week with this phone I've got some impressions to share.

yesterday 2 votes
Real WordPress Security

One thing you’ll see on every host that offers WordPress is claims about how secure they are, however they don’t put their money where their mouth is. When you dig deeper, if your site actually gets hacked they’ll hit you with remediation fees that can go from hundreds to thousands of dollars. They may try … Continue reading Real WordPress Security →

yesterday 1 votes
Odds and Ends #61: Fake Woolly Mammoths

Plus why intelligence is social, Land Registry open data, and some completely invisible VFX

2 days ago 2 votes