Full Width [alt+shift+f] Shortcuts [alt+shift+k]
Sign Up [alt+shift+s] Log In [alt+shift+l]
13
Simple Does Not Mean Ugly 2019-03-26 I see new blog posts popping up now and again advocating for designers to keep their products as simple as possible - and I couldn’t agree more. A lot of designers tend to think they need to reinvent the wheel when it comes to UI concepts that are standard defaults and expected by most users. Not only does this add extra work for your design and development cycles, but also increases the potential of frustration for your users when they are using your product. Your job as a designer is to focus on the user experience journey and understand what those users expect to happen - not what you want to happen. This is a very delicate balance of design “give and take”, hence why simple designs always seem to work best. But simple does not mean “ugly”. Ugly Simple Anyone who has read some of my opinion pieces on here in regards to UI know that I have a profound distaste for the overused “flat design” trend. Many designers consider this trend a clean and...
over a year ago

More from bt RSS Feed

Installing OpenBSD on Linveo KVM VPS

Installing OpenBSD on Linveo KVM VPS 2024-10-21 I recently came across an amazing deal for a VPS on Linveo. For just $15 a year they provide: AMD KVM 1GB 1024 MB RAM 1 CPU Core 25 GB NVMe SSD 2000 GB Bandwidth It’s a pretty great deal and I suggest you look more into it if you’re interested! But this post is more focused on setting up OpenBSD via the custom ISO option in the KVM dashboard. Linveo already provides several Linux OS options, along with FreeBSD by default (which is great!). Since there is no OpenBSD template we need to do things manually. Getting Started Once you have your initial VPS up and running, login to the main dashboard and navigate to the Media tab. Under CD/DVD-ROM you’ll want to click “Custom CD/DVD” and enter the direct link to the install76.iso: https://cdn.openbsd.org/pub/OpenBSD/7.6/amd64/install76.iso The "Media" tab of the Linveo Dashboard. Use the official ISO link and set the Boot Order to CD/DVD. Select “Insert”, then set your Boot Order to CD/DVD and click “Apply”. Once complete, Restart your server. Installing via VNC With the server rebooting, jump over to Options and click on “Browser VNC” to launch the web-based VNC client. From here we will boot into the OpenBSD installer and get things going! Follow the installer as you normally would when installing OpenBSD (if you’re unsure, I have a step-by-step walkthrough) until you reach the IPv4 selection. At this point you will want to input your servers IPv4 and IPv6 IPs found under your Network section of your dashboard. Next you will want to set the IPv6 route to first default listed option (not “none”). After that is complete, choose cd0 for your install media (don’t worry about http yet). Continue with the rest of the install (make users if desired, etc) until it tells you to reboot the machine. Go back to the Linveo Dashboard, switch your Boot Order back to “Harddrive” and reboot the machine directly. Booting into OpenBSD Load into the VNC client again. If you did everything correctly you should be greeted with the OpenBSD login prompt. There are a few tweaks we still need to make, so login as the root user. Remember how we installed our sets directly from the cd0? We’ll want to change that. Since we are running OpenBSD “virtually” through KVM, our target network interface will be vio0. Edit the /etc/hostname.vio0 file and add the following: dhcp !route add default <your_gateway_ip> The <your_gateway_ip> can be found under the Network tab of your dashboard. The next file we need to tweak is /etc/resolv.conf. Add the following to it: nameserver 8.8.8.8 nameserver 1.1.1.1 These nameservers are based on your selected IPs under the Resolvers section of Network in the Linveo dashboard. Change these as you see fit, so long as they match what you place in the resolve.conf file. Finally, the last file we need to edit is /etc/pf.conf. Like the others, add the following: pass out proto { tcp, udp } from any to any port 53 Final Stretch Now just reboot the server. Log back in as your desired user and everything should be working as expected! You can perform a simple test to check: ping openbsd.org This should work - meaning your network is up and running! Now you’re free to enjoy the beauty that is OpenBSD.

3 months ago 51 votes
Vertical Tabs in Safari

Vertical Tabs in Safari 2024-09-26 I use Firefox as my main browser (specifically the Nightly build) which has vertical tabs built-in. There are instances where I need to use Safari, such as debugging or testing iOS devices, and in those instances I prefer to have a similar experience to that of Firefox. Luckily, Apple has finally made it fairly straight forward to do so. Click the Sidebar icon in the top left of the Safari browser Right click and group your current tab(s) (I normally name mine something uninspired like “My Tabs” or simply “Tabs”) For an extra “clean look”, remove the horizontal tabs by right clicking the top bar, selected Customize Toolbar and dragging the tabs out When everything is set properly, you’ll have something that looks like this: One minor drawback is not having access to a direct URL input, since we have removed the horizontal tab bar altogether. Using a set of curated bookmarks could help avoid the need for direct input, along with setting our new tab page to DuckDuckGo or any other search engine.

4 months ago 55 votes
Build and Deploy Websites Automatically with Git

Build and Deploy Websites Automatically with Git 2024-09-20 I recently began the process of setting up my self-hosted1 cgit server as my main code forge. Updating repos via cgit on NearlyFreeSpeech on its own has been simple enough, but it lacked the “wow-factor” of having some sort of automated build process. I looked into a bunch of different tools that I could add to my workflow and automate deploying changes. The problem was they all seemed to be fairly bloated or overly complex for my needs. Then I realized I could simply use post-receive hooks which were already built-in to git! You can’t get more simple than that… So I thought it would be best to document my full process. These notes are more for my future self when I inevitably forget this, but hopefully others can benefit from it! Before We Begin This “tutorial” assumes that you already have a git server setup. It shouldn’t matter what kind of forge you’re using, so long as you have access to the hooks/ directory and have the ability to write a custom post-receive script. For my purposes I will be running standard git via the web through cgit, hosted on NearlyFreeSpeech (FreeBSD based). Overview Here is a quick rundown of what we plan to do: Write a custom post-receive script in the repo of our choice Build and deploy our project when a remote push to master is made Nothing crazy. Once you get the hang of things it’s really simple. Prepping Our Servers Before we get into the nitty-gritty, there are a few items we need to take care of first: Your main git repo needs ssh access to your web hosting (deploy) server. Make sure to add your public key and run a connection test first (before running the post-receive hook) in order to approve the “fingerprinting”. You will need to git clone your main git repo in a private/admin area of your deploy server. In the examples below, mine is cloned under /home/private/_deploys Once you do both of those tasks, continue with the rest of the article! The post-receive Script I will be using my own personal website as the main project for this example. My site is built with wruby, so the build instructions are specific to that generator. If you use Jekyll or something similar, you will need to tweak those commands for your own purposes. Head into your main git repo (not the cloned one on your deploy server), navigate under the hooks/ directory and create a new file named post-receive containing the following: #!/bin/bash # Get the branch that was pushed while read oldrev newrev ref do branch=$(echo $ref | cut -d/ -f3) if [ "$branch" == "master" ]; then echo "Deploying..." # Build on the remote server ssh user@deployserver.net << EOF set -e # Stop on any error cd /home/private/_deploys/btxx.org git pull origin master gem install 'kramdown:2.4.0' 'rss:0.3.0' make build rsync -a build/* ~/public/btxx.org/ EOF echo "Build synced to the deployment server." echo "Deployment complete." fi done Let’s break everything down. First we check if the branch being pushed to the remote server is master. Only if this is true do we proceed. (Feel free to change this if you prefer something like production or deploy) if [ "$branch" == "master" ]; then Then we ssh into the server (ie. deployserver.net) which will perform the build commands and also host these built files. ssh user@deployserver.net << EOF Setting set -e ensures that the script stops if any errors are triggered. set -e # Stop on any error Next, we navigate into the previously mentioned “private” directory, pull the latest changes from master, and run the required build commands (in this case installing gems and running make build) cd /home/private/_deploys/btxx.org git pull origin master gem install 'kramdown:2.4.0' 'rss:0.3.0' make build Finally, rsync is run to copy just the build directory to our public-facing site directory. rsync -a build/* ~/public/btxx.org/ With that saved and finished, be sure to give this file proper permissions: chmod +x post-receive That’s all there is to it! Time to Test! Now make changes to your main git project and push those up into master. You should see the post-receive commands printing out into your terminal successfully. Now check out your website to see the changes. Good stuff. Still Using sourcehut My go-to code forge was previously handled through sourcehut, which will now be used for mirroring my repos and handling mailing lists (since I don’t feel like hosting something like that myself - yet!). This switch over was nothing against sourcehut itself but more of a “I want to control all aspects of my projects” mentality. I hope this was helpful and please feel free to reach out with suggestions or improvements! By self-hosted I mean a NearlyFreeSpeech instance ↩

4 months ago 63 votes
Burning & Playing PS2 Games without a Modded Console

Burning & Playing PS2 Games without a Modded Console 2024-09-02 Important: I do not support pirating or obtaining illegal copies of video games. This process should only be used to copy your existing PS2 games for backup, in case of accidental damage to the original disc. Requirements Note: This tutorial is tailored towards macOS users, but most things should work similar on Windows or Linux. You will need: An official PS2 game disc (the one you wish to copy) A PS2 Slim console An Apple device with a optical DVD drive (or a portable USB DVD drive) Some time and a coffee! (or tea) Create an ISO Image of Your PS2 Disc: Insert your PS2 disc into your optical drive. Open Disk Utility (Applications > Utilities) In Disk Utility, select your PS2 disc from the sidebar Click on the File menu, then select New Image > Image from [Disc Name] Choose a destination to save the ISO file and select the format as DVD/CD Master Name your file and click Save. Disk Utility will create a .cdr file, which is essentially an ISO file Before we move on, we will need to convert that newly created cdr file into ISO. Navigate to the directory where the .cdr file is located and use the hdiutil command to convert the .cdr file to an ISO file: hdiutil convert yourfile.cdr -format UDTO -o yourfile.iso You’ll end up with a file named yourfile.iso.cdr. Rename it by removing the .cdr extension to make it an .iso file: mv yourfile.iso.cdr yourfile.iso Done and done. Getting Started For Mac and Linux users, you will need to install Wine in order to run the patcher: # macOS brew install wine-stable # Linux (Debian) apt install wine Clone & Run the Patcher Clone the FreeDVDBoot ESR Patcher: git clone https://git.sr.ht/~bt/fdvdb-esr Navigate to the cloned project folder: cd /path/to/fdvdb-esr The run the executable: wine FDVDB_ESR_Patcher.exe Now you need to select your previously cloned ISO file, use the default Payload setting and then click Patch!. After a few seconds your file should be patched. Burning Our ISO to DVD It’s time for the main event! Insert a blank DVD-R into your disc drive and mount it. Then right click on your patched ISO file and run “Burn Disk Image to Disc...". From here, you want to make sure you select the slowest write speed and enable verification. Once the file is written to the disc and verified (verification might fail - it is safe to ignore) you can remove the disc from the drive. Before Playing the Game Make sure you change the PS2 disc speed from Standard to Fast in the main “Browser” setting before you put the game into your console. After that, enjoy playing your cloned PS2 game!

5 months ago 47 votes
"This Key is Useless Now. Discard?"

“This Key is Useless Now. Discard?” 2024-08-28 The title of this article probably triggers nostalgic memories for old school Resident Evil veterans like myself. My personal favourite in the series (not that anyone asked) was the original, 1998 version of Resident Evil 2 (RE2). I believe that game stands the test of time and is very close to a masterpiece. The recent remake lost a lot of the charm and nuance that made the original so great, which is why I consistently fire up the PS1 version on my PS2 Slim. Resident Evil 2 (PS1) running on my PS2, hooked up to my Toshiba CRT TV. But the point of this post isn’t to gush over RE2. Instead I would like to discuss how well RE2 handled its interface and user experience across multiple in-game systems. HUD? What HUD? Just like the first Resident Evil that came before it, RE2 has no in-game HUD (heads-up display) to speak of. It’s just your playable character and the environment. No ammo-counters. No health bars. No “quest” markers. Nothing. This is how the game looks while you play. Zero HUD elements. The game does provide you with an inventory system, which holds your core items, weapons and notes you find along your journey. Opening up this sub-menu allows you to heal, reload weapons, combine objects or puzzle items, or read through previously collected documents. Not only is this more immersive (HUDs don’t exist for us in the real world, we need to look through our packs as well…) it also gets out of the way. The main inventory screen. Shows everything you need to know, only when you need it. (I can hear this screenshot...) I don’t need a visual element in the bottom corner showing me a list of “items” I can cycle through. I don’t want an ammo counter cluttering up my screen with information I only need to see in combat or while manually reloading. If those are pieces of information I need, I’ll explicitly open and look for it. Don’t make assumptions about what is important to me on screen. Capcom took this concept of less visual clutter even further in regards to maps and the character health status. Where We’re Going, We Don’t Need Roads Mini-Maps A great deal of newer games come pre-packaged with a mini-map on the main interface. In certain instances this works just fine, but most are 100% UI clutter. Something to add to the screen. I can only assume some devs believe it is “helpful”. Most times it’s simply a distraction. Thank goodness most games allow you to disable them. As for RE2, you collect maps throughout your adventure and, just like most other systems in the game, you need to consciously open the map menu to view them. You know, just like in real life. This creates a higher tension as well, since you need to constantly reference your map (on initial playthroughs) to figure out where the heck to go. You feel the pressure of someone frantically pulling out a physical map and scanning their surroundings. It also helps the player build a mental model in their head, thus providing even more of that sweet, sweet immersion. The map of the Raccoon City Police Station. No Pain, No Gain The game doesn’t display any health bar or player status information. In order to view your current status (symbolized by “Fine”, “Caution” or “Danger”) you need to open your inventory screen. From here you can heal yourself (if needed) and see the status type change in real-time. The "condition" health status. This is fine. But that isn’t the only way to visually see your current status. Here’s a scenario: you’re traveling down a hallway, turn a corner and run right into the arms of a zombie. She takes a couple good bites out of your neck before you push her aside. You unload some handgun rounds into her and down she goes. As you run over her body she reaches out and chomps on your leg as a final “goodbye”. You break free and move along but notice something different in your character’s movement - they’re holding their stomach and limping. Here we can see the character "Hunk" holding his stomach and limping, indicating an injury without the need for a custom HUD element. If this was your first time playing, most players would instinctively open the inventory menu, where their characters health is displayed, and (in this instance) be greeted with a “Caution” status. This is another example of subtle UX design. I don’t need to know the health status of my character until an action is required (in this example: healing). The health system is out of the way but not hidden. This keeps the focus on immersion, not baby-sitting the game’s interface. A Key to Every Lock Hey! This section is in reference to the title of the article. We made it! …But yes, discarding keys in RE2 is a subtle example of fantastic user experience. As a player, I know for certain this key is no longer needed. I can safely discard it and free up precious space from my inventory. There is also a sense of accomplishment, a feeling of “I’ve completed a task” or an internal checkbox being ticked. Progress has been made! Don’t overlook how powerful of a interaction this little text prompt is. Ask any veteran of the series and they will tell you this prompt is almost euphoric. The game's prompt asking if you'd like to discard a useless key. Perfection. Inspiring Greatness RE2 is certainly not the first or last game to implement these “minimal” game systems. A more “modern” example is Dead Space (DS), along with its very faithful remake. In DS the character’s health is displayed directly on the character model itself, and a similar inventory screen is used to manage items. An ammo-counter is visible but only when the player aims their weapon. Pretty great stuff and another masterpiece of survival horror. In Dead Space, the character's health bar is set as part of their spacesuit. The Point I guess my main takeaway is that designers and developers should try their best to keep user experience intuitive. I know that sounds extremely generic but it is a lot more complex than one might think. Try to be as direct as possible while remaining subtle. It’s a delicate balance but experiences like RE2 show us it is attainable. I’d love to talk more, but I have another play-through of RE2 to complete…

5 months ago 43 votes

More in programming

Adding auto-generated cover images to EPUBs downloaded from AO3

I was chatting with a friend recently, and she mentioned an annoyance when reading fanfiction on her iPad. She downloads fic from AO3 as EPUB files, and reads it in the Kindle app – but the files don’t have a cover image, and so the preview thumbnails aren’t very readable: She’s downloaded several hundred stories, and these thumbnails make it difficult to find things in the app’s “collections” view. This felt like a solvable problem. There are tools to add cover images to EPUB files, if you already have the image. The EPUB file embeds some key metadata, like the title and author. What if you had a tool that could extract that metadata, auto-generate an image, and use it as the cover? So I built that. It’s a small site where you upload EPUB files you’ve downloaded from AO3, the site generates a cover image based on the metadata, and it gives you an updated EPUB to download. The new covers show the title and author in large text on a coloured background, so they’re much easier to browse in the Kindle app: If you’d find this helpful, you can use it at alexwlchan.net/my-tools/add-cover-to-ao3-epubs/ Otherwise, I’m going to explain how it works, and what I learnt from building it. There are three steps to this process: Open the existing EPUB to get the title and author Generate an image based on that metadata Modify the EPUB to insert the new cover image Let’s go through them in turn. Open the existing EPUB I’ve not worked with EPUB before, and I don’t know much about it. My first instinct was to look for Python EPUB libraries on PyPI, but there was nothing appealing. The results were either very specific tools (convert EPUB to/from format X) or very unmaintained (the top result was last updated in April 2014). I decied to try writing my own code to manipulate EPUBs, rather than using somebody else’s library. I had a vague memory that EPUB files are zips, so I changed the extension from .epub to .zip and tried unzipping one – and it turns out that yes, it is a zip file, and the internal structure is fairly simple. I found a file called content.opf which contains metadata as XML, including the title and author I’m looking for: <?xml version='1.0' encoding='utf-8'?> <package xmlns="http://www.idpf.org/2007/opf" version="2.0" unique-identifier="uuid_id"> <metadata xmlns:opf="http://www.idpf.org/2007/opf" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:dcterms="http://purl.org/dc/terms/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:calibre="http://calibre.kovidgoyal.net/2009/metadata"> <dc:title>Operation Cameo</dc:title> <meta name="calibre:timestamp" content="2025-01-25T18:01:43.253715+00:00"/> <dc:language>en</dc:language> <dc:creator opf:file-as="alexwlchan" opf:role="aut">alexwlchan</dc:creator> <dc:identifier id="uuid_id" opf:scheme="uuid">13385d97-35a1-4e72-830b-9757916d38a7</dc:identifier> <meta name="calibre:title_sort" content="operation cameo"/> <dc:description><p>Some unusual orders arrive at Operation Mincemeat HQ.</p></dc:description> <dc:publisher>Archive of Our Own</dc:publisher> <dc:subject>Fanworks</dc:subject> <dc:subject>General Audiences</dc:subject> <dc:subject>Operation Mincemeat: A New Musical - SpitLip</dc:subject> <dc:subject>No Archive Warnings Apply</dc:subject> <dc:date>2023-12-14T00:00:00+00:00</dc:date> </metadata> … That dc: prefix was instantly familiar from my time working at Wellcome Collection – this is Dublin Core, a standard set of metadata fields used to describe books and other objects. I’m unsurprised to see it in an EPUB; this is exactly how I’d expect it to be used. I found an article that explains the structure of an EPUB file, which told me that I can find the content.opf file by looking at the root-path element inside the mandatory META-INF/container.xml file which is every EPUB. I wrote some code to find the content.opf file, then a few XPath expressions to extract the key fields, and I had the metadata I needed. Generate a cover image I sketched a simple cover design which shows the title and author. I wrote the first version of the drawing code in Pillow, because that’s what I’m familiar with. It was fine, but the code was quite flimsy – it didn’t wrap properly for long titles, and I couldn’t get custom fonts to work. Later I rewrote the app in JavaScript, so I had access to the HTML canvas element. This is another tool that I haven’t worked with before, so a fun chance to learn something new. The API felt fairly familiar, similar to other APIs I’ve used to build HTML elements. This time I did implement some line wrapping – there’s a measureText() API for canvas, so you can see how much space text will take up before you draw it. I break the text into words, and keeping adding words to a line until measureText tells me the line is going to overflow the page. I have lots of ideas for how I could improve the line wrapping, but it’s good enough for now. I was also able to get fonts working, so I picked Georgia to match the font used for titles on AO3. Here are some examples: I had several ideas for choosing the background colour. I’m trying to help my friend browse her collection of fic, and colour would be a useful way to distinguish things – so how do I use it? I realised I could get the fandom from the EPUB file, so I decided to use that. I use the fandom name as a seed to a random number generator, then I pick a random colour. This means that all the fics in the same fandom will get the same colour – for example, all the Star Wars stories are a shade of red, while Star Trek are a bluey-green. This was a bit harder than I expected, because it turns out that JavaScript doesn’t have a built-in seeded random number generator – I ended up using some snippets from a Stack Overflow answer, where bryc has written several pseudorandom number generators in plain JavaScript. I didn’t realise until later, but I designed something similar to the placeholder book covers in the Apple Books app. I don’t use Apple Books that often so it wasn’t a deliberate choice to mimic this style, but clearly it was somewhere in my subconscious. One difference is that Apple’s app seems to be picking from a small selection of background colours, whereas my code can pick a much nicer variety of colours. Apple’s choices will have been pre-approved by a designer to look good, but I think mine is more fun. Add the cover image to the EPUB My first attempt to add a cover image used pandoc: pandoc input.epub --output output.epub --epub-cover-image cover.jpeg This approach was no good: although it added the cover image, it destroyed the formatting in the rest of the EPUB. This made it easier to find the fic, but harder to read once you’d found it. An EPUB file I downloaded from AO3, before/after it was processed by pandoc. So I tried to do it myself, and it turned out to be quite easy! I unzipped another EPUB which already had a cover image. I found the cover image in OPS/images/cover.jpg, and then I looked for references to it in content.opf. I found two elements that referred to cover images: <?xml version="1.0" encoding="UTF-8"?> <package xmlns="http://www.idpf.org/2007/opf" version="3.0" unique-identifier="PrimaryID"> <metadata xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:opf="http://www.idpf.org/2007/opf"> <meta name="cover" content="cover-image"/> … </metadata> <manifest> <item id="cover-image" href="images/cover.jpg" media-type="image/jpeg" properties="cover-image"/> … </manifest> </package> This gave me the steps for adding a cover image to an EPUB file: add the image file to the zipped bundle, then add these two elements to the content.opf. Where am I going to deploy this? I wrote the initial prototype of this in Python, because that’s the language I’m most familiar with. Python has all the libraries I need: The zipfile module can unpack and modify the EPUB/ZIP The xml.etree or lxml modules can manipulate XML The Pillow library can generate images I built a small Flask web app: you upload the EPUB to my server, my server does some processing, and sends the EPUB back to you. But for such a simple app, do I need a server? I tried rebuilding it as a static web page, doing all the processing in client-side JavaScript. That’s simpler for me to host, and it doesn’t involve a round-trip to my server. That has lots of other benefits – it’s faster, less of a privacy risk, and doesn’t require a persistent connection. I love static websites, so can they do this? Yes! I just had to find a different set of libraries: The JSZip library can unpack and modify the EPUB/ZIP, and is the only third-party code I’m using in the tool Browsers include DOMParser for manipulating XML I’ve already mentioned the HTML <canvas> element for rendering the image This took a bit longer because I’m not as familiar with JavaScript, but I got it working. As a bonus, this makes the tool very portable. Everything is bundled into a single HTML file, so if you download that file, you have the whole tool. If my friend finds this tool useful, she can save the file and keep a local copy of it – she doesn’t have to rely on my website to keep using it. What should it look like? My first design was very “engineer brain” – I just put the basic controls on the page. It was fine, but it wasn’t good. That might be okay, because the only person I need to be able to use this app is my friend – but wouldn’t it be nice if other people were able to use it? If they’re going to do that, they need to know what it is – most people aren’t going to read a 2,500 word blog post to understand a tool they’ve never heard of. (Although if you have read this far, I appreciate you!) I started designing a proper page, including some explanations and descriptions of what the tool is doing. I got something that felt pretty good, including FAQs and acknowledgements, and I added a grey area for the part where you actually upload and download your EPUBs, to draw the user’s eye and make it clear this is the important stuff. But even with that design, something was missing. I realised I was telling you I’d create covers, but not showing you what they’d look like. Aha! I sat down and made up a bunch of amusing titles for fanfic and fanfic authors, so now you see a sample of the covers before you upload your first EPUB: This makes it clearer what the app will do, and was a fun way to wrap up the project. What did I learn from this project? Don’t be scared of new file formats My first instinct was to look for a third-party library that could handle the “complexity” of EPUB files. In hindsight, I’m glad I didn’t find one – it forced me to learn more about how EPUBs work, and I realised I could write my own code using built-in libraries. EPUB files are essentially ZIP files, and I only had basic needs. I was able to write my own code. Because I didn’t rely on a library, now I know more about EPUBs, I have code that’s simpler and easier for me to understand, and I don’t have a dependency that may cause problems later. There are definitely some file formats where I need existing libraries (I’m not going to write my own JPEG parser, for example) – but I should be more open to writing my own code, and not jumping to add a dependency. Static websites can handle complex file manipulations I love static websites and I’ve used them for a lot of tasks, but mostly read-only display of information – not anything more complex or interactive. But modern JavaScript is very capable, and you can do a lot of things with it. Static pages aren’t just for static data. One of the first things I made that got popular was find untagged Tumblr posts, which was built as a static website because that’s all I knew how to build at the time. Somewhere in the intervening years, I forgot just how powerful static sites can be. I want to build more tools this way. Async JavaScript calls require careful handling The JSZip library I’m using has a lot of async functions, and this is my first time using async JavaScript. I got caught out several times, because I forgot to wait for async calls to finish properly. For example, I’m using canvas.toBlob to render the image, which is an async function. I wasn’t waiting for it to finish, and so the zip would be repackaged before the cover image was ready to add, and I got an EPUB with a missing image. Oops. I think I’ll always prefer the simplicity of synchronous code, but I’m sure I’ll get better at async JavaScript with practice. Final thoughts I know my friend will find this helpful, and that feels great. Writing software that’s designed for one person is my favourite software to write. It’s not hyper-scale, it won’t launch the next big startup, and it’s usually not breaking new technical ground – but it is useful. I can see how I’m making somebody’s life better, and isn’t that what computers are for? If other people like it, that’s a nice bonus, but I’m really thinking about that one person. Normally the one person I’m writing software for is me, so it’s extra nice when I can do it for somebody else. If you want to try this tool yourself, go to alexwlchan.net/my-tools/add-cover-to-ao3-epubs/ If you want to read the code, it’s all on GitHub. [If the formatting of this post looks odd in your feed reader, visit the original article]

an hour ago 1 votes
Non-alcoholic apéritifs

I’ve been doing Dry January this year. One thing I missed was something for apéro hour, a beverage to mark the start of the evening. Something complex and maybe bitter, not like a drink you’d have with lunch. I found some good options. Ghia sodas are my favorite. Ghia is an NA apéritif based on grape juice but with enough bitterness (gentian) and sourness (yuzu) to be interesting. You can buy a bottle and mix it with soda yourself but I like the little cans with extra flavoring. The Ginger and the Sumac & Chili are both great. Another thing I like are low-sugar fancy soda pops. Not diet drinks, they still have a little sugar, but typically 50 calories a can. De La Calle Tepache is my favorite. Fermented pineapple is delicious and they have some fun flavors. Culture Pop is also good. A friend gave me the Zero book, a drinks cookbook from the fancy restaurant Alinea. This book is a little aspirational but the recipes are doable, it’s just a lot of labor. Very fancy high end drink mixing, really beautiful flavor ideas. The only thing I made was their gin substitute (mostly junipers extracted in glycerin) and it was too sweet for me. Need to find the right use for it, a martini definitely ain’t it. An easier homemade drink is this Nonalcoholic Dirty Lemon Tonic. It’s basically a lemonade heavily flavored with salted preserved lemons, then mixed with tonic. I love the complexity and freshness of this drink and enjoy it on its own merits. Finally, non-alcoholic beer has gotten a lot better in the last few years thanks to manufacturing innovations. I’ve been enjoying NA Black Butte Porter, Stella Artois 0.0, Heineken 0.0. They basically all taste just like their alcoholic uncles, no compromise. One thing to note about non-alcoholic substitutes is they are not cheap. They’ve become a big high end business. Expect to pay the same for an NA drink as one with alcohol even though they aren’t taxed nearly as much.

2 days ago 5 votes
It burns

The first time we had to evacuate Malibu this season was during the Franklin fire in early December. We went to bed with our bags packed, thinking they'd probably get it under control. But by 2am, the roaring blades of fire choppers shaking the house got us up. As we sped down the canyon towards Pacific Coast Highway (PCH), the fire had reached the ridge across from ours, and flames were blazing large out the car windows. It felt like we had left the evacuation a little too late, but they eventually did get Franklin under control before it reached us. Humans have a strange relationship with risk and disasters. We're so prone to wishful thinking and bad pattern matching. I remember people being shocked when the flames jumped the PCH during the Woolsey fire in 2017. IT HAD NEVER DONE THAT! So several friends of ours had to suddenly escape a nightmare scenario, driving through burning streets, in heavy smoke, with literally their lives on the line. Because the past had failed to predict the future. I feel into that same trap for a moment with the dramatic proclamations of wind and fire weather in the days leading up to January 7. Warning after warning of "extremely dangerous, life-threatening wind" coming from the City of Malibu, and that overly-bureaucratic-but-still-ominous "Particularly Dangerous Situation" designation. Because, really, how much worse could it be? Turns out, a lot. It was a little before noon on the 7th when we first saw the big plumes of smoke rise from the Palisades fire. And immediately the pattern matching ran astray. Oh, it's probably just like Franklin. It's not big yet, they'll get it out. They usually do. Well, they didn't. By the late afternoon, we had once more packed our bags, and by then it was also clear that things actually were different this time. Different worse. Different enough that even Santa Monica didn't feel like it was assured to be safe. So we headed far North, to be sure that we wouldn't have to evacuate again. Turned out to be a good move. Because by now, into the evening, few people in the connected world hadn't started to see the catastrophic images emerging from the Palisades and Eaton fires. Well over 10,000 houses would ultimately burn. Entire neighborhoods leveled. Pictures that could be mistaken for World War II. Utter and complete destruction. By the night of the 7th, the fire reached our canyon, and it tore through the chaparral and brush that'd been building since the last big fire that area saw in 1993. Out of some 150 houses in our immediate vicinity, nearly a hundred burned to the ground. Including the first house we moved to in Malibu back in 2009. But thankfully not ours. That's of course a huge relief. This was and is our Malibu Dream House. The site of that gorgeous home office I'm so fond to share views from. Our home. But a house left standing in a disaster zone is still a disaster. The flames reached all the way up to the base of our construction, incinerated much of our landscaping, and devoured the power poles around it to dysfunction. We have burnt-out buildings every which way the eye looks. The national guard is still stationed at road blocks on the access roads. Utility workers are tearing down the entire power grid to rebuild it from scratch. It's going to be a long time before this is comfortably habitable again. So we left. That in itself feels like defeat. There's an urge to stay put, and to help, in whatever helpless ways you can. But with three school-age children who've already missed over a months worth of learning from power outages, fire threats, actual fires, and now mudslide dangers, it was time to go. None of this came as a surprise, mind you. After Woolsey in 2017, Malibu life always felt like living on borrowed time to us. We knew it, even accepted it. Beautiful enough to be worth the risk, we said.  But even if it wasn't a surprise, it's still a shock. The sheer devastation, especially in the Palisades, went far beyond our normal range of comprehension. Bounded, as it always is, by past experiences. Thus, we find ourselves back in Copenhagen. A safe haven for calamities of all sorts. We lived here for three years during the pandemic, so it just made sense to use it for refuge once more. The kids' old international school accepted them right back in, and past friendships were quickly rebooted. I don't know how long it's going to be this time. And that's an odd feeling to have, just as America has been turning a corner, and just as the optimism is back in so many areas. Of the twenty years I've spent in America, this feels like the most exciting time to be part of the exceptionalism that the US of A offers. And of course we still are. I'll still be in the US all the time on both business, racing, and family trips. But it won't be exclusively so for a while, and it won't be from our Malibu Dream House. And that burns.

2 days ago 6 votes
Slow, flaky, and failing

Thou shalt not suffer a flaky test to live, because it’s annoying, counterproductive, and dangerous: one day it might fail for real, and you won’t notice. Here’s what to do.

3 days ago 6 votes
Name that Ware, January 2025

The ware for January 2025 is shown below. Thanks to brimdavis for contributing this ware! …back in the day when you would get wares that had “blue wires” in them… One thing I wonder about this ware is…where are the ROMs? Perhaps I’ll find out soon! Happy year of the snake!

3 days ago 4 votes