Full Width [alt+shift+f] FOCUS MODE Shortcuts [alt+shift+k]
Sign Up [alt+shift+s] Log In [alt+shift+l]
32
I’ve been Adobe Illustrator free for almost six months now, and what I miss most about it are some of the nice effects it ships with. I love Sketch, but I don’t think these effects will ever make their way into Sketch. Sketch does have the ability to add plugins, and I’ve always wanted to try my hand at creating one. I really miss is Illustrator’s Roughen effect as it’s really useful for creating that trendy-old-timey-beat-up look. Since a Sketch plugin that roughens a path doesn’t already exist (that I could find) I decided to create one! I’m going to be up front, and this is mostly a decompression of all the frustration I felt while working on the plugin. Delete and reinstall * Infinity I spent a lot of time just uninstalling, reinstalling the plugin any time I made a change to the plugin script. This was super annoying, but I’m not entirely sure if there was a way around this. Using the Custom Plugin menu option was quick and useful for testing out snippets of code, but it’s not a...
over a year ago

Comments

Improve your reading experience

Logged in users get linked directly to articles resulting in a better reading experience. Please login for free, it takes less than 1 minute.

More from Daniel Marino

Making an Escape Room with only HTML and CSS

Beware! This post includes spoilers! I recently built an escape room game called CSScape Room. This isn’t my first JavaScript-free web game, but HTML and CSS have evolved significantly since my previous attempts, with newer additions allowing for more complex selectors and native interactions. Rather than saving this idea for a game jam, I built it purely for fun, which freed me from theme constraints and time pressure. I’ve enjoyed escape room games since childhood, and it was nostalgic to recreate that experience myself. This project pushed my artistic limits while challenging me to design puzzles and translate them into complex HTML and CSS. The learning process was fun, challenging, and sometimes tedious—mostly through trial and error. Process My creative process isn’t linear—it’s a blend of designing, puzzle creation, and coding that constantly influences each other. I frequently had to redesign or recode elements as the project evolved. There was also that time I accidentally deleted half my CSS because I wasn’t backing up to GitHub... lesson learned! 😬 This might sound chaotic, and honestly, it was. If you’re wondering where to start with a project like this, I began by prototyping the room navigation system. I figured that was the minimum viable feature—if I couldn’t make that work, I’d abandon the project. The solution I eventually found seems simple in retrospect, but I went through several iterations to discover it. This flexible approach makes sense for my creative projects. As I build something, both the in-progress work and my growing skills inevitably influences the entire project. I’m comfortable with this non-linear process—it also suits my ADHD brain, where I tend to lose interest if I work on the same thing for too long. Artwork I’d wanted to design a pixel art-styled game for some time but never felt confident enough to attempt it during a game jam because of the learning curve. I watched tutorials from Adam Yunis and Mort to get a crash course in pixel art best practices. Initially, progress was slow. I had to figure out 2D perspective with vanishing points, determine a color palette, practice shading techniques, and decide how much detail to include. While I tried to adhere to pixel art “rules,” I definitely broke some along the way. One challenge I set for myself was using only 32 colors to capture the feeling of an older gaming console. Once I got comfortable with shading and dithering, working within this constraint became easier. An added benefit to using 32 colors was it resulted in smaller image sizes—the game’s 79 images account for only about 25% of the total payload. I attempted to design sprites using dimensions in multiples of eight, but I’ll admit I became less strict about this as the project progressed. At a certain point, I was struggling enough with the color and styling limitations that this guideline became more of a starting point than a rule. I considered creating my own font, but after exhausting myself with all the artwork, I opted for Google’s PixelifySans instead. Almost all animation frames were individually drawn (except for the “one” TV animation). This was tedious, but I was determined to stay true to old-school techniques! I did use CSS to streamline some animations—for instance, I used animation-direction: alternate on the poster page curl to create a palindrome effect, halving the number of required sprites. Mechanics Like my previous game Heiro, this project primarily uses checkbox and radio button mechanics. However, the addition of the :has() pseudo-selector opened up many more possibilities. I also utilized the popover API to create more detailed interactions. Checkbox and Radio Selection Triggering interactions by toggling checkboxes and radio buttons isn’t new, but the :has() selector is a game-changer! Before this existed, you had to structure your markup so interactive elements were siblings. The :has() selector makes this far more flexible because you no longer need to rely on a specific HTML structure. #element { display: none; } :has(#checkbox:checked) #element { display: block; } Using this pattern, :has() looks for #checkbox anywhere on the page, meaning you don’t have to rely on #checkbox, its corresponding <label>, or #element being siblings. The markup structure is no longer a constraint. Most of this game functions on toggling checkboxes and radios to unlock, collect, and use items. Navigation I almost gave up on the current implementation, and used basic compass notation to avoid visual transitions between directions. After several failed attempts, I found a solution. The tricky part was determining how to transition into a direction from either left or right, depending on which arrow was clicked. My solution is conceptually simple but difficult to explain! First, I used radio buttons to determine which direction you’re facing (since you can only face one direction at a time). Second, I needed a way to determine if you’re entering a direction from west or east. This required eight radio buttons—two for each direction. For example, if you’re facing east (having come from facing north), you have two possible directions to go: west (returning to face north) or east (to face south). I needed to make the radio buttons visible that would take you north from east, and south from west. The CSS looks something like this: :has(#east-from-west:checked) :is( [for="south-from-west"], [for="north-from-east"]) { display: block; } This pattern was implemented for each direction, along with animations to ensure each room view slid in and out correctly. Zooming In I initially focused so much on checkbox mechanics that I assumed I’d need the same approach for zooming in on specific areas. Then I had a "Duh!" moment and realized the popover API would be perfect. Here’s the basic markup for looking at an individual book: <button popovertarget="book">Zoom in</button> <div id="book" popover> <!-- Book content goes here --> <button popovertarget="book" popovertargetaction="hide">Close</button> </div> Turning the Lights Off I procrastinated on implementing this feature because I thought I’d need to create darkened variations of all artwork. I don’t recall what inspired me to try blend modes, but I’m glad I did—the solution was surprisingly simple. When the light switch checkbox is toggled, a <div> becomes visible with a dark background color and mix-blend-mode: multiply. This multiplies the colors of the blending and base layers, resulting in a darker appearance. Playing the Crossword This required surprisingly complex CSS. Each square has three letters plus a blank tile, meaning four radio buttons. The :checked letter has a z-index of 3 to display above other letters, but also has pointer-events: none so clicks pass through to the next letter underneath (with z-index: 2). The remaining tiles have a z-index of 1. The CSS becomes even trickier when the last tile is :checked, requiring some creative selector gymnastics to target the first radio button in the stack again. Tools I created all artwork using Aseprite, which is specifically designed for pixel art. I probably only used a fraction of its features, and I’m not sure it actually made my life easier—it might have made things more difficult at times. I’m not giving up on it yet, though. I suspect I’ll occasionally discover features that make me think, “Oh, that’s way easier than what I was doing!” I started coding with basic HTML and CSS but eventually found navigation difficult with such a long HTML file. It also became tedious writing the same attributes for every <img /> element. I migrated the project to Eleventy to improve organization and create custom shortcodes for simplifying component creation. I used the html-minifier-terser npm package, which integrates well with Eleventy. I chose native CSS over Sass for several reasons: CSS now has native nesting for better organization and leaner code CSS has built-in variables HTTP/2 handles asset loading efficiently, eliminating the major benefit of bundling CSS files The game uses 12 CSS files with 12 <link rel="stylesheet" /> tags. The only Sass feature I missed was the ability to loop through style patterns for easier maintenance, but this wasn’t a significant issue. The game is hosted on GitHub Pages. During deployment, it runs an npm command to minify CSS using Lightning CSS. I mentioned accidentally deleting half my CSS earlier—this happened because I initially used Eleventy’s recommended approach with the clean-css npm package. I strongly advise against using this! This package doesn’t work with native CSS nesting. While losing code was frustrating, I rewrote much of it more efficiently, so there was a silver lining. Nice to Haves I initially wanted to make this game fully accessible, but the navigation system doesn’t translate well for screen reader users. I tried implementing a more compass-like navigation approach for keyboard users, but it proved unreliable and conflicted with the side-to-side approach. Adding text labels for interactive elements was challenging because you can’t track the :focus state of a <label> element. While you can track the :focus of the corresponding <input />, it wasn’t consistently reliable. The main keyboard accessibility issue is that the game exists as one long HTML page. When you navigate to face a different direction, keyboard focus remains elsewhere on the page, requiring extensive tabbing to reach navigation elements or item selection. I ultimately decided to make the game deliberately inaccessible by adding tabindex="-1" to all keyboard-accessible elements. I’d rather users recognize immediately that they can’t play with assistive technology than become frustrated with a partially broken experience. Sound would have been a nice addition, but I encountered the same issues as with my previous game Heiro. You can toggle the visibility of an <embed> element, but once it’s visible, you can’t hide it again—meaning there’s no way to toggle sound on and off. Conclusion CSScape Room was a fun but exhausting four-month project. It began as an experiment to see if creating a JavaScript-free escape room was possible—and the answer is definitely yes. I’ve only touched on some aspects here, so if you’re interested in the technical details, check out the source code on GitHub. Finally, I’d like to thank all my playtesters for their valuable feedback!

5 months ago 67 votes
Self-avoiding Walk

I’m a bit late to this, but back in summer 2024 I participated in the OST Composing Jam. The goal of this jam is to compose an original soundtrack (minimum of 3 minutes) of any style for an imaginary game. While I’ve composed a lot of video game music, I’ve never created an entire soundtrack around a single concept. Self Avoiding Walk by Daniel Marino To be honest, I wasn’t entirely sure where to start. I was torn between trying to come up with a story for a game to inspire the music, and just messing around with some synths and noodling on the keyboard. I did a little bit of both, but nothing really materialized. Synth + Metal ≈ Synthmetal Feeling a bit paralyzed, I fired up the ’ole RMG sequencer for inspiration. I saved a handful of randomized melodies and experimented with them in Reaper. After a day or two I landed on something I liked which was about the first 30 seconds or so of the second track: "Defrag." I love metal bands like Tesseract, Periphery, The Algorithm, Car Bomb, and Meshuggah. I tried experimenting with incorporating syncopated guttural guitar sounds with the synths. After several more days I finished "Defrag"—which also included "Kernel Panic" before splitting that into its own track. I didn’t have a clue what to do next, nor did I have a concept. Composing the rest of the music was a bit of a blur because I bounced around from song to song—iterating on the leitmotif over and over with different synths, envelopes, time signatures, rhythmic displacement, pitch shifting, and tweaking underlying chord structures. Production The guitars were recorded using DI with my Fender Squire and Behringer Interface. I’m primarily using the ML Sound Labs Amped Roots Free amp sim because the metal presets are fantastic and rarely need much fuss to get it sounding good. I also used Blue Cat Audio free amp sim for clean guitars. All the other instruments were MIDI tracks either programmed via piano roll or recorded with my Arturia MiniLab MKII. I used a variety of synth effects from my library of VSTs. I recorded this music before acquiring my Fender Squire Bass guitar, so bass was also programmed. Theme and Story At some point I had five songs that all sounded like they could be from the same game. The theme for this particular jam was "Inside my world." I had to figure out how I could write a story that corresponded with the theme and could align with the songs. I somehow landed on the idea of the main actor realizing his addiction to AI, embarking on a journey to "unplug." The music reflects his path to recovery, capturing the emotional and psychological evolution as he seeks to overcome his dependency. After figuring this out, I thought it would be cool to name all the songs using computer terms that could be metaphors for the different stages of recovery. Track listing Worm – In this dark and haunting opening track, the actor grapples with his addiction to AI, realizing he can no longer think independently. Defrag – This energetic track captures the physical and emotional struggles of the early stages of recovery. Kernel Panic – Menacing and eerie, this track portrays the actor’s anxiety and panic attacks as he teeters on the brink during the initial phases of recovery. Dæmons – With initial healing achieved, the real challenge begins. The ominous and chaotic melodies reflect the emotional turbulence the character endures. Time to Live – The actor, having come to terms with himself, experiences emotional growth. The heroic climax symbolizes the realization that recovery is a lifelong journey. Album art At the time I was messing around with Self-avoiding walks in generative artwork explorations. I felt like the whole concept of avoiding the self within the context of addiction and recovery metaphorically worked. So I tweaked some algorithms and generated the self-avoiding walk using JavaScript and the P5.js library. I then layered the self-avoiding walk over a photo I found visually interesting on Unsplash using a CSS blend mode. Jam results I placed around the top 50% out of over 600 entries. I would have liked to have placed higher, but despite my ranking, I thoroughly enjoyed composing the music! I’m very happy with the music, its production quality, and I also learned a lot. I would certainly participate in this style of composition jam again!

5 months ago 55 votes
What I’m Using in 2025

I’ve always been fascinated to see what other apps or workflows others are using in their day-to-day lives. Every now and then I learn about a new app or some cool trick I didn’t previously know. I doubt anyone seriously cares about what I’m using, but figured I’d list them out anyway—if for no other reason than to keep a historical record at this point in time. Applications Alfred — I have a lifelong license, and I like it. No point in fixing something that isn’t broken. I primarily use it for app switching, but also use it for math, and to search for gifs. Aseprite — Sometimes I do pixel art! Even if the UI is clunky, and some keyboard shortcuts aren’t always convenient, there are some unique features that help facilitate creating pixel art. Audacity — I rarely use it, but sometimes it’s easier to make some quick audio edits with Audacity than to use a full blown DAW. Bear — This is the note-taking, task-tacking app I’ve landed on. The UI is beautiful and it feels snappy. It syncs, so I can use it on my iPhone too. Chrome — I used Arc for the better part of 2024, but after they announced they were done working on it to focus on a new AI-powered browser, I peaced out. There are a couple of features I really missed, but was able to find some extensions to fill those gaps: Copy Current Tab URL, Meetings Page Auto Closer for Zoom, Open Figma app, and JSON Formatter. Figma — I use it because it’s what we use at work. I’m happy enough with Figma. iTerm2 — Has a few features that I like that makes me chose this over Mac’s native Terminal app. Pixelmator Pro — I haven’t paid the Adobe tax for a long time, and it feels good. I started using Pixelmator because at the time it was the best alternative available. I’m comfortable with Pixelmator at this point. I don’t really use image editors often these days, so I probably won’t switch anytime soon. Reaper — My DAW of choice when composing music. It’s very customizable, easyish enough to learn, and the price is right. It also has a die hard community, so I’m always able to find help when I need it. VS Code — I’ve tried a lot of code editors. I prefer Sublime’s UI over VS Code, but VS Code does a lot of things more easily than Sublime does, so I put up with the UI. YouTube Music — I still miss Rdio. YouTube Music works well enough I guess. Paying for YouTube Music has the benefit of not seeing ads on YouTube. Command-line Tools These aren’t apps per se, but these are some tools that I use to help manage packages or that I use regularly when developing. Deno Eleventy Homebrew pure statikk Vite Volta yt-dlp Equipment I have one computer and I use it for everything, and I’m okay with that. It’s more than powerful enough for work, composing music, making games, and occasionaly playing games. Although I have a dedicated home office, lately I tend to work more on the go, often with just my laptop—whether that’s at a cafe, a coworking space, or even just moving around the house. 2021 M1 MacBook Pro AKG K240 Studio Headphones Arturia MiniLab MKII Controller Behringer UMC202HD USB Audio Interface Fender Squire Strat Guitar Fender Squire Bass Guitar Shure SM57 Virtual Instruments This is quite specific for composing music, so if that does’t interest you, feel free to stop reading here. This list is not exhaustive as I’m regularly trying out new VSTs. These are some staples that I use: 🎹 Arturia Analog Lab V (Intro) — My Arturia controller came with this software. It has over 500 presets and I love exploring the variety of sounds. 🎸 Bass Grinder (Free) — I recently came across this VST, and it has a great crunchy overdrive sound for bass guitar. 🥁 Manda Audio Power Drum Kit — Even though you can use this for free, I paid the $9 because it is fantastic. The drums sound real and are great for all styles of music. 🎸 ML Amped Roots (Free) — What I like about this is that I get great metal guitar out of the bost without having to add pedals or chaining other effects. 🥁 ML Drums (Free) — I just started experimenting with this, and the drum tones are amazing. The free set up is pretty limited, but I like how I can add on to the base drum kit to meet my needs rather than having having to buy one big extensive drum VST. 🎹 Spitfire LABS — More variety of eclectic sounds. I also use several built-in VSTs made by Reaper for delay, EQ, reverb, pitch-shifting, and other effects. Reaper’s VSTs are insanely powerful enough for my needs and are much less CPU intensive.

7 months ago 82 votes
Daily Inspirational Word

Over the past couple of years I’ve gotten into journaling. Recently I’ve been using a method where you’re given a single inspirational word as a prompt, and go from there. Unfortunately, the process of finding, saving, and accessing inspirational words was a bit of a chore: Google a list of “366 inspirational words”. Get taken to a blog bloated with ads and useless content all in an effort to generate SEO cred. Copy/paste the words into Notion. Fix how the words get formatted becasue Notion is weird, and I have OCD about formatting text. While this gets the job done, I felt like there was room to make this a more pleasant experience. All I really wanted was a small website that serves a single inspirational word on a daily basis without cruft or ads. This would allow me to get the content I want without having to scroll through a long list. I also don't want to manage or store the list of words. Once I've curated a list of words, I want to be done with it. Creating a microsite I love a good microsite, and so I decided to create one that takes all the chore out of obtaining a daily inspirational word. The website is built with all vanilla tech, and doesn’t use any frameworks! It’s nice and lean, and it’s footprint is only 6.5kb. Inspirational words While I’m not a huge fan of AI, I did leverage ChatGPT on obtaining 366 inspirational words. The benefit to ChatGPT was being able to get it to return the words as an array—cutting down on the tedium of having to convert the words I already had into an array. The words are stored in it’s own JSON file, and I use an async/await function to pull in the words, and then process the data upon return. Worth the effort I find these little projects fun and exciting because the scope is super tight, and makes for a great opportunity to learn new things. It’s definitely an overengineered solution to my problem, but it is a much more pleasant experience. And perhaps it will serve other people as well.

a year ago 119 votes
Daily Inspirational Word

Over the past couple of years I’ve gotten into journaling. Recently I’ve been using a method where you’re given a single inspirational word as a prompt, and go from there. Unfortunately, the process of finding, saving, and accessing inspirational words was a bit of a chore: 1. Google a list of “366 inspirational words”. 2. Get taken to a blog bloated with ads and useless content all in an effort to generate SEO cred. 3. Copy/paste the words into Notion. 4. Fix how the words get formatted becasue Notion is weird, and I have OCD about formatting text. While this gets the job done, I felt like there was room to make this a more pleasant experience. All I really wanted was a small website that serves a single inspirational word on a daily basis without cruft or ads. This would allow me to get the content I want without having to scroll through a long list. I also don't want to manage or store the list of words. Once I've curated a list of words, I want to be done with it. ## Creating a microsite I love a good microsite, and so I decided to create one that takes all the chore out of obtaining a [daily inspirational word](https://starzonmyarmz.github.io/daily-inspirational-word/). ![Daily Inspirational Word screenshot](/images/posts/daily_inspirational_word.jpeg) The website is built with all vanilla tech, and doesn’t use any frameworks! It’s nice and lean, and it’s footprint is only 6.5kb. ### Inspirational words While I’m not a huge fan of AI, I did leverage ChatGPT on obtaining 366 inspirational words. The benefit to ChatGPT was being able to get it to return the words as an array—cutting down on the tedium of having to convert the words I already had into an array. The words are stored in it’s own JSON file, and I use an async/await function to pull in the words, and then process the data upon return. ## Worth the effort I find these little projects fun and exciting because the scope is super tight, and makes for a great opportunity to learn new things. It’s definitely an overengineered solution to my problem, but it is a much more pleasant experience. And perhaps it will serve other people as well.

a year ago 49 votes

More in programming

first-class merges and cover letters

Although it looks really good, I have not yet tried the Jujutsu (jj) version control system, mainly because it’s not yet clearly superior to Magit. But I have been following jj discussions with great interest. One of the things that jj has not yet tackled is how to do better than git refs / branches / tags. As I underestand it, jj currently has something like Mercurial bookmarks, which are more like raw git ref plumbing than a high-level porcelain feature. In particular, jj lacks signed or annotated tags, and it doesn’t have branch names that always automatically refer to the tip. This is clearly a temporary state of affairs because jj is still incomplete and under development and these gaps are going to be filled. But the discussions have led me to think about how git’s branches are unsatisfactory, and what could be done to improve them. branch merge rebase squash fork cover letters previous branch workflow questions branch One of the huge improvements in git compared to Subversion was git’s support for merges. Subversion proudly advertised its support for lightweight branches, but a branch is not very useful if you can’t merge it: an un-mergeable branch is not a tool you can use to help with work-in-progress development. The point of this anecdote is to illustrate that rather than trying to make branches better, we should try to make merges better and branches will get better as a consequence. Let’s consider a few common workflows and how git makes them all unsatisfactory in various ways. Skip to cover letters and previous branch below where I eventually get to the point. merge A basic merge workflow is, create a feature branch hack, hack, review, hack, approve merge back to the trunk The main problem is when it comes to the merge, there may be conflicts due to concurrent work on the trunk. Git encourages you to resolve conflicts while creating the merge commit, which tends to bypass the normal review process. Git also gives you an ugly useless canned commit message for merges, that hides what you did to resolve the conflicts. If the feature branch is a linear record of the work then it can be cluttered with commits to address comments from reviewers and to fix mistakes. Some people like an accurate record of the history, but others prefer the repository to contain clean logical changes that will make sense in years to come, keeping the clutter in the code review system. rebase A rebase-oriented workflow deals with the problems of the merge workflow but introduces new problems. Primarily, rebasing is intended to produce a tidy logical commit history. And when a feature branch is rebased onto the trunk before it is merged, a simple fast-forward check makes it trivial to verify that the merge will be clean (whether it uses separate merge commit or directly fast-forwards the trunk). However, it’s hard to compare the state of the feature branch before and after the rebase. The current and previous tips of the branch (amongst other clutter) are recorded in the reflog of the person who did the rebase, but they can’t share their reflog. A force-push erases the previous branch from the server. Git forges sometimes make it possible to compare a branch before and after a rebase, but it’s usually very inconvenient, which makes it hard to see if review comments have been addressed. And a reviewer can’t fetch past versions of the branch from the server to review them locally. You can mitigate these problems by adding commits in --autosquash format, and delay rebasing until just before merge. However that reintroduces the problem of merge conflicts: if the autosquash doesn’t apply cleanly the branch should have another round of review to make sure the conflicts were resolved OK. squash When the trunk consists of a sequence of merge commits, the --first-parent log is very uninformative. A common way to make the history of the trunk more informative, and deal with the problems of cluttered feature branches and poor rebase support, is to squash the feature branch into a single commit on the trunk instead of mergeing. This encourages merge requests to be roughly the size of one commit, which is arguably a good thing. However, it can be uncomfortably confining for larger features, or cause extra busy-work co-ordinating changes across multiple merge requests. And squashed feature branches have the same merge conflict problem as rebase --autosquash. fork Feature branches can’t always be short-lived. In the past I have maintained local hacks that were used in production but were not (not yet?) suitable to submit upstream. I have tried keeping a stack of these local patches on a git branch that gets rebased onto each upstream release. With this setup the problem of reviewing successive versions of a merge request becomes the bigger problem of keeping track of how the stack of patches evolved over longer periods of time. cover letters Cover letters are common in the email patch workflow that predates git, and they are supported by git format-patch. Github and other forges have a webby version of the cover letter: the message that starts off a pull request or merge request. In git, cover letters are second-class citizens: they aren’t stored in the repository. But many of the problems I outlined above have neat solutions if cover letters become first-class citizens, with a Jujutsu twist. A first-class cover letter starts off as a prototype for a merge request, and becomes the eventual merge commit. Instead of unhelpful auto-generated merge commits, you get helpful and informative messages. No extra work is needed since we’re already writing cover letters. Good merge commit messages make good --first-parent logs. The cover letter subject line works as a branch name. No more need to invent filename-compatible branch names! Jujutsu doesn’t make you name branches, giving them random names instead. It shows the subject line of the topmost commit as a reminder of what the branch is for. If there’s an explicit cover letter the subject line will be a better summary of the branch as a whole. I often find the last commit on a branch is some post-feature cleanup, and that kind of commit has a subject line that is never a good summary of its feature branch. As a prototype for the merge commit, the cover letter can contain the resolution of all the merge conflicts in a way that can be shared and reviewed. In Jujutsu, where conflicts are first class, the cover letter commit can contain unresolved conflicts: you don’t have to clean them up when creating the merge, you can leave that job until later. If you can share a prototype of your merge commit, then it becomes possible for your collaborators to review any merge conflicts and how you resolved them. To distinguish a cover letter from a merge commit object, a cover letter object has a “target” header which is a special kind of parent header. A cover letter also has a normal parent commit header that refers to earlier commits in the feature branch. The target is what will become the first parent of the eventual merge commit. previous branch The other ingredient is to add a “previous branch” header, another special kind of parent commit header. The previous branch header refers to an older version of the cover letter and, transitively, an older version of the whole feature branch. Typically the previous branch header will match the last shared version of the branch, i.e. the commit hash of the server’s copy of the feature branch. The previous branch header isn’t changed during normal work on the feature branch. As the branch is revised and rebased, the commit hash of the cover letter will change fairly frequently. These changes are recorded in git’s reflog or jj’s oplog, but not in the “previous branch” chain. You can use the previous branch chain to examine diffs between versions of the feature branch as a whole. If commits have Gerrit-style or jj-style change-IDs then it’s fairly easy to find and compare previous versions of an individual commit. The previous branch header supports interdiff code review, or allows you to retain past iterations of a patch series. workflow Here are some sketchy notes on how these features might work in practice. One way to use cover letters is jj-style, where it’s convenient to edit commits that aren’t at the tip of a branch, and easy to reshuffle commits so that a branch has a deliberate narrative. When you create a new feature branch, it starts off as an empty cover letter with both target and parent pointing at the same commit. Alternatively, you might start a branch ad hoc, and later cap it with a cover letter. If this is a small change and rebase + fast-forward is allowed, you can edit the “cover letter” to contain the whole change. Otherwise, you can hack on the branch any which way. Shuffle the commits that should be part of the merge request so that they occur before the cover letter, and edit the cover letter to summarize the preceding commits. When you first push the branch, there’s (still) no need to give it a name: the server can see that this is (probably) going to be a new merge request because the top commit has a target branch and its change-ID doesn’t match an existing merge request. Also when you push, your client automatically creates a new instance of your cover letter, adding a “previous branch” header to indicate that the old version was shared. The commits on the branch that were pushed are now immutable; rebases and edits affect the new version of the branch. During review there will typically be multiple iterations of the branch to address feedback. The chain of previous branch headers allows reviewers to see how commits were changed to address feedback, interdiff style. The branch can be merged when the target header matches the current trunk and there are no conflicts left to resolve. When the time comes to merge the branch, there are several options: For a merge workflow, the cover letter is used to make a new commit on the trunk, changing the target header into the first parent commit, and dropping the previous branch header. Or, if you like to preserve more history, the previous branch chain can be retained. Or you can drop the cover letter and fast foward the branch on to the trunk. Or you can squash the branch on to the trunk, using the cover letter as the commit message. questions This is a fairly rough idea: I’m sure that some of the details won’t work in practice without a lot of careful work on compatibility and deployability. Do the new commit headers (“target” and “previous branch”) need to be headers? What are the compatibility issues with adding new headers that refer to other commits? How would a server handle a push of an unnamed branch? How could someone else pull a copy of it? How feasible is it to use cover letter subject lines instead of branch names? The previous branch header is doing a similar job to a remote tracking branch. Is there an opportunity to simplify how we keep a local cache of the server state? Despite all that, I think something along these lines could make branches / reviews / reworks / merges less awkward. How you merge should me a matter of your project’s preferred style, without interference from technical limitations that force you to trade off one annoyance against another. There remains a non-technical limitation: I have assumed that contributors are comfortable enough with version control to use a history-editing workflow effectively. I’ve lost all perspective on how hard this is for a newbie to learn; I expect (or hope?) jj makes it much easier than git rebase.

15 hours ago 5 votes
Performant Full-Disk Encryption on a Raspberry Pi, but Foiled by Twisty UARTs

In my post yesterday (“ARM is great, ARM is terrible (and so is RISC-V)), I described my desire to find ARM hardware with AES instructions to support full-disk encryption, and the poor state of the OS ecosystem around the newer ARM boards. I was anticipating buying either a newer ARM SBC or an x86 mini … Continue reading Performant Full-Disk Encryption on a Raspberry Pi, but Foiled by Twisty UARTs →

2 hours ago 2 votes
Words are not violence

Debates, at their finest, are about exploring topics together in search for truth. That probably sounds hopelessly idealistic to anyone who've ever perused a comment section on the internet, but ideals are there to remind us of what's possible, to inspire us to reach higher — even if reality falls short. I've been reaching for those debating ideals for thirty years on the internet. I've argued with tens of thousands of people, first on Usenet, then in blog comments, then Twitter, now X, and also LinkedIn — as well as a million other places that have come and gone. It's mostly been about technology, but occasionally about society and morality too. There have been plenty of heated moments during those three decades. It doesn't take much for a debate between strangers on this internet to escalate into something far lower than a "search for truth", and I've often felt willing to settle for just a cordial tone! But for the majority of that time, I never felt like things might escalate beyond the keyboards and into the real world. That was until we had our big blow-up at 37signals back in 2021. I suddenly got to see a different darkness from the most vile corners of the internet. Heard from those who seem to prowl for a mob-sanctioned opportunity to threaten and intimidate those they disagree with. It fundamentally changed me. But I used the experience as a mirror to reflect on the ways my own engagement with the arguments occasionally felt too sharp, too personal. And I've since tried to refocus way more of my efforts on the positive and the productive. I'm by no means perfect, and the internet often tempts the worst in us, but I resist better now than I did then. What I cannot come to terms with, though, is the modern equation of words with violence. The growing sense of permission that if the disagreement runs deep enough, then violence is a justified answer to settle it. That sounds so obvious that we shouldn't need to state it in a civil society, but clearly it is not. Not even in technology. Not even in programming. There are plenty of factions here who've taken to justify their violent fantasies by referring to their ideological opponents as "nazis", "fascists", or "racists". And then follow that up with a call to "punch a nazi" or worse. When you hear something like that often enough, it's easy to grow glib about it. That it's just a saying. They don't mean it. But I'm afraid many of them really do. Which brings us to Charlie Kirk. And the technologists who name drinks at their bar after his mortal wound just hours after his death, to name but one of the many, morbid celebrations of the famous conservative debater's death. It's sickening. Deeply, profoundly sickening. And my first instinct was exactly what such people would delight in happening. To watch the rest of us recoil, then retract, and perhaps even eject. To leave the internet for a while or forever. But I can't do that. We shouldn't do that. Instead, we should double down on the opposite. Continue to show up with our ideals held high while we debate strangers in that noble search for the truth. Where we share our excitement, our enthusiasm, and our love of technology, country, and humanity. I think that's what Charlie Kirk did so well. Continued to show up for the debate. Even on hostile territory. Not because he thought he was ever going to convince everyone, but because he knew he'd always reach some with a good argument, a good insight, or at least a different perspective. You could agree or not. Counter or be quiet. But the earnest exploration of the topics in a live exchange with another human is as fundamental to our civilization as Socrates himself. Don't give up, don't give in. Keep debating.

30 minutes ago 1 votes
ARM is great, ARM is terrible (and so is RISC-V)

I’ve long been interested in new and different platforms. I ran Debian on an Alpha back in the late 1990s and was part of the Alpha port team; then I helped bootstrap Debian on amd64. I’ve got somewhere around 8 Raspberry Pi devices in active use right now, and the free NNCPNET Internet email service … Continue reading ARM is great, ARM is terrible (and so is RISC-V) →

yesterday 3 votes
Many Hard Leetcode Problems are Easy Constraint Problems

In my first interview out of college I was asked the change counter problem: Given a set of coin denominations, find the minimum number of coins required to make change for a given number. IE for USA coinage and 37 cents, the minimum number is four (quarter, dime, 2 pennies). I implemented the simple greedy algorithm and immediately fell into the trap of the question: the greedy algorithm only works for "well-behaved" denominations. If the coin values were [10, 9, 1], then making 37 cents would take 10 coins in the greedy algorithm but only 4 coins optimally (10+9+9+9). The "smart" answer is to use a dynamic programming algorithm, which I didn't know how to do. So I failed the interview. But you only need dynamic programming if you're writing your own algorithm. It's really easy if you throw it into a constraint solver like MiniZinc and call it a day. int: total; array[int] of int: values = [10, 9, 1]; array[index_set(values)] of var 0..: coins; constraint sum (c in index_set(coins)) (coins[c] * values[c]) == total; solve minimize sum(coins); You can try this online here. It'll give you a prompt to put in total and then give you successively-better solutions: coins = [0, 0, 37]; ---------- coins = [0, 1, 28]; ---------- coins = [0, 2, 19]; ---------- coins = [0, 3, 10]; ---------- coins = [0, 4, 1]; ---------- coins = [1, 3, 0]; ---------- Lots of similar interview questions are this kind of mathematical optimization problem, where we have to find the maximum or minimum of a function corresponding to constraints. They're hard in programming languages because programming languages are too low-level. They are also exactly the problems that constraint solvers were designed to solve. Hard leetcode problems are easy constraint problems.1 Here I'm using MiniZinc, but you could just as easily use Z3 or OR-Tools or whatever your favorite generalized solver is. More examples This was a question in a different interview (which I thankfully passed): Given a list of stock prices through the day, find maximum profit you can get by buying one stock and selling one stock later. It's easy to do in O(n^2) time, or if you are clever, you can do it in O(n). Or you could be not clever at all and just write it as a constraint problem: array[int] of int: prices = [3, 1, 4, 1, 5, 9, 2, 6, 5, 3, 5, 8]; var int: buy; var int: sell; var int: profit = prices[sell] - prices[buy]; constraint sell > buy; constraint profit > 0; solve maximize profit; Reminder, link to trying it online here. While working at that job, one interview question we tested out was: Given a list, determine if three numbers in that list can be added or subtracted to give 0? This is a satisfaction problem, not a constraint problem: we don't need the "best answer", any answer will do. We eventually decided against it for being too tricky for the engineers we were targeting. But it's not tricky in a solver; include "globals.mzn"; array[int] of int: numbers = [3, 1, 4, 1, 5, 9, 2, 6, 5, 3, 5, 8]; array[index_set(numbers)] of var {0, -1, 1}: choices; constraint sum(n in index_set(numbers)) (numbers[n] * choices[n]) = 0; constraint count(choices, -1) + count(choices, 1) = 3; solve satisfy; Okay, one last one, a problem I saw last year at Chipy AlgoSIG. Basically they pick some leetcode problems and we all do them. I failed to solve this one: Given an array of integers heights representing the histogram's bar height where the width of each bar is 1, return the area of the largest rectangle in the histogram. The "proper" solution is a tricky thing involving tracking lots of bookkeeping states, which you can completely bypass by expressing it as constraints: array[int] of int: numbers = [2,1,5,6,2,3]; var 1..length(numbers): x; var 1..length(numbers): dx; var 1..: y; constraint x + dx <= length(numbers); constraint forall (i in x..(x+dx)) (y <= numbers[i]); var int: area = (dx+1)*y; solve maximize area; output ["(\(x)->\(x+dx))*\(y) = \(area)"] There's even a way to automatically visualize the solution (using vis_geost_2d), but I didn't feel like figuring it out in time for the newsletter. Is this better? Now if I actually brought these questions to an interview the interviewee could ruin my day by asking "what's the runtime complexity?" Constraint solvers runtimes are unpredictable and almost always than an ideal bespoke algorithm because they are more expressive, in what I refer to as the capability/tractability tradeoff. But even so, they'll do way better than a bad bespoke algorithm, and I'm not experienced enough in handwriting algorithms to consistently beat a solver. The real advantage of solvers, though, is how well they handle new constraints. Take the stock picking problem above. I can write an O(n²) algorithm in a few minutes and the O(n) algorithm if you give me some time to think. Now change the problem to Maximize the profit by buying and selling up to max_sales stocks, but you can only buy or sell one stock at a given time and you can only hold up to max_hold stocks at a time? That's a way harder problem to write even an inefficient algorithm for! While the constraint problem is only a tiny bit more complicated: include "globals.mzn"; int: max_sales = 3; int: max_hold = 2; array[int] of int: prices = [3, 1, 4, 1, 5, 9, 2, 6, 5, 3, 5, 8]; array [1..max_sales] of var int: buy; array [1..max_sales] of var int: sell; array [index_set(prices)] of var 0..max_hold: stocks_held; var int: profit = sum(s in 1..max_sales) (prices[sell[s]] - prices[buy[s]]); constraint forall (s in 1..max_sales) (sell[s] > buy[s]); constraint profit > 0; constraint forall(i in index_set(prices)) (stocks_held[i] = (count(s in 1..max_sales) (buy[s] <= i) - count(s in 1..max_sales) (sell[s] <= i))); constraint alldifferent(buy ++ sell); solve maximize profit; output ["buy at \(buy)\n", "sell at \(sell)\n", "for \(profit)"]; Most constraint solving examples online are puzzles, like Sudoku or "SEND + MORE = MONEY". Solving leetcode problems would be a more interesting demonstration. And you get more interesting opportunities to teach optimizations, like symmetry breaking. Because my dad will email me if I don't explain this: "leetcode" is slang for "tricky algorithmic interview questions that have little-to-no relevance in the actual job you're interviewing for." It's from leetcode.com. ↩

yesterday 4 votes