More from alexwlchan
Consider the following JSON object: { "sides": 4, "colour": "red", "sides": 5, "colour": "blue" } Notice that sides and colour both appear twice. This looks invalid, but I learnt recently that this is actually legal JSON syntax! It’s unusual and discouraged, but it’s not completely forbidden. This was a big surprise to me. I think of JSON objects as key/value pairs, and I associate them with data structures like a dict in Python or a Hash in Ruby – both of which only allow unique keys. JSON has no such restriction, and I started thinking about how to handle it. What does the JSON spec say about duplicate names? JSON is described by several standards, which Wikipedia helpfully explains for us: After RFC 4627 had been available as its “informational” specification since 2006, JSON was first standardized in 2013, as ECMA‑404. RFC 8259, published in 2017, is the current version of the Internet Standard STD 90, and it remains consistent with ECMA‑404. That same year, JSON was also standardized as ISO/IEC 21778:2017. The ECMA and ISO/IEC standards describe only the allowed syntax, whereas the RFC covers some security and interoperability considerations. All three of these standards explicitly allow the use of duplicate names in objects. ECMA‑404 and ISO/IEC 21778:2017 have identical text to describe the syntax of JSON objects, and they say (emphasis mine): An object structure is represented as a pair of curly bracket tokens surrounding zero or more name/value pairs. […] The JSON syntax does not impose any restrictions on the strings used as names, does not require that name strings be unique, and does not assign any significance to the ordering of name/value pairs. These are all semantic considerations that may be defined by JSON processors or in specifications defining specific uses of JSON for data interchange. RFC 8259 goes further and strongly recommends against duplicate names, but the use of SHOULD means it isn’t completely forbidden: The names within an object SHOULD be unique. The same document warns about the consequences of ignoring this recommendation: An object whose names are all unique is interoperable in the sense that all software implementations receiving that object will agree on the name-value mappings. When the names within an object are not unique, the behavior of software that receives such an object is unpredictable. Many implementations report the last name/value pair only. Other implementations report an error or fail to parse the object, and some implementations report all of the name/value pairs, including duplicates. So it’s technically valid, but it’s unusual and discouraged. I’ve never heard of a use case for JSON objects with duplicate names. I’m sure there was a good reason for it being allowed by the spec, but I can’t think of it. Most JSON parsers – including jq, JavaScript, and Python – will silently discard all but the last instance of a duplicate name. Here’s an example in Python: >>> import json >>> json.loads('{"sides": 4, "colour": "red", "sides": 5, "colour": "blue"}') {'colour': 'blue', 'sides': 5} What if I wanted to decode the whole object, or throw an exception if I see duplicate names? This happened to me recently. I was editing a JSON file by hand, and I’d copy/paste objects to update the data. I also had scripts which could update the file. I forgot to update the name on one of the JSON objects, so there were two name/value pairs with the same name. When I ran the script, it silently erased the first value. I was able to recover the deleted value from the Git history, but I wondered how I could prevent this happening again. How could I make the script fail, rather than silently delete data? Decoding duplicate names in Python When Python decodes a JSON object, it first parses the object as a list of name/value pairs, then it turns that list of name value pairs into a dictionary. We can see this by looking at the JSONObject function in the CPython source code: it builds a list pairs, and at the end of the function, it calls dict(pairs) to turn the list into a dictionary. This relies on the fact that dict() can take an iterable of key/value tuples and create a dictionary: >>> dict([('sides', 4), ('colour', 'red')]) {'colour': 'red', 'sides': 4} The docs for dict() tell us that it` will discard duplicate keys: “if a key occurs more than once, the last value for that key becomes the corresponding value in the new dictionary”. >>> dict([('sides', 4), ('colour', 'red'), ('sides', 5), ('colour', 'blue')]) {'colour': 'blue', 'sides': 5} We can customise what Python does with the list of name/value pairs. Rather than calling dict(), we can pass our own function to the object_pairs_hook parameter of json.loads(), and Python will call that function on the list of pairs. This allows us to parse objects in a different way. For example, we can just return the literal list of name/value pairs: >>> import json >>> json.loads( ... '{"sides": 4, "colour": "red", "sides": 5, "colour": "blue"}', ... object_pairs_hook=lambda pairs: pairs ... ) ... [('sides', 4), ('colour', 'red'), ('sides', 5), ('colour', 'blue')] We could also use the multidict library to get a dict-like data structure which supports multiple values per key. This is based on HTTP headers and URL query strings, two environments where it’s common to have multiple values for a single key: >>> from multidict import MultiDict >>> md = json.loads( ... '{"sides": 4, "colour": "red", "sides": 5, "colour": "blue"}', ... object_pairs_hook=lambda pairs: MultiDict(pairs) ... ) ... >>> md <MultiDict('sides': 4, 'colour': 'red', 'sides': 5, 'colour': 'blue')> >>> md['sides'] 4 >>> md.getall('sides') [4, 5] Preventing silent data loss If we want to throw an exception when we see duplicate names, we need a longer function. Here’s the code I wrote: import collections import typing def dict_with_unique_names(pairs: list[tuple[str, typing.Any]]) -> dict[str, typing.Any]: """ Convert a list of name/value pairs to a dict, but only if the names are unique. If there are non-unique names, this function throws a ValueError. """ # First try to parse the object as a dictionary; if it's the same # length as the pairs, then we know all the names were unique and # we can return immediately. pairs_as_dict = dict(pairs) if len(pairs_as_dict) == len(pairs): return pairs_as_dict # Otherwise, let's work out what the repeated name(s) were, so we # can throw an appropriate error message for the user. name_tally = collections.Counter(n for n, _ in pairs) repeated_names = [n for n, count in name_tally.items() if count > 1] assert len(repeated_names) > 0 if len(repeated_names) == 1: raise ValueError(f"Found repeated name in JSON object: {repeated_names[0]}") else: raise ValueError( f"Found repeated names in JSON object: {', '.join(repeated_names)}" ) If I use this as my object_pairs_hook when parsing an object which has all unique names, it returns the normal dict I’d expect: >>> json.loads( ... '{"sides": 4, "colour": "red"}', ... object_pairs_hook=dict_with_unique_names ... ) ... {'colour': 'red', 'sides': 4} But if I’m parsing an object with one or more repeated names, the parsing fails and throws a ValueError: >>> json.loads( ... '{"sides": 4, "colour": "red", "sides": 5}', ... object_pairs_hook=dict_with_unique_names ... ) Traceback (most recent call last): […] ValueError: Found repeated name in JSON object: sides >>> json.loads( ... '{"sides": 4, "colour": "red", "sides": 5, "colour": "blue"}', ... object_pairs_hook=dict_with_unique_names ... ) Traceback (most recent call last): […] ValueError: Found repeated names in JSON object: sides, colour This is precisely the behaviour I want – throwing an exception, not silently dropping data. Encoding non-unique names in Python It’s hard to think of a use case, but this post feels incomplete without at least a brief mention. If you want to encode custom data structures with Python’s JSON library, you can subclass JSONEncoder and define how those structures should be serialised. Here’s a rudimentary attempt at doing that for a MultiDict: class MultiDictEncoder(json.JSONEncoder): def encode(self, o: typing.Any) -> str: # If this is a MultiDict, we need to construct the JSON string # manually -- first encode each name/value pair, then construct # the JSON object literal. if isinstance(o, MultiDict): name_value_pairs = [ f'{super().encode(str(name))}: {self.encode(value)}' for name, value in o.items() ] return '{' + ', '.join(name_value_pairs) + '}' return super().encode(o) and here’s how you use it: >>> md = MultiDict([('sides', 4), ('colour', 'red'), ('sides', 5), ('colour', 'blue')]) >>> json.dumps(md, cls=MultiDictEncoder) {"sides": 4, "colour": "red", "sides": 5, "colour": "blue"} This is rough code, and you shouldn’t use it – it’s only an example. I’m constructing the JSON string manually, so it doesn’t handle edge cases like indentation or special characters. There are almost certainly bugs, and you’d need to be more careful if you wanted to use this for real. In practice, if I had to encode a multi-dict as JSON, I’d encode it as a list of objects which each have a key and a value field. For example: [ {"key": "sides", "value": 4 }, {"key": "colour", "value": "red" }, {"key": "sides", "value": 5 }, {"key": "colour", "value": "blue"}, ] This is a pretty standard pattern, and it won’t trip up JSON parsers which aren’t expecting duplicate names. Do you need to worry about this? This isn’t a big deal. JSON objects with duplicate names are pretty unusual – this is the first time I’ve ever encountered one, and it was a mistake. Trying to account for this edge case in every project that uses JSON would be overkill. It would add complexity to my code and probably never catch a single error. This started when I made a copy/paste error that introduced the initial duplication, and then a script modified the JSON file and caused some data loss. That’s a somewhat unusual workflow, because most JSON files are exclusively modified by computers, and this wouldn’t be an issue. I’ve added this error handling to my javascript-data-files library, but I don’t anticipate adding it to other projects. I use that library for my static website archives, which is where I had this issue. Although I won’t use this code exactly, it’s been good practice at writing custom encoders/decoders in Python. That is something I do all the time – I’m often encoding native Python types as JSON, and I want to get the same type back when I decode later. I’ve been writing my own subclasses of JSONEncoder and JSONDecoder for a while. Now I know a bit more about how Python decodes JSON, and object_pairs_hook is another tool I can consider using. This was a fun deep dive for me, and I hope you found it helpful too. [If the formatting of this post looks odd in your feed reader, visit the original article]
I store a lot of data in SQLite databases on remote servers, and I often want to copy them to my local machine for analysis or backup. When I’m starting a new project and the database is near-empty, this is a simple rsync operation: $ rsync --progress username@server:my_remote_database.db my_local_database.db As the project matures and the database grows, this gets slower and less reliable. Downloading a 250MB database from my web server takes about a minute over my home Internet connection, and that’s pretty small – most of my databases are multiple gigabytes in size. I’ve been trying to make these copies go faster, and I recently discovered a neat trick. What really slows me down is my indexes. I have a lot of indexes in my SQLite databases, which dramatically speed up my queries, but also make the database file larger and slower to copy. (In one database, there’s an index which single-handedly accounts for half the size on disk!) The indexes don’t store anything unique – they just duplicate data from other tables to make queries faster. Copying the indexes makes the transfer less efficient, because I’m copying the same data multiple times. I was thinking about ways to skip copying the indexes, and I realised that SQLite has built-in tools to make this easy. Dumping a database as a text file SQLite allows you to dump a database as a text file. If you use the .dump command, it prints the entire database as a series of SQL statements. This text file can often be significantly smaller than the original database. Here’s the command: $ sqlite3 my_database.db .dump > my_database.db.txt And here’s what the beginning of that file looks like: PRAGMA foreign_keys=OFF; BEGIN TRANSACTION; CREATE TABLE IF NOT EXISTS "tags" ( [name] TEXT PRIMARY KEY, [count_uses] INTEGER NOT NULL ); INSERT INTO tags VALUES('carving',260); INSERT INTO tags VALUES('grass',743); … Crucially, this reduces the large and disk-heavy indexes into a single line of text – it’s an instruction to create an index, not the index itself. CREATE INDEX [idx_photo_locations] ON [photos] ([longitude], [latitude]); This means that I’m only storing each value once, rather than the many times it may be stored across the original table and my indexes. This is how the text file can be smaller than the original database. If you want to reconstruct the database, you pipe this text file back to SQLite: $ cat my_database.db.txt | sqlite3 my_reconstructed_database.db Because the SQL statements are very repetitive, this text responds well to compression: $ sqlite3 explorer.db .dump | gzip -c > explorer.db.txt.gz To give you an idea of the potential savings, here’s the relative disk size for one of my databases. File Size on disk original SQLite database 3.4 GB text file (sqlite3 my_database.db .dump) 1.3 GB gzip-compressed text (sqlite3 my_database.db .dump | gzip -c) 240 MB The gzip-compressed text file is 14× smaller than the original SQLite database – that makes downloading the database much faster. My new ssh+rsync command Rather than copying the database directly, now I create a gzip-compressed text file on the server, copy that to my local machine, and reconstruct the database. Like so: # Create a gzip-compressed text file on the server ssh username@server "sqlite3 my_remote_database.db .dump | gzip -c > my_remote_database.db.txt.gz" # Copy the gzip-compressed text file to my local machine rsync --progress username@server:my_remote_database.db.txt.gz my_local_database.db.txt.gz # Remove the gzip-compressed text file from my server ssh username@server "rm my_remote_database.db.txt.gz" # Uncompress the text file gunzip my_local_database.db.txt.gz # Reconstruct the database from the text file cat my_local_database.db.txt | sqlite3 my_local_database.db # Remove the local text file rm my_local_database.db.txt A database dump is a stable copy source This approach fixes another issue I’ve had when copying SQLite databases. If it takes a long time to copy a database and it gets updated midway through, rsync may give me an invalid database file. The first half of the file is pre-update, the second half file is post-update, and they don’t match. When I try to open the database locally, I get an error: database disk image is malformed By creating a text dump before I start the copy operation, I’m giving rsync a stable copy source. That text dump isn’t going to change midway through the copy, so I’ll always get a complete and consistent text file. This approach has saved me hours when working with large databases, and made my downloads both faster and more reliable. If you have to copy around large SQLite databases, give it a try. [If the formatting of this post looks odd in your feed reader, visit the original article]
I support dark mode on this site, and as part of the dark theme, I have a colour-inverted copy of the default background texture. I like giving my website a subtle bit of texture, which I think makes it stand out from a web which is mostly solid-colour backgrounds. Both my textures are based on the “White Waves” pattern made by Stas Pimenov. I was setting these images as my background with two CSS rules, using the prefers-color-scheme: dark media feature to use the alternate image in dark mode: body { background: url('https://alexwlchan.net/theme/white-waves-transparent.png'); } @media (prefers-color-scheme: dark) { body { background: url('https://alexwlchan.net/theme/black-waves-transparent.png'); } } This works, mostly. But I prefer light mode, so while I wrote this CSS and I do some brief testing whenever I make changes, I’m not using the site in dark mode. I know how dark mode works in my local development environment, not how it feels as a day-to-day user. Late last night I was using my phone in dark mode to avoid waking the other people in the house, and I opened my site. I saw a brief flash of white, and then the dark background texture appeared. That flash of bright white is precisely what you don’t want when you’re using dark mode, but it happened anyway. I made a note to work it out in the morning, then I went to bed. Now I’m fully awake, it’s obvious what happened. Because my only background is the image URL, there’s a brief gap between the CSS being parsed and the background image being loaded. In that time, the browser doesn’t have anything to put in the background, so you just get pure white. This was briefly annoying in the moment, but it would be even more worse if the background texture never loaded. I have light text on black in dark mode, but without the background image it’s just light text on white, which is barely readable: I never noticed this in local development, because I’m usually working in a well-lit room where that white flash would be far less obvious. I’m also using a local version of the site, which loads near-instantly and where the background image is almost certainly saved in my browser cache. I’ve made two changes to prevent this happening again. I’ve added a colour to use as a fallback until the image loads. The CSS background property supports adding a colour, which is used until the image loads, or as a fallback if it doesn’t. I already use this in a few places, and now I’ve added it to my body background. body { background: url('https://…/white-waves-transparent.png') #fafafa; } @media (prefers-color-scheme: dark) { body { background: url('https://…/black-waves-transparent.png') #0d0d0d; } } This avoids the flash of unstyled background before the image loads – the browser will use a solid dark background until it gets the texture. I’ve added rel="preload" elements to the head of the page, so the browser will start loading the background textures faster. These elements are a clue to the browser that these resources are going to be useful when it renders the page, so it should start loading them as soon as possible: <link rel="preload" href="https://alexwlchan.net/theme/white-waves-transparent.png" as="image" type="image/png" media="(prefers-color-scheme: light)" /> <link rel="preload" href="https://alexwlchan.net/theme/black-waves-transparent.png" as="image" type="image/png" media="(prefers-color-scheme: dark)" /> This means the browser is downloading the appropriate texture at the same time as it’s downloading the CSS file. Previously it had to download the CSS file, parse it, and only then would it know to start downloading the texture. With the preload, it’s a bit faster! The difference is probably imperceptible if you’re on a fast connection, but it’s a small win and I can’t see any downside (as long as I scope the preload correctly, and don’t preload resources I don’t end up using). I’ve seen a lot of sites using <link rel="preload"> and I’ve only half-understood what it is and why it’s useful – I’m glad to have a chance to use it myself, so I can understand it better. This bug reminds me of a phenomenon called flash of unstyled text. Back when custom fonts were fairly new, you’d often see web pages appear briefly with the default font before custom fonts finished loading. There are well-understood techniques for preventing this, so it’s unusual to see that brief unstyled text on modern web pages – but the same issue is affecting me in dark mode I avoided using custom fonts on the web to avoid tackling this issue, but it got me anyway! In these dark times for the web, old bugs are new again. [If the formatting of this post looks odd in your feed reader, visit the original article]
I’ve been writing some internal dashboards recently, and one hard part is displaying timestamps. Our server does everything in UTC, but the team is split across four different timezones, so the server timestamps aren’t always easy to read. For most people, it’s harder to understand a UTC timestamp than a timestamp in your local timezone. Did that event happen just now, an hour ago, or much further back? Was that at the beginning of your working day? Or at the end? Then I remembered that I tried to solve this five years ago at a previous job. I wrote a JavaScript snippet that converts UTC timestamps into human-friendly text. It displays times in your local time zone, and adds a short suffix if the time happened recently. For example: today @ 12:00 BST (1 hour ago) In my old project, I was using writing timestamps in a <div> and I had to opt into the human-readable text for every date on the page. It worked, but it was a bit fiddly. Doing it again, I thought of a more elegant solution. HTML has a <time> element for expressing datetimes, which is a more meaningful wrapper than a <div>. When I render the dashboard on the server, I don’t know the user’s timezone, so I include the UTC timestamp in the page like so: <time datetime="2025-04-15 19:45:00Z"> Tue, 15 Apr 2025 at 19:45 UTC </time> I put a machine-readable date and time string with a timezone offset string in the datetime attribute, and then a more human-readable string in the text of the element. Then I add this JavaScript snippet to the page: window.addEventListener("DOMContentLoaded", function() { document.querySelectorAll("time").forEach(function(timeElem) { // Set the `title` attribute to the original text, so a user // can hover over a timestamp to see the UTC time. timeElem.setAttribute("title", timeElem.innerText); // Replace the display text with a human-friendly date string // which is localised to the user's timezone. timeElem.innerText = getHumanFriendlyDateString( timeElem.getAttribute("datetime") ); }) }); This updates any <time> element on the page to use a human friendly date string, which is localised to the user’s timezone. For example, I’m in the UK so that becomes: <time datetime="2025-04-15 19:45:00Z" title="Tue, 15 Apr 2025 at 19:45 UTC"> Tue, 15 Apr 2025 at 20:45 BST </time> In my experience, these timestamps are easier and more intuitive for people to read. I always include a timezone string (e.g. BST, EST, PDT) so it’s obvious that I’m showing a localised timestamp. If you really need the UTC timestamp, it’s in the title attribute, so you can see it by hovering over it. (Sorry, mouseless users, but I don’t think any of my team are browsing our dashboards from their phone or tablet.) If the JavaScript doesn’t load, you see the plain old UTC timestamp. It’s not ideal, but the page still loads and you can see all the information – this behaviour is an enhancement, not an essential. To me, this is the unfulfilled promise of the <time> element. In my fantasy world, web page authors would write the time in a machine-readable format, and browsers would show it in a way that makes sense for the reader. They’d take into account their language, locale, and time zone. I understand why that hasn’t happened – it’s much easier said than done. You need so much context to know what’s the “right” thing to do when dealing with datetimes, and guessing without that context is at the heart of many datetime bugs. These sort of human-friendly, localised timestamps are very handy sometimes, and a complete mess at other times. In my staff-only dashboards, I have that context. I know what these timestamps mean, who’s going to be reading them, and I think they’re a helpful addition that makes the data easier to read. [If the formatting of this post looks odd in your feed reader, visit the original article]
More in programming
“Earth to Dave.” Brian rapped his knuckles lightly against Dave’s head. He snapped back into it. “This nice lady wants to know what kind of bagel you want” It was so like Brian to call the woman at the counter a nice lady. At some point, he’d watched American Pie and thought Stifler was how people should be. Or maybe he was just always like that and the movie made him think it was acceptable. He called himself a gentleman, but not in the creepy Elliot Rodger way, or in anything resembling the real meaning of the word. I think he just thought it was funny how that word got a rise out of people. Dave replied, “uhhh a cinnamon raisin…with uhh…butter…yea butter.” Even the nice lady knew Dave was high, Brian and I ordered normal bagel sandwiches and here was Dave ordering dessert. Brian paid for the bagels with his mom’s credit card and told us we could Venmo him later. His mom’s credit card and we Venmo him. That’s the type of guy he was. “He didn’t leave last night. Passed out right where he was sitting,” Brian spoke about Dave like he wasn’t there. This would happen sometimes. Like we would smoke and get high, but for Dave it was a different thing. He would smoke till he was catatonic. He was also the first one in our friend group to start smoking. Dave’s older brother killed himself when we were freshmen, and that’s around when it started. A coping strategy. It wasn’t just the loss of his brother, his mom was never the same afterward. I don’t think I saw her out of the house after that; Dave said she barely left her room. The overall downer atmosphere was too much for his sister and she moved to California as soon as she could. He was all that was left. Dave got quiet after that. Maybe it was the weed, maybe it was the pain, but he started riding his bike to school every day and doing really well in classes. His brother was some kind of wunderkind; maybe he thought if he was like that his mother would stop moping. He graduated valedictorian and she never did. Maybe that wasn’t enough? “I thought we were gonna have a party last night,” I mentioned offhandedly. Brian’s parties weren’t the Met Gala, but it was usually more than just the three of us. I was hoping Ari was going to be there. “Hoping Ari was gonna be there?” Oh God even when I say something so innocuous Brian knows what I’m thinking. “Nah, we found the Volcano and didn’t really want to text anyone else after that.” “Understandable” “I don’t know if Dave slept. When I came downstairs he had the bag inflated on top of the machine.” He ribbed Dave with his elbow while he said this. “I slept,” mumbled Dave. “It’s called wake and bake.” Dave and Brian both liked 2000s movies, stoner comedies, movies about high school and prom, getting laid. Something about it being a simpler time. They were closer than I was with either of them. Dave would go to dark places and Brian couldn’t be brought there. So it worked. Even though Brian was kind of an asshole, he was a good guy to have around. His dad was a trucker and he wasn’t around much. This is one of the things we bonded over. They called our name and we sat down at a table with the bagels. In the morning the visit from the cops didn’t really seem like much of a big deal, so I thought I’d tell the guys about it. Upon doing so, Brian immediately pointed out something I missed. He asked, “What’s your dad’s name?” and I realized right away. While he went by Jonathan and I went by John, it sure made a lot more sense that the cops were looking for him when they wrote “John.” Lazy cops didn’t even write his full name. He was in town, and he was on Long Island for the meeting. It wasn’t me! “Hey McFly!” Brian mocked. It was clear he’d watched Back to the Future recently and now I see where he got the knuckle rapping too. “You know you gotta stop being such a little pussy all the time.” He put on a high pitched voice that was supposed to resemble mine, “I’m a little pussy, I hide in my room when the cops come and they aren’t even looking for me they are looking for my weird ass dad.” I was too relieved to care about his mocking. Brian continued, “You know that place was into some weird shit. Where your dad worked before he bugged out. Dave’s brother worked there too.” “You mean the one who killed himself?” Immediately I regretted how I said it and looked over at Dave. He was too transfixed by the swirls of cinnamon in his bagel to register anything. Brian scolded me, “dude” and I felt embarrassed. I was so giddy from realizing the cops weren’t looking for me that this crossed over in my head to a true crime podcast, where I was more interested in the mystery than the characters, forgetting that these were real people with real lives. I felt distant enough from my father to view him that way. I don’t know how Dave felt about his brother anymore. I came back at Brian, trying to move past my faux pas. “Well at least my dad did something. Your dad doesn’t even drive the truck! He sits there and watches it drive. A real union man.” Dave now had finished his bagel and put his head down on the table. Brian smiled at the retort, “At least my dad isn’t some kind of Kaczynski freak.” This was the nature of our friendship. Last night I thought my world was closing in, now I realize this just all isn’t my problem. My dad was sort of a Kaczynski freak. Maybe he mailed bombs to people and the cops caught him. Maybe he has some lame manifesto about why he did it. I love reading manifestos. I took out my phone and venmoed Brian for my and Dave’s bagel. We’d venmo him about half the time; Brian’s mom was rich from the settlement and we knew she didn’t care. My mom still wasn’t back when I got home.
Here on a summer night in the grass and lilac smell Drunk on the crickets and the starry sky, Oh what fine stories we could tell With this moonlight to tell them by. A summer night, and you, and paradise, So lovely and so filled with grace, Above your head, the universe has hung its … Continue reading Dreams of Late Summer →
The first Rails World in Amsterdam was a roaring success back in 2023. Tickets sold out in 45 minutes, the atmosphere was electric, and The Rails Foundation set a new standard for conference execution in the Ruby community. So when we decided to return to the Dutch Capital for the third edition of the conference this year, the expectations were towering. And yet, Amanda Perino, our executive director and event organizer extraordinaire, managed to outdo herself, and produced an even better show this year. The venue we returned to was already at capacity the first time around, but Amanda managed to fit a third more attendees by literally using slimmer chairs! And I didn't hear any complaints the folks who had to sit a little closer together in order for more people to enjoy the gathering. The increased capacity didn't come close to satisfy the increased demand, though. This year, tickets sold out in less than two minutes. Crazy. But for the 800+ people who managed to secure a pass, I'm sure it felt worth the refresh-the-website scramble to buy a ticket. And, as in years past, Amanda's recording crew managed to turn around post-production on my keynote in less than 24 hours, so anyone disappointed with missing out on a ticket could at least be in the loop on all the awesome new Rails stuff we were releasing up to and during the conference. Every other session was recorded too, and will soon be on the Rails YouTube channel. You can't stream the atmosphere, the enthusiasm, and the genuine love of Ruby on Rails, though. I was once again blown away by just how many incredible people and stories we have in this ecosystem. From entrepreneurs who've built million (or billion!) dollar businesses on Rails, to programmers who've been around the framework for decades, to people who just picked it up this year. It was a thrill to meet all of them, to take hundreds of selfies, and to talk about Ruby, Rails, and the Omarchy expansion pack for hours on the hallway track! I've basically stopped doing prepared presentations at conferences, but Rails World is the one exception. I really try my best to put on a good show, present the highlights of what we've been working on in the past year at 37signals, and transfer the never-ending enthusiasm I continue to feel for this framework, this programming language, and this ecosystem. True, I may occasionally curse that commitment in the weeks leading up to the conference, but the responsibility is always rewarded during and after the execution with a deep sense of satisfaction. Not everyone is so lucky as I've been to find their life's work early in their career, and see it continue to blossom over the decades. I'm eternally grateful that I have. Of course, there's been ups and downs over the years — nothing is ever just a straight line of excitement up and to the right! — but we're oh-so-clearly on the up-up-up part of that curve at the moment. I don't know whether it's just the wind or the whims, but Rails is enjoying an influx of a new generation of programmers at the moment. No doubt it helps when I get to wax poetically about Ruby for an hour with Lex Fridman in front of an audience of millions. No doubt Shopify's continued success eating the world of ecommerce helps. No doubt the stability, professionalism, and execution from The Rails Foundation is an aid. There are many auxiliary reasons why we're riding a wave at the moment, but key to it all is also that Ruby on Rails is simply really, really good! Next year, with RailsConf finished, it's time to return to the US. Amanda has picked a great spot in Austin, we're planning to dramatically expand the capacity, but I also fully expect that demand will continue to rise, especially in the most prosperous and successful market for Rails. Thanks again to all The Rails Foundation members who believed in the vision for a new institution back in 2022. It looks like a no-brainer to join such a venture now, given the success of Rails World and everything else, but it actually took guts to sign on back then. I approached quite a few companies at that time who could see the value, but couldn't find the courage to support our work, as our industry was still held hostage to a band of bad ideas and terrible ideologies. All that nonsense is thankfully now long gone in the Rails world. We're enjoying a period of peak unity, excitement, progress, and determination to continue to push for end-to-end problem solving, open source, and freedom. I can't tell you how happy it makes me feel when I hear from yet another programmer who credits Ruby on Rails with finding joy and beauty in the writing web applications because of what I started over 22 years ago. It may sound trite, but it's true: It's an honor and a privilege. I hope to carry this meaningful burden for as long as my intellectual legs still let me stand. See you next year in Austin? I hope so!
I hadn’t lost my virginity yet. And it wasn’t for lack of trying; it seemed like the rest of my generation was no longer interested in sex. On some level, I understood where they were coming from, the whole act did seem kind of pointless. But after a few beers, that wasn’t how my mind was working. I turned 19 last week. Dad flew in from Idaho, and it was the first time he was in the house I shared with my mother. He left when I was 12, and it was always apparent that parenting wasn’t the top thing on his mind. There was some meeting on Long Island. That’s probably why he was there, in addition to the fact he knew mom wouldn’t make him sleep on the couch. He had many reasons to be in New York that weren’t me. My birthday was just a flimsy pretense. He’d worked on Wall Street the whole time he was around, a quant. He wrote programs that made other people rich. But something happened to him right before he left. A crisis of conscience perhaps; he was spiraling for weeks, cursing the capitalist system, calling my mother a gold-digging whore (which was mostly true), and saying things needed to change. Then he packed a single backpack and left for Idaho. I visited him out there once my sophomore year. He had a camouflaged one room cabin in the middle of a spruce forest, but instead of the hunting or fishing stuff you might expect, the walls were adorned with electrical test equipment and various things that looked like they were out of a biology or chemistry lab. I didn’t know much about this stuff and that wasn’t what he wanted to talk about anyway. He wanted to talk about “man shit” like nature and women and not being life’s bitch. I tried to act like I did, but I didn’t really listen. All I remember is how eerily quiet the night was, I could hear every animal movement outside. My dad said you get used to it. Brian was having a party tonight. Well okay, party is a lofty way to describe it. He’d replaced the fluorescent lights in his mom’s basement with blacklights, and we’d go over there to drink beer and smoke weed and sit around on our phones and scroll. And sometimes someone would laugh at something and share with the group. I had a case of Bud Light left over from the last party and drank two of them today. Hence the thinking about sex and not thinking that thinking about sex was stupid. People wouldn’t be going over there for a few more hours, so I laid in my bed, drank, and loosely beat off to YouTube. Celebrity gossip, internet gossip, speedrun videos, nothing even arousing. I liked the true crime videos about the hot female teachers who slept with their students. Yea yea yea terrible crime and they all act holier than thou about what if the genders were reversed, but the genders weren’t reversed. Maybe they just don’t want to get demonetized. There were never women at these parties. Okay maybe one or two. But nobody ever slept with them or much thought about them that way. They were the agendered mass like the rest of us. Fellow consumers, not providers. Fuck I should just go visit a hooker. I didn’t know much about that, were hookers real? I’d never met one, and there wasn’t a good way to find out about stuff like this anymore. The Internet was pretty much all “advertiser friendly” now, declawed, sanitized. Once the algorithms got good enough and it was technically easy to censor, there was nothing holding them back. It wasn’t actually censored, it would just redirect you elsewhere. And if you didn’t pay careful attention, you wouldn’t even notice it happening. I tried asking ChatGPT about hookers and it told me to call them sex workers. And this was kind of triggering. Who the fuck does this machine think it is? But then I was lost on this tangent, the algorithms got a rise out of me and I went back to comfort food YouTube. Look this guy beat Minecraft starting with only one block. The doorbell rang. This always gives me anxiety. And it was particularly anxiety inducing since I was the only one home. Normally I could just know that the door of my room was locked and someone else would get it and this would be a downstairs issue. But it was just me at home. My heart rate jumped. I waited for it to ring again, but prayed that it wouldn’t. Please just go away. But sure enough, it rang again. I went to my window, my room was on the second floor. There was a black Escalade in the driveway that I hadn’t seen before, and I could see two men at the door. They were wearing suits. I ducked as to make sure they wouldn’t look up at me, making as little noise as possible. Peering over the window sill I could see one opening the screen door, and it looked like he stuck something to the main door. My heart was beating even faster now. It was Saturday night, why were there two men in suits? And why were they here? It felt longer, but 3 minutes later they drove off. I waited another 3 for good measure, just watching the clock on my computer until it hit 6:57. I doubled checked out the window to make sure they were actually gone, and crept down the stairs to retrieve whatever they left on the door. It was a business card, belonging to a “Detective James Reese” of the Nassau County Police. And on the back of the card, there was handwriting. “John – call me” John was my name.