More from Posts on Made of Bugs
About a month ago, the CPython project merged a new implementation strategy for their bytecode interpreter. The initial headline results were very impressive, showing a 10-15% performance improvement on average across a wide range of benchmarks across a variety of platforms. Unfortunately, as I will document in this post, these impressive performance gains turned out to be primarily due to inadvertently working around a regression in LLVM 19. When benchmarked against a better baseline (such GCC, clang-18, or LLVM 19 with certain tuning flags), the performance gain drops to 1-5% or so depending on the exact setup.
Earlier this month, I used Claude to port (parts of) an Emacs package into Rust, shrinking the execution time by a factor of 1000 or more (in one concrete case: from 90s to about 15ms). This is a variety of yak-shave that I do somewhat routinely, both professionally and in service of my personal computing environment. However, this time, Claude was able to execute substantially the entire project under my supervision without me writing almost-any lines of code, speeding up the project substantially compared to doing it by hand.
Suppose we have a large collection of documents, and we wish you identify which documents are approximately the same as each other. For instance, we may have crawled the web over some period of time, and expect to have fetched the “same page” several times, but to see slight differences in metadata, or that we have several revisions of a page following small edits. In this post I want to explore the method of approximate deduplication via Jaccard similarity and the MinHash approximation trick.
I worked at Stripe for about seven years, from 2012 to 2019. Over that time, I used and contributed to many generations of Stripe’s developer environment – the tools that engineers used daily to write and test code. I think Stripe did a pretty good job designing and building that developer experience, and since leaving, I’ve found myself repeatedly describing features of that environment to friends and colleagues. This post is an attempt to record the salient features of that environment as I remember it.
I was recently introduced to the paper “Seeing the Invisible: Perceptual-Cognitive Aspects of Expertise” by Gary Klein and Robert Hoffman. It’s excellent and I recommend you read it when you have a chance. Klein and Hoffman discuss the ability of experts to “see what is not there”: in addition to observing data and cues that are present in the environment, experts perceive implications of these cues, such as the absence of expected or “typical” information, the typicality or atypicality of observed data, and likely/possible past and future time trajectories of a system based on a point-in-time snapshot or limited duration of observation.
More in technology
A conventional model rocket engine is simple combustible solid fuel (black powder or more advanced composites) molded into a cylinder that uses expanding gas to produce thrust. Though it is minimal, there is some danger there. An alternative is compressed gas, which will also expand to produce thrust — just without the explosive chemical reaction. […] The post A homemade launchpad for compressed air-powered rockets appeared first on Arduino Blog.
Last week I wrote 2 main posts in reaction to the Nintendo Switch 2 announcement, and today we have new info on some of the open questions that lingered waaaay back then. Game-key cards I saw a remarkable amount of arguments against this new "physical" purchase option based
Plus the terrifying story of what actually happened to Boeing's Starliner capsule, and a terrible Star Trek opinion.
Oh boy, Meta released their latest llama models this weekend (I guess because they leaked, hence the rushed weekend release?), and it benchmarked quite well on LMArena. However, as Kyle Wiggers reported for TechCrunch, not all may have been on the up and up: Meta’s Benchmarks for Its
Around 2003, my mother had a laptop: the Compaq Armada 1592DT. It ran Windows Me, the worst Windows to ever exist, whopping 96 MB of RAM, and a 3 GB hard drive. My mother used it for important stuff, and I played games on it. Given the limitations of the 3 GB hard drive, this soon lead to a conflict: there was no room to store any new games! I did my best to make additional room by running the disk cleaner utility, disabling unnecessary Windows features and deleting some PDF catalogues that my mother had downloaded, but there was still a constant lack of space. Armed with a lack of knowledge about computers, I went further and found a tool that promised to make more room on the hard drive. I can’t remember what it was, but it had a nice graphical user interface where the space on the drive was represented as a pie chart. To my amazement, I could slide that pie chart to make it so that 90% of the drive was free space! I went full speed ahead with it. What followed was a crash and upon rebooting I was presented with a black screen. Oops. My mother ended up taking it to a repair shop for 1200 EEK, which was a lot of money at the time. The repair shop ended up installing Windows 98 SE on it, which felt like a downgrade at the time, but in retrospect it was an improvement over Windows Me. I had no idea what I was doing at the time, but I assume that the tool I was playing with was some sort of a partition manager that had no safeguards in place to avoid shrinking and reformatting operating system partitions. Or if it did, then it made ignoring the big warning signs way too easy. Still 100% user error on my part. If only I knew that reinstalling Windows was a relatively simple operation at the time, but it took a solid 4-5 years until I did my first installation of Windows all by myself.