More from Jim Nielsen’s Blog
Adam Silver has an article titled “Do you trust design advice from ChatGPT?” wherein he prompted the LLM: How do you add hint text to radio buttons? It gave various suggestions, each of which Adam breaks down. Here’s an an example response from ChatGPT: If you want the hint to appear when the user hovers on the radio button, use a tooltip for a cleaner design Adam’s response: ‘If you want’ Design is not about what you want. It’s about what users need. ‘use a tooltip’ If a hint is useful, why hide it behind a difficult-to-use and inaccessible interaction? ‘for a cleaner design’ Design is about clarity, not cleanliness. Adam’s point-by-point breakdowns are excellent. The entire article is a great example of how plausible-sounding ideas can quickly fall apart under scrutiny from an expert who reframes the issue. It’s funny how prevalent this feels in our age of fast-paced information overload. You read an argument and it seems rational — that is, if you don’t think about it too long, which who has the time? But an expert with deep experience can quickly refute these mediocre rationales and offer a more informed perspective that leaves you wondering how you ever nodded along to the original argument in the first place. Humorously, it reminds me of the culture of conspiracy theories where the burden of proof is on you to disprove the bare assertions being made (a time-consuming job). Hence the value of experience (and what’s experience but an investment of time?) to pierce through these kinds of middle-of-the-road rationales. Experience helps clarify and articulate what lesser experience cannot see, let alone articulate. That all leads me back to Adam: ChatGPT pulls unreliable, uninformed and untrustworthy design advice from the internet and delivers it with confidence. I mean you can certainly listen to its advice. But I think it’s better to develop the instinct to ask the right questions and be able to recognise bad advice when you see it. There’s no shortcut to gaining experience. You can’t consume enough content to get it. You have to do. Email · Mastodon · Bluesky
I recently finished Carlo Rovelli’s book “The Order of Time” and, of course, had a few web-adjacent thoughts come to mind. Who says lessons from physics can’t be applied to making software? (I know, nobody is actually dying on that hill.) A Weakness of Being Data-Driven Being data-driven is the most scientific way of building products? Hold that thought: The ability to understand something before it’s observed is at the heart of scientific thinking. If you can only imagine that which you can observe, understand, and measure, you’re limiting yourself. If you can only believe that which you can observe, then you’ll only ever understand that which you can see. Abstract thought can anticipate by centuries hypotheses that find use — or confirmation — in scientific inquiry. Beware the Prejudice of the Self-Evident The things that seemed self-evident to us were really no more than prejudices. The earth is flat. The sun revolves around the earth. These were mistakes determined by our perspective. There are undoubtedly more things that seem self-evident now, but as we progress in experience and knowledge we will realize that what seems self-evident is merely a prejudice of our perspective given our time and place in the world. There’s always room to be wrong. Children grow up and discover that the world is not as it seemed from within the four walls of their homes. Humankind as a whole does the same thing. Asking the Wrong Questions When we cannot formulate a problem with precision, it is often not because the problem is profound; it’s because the problem is false. Incredibly relevant to building software. If you can’t explain a problem (and your intended solution), it’s probably not a problem. Objectivity Is Overrated When we do science, we want to describe the world in the most objective way possible. We try to eliminate distortions and optical illusions deriving from our point of view. Science aspires to objectivity, to a shared point of view about which it is possible to be in agreement. This is admirable, but we need to be wary about what we lose by ignoring the point of view from which we do the observing. In its anxious pursuit of objectivity, science must not forget that our experience of the world comes from within. Every glance that we cast toward the world is made from a particular perspective. I love this idea. Constantly striving for complete and total objectivity is like trying to erase yourself from existence. As Einstein showed, point of view is everything in a measurement. Your frame of reference is important because it’s yours, however subjective, and you cannot escape it. What we call “objectivity” may merely be the interplay between different subjective perspectives. As Matisse said, “I don’t paint things. I paint the relationship between things.” Email · Mastodon · Bluesky
Web developers have been waiting years for traction in styling HTML form controls. Is it possible the day has come? Here’s Jen Simmons on Mastodon: My team is working on a solution — you’ll apply appearance: base and switch to a new interoperable, consistent controls with easy to override default CSS. They inherit much more of what you already have going on. And then you can override (even more of) those styles using new pseudo-elements. If you want the details, check out the working draft. It’s pretty cool what they’ve come up with, especially in the face of what is undoubtedly a Herculean task to balance developer desire against user preference while preserving accessibility standards. I applaud all involved 👏 That said, I have thoughts. Not new ones. I’ve voiced them before. And I’ll do it again. As developers, we’ve long been clamoring for this functionality: “We want to override defaults, give us more control!” But I wish there was equal voice for: “We want better defaults, not more control!” More control means you have to do more work. I don’t want to do more work, especially for basic computing controls. There are too many edge cases to think about across the plethora of devices, etc. that exist in the world wide web — it’s overwhelming if you stop to think about them all, let alone write them down. I want to respect user choice (which includes respecting what hardware and OS they’ve chosen to use) and build web user interfaces on top of stable OS primitives. Give me better APIs for leveraging OS primitives rather than APIs to opt out of them completely. That’s me, the developer talking. But there’s a user-centric point to be made here too: when you re-invent the look, appearance, and functionality of basic form inputs for every website you’re in charge of, that means every user is forced to encounter inconsistent form controls across the plethora of websites they visit. I’m not saying don’t do this. The web is a big place. There’s undoubtedly a need for it. But not all websites need it, and I’m afraid it’ll be the default posture for handling form controls. I don’t need different radio controls for every healthcare form, shopping cart, and bank account website I use. As a user, I’d prefer a familiar, consistent experience based on the technology choices (hardware, OS, etc.) I’ve made. As a developer, I don’t want to consistently “re-invent the wheel” of basic form controls. Sure, sometimes I may need the ability to opt-out of browser defaults. But increasingly I instead want to opt-in to better browser (and OS) defaults. Less UI primitive resets and more UI primitive customizations. I want to build on top of stable UI pace layers. Email · Mastodon · Bluesky
Heydon Pickering has an intriguing video dealing with the question: “Why is everything binary?” The gist of the video, to me, distills to this insight: The idea that [everything] belongs to one of two archetypes is seductive in its simplicity, so we base everything that we do and make on this false premise. That rings true to me. I tend to believe binary thinking is so prevalent because it’s the intellectual path of least resistance and we humans love to lazy. The fact is, as I’m sure any professional with any experience in any field will tell you, answers are always full of nuance and best explained with the statement “it depends”. The answers we’re all looking for are not found exclusively in one of two binary values, but in the contrast between them. In other words, when you test the accuracy of binary assertions the truth loves to reveal itself somewhere in between.[1] For example: peak design or development is found in the intermingling of form and function. Not form instead of function, nor function instead of form. Working on the web, we’re faced with so many binary choices every day: Do we need a designer or a developer? Do we make a web site or a web app? Should we build this on the client or the server? Are we driven by data or intuition? Does this work online or offline? And answering these questions is not helped by the byproduct of binary thinking, which as Heydon points out, results in intellectually and organizationally disparate structures like “Design” and ”Development”: Design thinking, but not about how to do the thing you are thinking about. Development doing, but without thinking about why the hell anyone would do this in the first place. It’s a good reminder to be consistently on guard for our own binary thinking. And when we catch ourselves, striving to look at the contrast between two options for the answer we seek. There’s a story that illustrates how you can reject binaries and invert the assumption that only two choices exist. It goes like this: A King told a condemned prisoner: “You may make one final statement. If it is true, you will be shot. If it is false, you will be hanged.” The prisoner answered, “I will be hanged.” This results in the King not being able to carry out any sentence. The prisoner manipulates the King’s logic to make both options impossible and reveal a third possible outcome. ⏎ Email · Mastodon · Bluesky
More in programming
My April Cools is out! Gaming Games for Non-Gamers is a 3,000 word essay on video games worth playing if you've never enjoyed a video game before. Patreon notes here. (April Cools is a project where we write genuine content on non-normal topics. You can see all the other April Cools posted so far here. There's still time to submit your own!) April Cools' Club
Everyone wants the software they work on to produce quality products, but what does that mean? In addition, how do you know when you have it? This is the longest single blog post I have ever written. I spent four decades writing software used by people (most of the server
The Ware for March 2025 is shown below. I was just taking this thing apart to see what went wrong, and thought it had some merit as a name that ware. But perhaps more interestingly, I was also experimenting with my cross-polarized imaging setup. This is a technique a friend of mine told me about […]
Picasso got it right: Great artists steal. Even if he didn’t actually say it, and we all just repeat the quote because Steve Jobs used it. Because it strikes at the heart of creativity: None of it happens in a vacuum. Everything is inspired by something. The best ideas, angles, techniques, and tones are stolen to build everything that comes after the original. Furthermore, the way to learn originality is to set it aside while you learn to perfect a copy. You learn to draw by imitating the masters. I learned photography by attempting to recreate great compositions. I learned to program by aping the Ruby standard library. Stealing good ideas isn’t a detour on the way to becoming a master — it’s the straight route. And it’s nothing to be ashamed of. This, by the way, doesn’t just apply to art but to the economy as well. Japan became an economic superpower in the 80s by first poorly copying Western electronics in the decades prior. China is now following exactly the same playbook to even greater effect. You start with a cheap copy, then you learn how to make a good copy, and then you don’t need to copy at all. AI has sped through the phase of cheap copies. It’s now firmly established in the realm of good copies. You’re a fool if you don’t believe originality is a likely next step. In all likelihood, it’s a matter of when, not if. (And we already have plenty of early indications that it’s actually already here, on the edges.) Now, whether that’s good is a different question. Whether we want AI to become truly creative is a fair question — albeit a theoretical or, at best, moral one. Because it’s going to happen if it can happen, and it almost certainly can (or even has). Ironically, I think the peanut gallery disparaging recent advances — like the Ghibli fever — over minor details in the copying effort will only accelerate the quest toward true creativity. AI builders, like the Japanese and Chinese economies before them, eager to demonstrate an ability to exceed. All that is to say that AI is in the "Good Copy" phase of its creative evolution. Expect "The Great Artist" to emerge at any moment.