More from Daniel De Laney
Code forces humans to be precise. That’s good—computers need precision. But it also forces humans to think like machines. For decades we tried to fix this by making programming more human-friendly. Higher-level languages. Visual interfaces. Each step helped, but we were still translating human thoughts into computer instructions. AI was supposed to change everything. Finally, plain English could be a programming language—one everyone already knows. No syntax. No rules. Just say what you want. The first wave of AI coding tools squandered this opportunity. They make flashy demos but produce garbage software. People call them “great for prototyping,” which means “don’t use this for anything real.” Many blame the AI models, saying we just need them to get smarter. This is wrong. Yes, better AI will make better guesses about what you mean. But when you’re building serious software, you don’t want guesses—even smart ones. You want to know exactly what you’re building. Current AI tools pretend writing software is like having a conversation. It’s not. It’s like writing laws. You’re using English, but you’re defining terms, establishing rules, and managing complex interactions between everything you’ve said. Try writing a tax code in chat messages. You can’t. Even simple tax codes are too complex to keep in your head. That’s why we use documents—they let us organize complexity, reference specific points, and track changes systematically. Chat reduces you to memory and hope. This is the core problem. You can’t build real software without being precise about what you want. Every successful programming tool in history reflects this truth. AI briefly fooled us into thinking we could just chat our way to working software. We can’t. You don’t program by chatting. You program by writing documents. When your intent is in a document instead of scattered across a chat log, English becomes a real programming language: You can see your whole system at once You can clarify and improve your intent You can track changes properly Teams can work on the system together Requirements become their own quality checks Changes start from clear specifications The first company to get this will own the next phase of AI development tools. They’ll build tools for real software instead of toys. They’ll make everything available today look like primitive experiments.
“I like to use sketches to validate ideas quickly, without a lot of investment in the wrong direction.” The Challenge The computing power that runs the world is hidden away in data centers that few people get to see. While many data centers are lights-out operations most of the time, people are still needed to update them, keep them running, and prevent and resolve outages. Those people need to know where their critical assets are in the labyrinth that is their global data center network. They need to know when areas get too hot, or get so cold and humid that condensation becomes a worry. In addition to data centers, large enterprises will also have smaller compute sites scattered across the nation or the world. Those sites are often physically unmanned with poor visibility into the health of critical systems. Operators need to know when potential issues arise and how to prioritize them. I help solve both of those problems. My Process Every design challenge starts with research. I put together extensive design research presentations with photos and video inside of real, working data centers. These included profiles of specific data center operators, personas/archetypes extracted from them, and detailed notes on pain points that customers face. Due to confidentiality concerns, heavily redacted and anonymized excerpts are available for eyes-only review upon request. Once the context and specific challenges are understood, it’s time to start rapidly prototyping solutions. I like to use sketches to validate ideas quickly, without a lot of investment in the wrong direction. Once I’ve put ideas in front of customers and gotten enough feedback to be confident in a direction, I produce specs for engineers to build the real thing. This frequently involves extensive annotation. In many cases the sketch is sufficient because the visual design of reusable elements has already been defined as part of a component library or as part of the product design guidelines. Of course, while sketches can convey functionality, if new elements are used for which I don’t already have a visual design specification, it’s important to provide fully realized mockups. Once the appropriate specifications are produced, I work extensively with software engineers. I write stories in JIRA, collaborate to find clever solutions to performance problems on Slack, and even contribute CSS here and there. Whatever I can do to ensure that the finished product is as good as our intentions.
In 2018 I worked with argodesign on an artificial intelligence client project, and Fast Company published an article on our work: This Is The World’s First Graphical AI Interface. For confidentiality reasons I can’t publicly go into more detail on the project than to link to that article. For the full case study, please contact me at hello@danieldelaney.net. AIGA Event During my time at argodesign we held an event with AIGA, the professional association for design, during which the team explained our point of view on artificial intelligence, and the way we approached designing interfaces for new technologies.
Sorry, this video didn’t work. Sorry, this video didn’t work. Sorry, this video didn’t work. Sorry, this video didn’t work.
More in technology
I’m very much into genealogy. I came to realize that my interest was more specifically as a kind of photograph genealogist.
We’re seeing a substantial turn towards online social interaction replacing in-person social interaction — especially among the younger generations. That was exacerbated and accelerated by the COVID-19 pandemic. But mountains of research show that physical touch is critical to a person’s mental wellbeing and online interactions haven’t been able to provide that. One solution may […] The post Could these VR haptic gloves replace human touch? appeared first on Arduino Blog.
reading it was, "Is he Deommodore 64?" (No, unfortunately, Deommodore Lenoir's jersey number is 2, and I'm not even a 49ers fan.)