More from Christopher Butler
Our world treats information like it’s always good. More data, more content, more inputs — we want it all without thinking twice. To say that the last twenty-five years of culture have centered around info-maximalism wouldn’t be an exaggeration. I hope we’re coming to the end of that phase. More than ever before, it feels like we have to — that we just can’t go on like this. But the solution cannot come from within; it won’t be a better tool or even better information to get us out of this mess. It will be us, feeling and acting differently. Think about this comparison: Information is to wisdom what pornography is to real intimacy. I’m not here to moralize, so I compare to pornography with all the necessary trepidation. Without judgement, it’s my observation that pornography depicts physical connection while creating emotional distance. I think information is like that. There’s a difference between information and wisdom that hinges on volume. More information promises to show us more of reality, but too much of it can easily hide the truth. Information can be pornography — a simulation that, when consumed without limits, can weaken our ability to experience the real thing. When we feel overwhelmed by information — anxious and unable to process what we’ve already taken in — we’re realizing that “more” doesn’t help us find truth. But because we have also established information as a fundamental good in our society, failure to keep up with it, make sense of it, and even profit from it feels like a personal moral failure. There is only one way out of that. We don’t need another filter. We need a different emotional response to information. We should not only question why our accepted spectrum of emotional response to information — in the general sense — is mostly limited to the space between curiosity and desire, but actively develop a capacity for disgust when it becomes too much. And it has become too much. Some people may say that we just need better information skills and tools, not less information. But this misses how fundamentally our minds need space and time to turn information into understanding. When every moment is filled with new inputs, we can’t fully absorb, process, and reflect upon what we’ve consumed. Reflection, not consumptions, creates wisdom. Reflection requires quiet, isolation, and inactivity. Some people say that while technology has expanded over the last twenty-five years, culture hasn’t. If they needed a good defense for that idea, well, I think this is it: A world without idleness is a truly world without creativity. I’m using loaded moral language here for a purpose — to illustrate an imbalance in our information-saturated culture. Idleness is a pejorative these days, though it needn’t be. We don’t refer to compulsive information consumption as gluttony, though we should. And if attention is our most precious resource — as an information-driven economy would imply — why isn’t its commercial exploitation condemned as avarice? As I ask these questions I’m really looking for where individuals like you and me have leverage. If our attention is our currency, then leverage will come with the capacity to not pay it. To not look, to not listen, to not react, to not share. And as has always been true of us human beings, actions are feelings echoed outside the body. We must learn not just to withhold our attention but to feel disgust at ceaseless claims to it.
How elimination, curation, and optimization can help us see through the technological mirror. Technology functions as both mirror and lens — reflecting our self-image while simultaneously shaping how we see everything else. This metaphor of recursion, while perhaps obvious once stated, is one that most people instinctively resist. Why this resistance? I think it is because the observation is not only about a kind of recursion, but it is itself recursive. The contexts in which we discuss technology’s distorting effects tend to be highly technological — internet-based forums, messaging, social media, and the like. It’s difficult to clarify from within, isn’t it? When we try to analyze or critique a technology while using it to do so, it’s as if we’re critiquing the label from inside the bottle. And these days, the bottle is another apt metaphor; it often feels like technology is something we are trapped within. And that’s just at the surface — the discussion layer. It goes much deeper. It’s astounding to confront the reality that nearly all the means by which we see and understand ourselves are technological. So much of modern culture is in its artifacts, and the rest couldn’t be described without them. There have been oral traditions, of course, but once we started making things, they grew scarce. For a human in the twenty-first century, self awareness, cultural identification, and countless other aspects of existence are all, in some way or another, technological. It’s difficult to question the mirror’s image when we’ve never seen ourselves without it. The interfaces through which we perceive ourselves and interpret the world are so integrated into our experience that recognizing their presence, let alone their distorting effects, requires an almost impossible perspective shift. Almost impossible. Because of course it can be done. In fact, I think it’s a matter of small steps evenly distributed throughout a normal lifestyle. It’s not a matter of secret initiation or withdrawing from society, though I think it can sometimes feel that way. How, then, can one step outside the mirror’s view? I’ve found three categories of action particularly helpful: Elimination One option we always have is to simply not use a thing. I often think about how fascinating it is that to not use a particular technology in our era seems radical — truly counter-cultural. The more drastic rejecting any given technology seems, the better an example it is of how dependent we have become upon it. Imagine how difficult a person’s life would be today if they were to entirely reject the internet. There’s no law in our country against opting out of the internet, but the countless day-to-day dependencies upon the it nearly amount to a cumulative obligation to be connected to it. Nevertheless, a person could do it. Few would, but they could. This kind of “brute force” response to technology has become a YouTube genre — the “I Went 30 Days Without ____” video is quite popular. And this is obviously because of how much effort it requires to eliminate even comparatively minor technologies from one’s life. Not the entire internet, but just social media, or just streaming services, or just a particular device or app. Elimination isn’t easy, but I’m a fan of it. The Amish are often thought of as simply rejecting modernity, but that’s not an accurate description of what actually motivates their way of life. Religion plays a foundational role, of course, but each Amish community works together to decide upon many aspects of how they live, including what technologies they adopt. Their guiding principle is whether a thing or practice strengthens their community. And their decision is a collective one. I find that inspiring. When I reject a technology, I do so because I either don’t feel I need it or because I feel that it doesn’t help me live the way I want to live. It’s not forever, and it isn’t with judgement for anyone else but me. These are probably my most radical eliminations: most social media (I still reluctantly have a LinkedIn profile), streaming services (except YouTube), all “smart home” devices of any kind, smartwatches, and for the last decade and counting, laptops. Don’t @ me because you can’t ;) Curation What I have in mind here is curation of information, not of technologies. Since it is simply impossible to consume all information, we all curate in some way, whether we’re aware of it or not. For some, though, this might actually be a matter of what technologies they use — for example, if a person only uses Netflix, then they only see what Netflix shows them. That’s curation, but Netflix is doing the work. However, I think it’s a good exercise to do a bit more curation of one’s own. I believe that if curation is going to be beneficial, it must involve being intentional about one’s entire media diet — what information we consume, from which sources, how frequently, and why. This last part requires the additional work of discerning what motivates and funds various information sources. Few, if any, are truly neutral. The reality is that as information grows in volume, the challenge of creating useful filters for it increases to near impossibility. Information environments operated on algorithms filter information for you based upon all kinds of factors, some of which align with your preferences and many of which don’t. There are many ways to avoid this, they are all more inconvenient than a social media news feed, and it is imperative that more people make the effort to do them. They range from subscribing to carefully-chosen sources, to using specialized apps, feed readers, ad and tracking-blocking browsers and VPNs to control how information gets to you. I recommend all of that and a constant vigilance because, sadly, there is no filter that will only show you the true stuff. Optimization Finally, there’s optimization — the fine-tuning you can do to nearly anything and everything you use. I’ve become increasingly active in seeking out and adjusting even the most detailed of application and device settings, shaping my experiences to be quieter, more limited, and aligned with my intentions rather than the manufacturers’ defaults. I spent thirty minutes nearly redesigning my entire experience in Slack in ways I had never been aware were even possible until recently. It’s made a world of difference to me. Just the other day, I found a video that had several recommendations for altering default settings in Mac OS that have completely solved daily annoyances I have just tolerated for years. I am always adjusting the way I organize files, the apps I use, and the way I use them because I think optimization is always worthwhile. And if I can’t optimize it, I’m likely to eliminate it. None of these approaches offers perfect protection from technological mediation, but together they create meaningful space for more direct control over your experience. But perhaps most important is creating physical spaces that remain relatively untouched by digital technology. I often think back to long trips I took before the era of ubiquitous computing and connection. During a journey from Providence to Malaysia in 2004, I stowed my laptop and cell phone knowing they’d be useless to me during 24 hours of transit. There was no in-cabin wifi, no easy way to have downloaded movies to my machine in advance, no place to even plug anything in. I spent most of that trip looking out the window, counting minutes, and simply thinking — a kind of unoccupied time that has become nearly extinct since then. What makes technological discernment in the digital age particularly challenging is that we’re drowning in a pit of rope where the only escape is often another rope. Information technology is designed to be a nearly wraparound lens on reality; it often feels like the only way to keep using a thing is to use another thing that limits the first thing. People who know me well have probably heard me rant for years about phone cases — “why do I need a case for my case?!” These days, the sincere answer to many peoples’ app overwhelm is another app. It’s almost funny. And yet, I do remain enthusiastic about technology’s creative potential. The ability to shape our world by making new things is an incredible gift. But we’ve gone overboard, creating new technologies simply because we can, without a coherent idea of how they’ll shape the world. This makes us bystanders to what Kevin Kelly describes as “what technology wants” — the agenda inherent in digital technology that makes it far from neutral. What we ultimately seek isn’t escape from technology itself, but recovery of certain human experiences that technology tends to overwhelm: sustained attention, silence, direct observation, unstructured thought, and the sense of being fully present rather than partially elsewhere. The most valuable skill in our digital age isn’t technical proficiency but technological discernment — the wisdom to know when to engage, when to disconnect, and how to shape our tools to serve our deeper human needs rather than allowing ourselves to be shaped by them. “It does us no good to make fantastic progress if we do not know how to live with it.” – Thomas Merton
There’s a psychological burden of digital life even heavier than distraction. When the iPhone was first introduced in 2007, the notion of an “everything device” was universally celebrated. A single object that could serve as phone, camera, music player, web browser, and so much more promised unprecedented convenience and connectivity. It was, quite literally, the dream of the nineties. But the better part of twenty years later, we’ve gained enough perspective to recognize that this revolutionary vision came with costs we did not anticipate. Distraction, of course, is the one we can all relate to first. An everything device has the problem of being useful nearly all the time, and when in use, all consuming. When you use it to do one thing, it pushes you toward others. In order to avoid this, you must disable functions. That’s an interesting turn of events, isn’t it? We have made a thing that does more than we need, more often than we desire. Because system-wide, duplicative notifications are enabled by default, the best thing you could say about the device’s design is that it lacks a point of view toward a prioritization of what it does. The worst thing you could say is that it is distracting by design. (I find it fascinating how many people – myself included — attempt to reduce the features of their smartphone to the point of replicating a “dumbphone” experience in order to save ourselves from distraction, but don’t actually go so far as to use a lesser-featured phone because a few key features are just too good to give up. A dumbphone is less distracting, but a nightmare for text messaging and a lousy camera. It turns out I don’t want a phone at all, but a camera that texts — and ideally one smaller than anything on the market now. I know I’m not alone, and yet this product will not be made. ) This kind of distraction is direct distraction. It’s the kind we are increasingly aware of, and as its accumulating stress puts pressure on our inner and outer lives, we can combat it with various choices and optimizations. But there is another kind of distraction that is less direct, though just as cumulative and, I believe, just as toxic. I’ve come to think of it as the “digital echo.” On a smartphone, every single thing it is used to do generates information that goes elsewhere. The vast majority of this is unseen — though not unfelt — by us. Everyone knows that there is no privacy within a digital device, nor within its “listening” range. We are all aware that as much information as smartphone provides to us, exponentially more is generated for someone else — someone watching, listening, measuring, and monetizing. The “digital echo” is more than just the awareness of this; it is the cognitive burden of knowing that our actions generate data elsewhere. The echo exists whenever we use connected technology, creating a subtle but persistent awareness that what we do isn’t just our own. A device like a smartphone has always generated a “digital echo”, but many others are as well. Comparing two different motor vehicles illustrates this well. In a car like a Tesla, which we might think of as a “smartcar” since it’s a computer you can drive, every function produces a digital signal. Adjusting the air conditioning, making a turn, opening a door — the car knows and records it all, transmitting this information to distant servers. By contrast, my 15-year-old Honda performs all of its functions without creating these digital echoes. The operations remain private, existing only in the moment they occur. In our increasingly digital world, I have begun to feel the SCIF-like isolation of the cabin of my car, and I like it. (The “smartcar”, of course, won’t remain simply a computer you can drive. The penultimate “smartcar” drives itself. The self-driving car represents perhaps the most acute expression of how digital culture values attention and convenience above all else, especially control and ownership. As a passenger of a self-driving car, you surrender control over the vehicle’s operation in exchange for the “freedom” to direct your attention elsewhere, most likely to some digital signal either on your own device or on screens within the vehicle. I can see the value in this; driving can be boring and most times I am behind the wheel I’d rather be doing something else. But currently, truly autonomous vehicles are service-enabling products like Waymo, meaning we also relinquish ownership. The benefits of that also seem obvious: no insurance premiums, no maintenance costs. But not every advantage is worth its cost. The economics of self-driving cars are not clear-cut. There’s a real debate to be had about attention, convenience, and ownership that I hope will play out before we have no choice but to be a passenger in someone else’s machine.) When I find myself looking for new ways to throttle my smartphone’s functions, or when I sit in the untapped isolation of my car, I often wonder about the costs of the “digital echo.” What is the psychological cost of knowing that your actions aren’t just your own, but create information that can be observed and analyzed by others? As more aspects of our lives generate digital echoes, they force an ambient awareness of being perpetually witnessed rather than simply existing. This transforms even solitary activities into implicit social interactions. It forces us to maintain awareness of our “observed self” alongside our “experiencing self,” creating a kind of persistent self-consciousness. We become performers in our own lives rather than merely participants. I think this growing awareness contributes to a growing interest in returning to single-focus devices and analog technologies. Record players and film cameras aren’t experiencing resurgence merely from nostalgia, but because they offer fundamentally different relationships with media — relationships characterized by intention, presence, and focus. In my own life, this recognition has led to deliberate choices about which technologies to embrace and which to avoid. Here are three off the top of my head: Replacing streaming services with owned media formats (CDs, Blu-rays) that remain accessible on my terms, not subject to platform changes or content disappearance Preferring printed books while using dedicated e-readers for digital texts — in this case, accepting certain digital echoes when the benefits (in particular, access to otherwise unavailable material) outweigh the costs Rejecting smart home devices entirely, recognizing that their convenience rarely justifies the added complexity and surveillance they introduce You’ve probably made similarly-motivated decisions, perhaps in other areas of your life or in relation to other things entirely. What matters, I think, is that these choices aren’t about rejecting technology but about creating spaces for more intentional engagement. They represent a search for balance in a world that increasingly defaults to maximum connectivity. I had a conversation recently with a friend who mused, “What are these the early days of?” What a wonderful question that is; we are, I hope, always living in the early days of something. Perhaps now, we’re witnessing the beginning of a new phase in our relationship with technology. The initial wave of digital transformation prioritized connecting everything possible; the next wave may be more discriminating about what should be connected and what’s better left direct and immediate. I hope to see operating systems truly designed around focus rather than multitasking, interfaces that respect attention rather than constantly competing for it, and devices that serve discrete purposes exceptionally well instead of performing multiple functions adequately. The digital echoes of our actions will likely continue to multiply, but we can choose which echoes we’re willing to generate and which activities deserve to remain ephemeral — to exist only in the moment they occur and then in the memories of those present. What looks like revision or retreat may be the next wave of innovation, borne out of having learned the lessons of the last few decades and desiring better for the next.
Back in 2012 when my first (and only) book was published, a friend reacted by exclaiming, “You wrote a book?!?” and then added, “oh yeah…you don’t have kids.” I was put off by that statement. I played it cool, but my unspoken reaction was, “Since when is having kids or not the difference between one’s ability to write a book?” I was proud of my accomplishment, and his reaction seemed to communicate that anyone could do such a thing if they didn’t have other priorities. Thirteen years and two children later, I’ve had plenty of opportunities to reflect upon that moment. I’ve come to a surprising conclusion: he was kind of right. My first child was perhaps ten minutes old before I began learning that my time would never be spent or managed the same way again. I was in the delivery room holding her while my phone vibrated in my pocket because work emails were coming in. Normally, I’d have responded right away. Not anymore. The constraints of parenthood are real and immediate and it takes some time to get used to the pinch. But they’re also transformative in unexpected ways. These days, my measure of how I spend my time comes down to a single idea: I will not make my children orphans to my ambition. If I prioritize anything over them, I require a very good reason which cannot benefit me alone. Yet this transformation runs deeper than simply having less time day to day. Entering your forties has a profound effect on your perception of your entire lifespan. Suddenly, you find that memories actual decades old are of things you experienced as an adult. The combination of parenthood and midlife can create a powerful perspective shift that makes you more intentional about what truly matters. There are times when I feel that I am able to do less than I did in the past, but what I’ve come to realize is that I am actually doing more of the things that matter to me. A more acute focus on limited time results in using that time much more intentionally. I’m more productive today than I was in 2012, but it’s not because of time, it’s because of choices. The constraints of parenthood haven’t just changed what I choose to do with my time, but what I create as well. Having less time to waste means I levy a more critical judgment of whether something is working or worthwhile to pursue much earlier in the process than I did before. In the past – if I’m dreadfully honest — I took pride in being the guy who started early and stayed late. Today, I take pride in producing the best thing I can. The less time that takes, the better. But parenthood has also reminded me of the pleasures and benefits of creativity purely as a means of thinking aloud, learning, exploring, and play. There’s a beautiful tension in this evolution - becoming both more critically discerning and more playfully exploratory at the same time. My children have inadvertently become my teachers, reconnecting me with the foundational joy of making without judgment or expectation. This integration of play and discernment has enriched my professional work. My creative output is far more diverse than it was before. The playful exploration I engage in with my children has opened new pathways in my professional thinking, allowing me to approach design problems from fresh perspectives. I’ve found that the best creative work feels effortless to viewers when the creation process itself was enjoyable. This enjoyment manifests for creators as what psychologists call a “flow state” - that immersive experience where time seems to vanish and work feels natural and intuitive. The more I embrace playful exploration with ideas, techniques, and tools, the more easily I can access this flow state in my professional work. My friend’s comment, while perhaps a bit lacking in tact, touched on a reality about the economics of attention and time. The book I wrote wasn’t just the product of writing skills - it was also the product of having the temporal and mental space to create it. (I’m not sure I’ll have that again, and if I do, I’m not sure a book is what I’ll choose to use it for.) What I didn’t understand then was that parenthood wouldn’t end my creative life, but transform it into something richer, more focused, and ultimately more meaningful. The constraints haven’t diminished my creativity but refined it.
More in design
AMA Design created the Häfele showroom in Bucharest featuring a meticulously designed spaces showcasing furniture components with special movable display...
I like the job title “Design Engineer”. When required to label myself, I feel partial to that term (I should, I’ve written about it enough). Lately I’ve felt like the term is becoming more mainstream which, don’t get me wrong, is a good thing. I appreciate the diversification of job titles, especially ones that look to stand in the middle between two binaries. But — and I admit this is a me issue — once a title starts becoming mainstream, I want to use it less and less. I was never totally sure why I felt this way. Shouldn’t I be happy a title I prefer is gaining acceptance and understanding? Do I just want to rebel against being labeled? Why do I feel this way? These were the thoughts simmering in the back of my head when I came across an interview with the comedian Brian Regan where he talks about his own penchant for not wanting to be easily defined: I’ve tried over the years to write away from how people are starting to define me. As soon as I start feeling like people are saying “this is what you do” then I would be like “Alright, I don't want to be just that. I want to be more interesting. I want to have more perspectives.” [For example] I used to crouch around on stage all the time and people would go “Oh, he’s the guy who crouches around back and forth.” And I’m like, “I’ll show them, I will stand erect! Now what are you going to say?” And then they would go “You’re the guy who always feels stupid.” So I started [doing other things]. He continues, wondering aloud whether this aversion to not being easily defined has actually hurt his career in terms of commercial growth: I never wanted to be something you could easily define. I think, in some ways, that it’s held me back. I have a nice following, but I’m not huge. There are people who are huge, who are great, and deserve to be huge. I’ve never had that and sometimes I wonder, ”Well maybe it’s because I purposely don’t want to be a particular thing you can advertise or push.” That struck a chord with me. It puts into words my current feelings towards the job title “Design Engineer” — or any job title for that matter. Seven or so years ago, I would’ve enthusiastically said, “I’m a Design Engineer!” To which many folks would’ve said, “What’s that?” But today I hesitate. If I say “I’m a Design Engineer” there are less follow up questions. Now-a-days that title elicits less questions and more (presumed) certainty. I think I enjoy a title that elicits a “What’s that?” response, which allows me to explain myself in more than two or three words, without being put in a box. But once a title becomes mainstream, once people begin to assume they know what it means, I don’t like it anymore (speaking for myself, personally). As Brian says, I like to be difficult to define. I want to have more perspectives. I like a title that befuddles, that doesn’t provide a presumed sense of certainty about who I am and what I do. And I get it, that runs counter to the very purpose of a job title which is why I don’t think it’s good for your career to have the attitude I do, lol. I think my own career evolution has gone something like what Brian describes: Them: “Oh you’re a Designer? So you make mock-ups in Photoshop and somebody else implements them.” Me: “I’ll show them, I’ll implement them myself! Now what are you gonna do?” Them: “Oh, so you’re a Design Engineer? You design and build user interfaces on the front-end.” Me: “I’ll show them, I’ll write a Node server and setup a database that powers my designs and interactions on the front-end. Now what are they gonna do?” Them: “Oh, well, we I’m not sure we have a term for that yet, maybe Full-stack Design Engineer?” Me: “Oh yeah? I’ll frame up a user problem, interface with stakeholders, explore the solution space with static designs and prototypes, implement a high-fidelity solution, and then be involved in testing, measuring, and refining said solution. What are you gonna call that?” [As you can see, I have some personal issues I need to work through…] As Brian says, I want to be more interesting. I want to have more perspectives. I want to be something that’s not so easily definable, something you can’t sum up in two or three words. I’ve felt this tension my whole career making stuff for the web. I think it has led me to work on smaller teams where boundaries are much more permeable and crossing them is encouraged rather than discouraged. All that said, I get it. I get why titles are useful in certain contexts (corporate hierarchies, recruiting, etc.) where you’re trying to take something as complicated and nuanced as an individual human beings and reduce them to labels that can be categorized in a database. I find myself avoiding those contexts where so much emphasis is placed in the usefulness of those labels. “I’ve never wanted to be something you could easily define” stands at odds with the corporate attitude of, “Here’s the job req. for the role (i.e. cog) we’re looking for.” Email · Mastodon · Bluesky
The Department of Bed Intentions is the world’s first fully sustainable microbiome-friendly prebiotic personal lubricant. The brand shakes up the...
Excuse my rant. Nobel-prize winning CEO of DeepMind, Demis Hassabis, was on 60 Minutes and floored me when he predicted: We can cure all diseases with the help of AI. [The end of disease] is within reach, maybe within the next decade or so. I don't see why not. “I don’t see why not” is doing a lot of work in that sentence. As I’m sure you know from working on problems, “I don’t see why not” moments are usually followed by, “Actually this is going to be a bit harder that we thought…” If you want to call me a skeptic, that’s fine. But “the end of disease” in the next decade is some ostentatious claim chowder IMHO. As one of the YouTube comments says: The goodies are always just another 5-10 years ahead, aren't they Generally speaking, I tend to regard us humans as incredibly short-sighted. So if I had to place a wager, I’d put my money on the end of disease not happening in the next decade (against my wishes, of course). But that’s not really how AI predictions work. You can’t put wagers on them, because AI predictions aren’t things you get held accountable for. “Yeah, when I said that, I added ‘I don’t see why not’ but we quickly realized that X was going to be an issue and now I’m going to have to qualify that prediction. Once we solve X, I don’t see why not.” And then “once we solve _Y_”. And then Z. “Ok, phew, we solved Z we’re close.” And then AA. And AB. And AC. And… I get it, it’s easy to sit here and play the critic. I’m not the “man in the arena”. I’m not a Nobel-prize winner. I just want to bookmark this prediction for an accountability follow-up in April 2035. If I’m wrong, HOORAY! DISEASE IS ENDED!!! I WILL GLADLY EAT MY HAT! But if not, does anyone’s credibility take a hit? You can’t just say stuff that’s not true and continue having credibility. Unless you’re AI, of course. Email · Mastodon · Bluesky
Weekly curated resources for designers — thinkers and makers.