More from Identity Designed
More in design
the Fang Eyewear Showroom by architecture firm M-D Design Studio, a project which reimagines the traditional showroom in the town...
A screen isn’t a technological distraction to overcome but a powerful cognitive prosthetic for external memory. Screens get a lot of blame these days. They’re accused of destroying attention spans, ruining sleep, enabling addiction, isolating us from one another, and eroding our capacity for deep thought. “Screen time” has become shorthand for everything wrong with modern technology and its grip on our lives. And as a result, those of us in more design and technology-focused spheres now face a persistent propaganda that screens are an outmoded interaction device, holding us back from some sort of immersive techno-utopia. They are not, and that utopia is a fantasy. The screen itself is obviously not to blame — what’s on the screen is. When we use “screen” as a catch-all for our digital dissatisfaction, we’re conflating the surface with what it displays. It’s like blaming paper for misleading news. We might dismiss this simply as a matter of semantics, but language creates understanding and behavior. The more we sum up the culture of what screens display with the word “screens,” the more we push ourselves toward the wrong solution. The most recent version of this is the idea of the “screenless interface” and the recurring nonsense of clickbait platitudes like “The best interface is no interface.” What we mean when we talk about the “screen” matters. And so it’s worth asking, what is a screen, really? And why can’t we seem to get “past” screens when it comes to human-computer interaction? For all our talk of ambient computing, voice interfaces, and immersive realities, screens remain central to our digital lives. Even as companies like Apple and Meta pour billions into developing headsets meant to replace screens, what do they actually deliver? Heavy headgear that just places smaller screens closer to our eyes. Sure, they can provide a persistent immersive experience that a stationary panel cannot. But a headset’s persistent immersion doesn’t make a panel’s stationary nature a bug. What makes a screen especially useful is not what it projects at you, but what happens when you look away from it. It is then that a screen serves a fundamental cognitive purpose that dates back to the earliest human experiences and tools. A screen is a memory surrogate. It’s a surface that holds information so we don’t have to keep it all in our heads. In this way, it’s the direct descendant of some of humanity’s most transformative devices: the dirt patch where our ancestors scratched out the first symbols, the cave wall that preserved their visions, the clay tablet that tracked their trades, the papyrus that extended their memories, the parchment that connected them across distances, the chalkboard that multiplied their teaching. Think of Einstein’s office at Princeton, with its blackboards covered in equations. Those boards weren’t distractions from his thought — they were extensions of it. They allowed him to externalize complex ideas, manipulate them visually, and free his mind from the burden — the impossibility — of holding every variable simultaneously. Our digital screens serve the same purpose, albeit with far greater complexity and interactivity. They hold vast amounts of information that would overwhelm our working memory. They visualize data in ways our minds can grasp. They show us possibilities we couldn’t otherwise envision. They hold them all in place for us, so that we can look away and then easily find them again when we return our gaze. Comparing screens to Einstein’s chalkboards, of course, is a limited metaphor. Screens also display endless streams of addictive content designed to capture and hold our attention. But that’s not an inherent property of screens themselves — it’s a consequence of the business models driving what appears on them. The screen isn’t the attention thief; it’s merely the scene of the crime. (And yes, I do think that future generations will think of today’s attention economy in the same way that we think of other past norms as injustices.) The connection between screens and attention matters, of course, because our brains have evolved to emphasize and prioritize visual processing. We can absorb and interpret visual information with remarkable efficiency; simply scanning a screen can convey more, faster, than listening to the same content read aloud. Visual processing also operates somewhat independently from our verbal reasoning, allowing us to think about what we’re seeing rather than using that cognitive capacity to process incoming language. We can scan at the speed of thought, but we can only listen at the speed of speech. This is why efforts to create “screenless” interfaces often end up feeling limiting rather than liberating. Voice assistants work beautifully for discrete, simple tasks but become frustrating when dealing with complex information or multiple options. Information conveyed in sound has no place to be held; it can only be repeated. The screen persists because it matches fundamental aspects of human cognition by being a tool that, among other things, offers us persistence: a place to hold information. None of this is to dismiss legitimate concerns about how we currently use screens. The content displayed, the contexts of use, the business models driving development — all deserve critical examination. But blaming the screen itself misses the point, misdirects our efforts to build healthier relationships with technology, and wastes our time on ridiculous technological fetch-quests for the next big device. Perhaps instead of dreaming about moving “beyond screens,” we should focus on creating better screens and better screen experiences. “Better screens” is a problem of materials, longevity, energy consumption, light, and heat. There’s so many things we could improve! “Better screen experiences” is a matter of cultural evolution, a generational project we can undertake together right now by thinking about what kind of information is worth being held for us by screens, as opposed to what kind of information is capable of holding our gaze captive. The screen isn’t the problem. It’s one of our most powerful cognitive prosthetics, a brain buffer. Our screens are, together, a platform for cultural creation, the latest in a long line of surfaces that have enriched human existence. De-screening is not just a bad idea that misunderstands how brains work, and not just an insincere sales pitch for a new gadget. It’s an entirely wrong turn toward a worse future with more of the same, only noisier.
This project involves a packaging series for nuvéa, a brand focused on hydration, softness, and sensory beauty. The design seamlessly...
Weekly curated resources for designers — thinkers and makers.
Contributed by Florian Hardwig Source: movieposters.ha.com Image: Heritage Auctions. License: All Rights Reserved. Csillagok háborúja (Star Wars), 1979. The custom acute accents are simple squares. The secondary typeface is ITC Avant Garde Gothic. More info on StarWarsMoviePoster.com. Tibor Helényi (1946–2014) was a Hungarian painter, graphic designer, and poster artist. Among his most famous works are the posters he created for the original Star Wars trilogy, commissioned by MOKÉP, Hungary’s state-owned film distributor. Today, the posters are sought-after collector’s items. The typeface Helényi used for the titles is Langdon Biform. Characterized by triangular notches, the boxy design is by John Langdon (b. 1946). To most people, the graphic designer and retired typography professor is best known for his ambigrams, and especially those he made for Dan Brown’s 2000 novel, Angels & Demons. Langdon Biform is an early work of his, drawn in 1971 when he was in his mid-twenties, years before embarking on a career as freelance logo designer, type specialist, and lettering artist. Langdon submitted the design to a competition organized by Californian phototype company Lettergraphics, who added it to their library of typefaces. It didn’t take long before it was copied by other type providers. I’m aware of at least six digitizations, under various names including Lampoon, Harpoon ART, and Dominion, none of which were authorized by its original designer. In a 2014 interview, Helényi was asked about a debate among fans who wondered whether he’d even watched Star Wars before designing the poster. After all, his art includes creatures that don’t appear in the film. Helényi laughingly replied that he indeed had seen the film, and that he had a lot of fun with designing the poster. In addition to his impressions from the advance screening, he also worked from lobby cards. You can learn more about Helényi and see more of his work at his official website (maintained by his daughter Flora) and also at Budapest Poster Gallery. Source: movieposters.ha.com Image: Heritage Auctions. License: All Rights Reserved. A Birodalom visszavág (The Empire Strikes Back), 1982. Subtitle and credits are added in Univers Bold. More info on StarWarsMoviePoster.com. Source: movieposters.ha.com Image: Heritage Auctions. License: All Rights Reserved. A Jedi visszatér (Return of the Jedi), 1984. The secondary typeface for this poster is Univers Extended. More info on StarWarsMoviePoster.com. Source: www.liveauctioneers.com Image: Budapest Poster Gallery. License: All Rights Reserved. The original painted art created for the posters was sold in Budapest Poster Gallery’s Tibor Helenyi Estate Auction in 2015, alongside many other items by the artist. Stephen Coles. License: CC BY-NC-SA. Glyph set for Langdon Biform with its fifteen alternates, as shown in the “Do a Comp” fan by Lettergraphics International Inc., 1968–1975 This post was originally published at Fonts In Use