Full Width [alt+shift+f] Shortcuts [alt+shift+k]
Sign Up [alt+shift+s] Log In [alt+shift+l]
3
I like to try different apps. What makes trying different apps incredible is a layer of interoperability — standardized protocols, data formats, etc. When I can bring my data from one app to another, that’s cool. Cool apps are interoperable. They work with my data, rather than own it. For example, the other day I was itching to try a new RSS reader. I’ve used Reeder (Classic) for ages. But every once in a while I like to try something different. This is super easy because lots of clients support syncing to Feedbin. It’s worth pointing out: Feedbin has their own app. But they don’t force you to use it. You’re free to use any RSS client you want that supports their service. So all I have to do is download a new RSS client, login to Feedbin, and boom! An experience of my data in a totally different app from a totally different developer. That’s amazing! And you know how long it took? Seconds. No data export. No account migration. Doing stuff with my blog is similar. If I want to try a...
2 days ago

Improve your reading experience

Logged in users get linked directly to articles resulting in a better reading experience. Please login for free, it takes less than 1 minute.

More from Jim Nielsen’s Blog

Ductility in Software

I learned a new word: ductile. Do you know it? I’m particularly interested in its usage in a physics/engineering setting when talking about materials. Here’s an answer on Quora to: “What is ductile?” Ductility is the ability of a material to be permanently deformed without cracking. In engineering we talk about elastic deformation as deformation which is reversed once the load is removed for example a spring, conversely plastic deformation isn’t reversed. Ductility is the amount (usually expressed as a ratio) of plastic deformation that a material can undergo before it cracks or tears. I read that and started thinking about the “ductility” of languages like HTML, CSS, and JS. Specifically: how much deformation can they undergo before breaking? HTML, for example, is famously forgiving. It can be stretched, drawn out, or deformed in a variety of ways without breaking. Take this short snippet of HTML: <!doctype html> <title>My site</title> <p>Hello world! <p>Nice to meet you That is valid HTML. But it can also be “drawn out” for readability without losing any of its meaning. It’ll still render the same in the browser: <!doctype html> <html> <head> <title>My site</title> </head> <body> <p>Hello world!</p> <p>Nice to meet you.</p> </body> </html> This capacity for the language to undergo a change in form without breaking is its “ductility”. HTML has some pull before it breaks. JS, on the other hand, doesn’t have the same kind of ductility. Forget a quotation mark and boom! Stretch it a little and it breaks. console.log('works!'); // -> works! console.log('works!); // Uncaught SyntaxError: Invalid or unexpected token I suppose some would say “this isn’t ductility, this is merely forgiving error-parsing”. Ok, sure. Nevertheless, I’m writing here because I learned this new word that has very practical meaning in another discipline to talk about the ability of materials to be stretched and deformed without breaking. I think we need more of that in software. More resiliency. More malleability. More ductility — prioritized in our materials (tools, languages, paradigms) so we can talk more about avoiding sudden failure. Email · Mastodon · Bluesky

6 days ago 12 votes
Background Image Opacity in CSS

The other day I was working on something where I needed to use CSS to apply multiple background images to an element, e.g. <div> My content with background images. </div> <style> div { background-image: url(image-one.jpg), url(image-two.jpg); background-position: top right, bottom left; /* etc. */ } </style> As I was tweaking the appearance of these images, I found myself wanting to control the opacity of each one. A voice in my head from circa 2012 chimed in, “Um, remember Jim, there is no background-opacity rule. Can’t be done.” Then that voice started rattling off the alternatives: You’ll have to use opacity but that will apply to the entire element, which you have text in, so that won’t work. You’ll have to create a new empty element, apply the background images there, then use opacity. Or: You can use pseudo elements (:before & :after), apply the background images to those, then use opacity. Then modern me interrupted this old guy. “I haven’t reached for background-opacity in a long time. Surely there’s a way to do this with more modern CSS?” So I started searching and found this StackOverflow answer which says you can use background-color in combination with background-blend-mode to achieve a similar effect, e.g. div { /* Use some images */ background-image: url(image-one.jpg), url(image-two.jpg); /* Turn down their 'opacity' by blending them into the background color */ background-color: rgba(255,255,255,0.6); background-blend-mode: lighten; } Worked like a charm! It probably won’t work in every scenario like a dedicated background-image-opacity might, but for my particular use case at that moment in time it was perfect! I love little moments like this where I reach to do something in CSS that was impossible back when I really cut my teeth on the language, and now there’s a one- or two-line modern solution! [Sits back and gets existential for a moment.] We all face moments like this where we have to balance leveraging hard-won expertise with seeking new knowledge and greater understanding, which requires giving up the lessons of previous experience in order to make room for incorporating new lessons of experiences. It’s hard to give up the old, but it’s the only way to make room for the new — death of the old is birth of the new. Email · Mastodon · Bluesky

a week ago 10 votes
Tag, You’re It

I saw these going around, but didn’t think I’d ever see myself get tagged — then Eric assuaged my FOMO. As I’ve done elsewhere talking about how I blog, I’m gonna try and impose a character limit to my answers (~240). I’m not sure if that makes my job as the writer easier or harder, but it should make your job as the reader easier. Why did you start blogging in the first place? I think I started because everything I learned about building on the web came from reading other people’s blogs online, so I wanted to be a “web person” like them. What platform are you using to manage your blog and why did you choose it? At the time of this writing (April 2025): I write in iA Writer. Code for my blog and notes is on GitHub. Deployment/hosting is via Netlify. I’ve arrived at this setup less from a combination of choice and evolution. As me and my writing evolve, my process and tools evolve too. Have you blogged on other platforms before? Blogspot, way back in the day. It’s no longer up, which is probably for the best. I was posting stuff I made from following “make this in Photoshop” tutorials. Or I’d practice trying to visually express silly puns. Or I’d make visual mashups of culture at the time. How do you write your posts? For example, in a local editing tool, or in a panel/dashboard that’s part of your blog? For a detailed history of changes on how I blog, I blog about blogging under #myBlog and I blog about microblogging under #myNotes. Read any of those posts for insights into my ever-changing process. When do you feel most inspired to write? When I read other people’s thoughts. Do you publish immediately after writing, or do you let it simmer a bit as a draft? I’m a simmerer. Rarely does a post go from thought to published in one sitting. For example, here’s a screenshot of my current simmering drafts (note my sophisticated editorial process of assigning each draft a letter prefix for sorting based on my appetite for finishing it). What are you generally interested in writing about? Stuff I make. Or stuff others make. Or thoughts I think while reading thoughts others think. I have a tags page that tries to capture what I write categorically — for example, I blog notes from books I read, and podcasts I listen to — but TBH it’s not the greatest taxonomy of my writing. Reductively: I blog about web design and development. Who are you writing for? Whoa, that question got me more introspective than I expected. Gonna move on before this becomes an existential crisis. What’s your favorite post on your blog? I used to highlight some of my favs on my home page, but I stopped. Choosing favorites is hard. My blog posts are like my kids: I love them all equally, lol. I suppose my favorite blog post is the one I’ll publish next. Any future plans for your blog? Maybe a redesign, a move to another platform, or adding a new feature? Will I redesign? Lol, the question is: when will you redesign? Tag ‘em Sorry if I mention someone who’s already been tagged: Piper Haywood — Love Piper’s mix of the personal and professional. Still have bookmarked to try grandma’s recipe. Tyler Gaw — Have loved and respected this dude since I met him at my first “real” webdev job in NYC. David Bushnell — Been enjoying David’s short- and long-form writing a lot as of late. Plus we feel the same about Deno & HTTP modules. Katie Langerman - Ah gotcha, that’s not a blog link. It’s Bluesky. But I’ve followed Katie on the socials and always enjoy her perspective. Not sure she has a personal blog, so this is a vote of confidence in her starting one :) Jan Miksovsky — Jan is doing really cool stuff with Web Origami (also just a super nice guy). Sorry, I’m not gonna ping any of these folks. If they read my blog, they’ll see their names. Otherwise, dear reader, consider it a suggestion to go subscribe to their stuff. Email · Mastodon · Bluesky

a week ago 13 votes
Flow State and Surfing

Jack Johnson is on Rick Rubin’s podcast Tetragrammaton talking about music, film making, creativity, and surfing. At one point (~24:30) Johnson talks about his love for surfing and the beautiful flow state it puts him in: Sometimes I’ll see a friend riding a wave while I’m paddling out, and the thing I’ll see them do just seems like magic...I’ll think, “How in the world did they just do that?” And then on your next ride you’re doing the exact same thing without thinking but it’s all muscle memory and it’s all in this flow that you get into. That’s a really beautiful state to get into, to do something that feels like a magic trick, like something you shouldn’t be able to do, but all of the sudden you’re doing it. I’m not a surfer, and I can’t do effortlessly cool. But I know what a flow state feels like. Johnson’s description reminds me of that feeling when you get a little time on a personal project — riding the wave of working on your personal website. You open your laptop. You start paddling out. Maybe you see an internet friend who was doing something cool and you want to try it but you have no idea if you’ll be able to do it as well as they did. And before you know it, you’re in that flow state where muscle memory takes over and you’re doing stuff without even consciously thinking about it — stuff that others might look at and perceive as magic (cough anything on the command line cough) but it’s not magic to you. Intuition and experience just take over while you ride the wave. Ok, I’m a nerd. But I don’t care. It’s a great feeling, regardless of whether it’s playing an instrument, or surfing, or programming. That feeling of sinking into a craft you’ve worked at your whole life that you don’t have to think about anymore. Email · Mastodon · Bluesky

a week ago 16 votes

More in programming

Tech hiring: is this an inflection point?

We might be seeing the end of remote coding interviews as we know them, and a return of in-person interviews, trial days and longer trial periods. Could hiring be returning to pre-pandemic norms?

21 hours ago 4 votes
Localising the `` with JavaScript

I’ve been writing some internal dashboards recently, and one hard part is displaying timestamps. Our server does everything in UTC, but the team is split across four different timezones, so the server timestamps aren’t always easy to read. For most people, it’s harder to understand a UTC timestamp than a timestamp in your local timezone. Did that event happen just now, an hour ago, or much further back? Was that at the beginning of your working day? Or at the end? Then I remembered that I tried to solve this five years ago at a previous job. I wrote a JavaScript snippet that converts UTC timestamps into human-friendly text. It displays times in your local time zone, and adds a short suffix if the time happened recently. For example: today @ 12:00 BST (1 hour ago) In my old project, I was using writing timestamps in a <div> and I had to opt into the human-readable text for every date on the page. It worked, but it was a bit fiddly. Doing it again, I thought of a more elegant solution. HTML has a <time> element for expressing datetimes, which is a more meaningful wrapper than a <div>. When I render the dashboard on the server, I don’t know the user’s timezone, so I include the UTC timestamp in the page like so: <time datetime="2025-04-15 19:45:00Z"> Tue, 15 Apr 2025 at 19:45 UTC </time> I put a machine-readable date and time string with a timezone offset string in the datetime attribute, and then a more human-readable string in the text of the element. Then I add this JavaScript snippet to the page: window.addEventListener("DOMContentLoaded", function() { document.querySelectorAll("time").forEach(function(timeElem) { // Set the `title` attribute to the original text, so a user // can hover over a timestamp to see the UTC time. timeElem.setAttribute("title", timeElem.innerText); // Replace the display text with a human-friendly date string // which is localised to the user's timezone. timeElem.innerText = getHumanFriendlyDateString( timeElem.getAttribute("datetime") ); }) }); This updates any <time> element on the page to use a human friendly date string, which is localised to the user’s timezone. For example, I’m in the UK so that becomes: <time datetime="2025-04-15 19:45:00Z" title="Tue, 15 Apr 2025 at 19:45 UTC"> Tue, 15 Apr 2025 at 20:45 BST </time> In my experience, these timestamps are easier and more intuitive for people to read. I always include a timezone string (e.g. BST, EST, PDT) so it’s obvious that I’m showing a localised timestamp. If you really need the UTC timestamp, it’s in the title attribute, so you can see it by hovering over it. (Sorry, mouseless users, but I don’t think any of my team are browsing our dashboards from their phone or tablet.) If the JavaScript doesn’t load, you see the plain old UTC timestamp. It’s not ideal, but the page still loads and you can see all the information – this behaviour is an enhancement, not an essential. To me, this is the unfulfilled promise of the <time> element. In my fantasy world, web page authors would write the time in a machine-readable format, and browsers would show it in a way that makes sense for the reader. They’d take into account their language, locale, and time zone. I understand why that hasn’t happened – it’s much easier said than done. You need so much context to know what’s the “right” thing to do when dealing with datetimes, and guessing without that context is at the heart of many datetime bugs. These sort of human-friendly, localised timestamps are very handy sometimes, and a complete mess at other times. In my staff-only dashboards, I have that context. I know what these timestamps mean, who’s going to be reading them, and I think they’re a helpful addition that makes the data easier to read. [If the formatting of this post looks odd in your feed reader, visit the original article]

16 hours ago 3 votes
Normal boyhood is ADHD

Nearly a quarter of seventeen-year-old boys in America have an ADHD diagnosis. That's crazy. But worse than the diagnosis is that the majority of them end up on amphetamines, like Adderall or Ritalin. These drugs allow especially teenage boys (diagnosed at 2-3x the rate of girls) to do what their mind would otherwise resist: Study subjects they find boring for long stretches of time. Hurray? Except, it doesn't even work. Because taking Adderall or Ritalin doesn't actually help you learn more, it merely makes trying tolerable. The kids might feel like the drugs are helping, but the test scores say they're not. It's Dunning-Kruger — the phenomenon where low-competence individuals overestimate their abilities — in a pill. Furthermore, even this perceived improvement is short-term. The sudden "miraculous" ability to sit still and focus on boring school work wanes in less than a year on the drugs. In three years, pill poppers are doing no better than those who didn't take amphetamines at all. These are all facts presented in a blockbuster story in New York Time Magazine entitled Have We Been Thinking About A.D.H.D. All Wrong?, which unpacks all the latest research on ADHD. It's depressing reading. Not least because the definition of ADHD is so subjective and  situational. The NYTM piece is full of anecdotes from kids with an ADHD diagnosis whose symptoms disappeared when they stopped pursuing a school path out of step with their temperament. And just look at these ADHD markers from the DSM-5: Inattention Difficulty staying focused on tasks or play. Frequently losing things needed for tasks (e.g., toys, school supplies). Easily distracted by unrelated stimuli. Forgetting daily activities or instructions. Trouble organizing tasks or completing schoolwork. Avoiding or disliking tasks requiring sustained mental effort. Hyperactivity Fidgeting, squirming, or inability to stay seated. Running or climbing in inappropriate situations. Excessive talking or inability to play quietly. Acting as if “driven by a motor,” always on the go. Impulsivity Blurting out answers before questions are completed. Trouble waiting for their turn. Interrupting others’ conversations or games. The majority of these so-called symptoms are what I'd classify as "normal boyhood". I certainly could have checked off a bunch of them, and you only need six over six months for an official ADHD diagnosis. No wonder a quarter of those seventeen year-old boys in America qualify! Borrowing from Erich Fromm’s The Sane Society, I think we're looking at a pathology of normalcy, where healthy boys are defined as those who can sit still, focus on studies, and suppress kinetic energy. Boys with low intensity and low energy. What a screwy ideal to chase for all. This is all downstream from an obsession with getting as many kids through as much safety-obsessed schooling as possible. While the world still needs electricians, carpenters, welders, soldiers, and a million other occupations that exist outside the narrow educational ideal of today. Now I'm sure there is a small number of really difficult cases where even the short-term break from severe symptoms that amphetamines can provide is welcome. The NYTM piece quotes the doctor that did one of the most consequential studies on ADHD as thinking that's around 3% — a world apart from the quarter of seventeen-year-olds discussed above. But as ever, there is no free lunch in medicine. Long-term use of amphetamines acts as a growth inhibitor, resulting in kids up to an inch shorter than they otherwise would have been. On top of the awful downs that often follow amphetamine highs. And the loss of interest, humor, and spirit that frequently comes with the treatment too. This is all eerily similar to what happened in America when a bad study from the 1990s convinced a generation of doctors that opioids actually weren't addictive. By the time they realized the damage, they'd set in motion an overdose and addiction cascade that's presently killing over a 100,000 Americans a year. The book Empire of Pain chronicles that tragedy well.  Or how about the surge in puberty-blocker prescriptions, which has now been arrested in the UK, following the Cass Review, as well as Finland, Norway, Sweden, France, and elsewhere. Doctors are supposed to first do no harm, but they're as liable to be swept up in bad paradigms, social contagions, and ideological echo chambers as the rest of us. And this insane over-diagnosis of ADHD fits that liability to a T.

yesterday 2 votes
Rediscovering the origins of my Lisp journey

<![CDATA[My journey to Lisp began in the early 1990s. Over three decades later, a few days ago I rediscovered the first Lisp environment I ever used back then which contributed to my love for the language. Here it is, PC Scheme running under DOSBox-X on my Linux PC: Screenshot of the PC Scheme Lisp development environment for MS-DOS by Texas Instruments running under DOSBox-X on Linux Mint Cinnamon. Using PC Scheme again brought back lots of great memories and made me reflect on what the environment taught me about Lisp and Lisp tooling. As a Computer Science student at the University of Milan, Italy, around 1990 I took an introductory computers and programming class taught by Prof. Stefano Cerri. The textbook was the first edition of Structure and Interpretation of Computer Programs (SICP) and Texas Instruments PC Scheme for MS-DOS the recommended PC implementation. I installed PC Scheme under DR-DOS on a 20 MHz 386 Olidata laptop with 2 MB RAM and a 40 MB hard disk drive. Prior to the class I had read about Lisp here and there but never played with the language. SICP and its use of Scheme as an elegant executable formalism instantly fascinated me. It was Lisp love at first sight. The class covered the first three chapters of the book but I later read the rest on my own. I did lots of exercises using PC Scheme to write and run them. Soon I became one with PC Scheme. The environment enabled a tight development loop thanks to its Emacs-like EDWIN editor that was well integrated with the system. The Lisp awareness of EDWIN blew my mind as it was the first such tool I encountered. The editor auto-indented and reformatted code, matched parentheses, and supported evaluating expressions and code blocks. Typing a closing parenthesis made EDWIN blink the corresponding opening one and briefly show a snippet of the beginning of the matched expression. Paying attention to the matching and the snippets made me familiar with the shape and structure of Lisp code, giving a visual feel of whether code looks syntactically right or off. Within hours of starting to use EDWIN the parentheses ceased to be a concern and disappeared from my conscious attention. Handling parentheses came natural. I actually ended up loving parentheses and the aesthetics of classic Lisp. Parenthesis matching suggested me a technique for writing syntactically correct Lisp code with pen and paper. When writing a closing parenthesis with the right hand I rested the left hand on the paper with the index finger pointed at the corresponding opening parenthesis, moving the hands in sync to match the current code. This way it was fast and easy to write moderately complex code. PC Scheme spoiled me and set the baseline of what to expect in a Lisp environment. After the class I moved to PCS/Geneva, a more advanced PC Scheme fork developed at the University of Geneva. Over the following decades I encountered and learned Common Lisp, Emacs, Lisp, and Interlisp. These experiences cemented my passion for Lisp. In the mid-1990s Texas Instruments released the executable and sources of PC Scheme. I didn't know it at the time, or if I noticed I long forgot. Until a few days ago, when nostalgia came knocking and I rediscovered the PC Scheme release. I installed PC Scheme under the DOSBox-X MS-DOS emulator on my Linux Mint Cinnamon PC. It runs well and I enjoy going through the system to rediscover what it can do. Playing with PC Scheme after decades of Lisp experience and hindsight on computing evolution shines new light on the environment. I didn't fully realize at the time but the product packed an amazing value for the price. It cost $99 in the US and I paid it about 150,000 Lira in Italy. Costing as much as two or three texbooks, the software was affordable even to students and hobbyists. PC Scheme is a rich, fast, and surprisingly capable environment with features such as a Lisp-aware editor, a good compiler, a structure editor and other tools, many Scheme extensions such as engines and OOP, text windows, graphics, and a lot more. The product came with an extensive manual, a thick book in a massive 3-ring binder I read cover to cover more than once. A paper on the implementation of PC Scheme sheds light on how good the system is given the platform constraints. Using PC Scheme now lets me put into focus what it taught me about Lisp and Lisp systems: the convenience and productivity of Lisp-aware editors; interactive development and exploratory programming; and a rich Lisp environment with a vast toolbox of libraries and facilities — this is your grandfather's batteries included language. Three decades after PC Scheme a similar combination of features, facilities, and classic aesthetics drew me to Medley Interlisp, my current daily driver for Lisp development. #Lisp #MSDOS #retrocomputing a href="https://remark.as/p/journal.paoloamoroso.com/rediscovering-the-origins-of-my-lisp-journey"Discuss.../a Email | Reply @amoroso@fosstodon.org !--emailsub--]]>

yesterday 4 votes
Thoughts on releasing our first indie game

Two weeks ago we released Dice’n Goblins, our first game on Steam. This project allowed me to discover and learn a lot of new things about game development and the industry. I will use this blog post to write down what I consider to be the most important lessons from the months spent working on this. The development started around 2 years ago when Daphnée started prototyping a dungeon crawler featuring a goblin protagonist. After a few iterations, the game combat started featuring dice, and then those dice could be used to make combos. In May 2024, the game was baptized Dice’n Goblins, and a Steam page was created featuring some early gameplay screenshots and footage. I joined the project full-time around this period. Almost one year later, after amassing more than 8000 wishlists, the game finally released on Steam on April 4th, 2025. It was received positively by the gaming press, with great reviews from PCGamer and LadiesGamers. It now sits at 92% positive reviews from players on Steam. Building RPGs isn’t easy As you can see from the above timeline, building this game took almost two years and two programmers. This is actually not that long if you consider that other indie RPGs have taken more than 6 years to come out. The main issue with the genre is that you need to create a believable world. In practice, this requires programming many different systems that will interact together to give the impression of a cohesive universe. Every time you add a new system, you need to think about how it will fit all the existing game features. For example, players typically expect an RPG to have a shop system. Of course, this means designing a shop non-player character (or building) and creating a UI that is displayed when you interact with it. But this also means thinking through a lot of other systems: combat needs to be changed to reward the player with gold, every item needs a price tag, chests should sometimes reward the player with gold, etc… Adding too many systems can quickly get into scope creep territory, and make the development exponentially longer. But you can only get away with removing so much until your game stops being an RPG. Making a game without a shop might be acceptable, but the experience still needs to have more features than “walking around and fighting monsters” to feel complete. RPGs are also, by definition, narrative experiences. While some games have managed to get away with procedurally generating 90% of the content, in general, you’ll need to get your hands dirty, write a story, and design a bunch of maps. Creating enough content for a game to fit 12h+ without having the player go through repetitive grind will by itself take a lot of time. Having said all that, I definitely wouldn’t do any other kind of games than RPGs, because this is what I enjoy playing. I don’t think I would be able to nail what makes other genres fun if I don’t play them enough to understand what separates the good from the mediocre. Marketing isn’t that complicated Everyone in the game dev community knows that there are way too many games releasing on Steam. To stand out amongst the 50+ games coming out every day, it’s important not only to have a finished product but also to plan a marketing campaign well in advance. For most people coming from a software engineering background, like me, this can feel extremely daunting. Our education and jobs do not prepare us well for this kind of task. In practice, it’s not that complicated. If your brain is able to provision a Kubernetes cluster, then you are most definitely capable of running a marketing campaign. Like anything else, it’s a skill that you can learn over time by practicing it, and iteratively improving your methods. During the 8 months following the Steam page release, we tried basically everything you can think of as a way to promote the game. Every time something was having a positive impact, we would do it more, and we quickly stopped things with low impact. The most important thing to keep in mind is your target audience. If you know who wants the game of games you’re making, it is very easy to find where they hang out and talk to them. This is however not an easy question to answer for every game. For a long while, we were not sure who would like Dice’n Goblins. Is it people who like Etrian Odyssey? Fans of Dicey Dungeons? Nostalgic players of Paper Mario? For us, the answer was mostly #1, with a bit of #3. Once we figured out what was our target audience, how to communicate with them, and most importantly, had a game that was visually appealing enough, marketing became very straightforward. This is why we really struggled to get our first 1000 wishlists, but getting the last 5000 was actually not that complicated. Publishers aren’t magic At some point, balancing the workload of actually building the game and figuring out how to market it felt too much for a two-person team. We therefore did what many indie studios do, and decided to work with a publisher. We worked with Rogue Duck Interactive, who previously published Dice & Fold, a fairly successful dice roguelike. Without getting too much into details, it didn’t work out as planned and we decided, by mutual agreement, to go back to self-publishing Dice’n Goblins. The issue simply came from the audience question mentioned earlier. Even though Dice & Fold and Dice’n Goblins share some similarities, they target a different audience, which requires a completely different approach to marketing. The lesson learned is that when picking a publisher, the most important thing you can do is to check that their current game catalog really matches the idea you have of your own game. If you’re building a fast-paced FPS, a publisher that only has experience with cozy simulation games will not be able to help you efficiently. In our situation, a publisher with experience in roguelikes and casual strategy games wasn’t a good fit for an RPG. In addition to that, I don’t think the idea of using a publisher to remove marketing toil and focus on making the game is that much of a good idea in the long term. While it definitely helps to remove the pressure from handling social media accounts and ad campaigns, new effort will be required in communicating and negotiating with the publishing team. In the end, the difference between the work saved and the work gained might not have been worth selling a chunk of your game. Conclusion After all this was said and done, one big question I haven’t answered is: would I do it again? The answer is definitely yes. Not only building this game was an extremely satisfying endeavor, but so much has been learned and built while doing it, it would be a shame not to go ahead and do a second one.

yesterday 4 votes