More from Founder's blog
TL;DR The "build vs. buy" equation has flipped. Businesses used to buy SaaS because it was cheaper than building their own. AI has changed that—building your own is now more affordable than ever. The discovery problem. AI recommendations default to well-established solutions. Think SEO is a long game? Try LLM SEO. Everyone worries about AI taking developer jobs, but what if AI wipes out the entire off-the-shelf software industry? The "Why Buy?" Problem Six months ago, we needed an AI-powered code review tool. We explored several options and ultimately "vibe-coded" our own GitHub Action—a simple Bash script that takes a git log, sends it to Claude via curl, and posts the results to Slack. Done. The best part? AI wrote the entire thing faster than it would take to sign up for a SaaS. How long until every company realizes they can do this? Need a simple "CRUD" CRM with JIRA-style tasks? Done. Need a mobile time-tracking app for remote employees? AI will spit out a React Native iOS build in minutes. Why pay for yet another SaaS when you can "vibe-code" something in a week? And mark my words, LLM providers are one step away from actually hosting the code they generate. Who needs to spawn an AWS server if you can just ask OpenAI to host the code it just wrote? - "Hey Siri! build me a Basecamp, but with green buttons, also register a domain, spawn a server and host it all there, charge this credit card when you're done" - "Absolutely, that'd be $1.17 per hour" The Discovery Problem AI doesn’t just make it easier to build software—it makes it harder for new SaaS products to get discovered. When you ask AI for recommendations, it defaults to the biggest names. And not just in SaaS, by the way, in open source too. Imagine launching a killer new JS framework today. AI coding assistants and tools like Cursor will just default to React anyway. And not even the latest version of it! In a recent tweet Adam Wathan, the creator of Tailwind, asked: "Has anyone migrated to Tailwind 4.0 yet?" The most popular response was "Nah! we're still waiting for LLMs to learn it." AI isn’t just "the next internet moment." It’s more like "the social network moment." Echo chambers get louder, big names get bigger, and smaller ones disappear into the noise. What Can SaaS Companies Do? 1. Become an Industry Standard Or at least a "go-to" product in a niche. If your app becomes something people mention on their CVs or job descriptions, you win. Examples: Slack. HubSpot. Salesforce etc. A salesperson moving to a new company simply expects Salesforce to be there. That kind of lock-in ensures survival. 2. Build Moats: Infrastructure & Vendor Lock-In SaaS products that are just CRUD apps will die. The ones that survive will own infrastructure or at least some part of it. Instead of building another AI voice assistant, create one with built-in VoIP and provide landline numbers to customers. Examples: Transistor.fm – Not just a SaaS, but also a podcast hosting and publishing pipeline. Postmark (or any transactional email service really) – yes, AI can code an email-sending app, but it can't get you a 10-year old high-reputation sender IP address trusted by Gmail and Outlook. SignWell, SavvyCal and similar "inter-business" file-sharing, communication & escrow apps that own the communication part (and frankly, are literally easier to use than vibe-code your own). But prepare for tthousands of clones. Which SaaS Will Die First? Side-project-scale, "one simple tool" SaaS products that used to be easy wins—form builders, schedulers, basic dashboards, simple workflow apps—those days are over. If AI can generate it in an afternoon, no one is paying a subscription for it. Oh, and "no code" is toasted too. The SaaS graveyard is about to get a lot more crowded. I give it 4 years. Software consulting is making a comeback though. Someone has to clean up the vibe-coded chaos.
TL;DR The "build vs. buy" equation has flipped. Businesses used to buy SaaS because it was cheaper than building their own. AI has changed that—building your own is now more affordable than ever. The discovery problem. AI recommendations default to well-established solutions. Think SEO is a long game? Try LLM SEO. Everyone worries about AI taking developer jobs, but what if AI wipes out the entire off-the-shelf software industry? The "Why Buy?" Problem Six months ago, we needed an AI-powered code review tool. We explored several options, tested them all, and ultimately "vibe-coded" our own GitHub Action—a simple Bash script that takes a git log, sends it to Claude via curl, and posts the results to Slack. Done. The best part? AI wrote the entire thing—faster than it took to sign up for another SaaS. How long until every company realizes they can do this? Need a simple CRM with JIRA-style tasks? Done. Need a mobile time-tracking app for remote employees? AI will spit out a React Native iOS build in minutes. Why pay for yet another SaaS when you can "vibe-code" something in a week? The Discovery Problem AI doesn’t just make it easier to build software—it makes it harder for new SaaS products to get discovered. When you ask AI for recommendations, it defaults to the biggest names. Here’s an open-source analogy: imagine launching a game-changing JS framework today. AI coding assistants and tools like Cursor will still default to React. And not even the latest version! Adam Wathan recently asked on Twitter, "Has anyone migrated to Tailwind 4.0 yet?" The most popular response was "Nah! we're still waiting for LLMs to learn it." AI isn’t just "the next internet moment." It’s more like "the social network moment." Echo chambers get louder, big names get bigger, and smaller ones disappear into the noise. What Can SaaS Companies Do? 1. Become an Industry Standard Or at least a "go-to" product in a niche. If your app becomes something people mention on their CVs or job descriptions, you win. Examples: Slack. HubSpot. Salesforce etc. A salesperson moving to a new company simply expects Salesforce to be there. That kind of lock-in ensures survival. 2. Build Moats: Infrastructure & Vendor Lock-In SaaS products that are just CRUD apps will die. The ones that survive will own infrastructure. Examples: Transistor.fm – Not just a SaaS, but also a podcast hosting and distribution pipeline. Postmark (or any transactional email service really) – AI can code an email-sending app, but it can't get you a 10-year old high-reputation sender IP address trusted by Gmail and Outlook. SignWell and similar B2B file-sharing apps (literally easier to use then code your own). Don't just build another CRUD sales CRM, build a CRM with an inbound VoIP number – because AI can’t replace telco infrastructure (yet). Which SaaS Will Die First? Side-project-scale, "one simple tool" SaaS products that used to be easy wins—Calendly replacements, form builders, schedulers, basic dashboards, simple workflow apps—those days are over. If AI can generate it in an afternoon, no one is paying a subscription for it. Oh, and "no code" is toasted too. The SaaS graveyard is about to get a lot more crowded. I give it 4 years. Software consulting is making a comeback though. Someone has to clean up the vibe-coded chaos.
I mean, it is! But the whole story about the stock market reacting to the news about DeepSeek V3 and R1 is a fine example of the knee-jerk nature of mass consciousness in the era of clickbait economics. Briefly, by points: No, DeepSeek isn’t “head and shoulders above” every other model. The results vary across benchmarks, but on average, GPT-4o and Gemini-2 are better. You can see this on ChatBot Arena, for example (Reddit thread). Even in the results published by DeepSeek’s authors themselves (benchmark graph), you can see that in several tests, the model lags behind GPT-4o from May 2024—which, mind you, is currently ranked 16th on ChatBot Arena. No, training DeepSeek didn’t cost $6 million, “100 times less than GPT-4.” The $6 million figure refers only to the final training run of the published model. It doesn’t include any prior experiments, earlier versions, or R&D costs. This is just the raw computational cost of that final training run. And guess what? That figure is pretty much in line with models of the same class. No, Nvidia did not deserve this hit Not that we’re shedding tears for them — they could use a push to lower hardware prices. And let's not forget that DeepSeek was still trained on Nvidia’s own hardware. And no, their GPUs aren’t suddenly obsolete. DeepSeek’s computational budget is fairly standard for training, and inference for such a massive model (reminder: it’s an MoE with 671 billion parameters, 37 billion of which are active per token generation) requires a ton of hardware. Inference costs are roughly on par with a 70B dense model. Naturally, they’ll scale this success by throwing even more hardware at it and making the model bigger. Not to mention that Deepseek makes LLMs more accessible for the on-prem customers. Which means smaller businesses will buy more GPU's, which is still good for NVDA, am I right? Does this mean the model is bad? No, the model is very, VERY good. It outperforms the vast majority of open-source models, which is fantastic. DeepSeek used 8-bit floating point numbers (FP8) throughout the entire training process. This sacrifices some of that precision to save memory and boost performance. Additionally, they employed a multi-token prediction system and innovative GPU clustering/connectivity techniques. These are clever and practical engineering choices that undoubtedly contributed to their success. In the end, though, stocks will recover, ideas will spread, models will get better, and progress will march on (hopefully).
After years of working with the "big" Visual Studio, I've had enough. It's buggy, slow, and frustrating, and I've decided to make the switch to Visual Studio Code. While as a C# developer I'm still unsure if I can replicate every aspect of my workflow in VS Code, I'm willing to give it a shot—and so far, I'm really impressed. 1. Performance Visual Studio 2022 performance has been a constant issue. It's sluggish and feels increasingly bloated with every new update. It's like watching paint dry every time I open a project. In contrast, Visual Studio Code feels lightweight and incredibly fast. The first time I opened my large project in VS Code, I was shocked — it loaded in lees than a second, literally, even with extensions like "C#" and "C# Dev Kit" installed. 2. Better Developer Experience Running dotnet watch run in VS Code's terminal has been a revelation. It's fast, responsive, and actually works consistently. Visual Studio's "hot reload" feature, on the other hand, has been a constant source of frustration for me. Half the time it doesn't work, and I'm left restarting debugging sessions over and over again. I can't tell you how many hours I've lost to that unreliable feature. 3. Fewer Bugs, Less Frustration The minor editor bugs in Visual Studio have been endless and exhausting. I remember one particularly infuriating bug where syntax highlighting would break in Razor and .cshtml files whenever I used certain HTML tags or even just adjusted the indentation. It drove me up the wall! Not to mention the bizarre issues with JavaScript formatting that never seemed to get fixed. Since switching to VS Code, I've encountered far fewer bugs. It just feels like an environment that respects my time and sanity. 4. A Thriving Ecosystem The VS Code extension ecosystem is alive and thriving. Need Tailwind CSS IntelliSense? There's an extension for that, and it works beautifully. Want to visualize your Git history for a particular line (better version of git-blame)? The Git History extension has got you covered. In "big" Visual Studio, I'd report issues through the "feedback hub" and wait months — or even years — for a response. With VS Code, the community is constantly contributing new tools and improvements. It's energizing (and sometimes exhausting) to be part of such an active ecosystem. 5. Cross-Platform Flexibility One of the biggest advantages I've found with Visual Studio Code is its true cross-platform support. Whether I'm on my Windows PC gaming rig at home or my MacBook while traveling, VS Code runs smoothly and keeps my workflow consistent. Visual Studio's limited macOS version just doesn't cut it for me. Being able to switch between machines without missing a beat has been a game-changer. I have to admit, I was skeptical at first. I've always had a bit of a grudge against Electron-based apps — they've often felt sluggish and bloated. But VS Code has completely changed my perspective. It's fast, responsive, and flexible enough to let me build the development environment that works best for me. Switching to VS Code has rekindled my passion for coding; it reminds me why I fell in love with development in the first place. While Visual Studio will always have its strengths, I need a tool that evolves with me—not one that holds me back.
More in programming
The previous series of articles about UART was initially motivated by an error I was getting when using the ESP-Prog. I could have jumped straight to the conclusion, but I took the time to really understand what was going on, and we are finally reaching the end of this investigation. Connecting to “real” UART again … Continue reading The serial TX path seems to be down → The post The serial TX path seems to be down appeared first on Quentin Santos.
One day, I got a chance. It just seemed to show up. It acted like it knew me, as if it wanted something. This is how Kobi Yamada's book What do you do with a chance? starts. I've been reading that beautiful book to the boys at bedtime since it came out in 2018. It continues: It fluttered around me. It brushed up against me. It circled me as if it wanted me to grab it. What a mesmerizing mental image of a chance, fluttering about. What do you do with a chance? is a great book exactly because it's not just for the boys, but for me too. A poetic reminder of what being open to chance looks like, and what to do when it shows up. Right now, Omarchy feels like that chance. Like Linux fluttered into my hands and said "let's take a trip to the moon". I joked on X that perhaps it was the new creatine routine I picked up from Pieter Levels, but I only started that a few days ago, so I really do think it's actually Linux! This exhilaration of The Chance reminds me of the 1986 cult classic Highlander. There's a fantastic montage in the middle where Sean Connery is teaching Christopher Lambert to fight for the prize of immortality, and in it, he talks about The Quickening. Feeling the stag, feeling the opportunity. That's the feeling I have when I wake up in the morning at the moment: The Quickening. There's something so exciting here, so energizing, that I simply must get to the keyboard and chase wherever it flutters to.
Your monthly updates post is here! This month we have a couple of releases for our developer tools, plus plenty of improvements to Bluetooth, as well as a hardware enablement boost from Ubuntu and plenty to talk about in Early Access. Let’s dive in! System Settings The previously mentioned redesign of Bluetooth Settings has arrived! This redesign not only brings a bit more visual separation between paired devices and nearby devices, but also improves the keyboard navigation and screen reader experience. Plus, you can now double click rows to activate them. We resolved an issue where sometimes devices would be duplicated in the list and fixed issues when a pairing request requires entering passcodes—like with some keyboards. You’ll now also see fewer unnamed devices when discovering, enabling and disabling Bluetooth on devices that have been hardware locked should now work reliably, and to top it all off performance when listing lots of devices has also been improved. Bluetooth settings has a new design Leonhard and Ryo fixed a couple of issues with sidebar selections when navigating directly to a setting from search. Ryo fixed an issue where Sharing Settings lost its window controls when a network was not connected. And there’s now an action to jump directly to the System Updates page from the context menu in the Dock or Applications Menu or via search. Code Working with Git projects continues to get better thanks to Jeremy! There’s now a new feature to clone git repositories directly from inside Code via the projects menu in the sidebar. The item for opening project folders has moved there as well, so managing your open projects now happens all in one place no matter where they come from. You can now clone Git projects in Code He also fixed an issue with blank tooltips appearing in empty sidebar folders, a crash when deleting selected text while using the “Highlight Selection” plugin, and a freeze when editing lists with the “Markdown” plugin. Plus, the Symbols sidebar now shows a loading spinner when searching symbols takes longer than usual, and filters have been fixed for C symbols. Terminal Terminal will now warn about pasted commands that include options to skip confirmation like -y, --interactive=never, and --force. Plus we now make sure to show all found warnings about a pasted command, not just the first one found. For example, if a command like sudo apt update && sudo apt install -y fuse is pasted, we will warn about use of admin privileges, multiple commands, and that it skips confirmations, not just that it uses admin privileges. Terminal warns about more potentially dangerous commands Corentin fixed an issue where long commands could resize windows. Jeremy fixed an issue where tab labels didn’t properly update when using screen or ssh. And he made sure we properly close tabs when using the exit command. And More A few small bug fixes for our Window Manager: Corentin resolved a potential crasher, Leo improved dock hide animations, and Leonhard fixed an issue with revealing the panel over fullscreen apps. A new Hardware Enablement stack has arrived thanks to Ubuntu! This includes Linux 6.14 and Mesa 25 which brings support for newer hardware as well as some big performance gains and potential battery life improvements. OMG! Ubuntu! breaks down all the nerdy details here Get These Updates As always, pop open System Settings → System on elementary OS 8 and hit “Update All” to get these updates plus your regular security, bug fix, and translation updates. Or set up automatic updates and get a notification when updates are ready to install! Early Access Maps Last month, Ryo made the last release of Atlas on AppCenter because we’ll soon be shipping it by default in elementary OS as Maps! As part of our work to improve the experience on computers you take with you—like notebooks and tablets—we’ve been working on features that use your location, and shipping a Maps app is part of that work. An in-development version of the new Maps Our hope with Maps is to improve support for mapping and location features in our platform libraries and to improve experiences in other apps like Tasks and Calendar as well as 3rd party apps in AppCenter. We’ve also already made a tiny improvement to the wider Freedesktop ecosystem by documenting a standard location icon used across desktops and in Portals. We’re really looking forward to getting your feedback and learning how we can improve experiences for apps that use your location in elementary OS. Window Manager & Dock First up is some new eye candy. We’ve heard requests for transparency and blur over the years and I’m happy to report we’re now experimenting with some new effects in shell elements like the alt + tab Window Switcher. We want to make sure to carefully balance shiny effects with performance and legibility, so be sure to send in your feedback. We’re also on track to apply some blur behind the Dock soon, so watch out for that. Blur effects have landed in the window switcher and are coming to the Dock Speaking of the Dock, you may have noticed that it’s now sticking around when in Multitasking View! We’ve replaced the old workspace switcher and you can now launch apps from the Dock directly into different workspaces to quickly get things set up exactly how you like. We’ve also merged in a new feature to monitor background apps that use the cross-platform Background Portal. Here you can not only manage background apps, but also see an explanation of what exactly they’re doing while running. With these features, we’re seeing years of design and development work come together: an improved way to multitask on elementary OS whether you use a mouse at your desktop, multitouch gestures on your laptop or tablet, or rely on keyboard shortcuts to get the job done. I’m extremely proud of what the team has done here and look forward from hearing more from you about it! Background apps now show in the Dock And that’s not all. Building on the previously mentioned Gesture Controller, the new Touchpad backend for multitouch gestures has also landed. This replaces Touchégg in the Secure Session as the way to track multitouch gestures, fixes bugs, and enables new features like two-dimensional swiping between workspaces and the full Multitasking View. So if you are a fan of gestures on your notebook, we’d love for you to try it out and report back before we ship it for everyone. Hardware Support Last but not least, we’re now building Universal EFI install images for ARM64 processors. This means instead of building unique ARM images for every hardware platform, we can build a single universal image for platforms like Pinebook, Raspberry Pi, and M-series Macs. These builds are still experimental and come with a few bugs, but we’d love folks to give them a spin in a virtual machine or on a spare computer and report back. If everything looks good, we may be able to offer stable ARM64 downloads starting with OS 8.1 later this year. Sponsors I want to give special thanks this month to Ryan Prior for his extremely generous one-time sponsorship! Ryan noted that this sponsorship was dedicated to the hard work of Renato who has been translating elementary OS into Brazilian Portuguese. Thanks a ton for your work Renato! At the moment we’re at 22% of our monthly funding goal and 332 Sponsors on GitHub! Shoutouts to everyone helping us reach our goals here. Your monthly sponsorship funds development and makes sure we have the resources we need to give you the best version of elementary OS we can! Monthly release candidate builds and daily Early Access builds are available to GitHub Sponsors from any tier! Beware that Early Access builds are not considered stable and you will encounter fresh issues when you run them. We’d really appreciate reporting any problems you encounter with the Feedback app or directly on GitHub.
I’ve been reading listening to Poor Charlie’s Almanack which is a compilation of talks by Charlie Munger, legendary vice-chairman at Berkshire Hathaway. One thing Charlie talks about is what he calls “sit on your ass investing” which is the opposite of day trading. Rather than being in the market every day (chasing trends, reacting to fluctuations, and trying to time transactions) Charlie advocates spending most of your time “sitting on your ass”. That doesn’t mean you’re doing nothing. It means that instead of constantly trading you’re spending your time in research and preparation for trading. Eventually, a top-tier opportunity will come along and your preparation will make you capable of recognizing it and betting big. That’s when you trade. After that, you’re back to “sitting on your ass”. Trust your research. Trust your choices. Don’t tinker. Don’t micromanage. Don’t panic. Just let the compounding effects of a good choice work in your favor. Day Trading, Day Developing As a day trader your job is to trade daily (it’s right there in the job title). If you’re not trading every day — following trends, reacting to fluctuations, timing trades — then what are you even doing? Not your job, apparently. I think it’s easy to view “development” this. You’re a developer. Your job is to develop programs — to write code. If you’re not doing that every single day, then what are you even doing? From this perspective, it becomes easy to think that writing endless code for ever-changing software paradigms is just one develops websites. But it doesn’t have to be that way. Granted, there’s cold-blooded and warm-blooded software. Sometimes you can’t avoid that. But I also think there’s a valuable lesson in Charlie’s insight. You don’t have to chase “the market” of every new framework or API, writing endless glue code for features that already exist or that will soon exist in browsers. Instead, you can make a few select, large bets on the web platform and then “sit on your ass” until the payoff comes later! An Example: Polyfills I think polyfills are a great example of an approach to “sit on your ass” web development. Your job as a developer is to know enough to make a bet on a particular polyfill that aligns with the future of the web platform. Once implemented, all you have to do is sit on your ass while other really smart people who are building browsers do their part to ship the polyfilled feature in the platform. Once shipped, you “sell” your investment by stripping out the polyfill and reap the reward of having your application get lighter and faster with zero additional effort. A big part of the payoff is in the waiting — in the “sitting on your ass”. You make a smart bet, then you sit patiently while others run around endlessly writing and rewriting more code (meanwhile the only thing left for you will be to delete code). Charlie’s business partner Warren Buffett once said that it’s “better to buy a wonderful company at a fair price, than a fair company at a wonderful price”. Similarly, I’d say it’s better to build on a polyfill aligned with the future of the platform than to build on a framework re-inventing a feature of the platform. Get Out Of Your Own Way Want to do “Day Trading Development”? Jump tools and frameworks constantly — “The next one will solve all our problems!” Build complex, custom solutions that duplicate work the web platform is already moving towards solving. Commit code that churns with time, rather than compounds with it. Want to do “Sit on Your Ass Development”? Do the minimum necessary to bridge the gap until browsers catch up. Build on forward-facing standards, then sit back and leverage the compounding effects of browser makers and standards bodies that iteratively improve year over year (none of whom you have to pay). As Alex Russel recommends, spend as little time as possible in your own code and instead focus on glueing together “the big C++/Rust subsystems” of the browser. In short: spend less time glueing together tools and frameworks on top of the browser, and more time bridging tools and APIs inside of the browser. Then get out of your own way and go sit on your ass. You might find yourself more productive than ever! Email · Mastodon · Bluesky
We're going all-in on Omarchy at 37signals. Over the next three years, as the regular churn of hardware invites it, we're switching everyone on our Ops and Ruby programming teams to our own Arch-derived Linux distribution (and of course sharing all the improvements we make along the way with everyone else on Omarchy!). It's funny how nobody bats an eye when the company mandate is to use Macs or Windows, but when the prescription is Linux, it's suddenly surprising. It really shouldn't be. Your ability to control your own destiny with Linux is far superior to what you'll get from a closed-source, commercial operating system. Of course it is! The code is literally all there! True, you might face more challenges, and there won't be a vendor to call (unless you hop into the Enterprise Linux camp, which doesn't appeal to me either). But I've never given a damn about that. I started using Ruby to build Basecamp when we could barely fill a room in American with professional Ruby programmers. This is what we do here! This also means giving up on MacBooks and choosing Framework laptops as the new standard-issue equipment. Along with desktop machines from Framework and Beelink both. PC hardware has gotten incredibly good over the last few years, as AMD in particular has managed to extract many of the same processor improvements from TSMC, as Apple did so well with the M series. At least in terms of performance. Again, true, there is still a gap on the efficiency front. Nobody can currently beat Apple on the wattage-to-power ratio (but the gap is fast closing!). So battery life on Linux using Framework is currently a bit less. I get about 6 hours of mixed use from my Framework 13, so whenever I suspect that might be a problem, I bring a small 20K mAh Anker battery in the bag, and now I have double the capacity. A small concession on a rare occasion, but nothing like the performance AND battery deficit we willingly endured for decades on the Mac before Apple switched to their own chips. Because we wanted to run OSX. It was worth sacrifice a few other concerns for. Just like Linux is today. On the flip side, we'll get a massive boost in productivity from being able to run our Ruby on Rails test suites locally much faster. For our HEY application, even the fastest Mac, an M4 Max, is almost twice as slow as a Framework Desktop machine running Linux, which can do Docker natively. It's an exciting new adventure for us. Omarchy is already by far-and-away my favorite computing environment. Right up there in joy and wonder with the old Amiga days or early OSX. It's been a blast learning that so many other early-adopters have found the same feeling. Very reminiscent of the excitement in the early Ruby days. Knowing you'd found something super special that wasn't yet widely distributed (but poised to be). I spoke about all of this with Kimberly on a bonus episode of The REWORK Podcast. Give it a listen if you're curious about the why, the how, and the inevitable objections.