More from Quentin Santos
The previous series of articles about UART was initially motivated by an error I was getting when using the ESP-Prog. I could have jumped straight to the conclusion, but I took the time to really understand what was going on, and we are finally reaching the end of this investigation. Connecting to “real” UART again … Continue reading The serial TX path seems to be down → The post The serial TX path seems to be down appeared first on Quentin Santos.
The mystery In the previous article, I briefly mentioned a slight difference between the ESP-Prog and the reproduced circuit, when it comes to EN: Focusing on EN, it looks like the voltage level goes back to 3.3V much faster on the ESP-Prog than on the breadboard circuit. The grid is horizontally spaced at 2ms, so … Continue reading Overanalyzing a minor quirk of Espressif’s reset circuit → The post Overanalyzing a minor quirk of Espressif’s reset circuit appeared first on Quentin Santos.
The mystery In the previous article, I briefly mentioned a slight difference between the ESP-Prog and the reproduced circuit, when it comes to EN: Focusing on EN, it looks like the voltage level goes back to 3.3V much faster on the ESP-Prog than on the breadboard circuit. The grid is horizontally spaced at 2ms, so … Continue reading Transistors in reverse and redundant circuits → The post Transistors in reverse and redundant circuits appeared first on Quentin Santos.
In the previous article, we peeked at the reset circuit of ESP-Prog with an oscilloscope, and reproduced it with basic components. We observed that it did not behave quite as expected. In this article, we’ll look into the missing pieces. An incomplete circuit For a hint, we’ll first look a bit more closely at the … Continue reading The missing part of Espressif’s reset circuit → The post The missing part of Espressif’s reset circuit appeared first on Quentin Santos.
I recently discussed how Espressif implements automatic reset, a feature that lets users easily update the code on an Espressif microcontroller. There are actually more subtleties than a quick look would suggest, and I spent a fair bit of time investigating them. This article and the next two present what I have learned. The current … Continue reading Reproducing Espressif’s reset circuit → The post Reproducing Espressif’s reset circuit appeared first on Quentin Santos.
More in programming
The 2025 edition of the TokyoDev Developer Survey is now live! If you’re a software developer living in Japan, please take a few minutes to participate. All questions are optional, and it should take less than 10 minutes to complete. The survey will remain open until September 30th. Last year, we received over 800 responses. Highlights included: Median compensation remained stable. The pay gap between international and Japanese companies narrowed to 47%. Fewer respondents had the option to work fully remotely. For 2025, we’ve added several new questions, including a dedicated section on one of the most talked-about topics in development today: AI. The survey is completely anonymous, and only aggregated results will be shared—never personally identifiable information. The more responses we get, the deeper and more meaningful our insights will be. Please help by taking the survey and sharing it with your peers!
Greetings everyone! You might have noticed that it's September and I don't have the next version of Logic for Programmers ready. As penance, here's ten free copies of the book. So a few months ago I wrote a newsletter about how we use nondeterminism in formal methods. The overarching idea: Nondeterminism is when multiple paths are possible from a starting state. A system preserves a property if it holds on all possible paths. If even one path violates the property, then we have a bug. An intuitive model of this is that for this is that when faced with a nondeterministic choice, the system always makes the worst possible choice. This is sometimes called demonic nondeterminism and is favored in formal methods because we are paranoid to a fault. The opposite would be angelic nondeterminism, where the system always makes the best possible choice. A property then holds if any possible path satisfies that property.1 This is not as common in FM, but it still has its uses! "Players can access the secret level" or "We can always shut down the computer" are reachability properties, that something is possible even if not actually done. In broader computer science research, I'd say that angelic nondeterminism is more popular, due to its widespread use in complexity analysis and programming languages. Complexity Analysis P is the set of all "decision problems" (basically, boolean functions) can be solved in polynomial time: there's an algorithm that's worst-case in O(n), O(n²), O(n³), etc.2 NP is the set of all problems that can be solved in polynomial time by an algorithm with angelic nondeterminism.3 For example, the question "does list l contain x" can be solved in O(1) time by a nondeterministic algorithm: fun is_member(l: List[T], x: T): bool { if l == [] {return false}; guess i in 0..<(len(l)-1); return l[i] == x; } Say call is_member([a, b, c, d], c). The best possible choice would be to guess i = 2, which would correctly return true. Now call is_member([a, b], d). No matter what we guess, the algorithm correctly returns false. and just return false. Ergo, O(1). NP stands for "Nondeterministic Polynomial". (And I just now realized something pretty cool: you can say that P is the set of all problems solvable in polynomial time under demonic nondeterminism, which is a nice parallel between the two classes.) Computer scientists have proven that angelic nondeterminism doesn't give us any more "power": there are no problems solvable with AN that aren't also solvable deterministically. The big question is whether AN is more efficient: it is widely believed, but not proven, that there are problems in NP but not in P. Most famously, "Is there any variable assignment that makes this boolean formula true?" A polynomial AN algorithm is again easy: fun SAT(f(x1, x2, …: bool): bool): bool { N = num_params(f) for i in 1..=num_params(f) { guess x_i in {true, false} } return f(x_1, x_2, …) } The best deterministic algorithms we have to solve the same problem are worst-case exponential with the number of boolean parameters. This a real frustrating problem because real computers don't have angelic nondeterminism, so problems like SAT remain hard. We can solve most "well-behaved" instances of the problem in reasonable time, but the worst-case instances get intractable real fast. Means of Abstraction We can directly turn an AN algorithm into a (possibly much slower) deterministic algorithm, such as by backtracking. This makes AN a pretty good abstraction over what an algorithm is doing. Does the regex (a+b)\1+ match "abaabaabaab"? Yes, if the regex engine nondeterministically guesses that it needs to start at the third letter and make the group aab. How does my PL's regex implementation find that match? I dunno, backtracking or NFA construction or something, I don't need to know the deterministic specifics in order to use the nondeterministic abstraction. Neel Krishnaswami has a great definition of 'declarative language': "any language with a semantics has some nontrivial existential quantifiers in it". I'm not sure if this is identical to saying "a language with an angelic nondeterministic abstraction", but they must be pretty close, and all of his examples match: SQL's selects and joins Parsing DSLs Logic programming's unification Constraint solving On top of that I'd add CSS selectors and planner's actions; all nondeterministic abstractions over a deterministic implementation. He also says that the things programmers hate most in declarative languages are features that "that expose the operational model": constraint solver search strategies, Prolog cuts, regex backreferences, etc. Which again matches my experiences with angelic nondeterminism: I dread features that force me to understand the deterministic implementation. But they're necessary, since P probably != NP and so we need to worry about operational optimizations. Eldritch Nondeterminism If you need to know the ratio of good/bad paths, the number of good paths, or probability, or anything more than "there is a good path" or "there is a bad path", you are beyond the reach of heaven or hell. Angelic and demonic nondeterminism are duals: angelic returns "yes" if some choice: correct and demonic returns "no" if !all choice: correct, which is the same as some choice: !correct. ↩ Pet peeve about Big-O notation: O(n²) is the set of all algorithms that, for sufficiently large problem sizes, grow no faster that quadratically. "Bubblesort has O(n²) complexity" should be written Bubblesort in O(n²), not Bubblesort = O(n²). ↩ To be precise, solvable in polynomial time by a Nondeterministic Turing Machine, a very particular model of computation. We can broadly talk about P and NP without framing everything in terms of Turing machines, but some details of complexity classes (like the existence "weak NP-hardness") kinda need Turing machines to make sense. ↩
Learn how to use Vitest’s defaults to eliminate extra configuration and prevent flaky results, letting you write reliable tests with less effort.
In my previous post, I advocate turning against the unproductive. Whenever you decide to turn against a group, it’s very important to prevent purity spirals. There needs to be a bright line that doesn’t move. Here is that line. You should be, on net, producing more than you are consuming. You shouldn’t feel bad if you are producing less than you could be. But at the end of your life, total it all up. You should have produced more than you consumed. We used to make shit in this country, build shit. It needs to stop. I have to believe that the average person is net positive, because if they aren’t, we’re already too far gone, and any prospect of a democracy is over. But if we aren’t too far gone, we have to stop the hemorrhaging. The unproductive rich are in cahoots with the unproductive poor to take from you. And it’s really the unproductive rich that are the problem. They loudly frame helping the unproductive as a moral issue for helping the poor because they know deep down they are unproductive losers. But they aren’t beyond saving. They just need to make different choices. This cultural change starts with you. Private equity, market manipulators, real estate, lawyers, lobbyists. This is no longer okay. You know the type of person I’m talking about. Let’s elevate farmers, engineers, manufacturing, miners, construction, food prep, delivery, operations. Jobs that produce value that you can point to. There’s a role for everyone in society. From productive billionaires to the fry cook at McDonalds. They are both good people. But negative sum jobs need to no longer be socially okay. The days of living off the work of everyone else are over. We live in a society. You have to produce more than you consume.