More from Paolo Amoroso's Journal
<![CDATA[Printing rich text to windows is one of the planned features of DandeGUI, the GUI library for Medley Interlisp I'm developing in Common Lisp. I finally got around to this and implemented the GUI:WITH-TEXT-STYLE macro which controls the attributes of text printed to a window, such as the font family and face. GUI:WITH-TEXT-STYLE establishes a context in which text printed to the stream associated with a TEdit window is rendered in the style specified by the arguments. The call to GUI:WITH-TEXT-STYLE here extends the square root table example by printing the heading in a 12-point bold sans serif font: (gui:with-output-to-window (stream :title "Table of square roots") (gui:with-text-style (stream :family :sans :size 12 :face :bold) (format stream "~&Number~40TSquare Root~2%")) (loop for n from 1 to 30 do (format stream "~&~4D~40T~8,4F~%" n (sqrt n)))) The code produces this window in which the styled column headings stand out: Medley Interlisp window of a square root table generated by the DandeGUI GUI library. The :FAMILY, :SIZE, and :FACE arguments determine the corresponding text attributes. :FAMILY may be a generic family such as :SERIF for an unspecified serif font; :SANS for a sans serif font; :FIX for a fixed width font; or a keyword denoting a specific family like :TIMESROMAN. At the heart of GUI:WITH-TEXT-STYLE is a pair of calls to the Interlisp function PRINTOUT that wrap the macro body, the first for setting the font of the stream to the specified style and the other for restoring the default: (DEFMACRO WITH-TEXT-STYLE ((STREAM &KEY FAMILY SIZE FACE) &BODY BODY) (ONCE-ONLY (STREAM) `(UNWIND-PROTECT (PROGN (IL:PRINTOUT ,STREAM IL:.FONT (TEXT-STYLE-TO-FD ,FAMILY ,SIZE ,FACE)) ,@BODY) (IL:PRINTOUT ,STREAM IL:.FONT DEFAULT-FONT)))) PRINTOUT is an Interlisp function for formatted output similar to Common Lisp's FORMAT but with additional font control via the .FONT directive. The symbols of PRINTOUT, i.e. its directives and arguments, are in the Interlisp package. In turn GUI:WITH-TEXT-STYLE calls GUI::TEXT-STYLE-TO-FD, an internal DandeGUI function which passes to .FONT a font descriptor matching the required text attributes. GUI::TEXT-STYLE-TO-FD calls IL:FONTCOPY to build a descriptor that merges the specified attributes with any unspecified ones copied from the default font. The font descriptor is an Interlisp data structure that represents a font on the Medley environment. #DandeGUI #CommonLisp #Interlisp #Lisp a href="https://remark.as/p/journal.paoloamoroso.com/changing-text-style-for-dandegui-window-output"Discuss.../a Email | Reply @amoroso@oldbytes.space !--emailsub--]]>
<![CDATA[I continued working on DandeGUI, a GUI library for Medley Interlisp I'm writing in Common Lisp. I added two new short public functions, GUI:CLEAR-WINDOW and GUI:PRINT-MESSAGE, and fixed a bug in some internal code. GUI:CLEAR-WINDOW deletes the text of the window associated with the Interlisp TEXTSTREAM passed as the argument: (DEFUN CLEAR-WINDOW (STREAM) "Delete all the text of the window associated with STREAM. Returns STREAM" (WITH-WRITE-ENABLED (STR STREAM) (IL:TEDIT.DELETE STR 1 (IL:TEDIT.NCHARS STR))) STREAM) It's little more than a call to the TEdit API function IL:TEDIT.DELETE for deleting text in the editor buffer, wrapped in the internal macro GUI::WITH-WRITE-ENABLED that establishes a context for write access to a window. I also wrote GUI:PRINT-MESSAGE. This function prints a message to the prompt area of the window associated with the TEXTSTREAM passed as an argument, optionally clearing the area prior to printing. The prompt area is a one-line Interlisp prompt window attached above the title bar of the TEdit window where the editor displays errors and status messages. (DEFUN PRINT-MESSAGE (STREAM MESSAGE &OPTIONAL DONT-CLEAR-P) "Print MESSAGE to the prompt area of the window associated with STREAM. If DONT-CLEAR-P is non NIL the area will be cleared first. Returns STREAM." (IL:TEDIT.PROMPTPRINT STREAM MESSAGE (NOT DONT-CLEAR-P)) STREAM) GUI:PRINT-MESSAGE just passes the appropriate arguments to the TEdit API function IL:TEDIT.PROMPTPRINT which does the actual printing. The documentation of both functions is in the API reference on the project repo. Testing DandeGUI revealed that sometimes text wasn't appended to the end but inserted at the beginning of windows. To address the issue I changed GUI::WITH-WRITE-ENABLED to ensure the file pointer of the stream is set to the end of the file (i.e -1) prior to passing control to output functions. The fix was to add a call to the Interlisp function IL:SETFILEPTR: (IL:SETFILEPTR ,STREAM -1) #DandeGUI #CommonLisp #Interlisp #Lisp a href="https://remark.as/p/journal.paoloamoroso.com/adding-window-clearing-and-message-printing-to-dandegui"Discuss.../a Email | Reply @amoroso@oldbytes.space !--emailsub--]]>
<![CDATA[I'm working on DandeGUI, a Common Lisp GUI library for simple text and graphics output on Medley Interlisp. The name, pronounced "dandy guy", is a nod to the Dandelion workstation, one of the Xerox D-machines Interlisp-D ran on in the 1980s. DandeGUI allows the creation and management of windows for stream-based text and graphics output. It captures typical GUI patterns of the Medley environment such as printing text to a window instead of the standard output. The main window of this screenshot was created by the code shown above it. A text output window created with DandeGUI on Medley Interlisp and the Lisp code that generated it. The library is written in Common Lisp and exposes its functionality as an API callable from Common Lisp and Interlisp code. Motivations In most of my prior Lisp projects I wrote programs that print text to windows. In general these windows are actually not bare Medley windows but running instances of the TEdit rich-text editor. Driving a full editor instead of directly creating windows may be overkill, but I get for free content scrolling as well as window resizing and repainting which TEdit handles automatically. Moreover, TEdit windows have an associated TEXTSTREAM, an Interlisp data structure for text stream I/O. A TEXTSTREAM can be passed to any Common Lisp or Interlisp output function that takes a stream as an argument such as PRINC, FORMAT, and PRIN1. For example, if S is the TEXTSTREAM associated with a TEdit window, (FORMAT S "~&Hello, Medley!~%") inserts the text "Hello, Medley!" in the window at the position of the cursor. Simple and versatile. As I wrote more GUI code, recurring patterns and boilerplate emerged. These programs usually create a new TEdit window; set up the title and other options; fetch the associated text stream; and return it for further use. The rest of the program prints application specific text to the stream and hence to the window. These patterns were ripe for abstracting and packaging in a library that other programs can call. This work is also good experience with API design. Usage An example best illustrates what DandeGUI can do and how to use it. Suppose you want to display in a window some text such as a table of square roots. This code creates the table in the screenshot above: (gui:with-output-to-window (stream :title "Table of square roots") (format stream "~&Number~40TSquare Root~2%") (loop for n from 1 to 30 do (format stream "~&~4D~40T~8,4F~%" n (sqrt n)))) DandeGUI exports all the public symbols from the DANDEGUI package with nickname GUI. The macro GUI:WITH-OUTPUT-TO-WINDOW creates a new TEdit window with title specified by :TITLE, and establishes a context in which the variable STREAM is bound to the stream associated with the window. The rest of the code prints the table by repeatedly calling the Common Lisp function FORMAT with the stream. GUI:WITH-OUTPUT-TO-WINDOW is best suited for one-off output as the stream is no longer accessible outside of its scope. To retain the stream and send output in a series of steps, or from different parts of the program, you need a combination of GUI:OPEN-WINDOW-STREAM and GUI:WITH-WINDOW-STREAM. The former opens and returns a new window stream which may later be used by FORMAT and other stream output functions. These functions must be wrapped in calls to the macro GUI:WITH-WINDOW-STREAM to establish a context in which a variable is bound to the appropriate stream. The DandeGUI documentation on the project repository provides more details, sample code, and the API reference. Design DandeGUI is a thin wrapper around the Interlisp system facilities that provide the underlying functionality. The main reason for a thin wrapper is to have a simple API that covers the most common user interface patterns. Despite the simplicity, the library takes care of a lot of the complexity of managing Medley GUIs such as content scrolling and window repainting and resizing. A thin wrapper doesn't hide much the data structures ubiquitous in Medley GUIs such as menus and font descriptors. This is a plus as the programmer leverages prior knowledge of these facilities. So far I have no clear idea how DandeGUI may evolve. One more reason not to deepen the wrapper too much without a clear direction. The user needs not know whether DandeGUI packs TEdit or ordinary windows under the hood. Therefore, another design goal is to hide this implementation detail. DandeGUI, for example, disables the main command menu of TEdit and sets the editor buffer to read-only so that typing in the window doesn't change the text accidentally. Using Medley Common Lisp DandeGUI relies on basic Common Lisp features. Although the Medley Common Lisp implementation is not ANSI compliant it provides all I need, with one exception. The function DANDEGUI:WINDOW-TITLE returns the title of a window and allows to set it with a SETF function. However, the SEdit structure editor and the File Manager of Medley don't support or track function names that are lists such as (SETF WINDOW-TITLE). A good workaround is to define SETF functions with DEFSETF which Medley does support along with the CLtL macro DEFINE-SETF-METHOD. Next steps At present DandeGUI doesn't do much more than what described here. To enhance this foundation I'll likely allow to clear existing text and give control over where to insert text in windows, such as at the beginning or end. DandeGUI will also have rich text facilities like printing in bold or changing fonts. The windows of some of my programs have an attached menu of commands and a status area for displaying errors and other messages. I will eventually implement such menu-ed windows. To support programs that do graphics output I plan to leverage the functionality of Sketch for graphics in a way similar to how I build upon TEdit for text. Sketch is the line drawing editor of Medley. The Interlisp graphics primitives require as an argument a DISPLAYSTREAM, a data stracture that represents an output sink for graphics. It is possible to use the Sketch drawing area as an output destination by associating a DISPLAYSTREAM with the editor's window. Like TEdit, Sketch takes care of repainting content as well as window scrolling and resizing. In other words, DISPLAYSTREAM is to Sketch what TEXTSTREAM is to TEdit. DandeGUI will create and manage Sketch windows with associated streams suitable for use as the DISPLAYSTREAM the graphics primitives require. #DandeGUI #CommonLisp #Interlisp #Lisp a href="https://remark.as/p/journal.paoloamoroso.com/dandegui-a-gui-library-for-medley-interlisp"Discuss.../a Email | Reply @amoroso@fosstodon.org !--emailsub--]]>
<![CDATA[My journey to Lisp began in the early 1990s. Over three decades later, a few days ago I rediscovered the first Lisp environment I ever used back then which contributed to my love for the language. Here it is, PC Scheme running under DOSBox-X on my Linux PC: Screenshot of the PC Scheme Lisp development environment for MS-DOS by Texas Instruments running under DOSBox-X on Linux Mint Cinnamon. Using PC Scheme again brought back lots of great memories and made me reflect on what the environment taught me about Lisp and Lisp tooling. As a Computer Science student at the University of Milan, Italy, around 1990 I took an introductory computers and programming class taught by Prof. Stefano Cerri. The textbook was the first edition of Structure and Interpretation of Computer Programs (SICP) and Texas Instruments PC Scheme for MS-DOS the recommended PC implementation. I installed PC Scheme under DR-DOS on a 20 MHz 386 Olidata laptop with 2 MB RAM and a 40 MB hard disk drive. Prior to the class I had read about Lisp here and there but never played with the language. SICP and its use of Scheme as an elegant executable formalism instantly fascinated me. It was Lisp love at first sight. The class covered the first three chapters of the book but I later read the rest on my own. I did lots of exercises using PC Scheme to write and run them. Soon I became one with PC Scheme. The environment enabled a tight development loop thanks to its Emacs-like EDWIN editor that was well integrated with the system. The Lisp awareness of EDWIN blew my mind as it was the first such tool I encountered. The editor auto-indented and reformatted code, matched parentheses, and supported evaluating expressions and code blocks. Typing a closing parenthesis made EDWIN blink the corresponding opening one and briefly show a snippet of the beginning of the matched expression. Paying attention to the matching and the snippets made me familiar with the shape and structure of Lisp code, giving a visual feel of whether code looks syntactically right or off. Within hours of starting to use EDWIN the parentheses ceased to be a concern and disappeared from my conscious attention. Handling parentheses came natural. I actually ended up loving parentheses and the aesthetics of classic Lisp. Parenthesis matching suggested me a technique for writing syntactically correct Lisp code with pen and paper. When writing a closing parenthesis with the right hand I rested the left hand on the paper with the index finger pointed at the corresponding opening parenthesis, moving the hands in sync to match the current code. This way it was fast and easy to write moderately complex code. PC Scheme spoiled me and set the baseline of what to expect in a Lisp environment. After the class I moved to PCS/Geneva, a more advanced PC Scheme fork developed at the University of Geneva. Over the following decades I encountered and learned Common Lisp, Emacs, Lisp, and Interlisp. These experiences cemented my passion for Lisp. In the mid-1990s Texas Instruments released the executable and sources of PC Scheme. I didn't know it at the time, or if I noticed I long forgot. Until a few days ago, when nostalgia came knocking and I rediscovered the PC Scheme release. I installed PC Scheme under the DOSBox-X MS-DOS emulator on my Linux Mint Cinnamon PC. It runs well and I enjoy going through the system to rediscover what it can do. Playing with PC Scheme after decades of Lisp experience and hindsight on computing evolution shines new light on the environment. I didn't fully realize at the time but the product packed an amazing value for the price. It cost $99 in the US and I paid it about 150,000 Lira in Italy. Costing as much as two or three texbooks, the software was affordable even to students and hobbyists. PC Scheme is a rich, fast, and surprisingly capable environment with features such as a Lisp-aware editor, a good compiler, a structure editor and other tools, many Scheme extensions such as engines and OOP, text windows, graphics, and a lot more. The product came with an extensive manual, a thick book in a massive 3-ring binder I read cover to cover more than once. A paper on the implementation of PC Scheme sheds light on how good the system is given the platform constraints. Using PC Scheme now lets me put into focus what it taught me about Lisp and Lisp systems: the convenience and productivity of Lisp-aware editors; interactive development and exploratory programming; and a rich Lisp environment with a vast toolbox of libraries and facilities — this is your grandfather's batteries included language. Three decades after PC Scheme a similar combination of features, facilities, and classic aesthetics drew me to Medley Interlisp, my current daily driver for Lisp development. #Lisp #MSDOS #retrocomputing a href="https://remark.as/p/journal.paoloamoroso.com/rediscovering-the-origins-of-my-lisp-journey"Discuss.../a Email | Reply @amoroso@fosstodon.org !--emailsub--]]>
More in programming
The appeal of "vibe coding" — where programmers lean back and prompt their way through an entire project with AI — appears partly to be based on the fact that so many development environments are deeply unpleasant to work with. So it's no wonder that all these programmers stuck working with cumbersome languages and frameworks can't wait to give up on the coding part of software development. If I found writing code a chore, I'd be looking for retirement too. But I don't. I mean, I used to! When I started programming, it was purely because I wanted programs. Learning to code was a necessary but inconvenient step toward bringing systems to life. That all changed when I learned Ruby and built Rails. Ruby's entire premise is "programmer happiness": that writing code should be a joy. And historically, the language was willing to trade run-time performance, memory usage, and other machine sympathies against the pursuit of said programmer happiness. These days, it seems like you can eat your cake and have it too, though. Ruby, after thirty years of constant improvement, is now incredibly fast and efficient, yet remains a delight to work with. That ethos couldn't shine brighter now. Disgruntled programmers have finally realized that an escape from nasty syntax, boilerplate galore, and ecosystem hyper-churn is possible. That's the appeal of AI: having it hide away all that unpleasantness. Only it's like cleaning your room by stuffing the mess under the bed — it doesn't make it go away! But the instinct is correct: Programming should be a vibe! It should be fun! It should resemble English closely enough that line noise doesn't obscure the underlying ideas and decisions. It should allow a richness of expression that serves the human reader instead of favoring the strictness preferred by the computer. Ruby does. And given that, I have no interest in giving up writing code. That's not the unpleasant part that I want AI to take off my hands. Just so I can — what? — become a project manager for a murder of AI crows? I've had the option to retreat up the manager ladder for most of my career, but I've steadily refused, because I really like writing Ruby! It's the most enjoyable part of the job! That doesn't mean AI doesn't have a role to play when writing Ruby. I'm conversing and collaborating with LLMs all day long — looking up APIs, clarifying concepts, and asking stupid questions. AI is a superb pair programmer, but I'd retire before permanently handing it the keyboard to drive the code. Maybe one day, wanting to write code will be a quaint concept. Like tending to horses for transportation in the modern world — done as a hobby but devoid of any economic value. I don't think anyone knows just how far we can push the intelligence and creativity of these insatiable token munchers. And I wouldn't bet against their advance, but it's clear to me that a big part of their appeal to programmers is the wisdom that Ruby was founded on: Programming should favor and flatter the human.
About half a year ago I encountered a paper bombastically titled “the ultimate conditional syntax”. It has the attractive goal of unifying pattern match with boolean if tests, and its solution is in some ways very nice. But it seems over-complicated to me, especially for something that’s a basic work-horse of programming. I couldn’t immediately see how to cut it down to manageable proportions, but recently I had an idea. I’ll outline it under the “penultimate conditionals” heading below, after reviewing the UCS and explaining my motivation. what the UCS? whence UCS out of scope penultimate conditionals dangling syntax examples antepenultimate breath what the UCS? The ultimate conditional syntax does several things which are somewhat intertwined and support each other. An “expression is pattern” operator allows you to do pattern matching inside boolean expressions. Like “match” but unlike most other expressions, “is” binds variables whose scope is the rest of the boolean expression that might be evaluated when the “is” is true, and the consequent “then” clause. You can “split” tests to avoid repeating parts that are the same in successive branches. For example, if num < 0 then -1 else if num > 0 then +1 else 0 can be written if num < 0 then -1 > 0 then +1 else 0 The example shows a split before an operator, where the left hand operand is the same and the rest of the expression varies. You can split after the operator when the operator is the same, which is common for “is” pattern match clauses. Indentation-based syntax (an offside rule) reduces the amount of punctuation that splits would otherwise need. An explicit version of the example above is if { x { { < { 0 then −1 } }; { > { 0 then +1 } }; else 0 } } (This example is written in the paper on one line. I’ve split it for narrow screens, which exposes what I think is a mistake in the nesting.) You can also intersperse let bindings between splits. I doubt the value of this feature, since “is” can also bind values, but interspersed let does have its uses. The paper has an example using let to avoid rightward drift: if let tp1_n = normalize(tp1) tp1_n is Bot then Bot let tp2_n = normalize(tp2) tp2_n is Bot then Bot let m = merge(tp1_n, tp2_n) m is Some(tp) then tp m is None then glb(tp1_n, tp2_n) It’s probably better to use early return to avoid rightward drift. The desugaring uses let bindings when lowering the UCS to simpler constructions. whence UCS Pattern matching in the tradition of functional programming languages supports nested patterns that are compiled in a way that eliminates redundant tests. For example, this example checks that e1 is Some(_) once, not twice as written. if e1 is Some(Left(lv)) then e2 Some(Right(rv)) then e3 None then e4 Being cheeky, I’d say UCS introduces more causes of redundant checks, then goes to great effort to to eliminate redundant checks again. Splits reduce redundant code at the source level; the bulk of the paper is about eliminating redundant checks in the lowering from source to core language. I think the primary cause of this extra complexity is treating the is operator as a two-way test rather than a multi-way match. Splits are introduced as a more general (more complicated) way to build multi-way conditions out of two-way tests. There’s a secondary cause: the tradition of expression-oriented functional languages doesn’t like early returns. A nice pattern in imperative code is to write a function as a series of preliminary calculations and guards with early returns that set things up for the main work of the function. Rust’s ? operator and let-else statement support this pattern directly. UCS addresses the same pattern by wedging calculate-check sequences into if statements, as in the normalize example above. out of scope I suspect UCS’s indentation-based syntax will make programmers more likely to make mistakes, and make compilers have more trouble producing nice error messages. (YAML has put me off syntax that doesn’t have enough redundancy to support good error recovery.) So I wondered if there’s a way to have something like an “is pattern” operator in a Rust-like language, without an offside rule, and without the excess of punctuation in the UCS desugaring. But I couldn’t work out how to make the scope of variable bindings in patterns cover all the code that might need to use them. The scope needs to extend into the consequent then clause, but also into any follow-up tests – and those tests can branch so the scope might need to reach into multiple then clauses. The problem was the way I was still thinking of the then and else clauses as part of the outer if. That implied the expression has to be closed off before the then, which troublesomely closes off the scope of any is-bound variables. The solution – part of it, at least – is actually in the paper, where then and else are nested inside the conditional expression. penultimate conditionals There are two ingredients: The then and else clauses become operators that cause early return from a conditional expression. They can be lowered to a vaguely Rust syntax with the following desugaring rules. The 'if label denotes the closest-enclosing if; you can’t use then or else inside the expr of a then or else unless there’s another intervening if. then expr ⟼ && break 'if expr else expr ⟼ || break 'if expr else expr ⟼ || _ && break 'if expr There are two desugarings for else depending on whether it appears in an expression or a pattern. If you prefer a less wordy syntax, you might spell then as => (like match in Rust) and else as || =>. (For symmetry we might allow && => for then as well.) An is operator for multi-way pattern-matching that binds variables whose scope covers the consequent part of the expression. The basic form is like the UCS, scrutinee is pattern which matches the scrutinee against the pattern returning a boolean result. For example, foo is None Guarded patterns are like, scrutinee is pattern && consequent where the scope of the variables bound by the pattern covers the consequent. The consequent might be a simple boolean guard, for example, foo is Some(n) && n < 0 or inside an if expression it might end with a then clause, if foo is Some(n) && n < 0 => -1 // ... Simple multi-way patterns are like, scrutinee is { pattern || pattern || … } If there is a consequent then the patterns must all bind the same set of variables (if any) with the same types. More typically, a multi-way match will have consequent clauses, like scrutinee is { pattern && consequent || pattern && consequent || => otherwise } When a consequent is false, we go on to try other alternatives of the match, like we would when the first operand of boolean || is false. To help with layout, you can include a redundant || before the first alternative. For example, if foo is { || Some(n) && n < 0 => -1 || Some(n) && n > 0 => +1 || Some(n) => 0 || None => 0 } Alternatively, if foo is { Some(n) && ( n < 0 => -1 || n > 0 => +1 || => 0 ) || None => 0 } (They should compile the same way.) The evaluation model is like familiar shortcutting && and || and the syntax is supposed to reinforce that intuition. The UCS paper spends a lot of time discussing backtracking and how to eliminate it, but penultimate conditionals evaluate straightforwardly from left to right. The paper briefly mentions as patterns, like Some(Pair(x, y) as p) which in Rust would be written Some(p @ Pair(x, y)) The is operator doesn’t need a separate syntax for this feature: Some(p is Pair(x, y)) For large examples, the penultimate conditional syntax is about as noisy as Rust’s match, but it scales down nicely to smaller matches. However, there are differences in how consequences and alternatives are punctuated which need a bit more discussion. dangling syntax The precedence and associativity of the is operator is tricky: it has two kinds of dangling-else problem. The first kind occurs with a surrounding boolean expression. For example, when b = false, what is the value of this? b is true || false It could bracket to the left, yielding false: (b is true) || false Or to the right, yielding true: b is { true || false } This could be disambiguated by using different spellings for boolean or and pattern alternatives. But that doesn’t help for the second kind which occurs with an inner match. foo is Some(_) && bar is Some(_) || None Does that check foo is Some(_) with an always-true look at bar ( foo is Some(_) ) && bar is { Some(_) || None } Or does it check bar is Some(_) and waste time with foo? foo is { Some(_) && ( bar is Some(_) ) || None } I have chosen to resolve the ambiguity by requiring curly braces {} around groups of alternative patterns. This allows me to use the same spelling || for all kinds of alternation. (Compare Rust, which uses || for boolean expressions, | in a pattern, and , between the arms of a match.) Curlies around multi-way matches can be nested, so the example in the previous section can also be written, if foo is { || Some(n) && n < 0 => -1 || Some(n) && n > 0 => +1 || { Some(0) || None } => 0 } The is operator binds tigher than && on its left, but looser than && on its right (so that a chain of && is gathered into a consequent) and tigher than || on its right so that outer || alternatives don’t need extra brackets. examples I’m going to finish these notes by going through the ultimate conditional syntax paper to translate most of its examples into the penultimate syntax, to give it some exercise. Here we use is to name a value n, as a replacement for the |> abs pipe operator, and we use range patterns instead of split relational operators: if foo(args) is { || 0 => "null" || n && abs(n) is { || 101.. => "large" || ..10 => "small" || => "medium" ) } In both the previous example and the next one, we have some extra brackets where UCS relies purely on an offside rule. if x is { || Right(None) => defaultValue || Right(Some(cached)) => f(cached) || Left(input) && compute(input) is { || None => defaultValue || Some(result) => f(result) } } This one is almost identical to UCS apart from the spellings of and, then, else. if name.startsWith("_") && name.tailOption is Some(namePostfix) && namePostfix.toIntOption is Some(index) && 0 <= index && index < arity && => Right([index, name]) || => Left("invalid identifier: " + name) Here are some nested multi-way matches with overlapping patterns and bound values: if e is { // ... || Lit(value) && Map.find_opt(value) is Some(result) => Some(result) // ... || { Lit(value) || Add(Lit(0), value) || Add(value, Lit(0)) } => { print_int(value); Some(value) } // ... } The next few examples show UCS splits without the is operator. In my syntax I need to press a few more buttons but I think that’s OK. if x == 0 => "zero" || x == 1 => "unit" || => "?" if x == 0 => "null" || x > 0 => "positive" || => "negative" if predicate(0, 1) => "A" || predicate(2, 3) => "B" || => "C" The first two can be written with is instead, but it’s not briefer: if x is { || 0 => "zero" || 1 => "unit" || => "?" } if x is { || 0 => "null" || 1.. => "positive" || => "negative" } There’s little need for a split-anything feature when we have multi-way matches. if foo(u, v, w) is { || Some(x) && x is { || Left(_) => "left-defined" || Right(_) => "right-defined" } || None => "undefined" } A more complete function: fn zip_with(f, xs, ys) { if [xs, ys] is { || [x :: xs, y :: ys] && zip_with(f, xs, ys) is Some(tail) => Some(f(x, y) :: tail) || [Nil, Nil] => Some(Nil) || => None } } Another fragment of the expression evaluator: if e is { // ... || Var(name) && Map.find_opt(env, name) is { || Some(Right(value)) => Some(value) || Some(Left(thunk)) => Some(thunk()) } || App(lhs, rhs) => // ... // ... } This expression is used in the paper to show how a UCS split is desugared: if Pair(x, y) is { || Pair(Some(xv), Some(yv)) => xv + yv || Pair(Some(xv), None) => xv || Pair(None, Some(yv)) => yv || Pair(None, None) => 0 } The desugaring in the paper introduces a lot of redundant tests. I would desugar straightforwardly, then rely on later optimizations to eliminate other redundancies such as the construction and immediate destruction of the pair: if Pair(x, y) is Pair(xx, yy) && xx is { || Some(xv) && yy is { || Some(yv) => xv + yv || None => xv } || None && yy is { || Some(yv) => yv || None => 0 } } Skipping ahead to the “non-trivial example” in the paper’s fig. 11: if e is { || Var(x) && context.get(x) is { || Some(IntVal(v)) => Left(v) || Some(BoolVal(v)) => Right(v) } || Lit(IntVal(v)) => Left(v) || Lit(BoolVal(v)) => Right(v) // ... } The next example in the paper compares C# relational patterns. Rust’s range patterns do a similar job, with the caveat that Rust’s ranges don’t have a syntax for exclusive lower bounds. fn classify(value) { if value is { || .. -4.0 => "too low" || 10.0 .. => "too high" || NaN => "unknown" || => "acceptable" } } I tend to think relational patterns are the better syntax than ranges. With relational patterns I can rewrite an earlier example like, if foo is { || Some(< 0) => -1 || Some(> 0) => +1 || { Some(0) || None } => 0 } I think with the UCS I would have to name the Some(_) value to be able to compare it, which suggests that relational patterns can be better than UCS split relational operators. Prefix-unary relational operators are also a nice way to write single-ended ranges in expressions. We could simply write both ends to get a complete range, like >= lo < hi or like if value is > -4.0 < 10.0 => "acceptable" || => "far out" Near the start I quoted a normalize example that illustrates left-aligned UCS expression. The penultimate version drifts right like the Scala version: if normalize(tp1) is { || Bot => Bot || tp1_n && normalize(tp2) is { || Bot => Bot || tp2_n && merge(tp1_n, tp2_n) is { || Some(tp) => tp || None => glb(tp1_n, tp2_n) } } } But a more Rusty style shows the benefits of early returns (especially the terse ? operator) and monadic combinators. let tp1 = normalize(tp1)?; let tp2 = normalize(tp2)?; merge(tp1, tp2) .unwrap_or_else(|| glb(tp1, tp2)) antepenultimate breath When I started writing these notes, my penultimate conditional syntax was little more than a sketch of an idea. Having gone through the previous section’s exercise, I think it has turned out better than I thought it might. The extra nesting from multi-way match braces doesn’t seem to be unbearably heavyweight. However, none of the examples have bulky then or else blocks which are where the extra nesting is more likely to be annoying. But then, as I said before it’s comparable to a Rust match: match scrutinee { pattern => { consequent } } if scrutinee is { || pattern => { consequent } } The || lines down the left margin are noisy, but hard to get rid of in the context of a curly-brace language. I can’t reduce them to | like OCaml because what would I use for bitwise OR? I don’t want presence or absence of flow control to depend on types or context. I kind of like Prolog / Erlang , for && and ; for ||, but that’s well outside what’s legible to mainstream programmers. So, dunno. Anyway, I think I’ve successfully found a syntax that does most of what UCS does, but much in a much simpler fashion.
I really like RTS games. I pretty much grew up on them, starting with Command&Conquer 3: Kane’s Wrath, moving on to StarCraft 2 trilogy and witnessing the downfall of Command&Conquer 4. I never had the disks for any other RTS games during my teenage years. Yes, the disks, the ones you go to the store to buy! I didn’t know Steam existed back then, so this was my only source of games. There is something magical in owning a physical copy of the game. I always liked the art on the front (a mandatory huge face for all RTS!), game description and screenshots on the back, even the smell of the plastic disk case.
Following up on a previous article I wrote about backwards compatibility, I came across this document from Rick Byers of the Chrome team titled “Blink principles of web compatibility” which outlines how they navigate introducing breaking changes. “Hold up,” you might say. “Breaking changes? But there’s no breaking changes on the web!?” Well, as outlined in their Google Doc, “don’t break anyone ever” is a bit unrealistic. Here’s their rationale: The Chromium project aims to reduce the pain of breaking changes on web developers. But Chromium’s mission is to advance the web, and in some cases it’s realistically unavoidable to make a breaking change in order to do that. Since the web is expected to continue to evolve incrementally indefinitely, it’s essential to its survival that we have some mechanism for shedding some of the mistakes of the past. Fair enough. We all need ways of shedding mistakes from the past. But let’s not get too personal. That’s a different post. So when it comes to the web, how do you know when to break something and when to not? The Chrome team looks at the data collected via Chrome's anonymous usage statistics (you can take a peak at that data yourself) to understand how often “mistake” APIs are still being used. This helps them categorize breaking changes as low-risk or high-risk. What’s wild is that, given Chrome’s ubiquity as a browser, a number like 0.1% is classified as “high-risk”! As a general rule of thumb, 0.1% of PageVisits (1 in 1000) is large, while 0.001% is considered small but non-trivial. Anything below about 0.00001% (1 in 10 million) is generally considered trivial. There are around 771 billion web pages viewed in Chrome every month (not counting other Chromium-based browsers). So seriously breaking even 0.0001% still results in someone being frustrated every 3 seconds, and so not to be taken lightly! But the usage stats are merely a guide — a partially blind one at that. The Chrome team openly acknowledges their dataset doesn’t tell the whole story (e.g. Enterprise clients have metrics recording is disabled, China has Google’s metric servers are disabled, and Chromium derivatives don’t record metrics at all). And Chrome itself is only part of the story. They acknowledge that a change that would break Chrome but align it with other browsers is a good thing because it’s advancing the whole web while perhaps costing Chrome specifically in the short term — community > corporation?? Breaking changes which align Chromium’s behavior with other engines are much less risky than those which cause it to deviate…In general if a change will break only sites coding specifically for Chromium (eg. via UA sniffing), then it’s likely to be net-positive towards Chromium’s mission of advancing the whole web. Yay for advancing the web! And the web is open, which is why they also state they’ll opt for open formats where possible over closed, proprietary, “patent-encumbered” ones. The chromium project is committed to a free and open web, enabling innovation and competition by anyone in any size organization or of any financial means or legal risk tolerance. In general the chromium project will accept an increased level of compatibility risk in order to reduce dependence in the web ecosystem on technologies which cannot be implemented on a royalty-free basis. One example we saw of breaking change that excluded proprietary in favor of open was Flash. One way of dealing with a breaking change like that is to provide opt-out. In the case of Flash, users were given the ability to “opt-out” of Flash being deprecated via site settings (in other words, opt-in to using flash on a page-by-page basis). That was an important step in phasing out that behavior completely over time. But not all changes get that kind of heads-up. there is a substantial portion of the web which is unmaintained and will effectively never be updated…It may be useful to look at how long chromium has had the behavior in question to get some idea of the risk that a lot of unmaintained code will depend on it…In general we believe in the principle that the vast majority of websites should continue to function forever. There’s a lot going on with Chrome right now, but you gotta love seeing the people who work on it making public statements like that — “we believe…that the vast majority of websites should continue to function forever.” There’s some good stuff in this document that gives you hope that people really do care and work incredibly hard to not break the web! (It’s an ecosystem after all.) It’s important for [us] browser engineers to resist the temptation to treat breaking changes in a paternalistic fashion. It’s common to think we know better than web developers, only to find out that we were wrong and didn’t know as much about the real world as we thought we did. Providing at least a temporary developer opt-out is an act of humility and respect for developers which acknowledges that we’ll only succeed in really improving the web for users long-term via healthy collaborations between browser engineers and web developers. More 👏 acts 👏 of 👏 humility 👏 in tech 👏 please! Email · Mastodon · Bluesky
react-router-devtools enhances debugging by adding automatic logging for loaders & actions, plus direct links to code origins in console logs.