Full Width [alt+shift+f] Shortcuts [alt+shift+k] TRY SIMPLE MODE
Sign Up [alt+shift+s] Log In [alt+shift+l]
94
Apple has a really awesome WWDC 2018 video called Designing Fluid Interfaces, and one of the key takeaways from the videos that one of the presenters, Chan Karunamuni, said is “Look for delays everywhere. Everything needs to respond instantly.” (6:28) A really great example of this is a scroll view on iOS. If you flick through your contacts and touch your finger to the screen, instantly the scroll view stops and let’s you reposition it. This kind of instantaneous behavior is really important in our own views, interactions, and animations. Okay, so I was building a view where I wanted this behavior. Basically, you flick a box from the left side of the screen to the right, and as you release your finger it keeps going. Of note though is that you should be able to grab the box while it’s in flight to stop it. Turns out this is trickier than it seems, as UIPanGestureRecognizer has a small delay in startup where it requires you to move your finger before it recognizes the gesture starting....
over a year ago

Improve your reading experience

Logged in users get linked directly to articles resulting in a better reading experience. Please login for free, it takes less than 1 minute.

More from Christian Selig

High quality, low filesize GIFs

While the GIF format is a little on the older side, it’s still a really handy format in 2025 for sharing short clips where an actual video file might have some compatibility issues. For instance, I find when you just want a short little video on your website, a GIF is still so handy versus a video, where some browsers will refuse to autoplay them, or seem like they’ll autoplay them fine until Low Battery Mode is activated, etc. With GIFs it’s just… easy, and sometimes easy is nice. They’re super handy for showing a screen recording of a cool feature in your app, for instance. What’s not nice is the size of GIFs. They have a reputation of being absolutely enormous from a filesize perspective, and they often are, but that doesn’t have to be the case, you can be smart about your GIF and optimize its size substantially. Over the years I’ve tried lots of little apps that promise to help to no avail, so I’ve developed a little script to make this easier that I thought might be helpful to share. Naive approach Let’s show where GIFs get that bad reputation so we can have a baseline. We’ll use trusty ol’ ffmpeg (in the age of LLMs it is a super handy utility), which if you don’t have already you can install via brew install ffmpeg. It’s a handy (and in my opinion downright essential) tool for doing just about anything with video. For a video we’ll use this cute video of some kittens I took at our local animal shelter: It’s 4K, 30 FPS, 5 seconds long, and thanks to its H265/HEVC video encoding it’s only 19.5 MB. Not bad! Let’s just chuck it into ffmpeg and tell it to output a GIF and see how it does. ffmpeg -i kitties.mp4 kitties.gif Okay, let that run and- oh no. For your sake I’m not even going to attach the GIF here in case folks are on mobile data, but the resulting file is 409.4MB. Almost half a gigabyte for a 5 second GIF of kittens. We gotta do better. Better We can do better. Let’s throw a bunch of confusing parameters at ffmpeg (that I’ll break down) to make this a bit more manageable. ffmpeg -i kitties.mp4 -filter_complex "fps=24,scale=iw*sar:ih,scale=1000:-1,split[a][b];[a]palettegen[p];[b][p]paletteuse=dither=floyd_steinberg" kitties2.gif Okay, lot going on here, let’s break it down. fps=24: we’re dropping down to 24 fps from 30 fps, many folks upload full YouTube videos at this framerate so it’s more than acceptable for a GIF. scale=iw*sar:ih: sometimes video files have weird situations where the aspect ratio of each pixel isn’t square, which GIFs don’t like, so this is just a correction step so that doesn’t potentially trip us up scale=1000:-1: we don’t need our GIF to be 4K, and I’ve found 1,000 pixels across to be a great middle ground for GIFs. The -1 at the end just means scale the height to the appropriate value rather than us having to do the math ourselves. The rest is related to the color palette, we’re telling ffmpeg to scan the entire video to build an appropriate color palette up, and to use the Floyd-Steinberg algorithm to do so. I find this algorithm gives us the highest quality output (which is also handy for compressing it more in further steps) This gives us a dang good looking GIF that clocks in at about 10% the file size at 45.8MB. Link to GIF in lieu of embedding directly Nice! Even better ffmpeg is great, but where it’s geared toward videos it doesn’t do every GIF optimization imaginable. You could stop where we are and be happy, but if you want to shave off a few more megabytes, we can leverage gifsicle, a small command line utility that is built around optimizing GIFs. We’ll install gifsicle via brew install gifsicle and throw our GIF into it with the following: gifsicle -O3 --lossy=65 --gamma=1.2 kitties2.gif -o kitties3.gif So what’s going on here? O3 is essentially gifsicle’s most efficient mode, doing fancy things like delta frames so changes between frames are stored rather than each frame separately lossy=65 defines the level of compression, 65 has been a good middle ground for me (200 I believe is the highest compression level) gamma=1.2 is a bit confusing, but essentially the gamma controls how the lossy parameter reacts to (and thus compresses) colors. 1 will allow it to be quite aggressive with colors, while 2.2 (the default) is much less so. Through trial and error I’ve found 1.2 causes nice compression without much of a loss in quality The resulting GIF is now 23.8MB, shaving a nice additional 22MB off, so we’re now at a meagre 5% of our original filesize. That’s a lot closer to the 4K, 20MB input, so for a GIF I’ll call that a win. And for something like a simpler screen recording it’ll be even smaller! Make it easy Rather than having to remember that command or come back here and copy paste it all the time, add the following to your ~/.zshrc (or create it if you don’t have one already): gifify() { # Defaults local lossy=65 fps=24 width=1000 gamma=1.2 while [[ $# -gt 0 ]]; do case "$1" in --lossy) lossy="$2"; shift 2 ;; --fps) fps="$2"; shift 2 ;; --width) width="$2"; shift 2 ;; --gamma) gamma="$2"; shift 2 ;; --help|-h) echo "Usage: gifify [--lossy N] [--fps N] [--width N] [--gamma VAL] <input video> <output.gif>" echo "Defaults: --lossy 65 --fps 24 --width 1000 --gamma 1.2" return 0 ;; --) shift; break ;; --*) echo "Unknown option: $1" >&2; return 2 ;; *) break ;; esac done if (( $# < 2 )); then echo "Usage: gifify [--lossy N] [--fps N] [--width N] [--gamma VAL] <input video> <output.gif>" >&2 return 2 fi local in="$1" local out="$2" local tmp="$(mktemp -t gifify.XXXXXX).gif" trap 'rm -f "$tmp"' EXIT echo "[gifify] FFmpeg: starting encode → '$in' → temp GIF (fps=${fps}, width=${width})…" if ! ffmpeg -hide_banner -loglevel error -nostats -y -i "$in" \ -filter_complex "fps=${fps},scale=iw*sar:ih,scale=${width}:-1,split[a][b];[a]palettegen[p];[b][p]paletteuse=dither=floyd_steinberg" \ "$tmp" then echo "[gifify] FFmpeg failed." >&2 return 1 fi echo "[gifify] FFmpeg: done. Starting gifsicle (lossy=${lossy}, gamma=${gamma})…" if ! gifsicle -O3 --gamma="$gamma" --lossy="$lossy" "$tmp" -o "$out"; then echo "[gifify] gifsicle failed." >&2 return 1 fi local bytes bytes=$(stat -f%z "$out" 2>/dev/null || stat -c%s "$out" 2>/dev/null || echo "") if [[ -n "$bytes" ]]; then local mb mb=$(LC_ALL=C printf "%.2f" $(( bytes / 1000000.0 ))) echo "[gifify] gifsicle: done. Wrote '$out' (${mb} MB)." else echo "[gifify] gifsicle: done. Wrote '$out'." fi } This will allow you to easily call it as either gifify <input-filename.mp4> <output-gifname.gif> and default to the values above, or if you want to tweak them you can use any optional parameters with gifify --fps 30 --gamma 1.8 --width 600 --lossy 100 <input-filename.mp4> <output-gifname.gif>. For instance: # Using default values we used above gifify cats.mp4 cats.gif # Changing the lossiness and gamma gifify --lossy 30 --gamma 2.2 cats.mp4 cats.gif Much easier. May your GIFs be beautiful and efficient.

2 weeks ago 19 votes
You should repaste your MacBook (but don't)

My favorite memory of my M1 Pro MacBook Pro was the whole sensation of “holy crap, you never hear the fans in this thing”, which was very novel in 2021. Four years later, this MacBook Pro is still a delight. It’s the longest I’ve ever owned a laptop, and while I’d love to pick up the new M4 goodness, this dang thing still seems to just shrug at basically anything I throw at it. Video editing, code compiling, CAD models, the works. (My desire to update is helped though by the fact I got the 2TB SSD, 32GB RAM option, and upgrading to those on new MacBooks is still eye wateringly expensive.) But my MacBook is starting to show its age in one area: it’s not quiet anymore. If you’re doing anything too intensive like compiling code for awhile, or converting something in Handbrake, the age of the fans being quiet is long past. The fans are properly loud. (And despite having two cats, it’s not them! I clean out the fans pretty regularly.) Enter the thermal paste Everyone online seems to point toward one thing: the thermal paste on computers tends to dry up over the years. What the heck is thermal paste? Well, components on your computer that generate a lot of heat are normally made to touch something like a copper heatsink that is really good at pulling that heat away from it. The issue is, when you press these two metal surfaces against each other, even the best machining isn’t perfect and you there’s microscopic gaps between them meaning there’s just air at those parts, and air is a terrible conductor of heat. The solution is to put a little bit of thermal paste (basically a special grey toothpaste gunk that is really good at transferring heat) between them, and it fills in any of those microscopic gaps. The problem with this solution is after hundreds and hundreds of days of intense heat, the paste can dry up into something closer to almost a powder, and it’s not nearly as good at filling in those gaps. Replacement time The logic board! MacBook thermal paste isn’t anything crazy (for the most part, see below), custom PC builders use thermal paste all the time so incredibly performant options are available online. I grabbed a tube of Noctua NT-H2 for about $10 and set to taking apart my MacBook to swap out the aging thermal paste. And thankfully, iFixit has a tremendous, in depth guide on the disassembly required, so I got to it. Indeed, that grey thermal paste looked quite old, but also above and below it (on the RAM chips) I noticed something that didn’t quite seem like thermal paste, it was far more… grainy almost? Spottiness is due to half of it being on the heatsink It turns out, ending with my generation of MacBooks (lucky me!) Apple used a very special kind of thermal compound often called “Carbon Black”, which is basically designed to be able to bridge an even thicker gap than traditional thermal paste. I thought about replacing it, but it seems really hard to come across that special thermal compound (and do not do it with normal thermal paste) and my RAM temperatures always seemed fine (65°C is fine… right?) so I just made sure to not touch that. For the regular grey thermal paste, I used some cotton swabs and isopropyl alcohol to remove the dried up existing thermal paste, then painted on a bit of the new stuff. Disaster To get to the underside of the CPU, you basically need to disassemble the entire MacBook. It’s honestly not that hard, but iFixit warned that the fan cables (which also need to be unclipped) are incredibly delicate. And they’re not wrong, seriously they have the structural integrity of the half-ply toilet paper available at gas stations. So, wouldn’t you know it, I moved the left fan’s cable a bit too hard and it completely tore in half. Gah. I found a replacement fan online (yeah you can’t just buy the cable, need a whole new fan) and in the meantime I just kept an eye on my CPU thermals. As long as I wasn’t doing anything too intensive it honestly always stayed around 65° which was warm, but not terrifying (MacBook Airs completely lack a fan, after all). Take two A few days later, the fans arrived, and I basically had to redo the entire disassembly process to get to the fans. At least I was a lot faster this time. The fan was incredibly easy to swap out (hats off there, Apple!) and I screwed everything back together and began reconnecting all the little connectors. Until I saw it: the tiny (made of the same half ply material as the fan cable) Touch ID sensor cable was inexpicably torn in half, the top half just hanging out. I didn’t even half to touch this thing really, and I hadn’t even got to the stage of reconnecting it (I was about to!), it comes from underneath the logic board and I guess just the movement of sliding the logic board back in sheared it in half. me Bah. I looked up if I could just grab another replacement cable here, and sure enough you can… but the Touch ID chip is cryptographically paired to your MacBook so you’d have to take it into an Apple Store. Estimates seemed to be in the hundreds of dollars, so if anyone has any experience there let me know, but for now I’m just going to live happily without a Touch ID sensor… or the button because the button also does not work. RIP little buddy (And yeah I’m 99.9% sure I can’t solder this back together, there’s a bunch of tiny lanes that make up the cable that you would need experience with proper micro-soldering to do.) Honestly, the disassembly process for my MacBook was surprisingly friendly and not very difficult, I just really wish they beefed up some of the cables even slightly so they weren’t so delicate. The results I was going to cackle if I went through all that just to have identical temperatures as before, but I’m very happy to say they actually improved a fair bit. I ran a Cinebench test before disassembling the MacBook the very first time to establish a baseline: Max CPU temperature: 102°C Max fan speed: 6,300 RPM Cinbench score: 12,252 After the new thermal paste (and the left fan being new): Max CPU temperature: 96°C Max fan speed: 4,700 RPM Cinbench score: 12,316 Now just looking at those scores you might be like… so? But let me tell you, dropping 1,600 RPM on the fan is a noticeable change, it goes from “Oh my god this is annoyingly loud” to “Oh look the fans kicked in”, and despite slower fan speeds there was still a decent drop in CPU temperature! And a 0.5% higher Cinebench score! But where I also really notice it is in idling: just writing this blog post my CPU was right at 46°C the whole time, where previously my computer idled right aroud 60°C. The whole computer just feels a bit healthier. So… should you do it? Honestly, unless you’re very used to working on small, delicate electronics, probably not. But if you do have that experience and are very careful, or have a local repair shop that can do it for a reasonable fee (and your MacBook is a few years old so as to warrant it) it’s honestly a really nice tweak that I feel will hopefully at least get me to the M5 generation. I do miss Touch ID, though.

a month ago 39 votes
A slept on upscaling tool for macOS

I uploaded YouTube videos from time to time, and a fun comment I often get is “Whoa, this is in 8K!”. Even better, I’ve had comments from the like, seven people with 8K TVs that the video looks awesome on their TV. And you guessed it, I don’t record my videos in 8K! I record them in 4K and upscale them to 8K after the fact. There’s no shortage of AI video upscaling tools today, but they’re of varying quality, and some are great but quite expensive. The legendary Finn Voorhees created a really cool too though, called fx-upscale, that smartly leverages Apple’s built-in MetalFX framework. For the unfamiliar, this library is an extensive of Apple’s Metal graphics library, and adds functionality similar to NVIDIA’s DLSS where it intelligently upscales video using machine learning (AI), so rather than just stretching an image, it uses a model to try to infer what the frame would look like at a higher resolution. It’s primarily geared toward video game use, but Finn’s library shows it does an excellent job for video too. I think this is a really killer utility, and use it for all my videos. I even have a license for Topaz Video AI, which arguably works better, but takes an order of magnitude longer. For instance my recent 38 minute, 4K video took about an hour to render to 8K via fx-upscale on my M1 Pro MacBook Pro, but would take over 24 hours with Topaz Video AI. # Install with homebrew brew install finnvoor/tools/fx-upscale # Outputs a file named my-video Upscaled.mov fx-upscale my-video.mov --width 7680 --codec h265 Anyway, just wanted to give a tip toward a really cool tool! Finn’s even got a [version in the Mac App Store called Unsqueeze](https://apps.apple.com/ca/app/unsqueeze/id6475134617 Unsqueeze) with an actual GUI that’s even easier to use, but I really like the command line version because you get a bit more control over the output. 8K is kinda overkill for most use cases, so to be clear you can go from like, 1080p to 4K as well if you’re so inclined. I just really like 8K for the future proofing of it all, in however many years when 8K TVs are more common I’ll be able to have some of my videos already able to take advantage of that. And it takes long enough to upscale that I’d be surprised to see TVs or YouTube offering that upscaling natively in a way that looks as good given the amount of compute required currently. Obviously very zoomed in to show the difference easier If you ask me, for indie creators, even when 8K displays are more common, the future of recording still probably won’t be in native 8K. 4K recording gives so much detail still that have more than enough details to allow AI to do a compelling upscale to 8K. I think for my next camera I’m going to aim for recording in 6K (so I can still reframe in post), and then continue to output the final result in 4K to be AI upscaled. I’m coming for you, Lumix S1ii.

a month ago 40 votes
Embedding Godot games in iOS is easy

Recently there’s been very exciting developments in the Godot game engine, that have allowed really easy and powerful integration into an existing normal iOS or Mac app. I couldn’t find a lot of documentation or discussion about this, so I wanted to shine some light on why this is so cool, and how easy it is to do! What’s Godot? For the uninitiated, Godot is an engine for building games, other common ones you might know of are Unity and Unreal Engine. It’s risen in popularity a lot over the last couple years due to its open nature: it’s completely open source, MIT licensed, and worked on in the open. But beyond that, it’s also a really well made tool for building games with (both 2D and 3D), with a great UI, beautiful iconography, a ton of tutorials and resources, and as a bonus, it’s very lightweight. I’ve had a lot of fun playing around with it (considering potentially integrating it into Pixel Pals), and while Unity and Unreal Engine are also phenomenal tools, Godot has felt lightweight and approachable in a really nice way. As an analogy, Godot feels closer to Sketch and Figma whereas Unity and Unreal feel more like Photoshop/Illustrator or other kinda bulky Adobe products. Even Apple has taken interest in it, contributing a substantial pull request for visionOS support in Godot. Why use it with iOS? You’ve always been able to build a game in Godot and export it to run on iOS, but recently thanks to advancements in the engine and work by amazing folks like Miguel de Icaza, you can now embed a Godot game in an existing normal SwiftUI or UIKit app just as you would an extra UITextView or ScrollView. Why is this important? Say you want to build a game or experience, but you don’t want it to feel just like another port, you want it to integrate nicely with iOS and feel at home there through use of some native frameworks and UI here and there to anchor the experience (share sheets, local notifications, a simple SwiftUI sheet for adding a friend, etc.). Historically your options have been very limited or difficult. You no longer have to have “a Godot game” or “an iOS app”, you can have the best of both worlds. A fun game built entirely in Godot, while having your share sheets, Settings screens, your paywall, home screen widgets, onboarding, iCloud sync, etc. all in native Swift code. Dynamically choosing which tool you want for the job. (Again, this was technically possible before and with other engines, but was much, much more complicated. Unity’s in particular seems to have been last updated during the first Obama presidency.) And truly, this doesn’t only benefit “game apps”. Heck, if the user is doing something that will take awhile to complete (uploading a video, etc.) you could give them a small game to play in the interim. Or just for some fun you could embed a little side scroller easter egg in one of your Settings screens to delight spelunking users. Be creative! SpriteKit? A quick aside. It wouldn’t be an article about game dev on iOS without mentioning SpriteKit, Apple’s native 2D game framework (Apple also has SceneKit for 3D). SpriteKit is well done, and actually what I built most of Pixel Pals in. But it has a lot of downsides versus a proper, dedicated game engine: Godot has a wealth of tutorials on YouTube and elsewhere, bustling Discord communities for help, where SpriteKit being a lot more niche can be quite hard to find details on The obvious one: SpriteKit only works on Apple platforms, so if you want to port your game to Android or Windows you’re probably not going to have a great time, where Godot is fully cross platform Godot being a full out game engine has a lot more tools for game development than can be handy, from animation tools, to sprite sheet editors, controls that make experimenting a lot easier, handy tools for creating shaders, and so much more than I could hope to go over in this article. If you ever watch a YouTube video of someone building a game in a full engine, the wealth of tools they have for speeding up development is bonkers. Godot is updated frequently by a large team of employees and volunteers, SpriteKit conversely isn’t exactly one of Apple’s most loved frameworks (I don’t think it’s been mentioned at WWDC in years) and kinda feels like something Apple ins’t interested in putting much more work into. Maybe that’s because it does everything Apple wants and is considered “finished” (if so I think that would be incorrect, see previous point for many things that it would be helpful for SpriteKit to have), but if you were to encounter a weird bug I’d feel much better about the likelihood of it getting fixed in Godot than SpriteKit I’m a big fan of using the right tool for the job. For iOS apps, most of the time that’s building something incredible in SwiftUI and UIKit. But for building a game, be it small or large, using something purpose built to be incredible at that seems like the play to me, and Godot feels like a great candidate there. Setup Simply add the SwiftGodotKit package to your Xcode project by selecting your project in the sidebar, ensuring your project is selected in the new sidebar, selecting the Package Dependencies tab, click the +, then paste the GitHub link. After adding it, you will also need to select the target that you added it to in the sidebar, select the Build Settings tab, then select “Other Linker Flags” and add -lc++. Lastly, with that same target, under the General tab add MetalFX.framework to Frameworks, Libraries, and Embedded Content. (Yeah you got me, I don’t know why we have to do that.) After that, you should be able to import SwiftGodotKit. Usage Now we’re ready to use Godot in our iOS app! What excites me most and I want to focus on is embedding an existing Godot game in your iOS app and communicating back and forth with it from your iOS app. This way, you can do the majority of the game development in Godot without even opening Xcode, and then sprinkle in delightful iOS integration by communicating between iOS and Godot where needed. To start, we’ll build a very simple game called IceCreamParlor, where we select from a list of ice cream options in SwiftUI, which then gets passed into Godot. Godot will have a button the user can tap to send a message back to SwiftUI with the total amount of ice cream. This will not be an impressive “game” by any stretch of the imagination, but should be easy to set up and understand the concepts so you can apply it to an actual game. To accomplish our communication, in essence we’ll be recreating iOS’ NotificationCenter to send messages back and forth between Godot and iOS, and like NotificationCenter, we’ll create a simple singleton to accomplish this. Those messages will be sent via Signals. This is Godot’s system for, well, signaling an event occurred, and can be used to signify everything from a button press, to a player taking damage, to a timer ending. Keeping with the NotificationCenter analogy, this would the be Notification that gets posted (except in Godot, it’s used for everything, where in iOS land you really wouldn’t use NotificationCenter for a button press.) And similar to Notification that has a userInfo field to provide more information about the notification, Godot signals can also take an argument that provides more information. (For example if the notification was “player took damage” the argument might be an integer that includes how much damage they took.) Like userInfo, this is optional however and you can also fire off a signal with no further information, something like “userUnlockedPro” for when they activate Pro after your SwiftUI paywall. For our simple example, we’re going to send a “selectedIceCream” signal from iOS to Godot, and a “updatedIceCreamCount” signal from Godot to iOS. The former will have a string argument for which ice cream was selected, and the latter will have an integer argument with the updated count. Setting up our Godot project Open Godot.app (available to download from their website) and create a new project, I’ll type in IceCreamParlor, choose the Mobile renderer, then click Create. Godot defaults to a 3D scene, so I’ll switch to 2D at the top, and then in the left sidebar click 2D Scene to create that as our root node. I’ll right-click the sidebar to add a child node, and select Label. We’ll set the text to the “Ice cream:”. In the right sidebar, we’ll go to Theme Overrides and increase the font size to 80 to make it more visible, and we’ll also rename it in the left sidebar from Label to IceCreamLabel. We’ll also do the same to add a Button to the scene, which we’ll call UpdateButton and sets its text to “Update Ice Cream Count”. If you click the Play button in the top right corner of Godot, it will run and you can click the button, but as of now it doesn’t do anything. We’ll select our root node (Node2D) in the sidebar, right click, and select “Attach Script”. Leave everything as default, and click Create. This will now present us with an area where we can actually code in GDScript, and we can refer to the objects in our scene by prefixing their name with a dollar sign. Inside our script, we’ll implement the _ready function, which is essentially Godot’s equivalent of viewDidLoad, and inside we’ll connect to our simple signal we discussed earlier. We’ll do this by grabbing a reference to our singleton, then reference the signal we want, then connect to it by passing a function we want to be called when the signal is received. And of course the function takes a String as a parameter because our signal includes what ice cream was selected. extends Node2D var ice_cream: Array[String] = [] func _ready() -> void: var singleton = Engine.get_singleton("GodotSwiftMessenger") singleton.ice_cream_selected.connect(_on_ice_cream_selected_signal_received) func _on_ice_cream_selected_signal_received(new_ice_cream: String) -> void: # We received a signal! Probably should do something… pass Note that we haven’t actually created the singleton yet, but we will shortly. Also note that normally in Godot, you have to declare custom signals like the ones we’re using, but we’re going to declare them in Swift. As long as they’re declared somewhere, Godot is happy! Let’s also hook up our button by going back to our scene, selecting our button in the canvas, selecting the “Node” tab in the right sidebar, and double-clicking the pressed() option. We can then select that same Node2D script and name the function _on_update_button_pressed to add a function that executes when the button is pressed (fun fact: the button being pressed event is also powered by signals). func _on_update_button_pressed() -> void: pass Setting up our iOS/Swift project Let’s jump over to Xcode and create a new SwiftUI project there as well, also calling it IceCreamParlor. We’ll start by adding the Swift package for SwiftGodotKit to Swift Package Manager, add -lc++ to our “Other Linker Flags” under “Build Settings”, add MetalFX, then go to ContentView.swift and add import SwiftGodotKit at the top. From here, let’s create a simple SwiftUI view so we can choose from some ice cream options. var body: some View { HStack { Button { } label: { Text("Chocolate") } Button { } label: { Text("Strawberry") } Button { } label: { Text("Vanilla") } } .buttonStyle(.bordered) } We’ll also create a new file in Xcode called GodotSwiftMessenger.swift. This will be where we implement our singleton that is akin to NotificationCenter. import SwiftGodot @Godot class GodotSwiftMessenger: Object { public static let shared = GodotSwiftMessenger() @Signal var iceCreamSelected: SignalWithArguments<String> @Signal var iceCreamCountUpdated: SignalWithArguments<Int> } We first import SwiftGodot (minus the Kit), essentially because this part is purely about interfacing with Godot through Godot, and doesn’t care about whether or not it’s embedded in an iOS app. For more details on SwiftGodot see its section below. Then, we annotate our class with the @Godot Swift Macro, which basically just says “Hey make Godot aware that this class exists”. The class is a subclass of Object as everything in Godot needs to inherit from Object, it’s essentially the parent class of everything. Following that is your bog standard Swift singleton initialization. Then, with another Swift Macro, we annotate a variable we want to be our signal which signifies that it’s a Signal to Godot. You can either specify its type as Signal or SignalWithArguments<T> depending on whether or not the specific signal also sends any data alongside it. We’ll use that “somethingHappened” signal we mentioned early, which includes a string for more details on what happened. Note that we used “ice_cream_selected” in Godot but “iceCreamSelected” in Swift, this is because the underscore convention is used in Godot, and SwiftGodotKit will automatically map the camelCase Swift convention to it. Now we need to tell Godot about this singleton we just made. We want Godot to know about it as soon as possible, otherwise if things aren’t hooked up, Godot might emit a signal that we wouldn’t receive in Swift, or vice-versa. So, we’ll hook it up very early in our app cycle. In SwiftUI, you might do this in the init of your main App struct as I’ll show below, and in UIKit in applicationDidFinishLaunching. @main struct IceCreamParlor: App { init() { initHookCb = { level in guard level == .scene else { return } register(type: GodotSwiftMessenger.self) Engine.registerSingleton(name: "GodotSwiftMessenger", instance: GodotSwiftMessenger.shared) } } var body: some Scene { WindowGroup { ContentView() } } } In addition to the boilerplate code Xcode gives us, we’ve added an extra step to the initializer, where we set a callback on initHookCb. This is just a callback that fires as Godot is setup, and it specifies what level of setup has occurred. We want to wait until the level setup is reached, which means the game is ready to go (you could set it up at an even earlier level if you see that as beneficial). Then, we just tell Godot about this type by calling register, and then we register the singleton itself with a name we want it to be accessible under. Again, we want to do this early, as if Godot was already setup in our app, and then we set initHookCb, its contents would never fire and thus we wouldn’t register anything. But don’t worry, this hook won’t fire until we first initialize our Godot game in iOS ourself, so as long as this code is called before then, we’re golden. Lastly, everything is registered in iOS land, but there’s still nothing that emits or receives signals. Let’s change that by going to ContentView.swift, and change our body to the following: import SwiftUI import SwiftGodotKit import SwiftGodot struct ContentView: View { @State var totalIceCream = 0 @State var godotApp: GodotApp = GodotApp(packFile: "main.pck") var body: some View { VStack { GodotAppView() .environment(\.godotApp, godotApp) Text("Total Ice Cream: \(totalIceCream)") HStack { Button { GodotSwiftMessenger.shared.iceCreamSelected.emit("chocolate") } label: { Text("Chocolate") } Button { GodotSwiftMessenger.shared.iceCreamSelected.emit("strawberry") } label: { Text("Strawberry") } Button { GodotSwiftMessenger.shared.iceCreamSelected.emit("vanilla") } label: { Text("Vanilla") } } .buttonStyle(.bordered) } .onAppear { GodotSwiftMessenger.shared.iceCreamCountUpdated.connect { newTotalIceCream in totalIceCream = newTotalIceCream } } } } There’s quite a bit going on here, but let’s break it down because it’s really quite simple. We have two new state variables, the first is to keep track of the new ice cream count. Could we just do this ourselves purely in SwiftUI? Totally, but for fun we’re going to be totally relying on Godot to keep us updated there, and we’ll just reflect that in SwiftUI to show the communication. Secondly and more importantly, we need to declare a variable for our actual game file so we can embed it. We do this embedding at the top of the VStack by creating a GodotAppView, a handy SwiftUI view we can now leverage, and we do so by just setting its environment variable to the game we just declared. Then, we change our buttons to actually emit the selections via signals, and when the view appears, we make sure we connect to the signal that keeps us updated on the count so we can reflect that in the UI. Note that we don’t also connect to the iceCreamSelected signal, because we don’t care to receive it in SwiftUI, we’re just firing that one off for Godot to handle. Communicating Let’s update our gdscript in Godot to take advantage of these changes. func _on_ice_cream_selected_signal_received(new_ice_cream: String) -> void: ice_cream.append(new_ice_cream) $IceCreamLabel.text = "Ice creams: " + ", ".join(ice_cream) func _on_update_button_pressed() -> void: var singleton = Engine.get_singleton("GodotSwiftMessenger") singleton.ice_cream_count_updated.emit(ice_cream.size()) Not too bad! We now receive the signal from SwiftUI and update our UI and internal state in Godot accordingly, as well as the UI by making our ice cream into a comma separated list. And then when the user taps the update button, we then send (emit) that signal back to SwiftUI with the updated count. Running To actually see this live, first make sure you have an actual iOS device plugged in. Unfortunately Godot doesn’t work with the iOS simulator. Secondly, in Godot, select the Project menu bar item, then Export, then click the Add button and select “iOS”. This will bring you to a screen with a bunch of options, but my understanding is that this is 99% if you’re building your app entirely in Godot, you can plug in all the things you’d otherwise plug into Xcode here instead, and Godot will handle them for you. That doesn’t apply to us, we’re going to do all that normally in Xcode anyway, we just want the game files, so ignore all that and select “Export PCK/ZIP…” at the bottom. It’ll ask you where you want to save it, and I just keep it in the Godot project directory, make sure “Godot Project Pack (*.pck)” is selected in the dropdown, and then save it as main.pck. That’s our “game” bundled up, as meager as it is! We’ll then drop that into Xcode, making sure to add it to our target, then we can run it on the device! Here we’ll see choosing the ice cream flavor at the bottom in SwiftUI beams it into the Godot game that’s just chilling like a SwiftUI view, and then we can tap the update button in Godot land to beam the new count right back to SwiftUI to be displayed. Not exactly a AAA game but enough to show the basics of communication 😄 Look at you go! Take this as a leaping off point for all the cool SwiftUI and Godot interoperability that you can accomplish, be it tappings a Settings icon in Godot to bring up a beautifully designed, native SwiftUI settings screen, or confirmation to you your game when the user updated to the Pro version of your game through your SwiftUI paywall. Bonus: SwiftGodot (minus the “Kit”) An additional fun option (that sits at the heart of SwiftGodotKit) is SwiftGodot, which allows you to actually build your entire Godot game with Swift as the programming language if you so choose. Swift for iOS apps, Swift on the server, Swift for game dev. Swift truly is everywhere. For me, I’m liking playing around in GDScript, which is Godot’s native programming language, but it’s a really cool option to know about. Embed size A fear might be that embedding Godot into your app might bloat the binary and result in an enormous app download size. Godot is very lightweight, adding it to your codebase adds a relatively meager (at least by 2025 standards) 30MB to your binary size. That’s a lot larger than SpriteKit’s 0MB, but for all the benefits Godot offers that’s a pretty compelling trade. (30MB was measured by handy blog sponsor, Emerge Tools.) Tips Logging If you log something in Godot/GDScript via print("something") that will also print to the Xcode console, handy! Quickly embedding the pck into iOS Exporting the pck file from Godot to Xcode is quite a few clicks, so if you’re doing it a lot it would be nice to speed that up. We can use the command line to make this a lot nicer. Godot.app also has a headless mode you can use by going inside the .app file, then Contents > MacOS > Godot. But typing the full path to that binary is no fun, so let’s symlink the binary to /usr/local/bin. sudo ln -s "/Applications/Godot.app/Contents/MacOS/Godot" /usr/local/bin/godot Now we can simply type godot anywhere in the Terminal to either open the Godot app, or we can use godot --headless for some command line goodness. My favorite way to do this, is to do something like the following within your Godot project directory: godot --headless --export-pack "iOS" /path/to/xcodeproject/target/main.pck This will handily export the pck and add it to our Xcode project, overwriting any existing pck file, from which point we can simply compile our iOS app. Wrapping it up I really think Godot’s new interoperability with iOS is an incredibly exciting avenue for building games on iOS, be it a full fledged game or a small little easter egg integrated into an existing iOS app, and hats off to all the folks who did the hard work getting it working. Hopefully this serves as an easy way to get things up and running! It might seem like a lot at first glance, but most of the code shown above is just boilerplate to get an example Godot and iOS project up and running, the actual work to embed a game and communicate across them is so delightfully simple! (Also big shout out to Chris Backas and Miguel de Icaza for help getting this tutorial off the ground.)

2 months ago 44 votes
Curing Mac mini M4 fomo with 3D printing

Spoiler: 3D printed! The colored ports really sell the effect If you’re anything like me, you’ve found the new, tinier Mac mini to be absolutely adorable. But you might also be like me that you either already have an awesome M1 Mac mini that you have no real reason to replace, or the new Mac mini just isn’t something you totally need. While that logic might be sound, but it doesn’t make you want one any less. To help cure this FOMO, I made a cute little 3D printable Mac mini that can sit on your desk and be all cute. But then I had an even better idea, the new Mac mini is powerful sure, but it can’t hold snacks. Or a plant. Or your phone. Or pens/pencils. So I also made some versions you can print that add some cute utility to your desk in the form of the new Mac mini. They’re free of course! Just chuck ’em into your (or your friend’s) 3D printer. It even has all the little details modeled, like the power button, ports (including rear), and fan holes! They’re pretty easy to print, it’s in separate parts for ease of printing the bottom a different color (black) versus the top, then just put a dab of glue (or just use gravity) to keep them together. If you have a multi-color 3D printer, you can color the ports and power LED to make it look extra cool (or just do it after the fact with paint). Here are the different options for your desk! Secret item stash The possibilities for what you can store on your desk are now truly endless. Individually wrapped mints? Key switches? Screws? Paper clips? Rubber bands? Flash drives? Download link: https://makerworld.com/en/models/793456 A very green sorta Mac First carbon neutral Mac is cool and all but what if your Mac mini literally had a plant in it? Every desk needs a cute little plant. Download link: https://makerworld.com/en/models/793464 Phone holder A phone/tablet holder is an essential item on my desk for debugging things, watching a video, or just keeping an eye on an Uber Eats order. Before, guests came over and saw my boring phone stand and judged me, now they come over and think I’m exciting and well-traveled. You can even charge your phone/tablet in portrait mode by pushing the cable through a tunnel made through the Ethernet port that then snakes up to the surface. Download link: https://makerworld.com/en/models/793495 Pen holder The Playdate had the cutest little pen/pencil holder accessory but it unfortunately never shipped and my desk is sad. This will be a nice stand in for your beloved pens, pencils, markers, and Apple Pencils. Download link: https://makerworld.com/en/models/793470 A solid model Or if you just want to stare at it without any frills, you can just print the normal model too! Download link: https://makerworld.com/en/models/793447 Printer recommendation Whenever I post about 3D printing I understandably get a bunch of “Which 3D printer should I buy??” questions. This isn’t sponsored, but I’ve found over the last few years the answer has been pretty easy: something from Bambu Lab. Their printers are somehow super easy to use, well designed, and reasonably priced. Prusas are great too, but I think Bambu is hard to beat for the price. Don’t get an Ender. So if you’re looking for a printer now, Black Friday deals are aplenty so it’s pretty much the best time to pick one up. I’d grab something in their A series if you’re on a budget, or the P1S for a bit more if you can swing it (that’s what I use). https://bambulab.com On the other hand if you just want to print one thing now and again, a lot of local libraries are starting to have 3D printers so that might be worth looking into! And online services exist too (eg: JLCPCB and PCBWay), but if you do it with any regularity a 3D printer is a really fun thing to pick up. Enjoy! ❤️ Learning 3D modeling over the last year has been a ton of fun so I love a good excuse to practice, and shout out to Jerrod Hofferth and his amazing 3D printable Mac mini tower (that you should totally download) for the idea to solve my desire with some 3D printing! Also, the models are almost certainly not accurate down to the micrometer as I don’t actually have one, they’re based off Apple’s measurements as well as measuring screenshots. But it should be close! If you have a multi-color 3D printer, the linked models have the colors built-in for your ready to go, but if you want to print it in single-colors I also made versions available with the top and bottom separate as well as the logo, so you can print them separately in the individual colors then connect them with a touch of super glue or something.

9 months ago 74 votes

More in technology

Learn how to make a 2D capacitive touch sensor with ElectroBOOM

Mehdi Sadaghdar, better known as ElectroBOOM, created a name for himself with shocking content on YouTube full of explosive antics. But once you get past the meme-worthy shenanigans, he is a genuinely smart guy that provides useful and accessible lessons on many electrical engineering principles. If you like your learning with a dash of over-the-top […] The post Learn how to make a 2D capacitive touch sensor with ElectroBOOM appeared first on Arduino Blog.

3 days ago 5 votes
The 'politsei' problem, or how filtering unwanted content is still an issue in 2025

A long time ago, there was a small Estonian website called “Mängukoobas” (literal translation from Estonian is “game cave”). It started out as a place for people to share various links to browser games, mostly built with Flash or Shockwave. It had a decent moderation system, randomized treasure chests that could appear on any part of the website, and a lot more.1 What it also had was a basic filtering system. As a good chunk of the audience was children (myself included), there was a need to filter out all the naughty Estonian words, such as “kurat”, “türa”, “lits” and many more colorful ones. The filtering was very basic, however, and some took it to themselves to demonstrate how flawed the system was by intentionally using phrases like “politsei”, which is Estonian for “police”. It would end up being filtered to “po****ei” as it also contained the word “lits”, which translates to “slut”2. Of course, you could easily overcome the filter by using a healthy dose of period characters, leading to many cases of “po.l.i.t.sei” being used. With the ZIRP phenomenon we got a lot of companies wanting to get into the “platform” business where they bring together buyers and sellers, or service providers and clients. A lot of these platforms rely on transactions taking place only on their platform and nowhere else, so they end up doing their best to avoid the two parties from being in contact off-platform and paying out of band, as that would directly cut into their revenue. As a result, they scan private messages and public content for common patterns, such as e-mails and phone numbers, and block or filter them. As you can predict, this can backfire in a very annoying way. I was looking for a cheap mini PC on a local buy-sell website and stumbled on one decent offer. I looked at the details, was going over the CPU model, and found the following: CPU: Intel i*-**** Oh. Well, maybe it was an error, I will ask the seller for additional details with a public question. The response? Hello, the CPU model is Intel i*-****. Damn it. I never ended up buying that machine because I don’t really want to gamble with Intel CPU model numbers, and a few days later it was gone. It’s 2025, I’m nearing my mandatory mid-life crisis, and the Scunthorpe problem is alive and well. fun tangent: the site ended up being like a tiny social network, eventually incorporating things like a cheap rate.ee knock-off where children were allowed to share pictures of themselves. As you can imagine, this was a horrible, horrible idea, as it attracted the exact type of person that would be interested in that type of content. I got lucky by being so poor that I did not have a webcam or a digital camera to make any pictures with, and I remember that fondly because someone on MSN Messenger was very insistent that I take some pictures of myself. Don’t leave children with unmonitored internet access! ↩︎ “slut” is also an actual word in Swedish which translates to “final”. I think. I’m not a Swedish expert, actually. ↩︎

5 days ago 11 votes
What Interviews Should I Look For?

Help point me in the right direction.

5 days ago 9 votes
Repairing an HP 5370A Time Interval Counter

MathJax.Hub.Config({ jax: ["input/TeX", "output/HTML-CSS"], tex2jax: { inlineMath: [ ['$', '$'], ["\\(", "\\)"] ], displayMath: [ ['$$', '$$'], ["\\[", "\\]"] ], processEscapes: true, skipTags: ['script', 'noscript', 'style', 'textarea', 'pre', 'code'] } //, //displayAlign: "left", //displayIndent: "2em" }); Introduction Inside the HP 5370A High Stability Reference Clock with an HP 10811-60111 OCXO RIFA Capacitors in the Corcom F2058 Power Entry Module? 15V Rail Issues Power Suppy Architecture Fault Isolation - It’s the Reference Frequency Buffer PCB! The Reference Frequency Buffer Board Fixing the Internal Reference Clock Fixing the External Reference Clock Future work Footnotes Introduction I bought an HP 5370A time interval counter at the Silicon Valley Electronics Flea Market for a cheap $40. The 5370A is a pretty popular device among time nuts: it has a precision of 20ps for single-shot time interval measurements, amazing for a device that was released in 1978, and even compared to contemporary time interval counters it’s still a decent performance. The 74LS chips in mine have a 1981 time code which makes the unit a whopping 44 years old. But after I plugged it in and pressed the power button, smoke and a horrible smell came out after a few minutes. I had just purchased myself hours of entertainment! Inside the HP 5370A It’s trivial to open the 5370A: remove the 4 feet in the back by removing the Philips screws inside them. remove a screw to release the top or bottom cover (Click to enlarge) Once inside, you can see an extremely modular build: the center consists of a motherboard with 10 plug-in PCBs, 4 on the left for an embedded computer that’s based on an MC6800 CPU, 6 on the right for the time acquisition. The top has plug-in PCBs as well, with the power supply on the left and reference clock circuitry on the right. My unit uses the well known HP 10811-60111 high-stability OCXO as 10MHz clock reference. The bottom doesn’t have plug-in PCBs. It has PCBs for trigger logic and the front panel. This kind of modular build probably adds significant cost, but it’s a dream for servicing and tracking down faults. To make things even easier, the vertical PCBs have a plastic ring or levers to pull them out of their slot! There are also plenty of generously sized test pins and some status LEDs. High Stability Reference Clock with an HP 10811-60111 OCXO Since the unit has the high stability option, I have now yet another piece of test equipment with an HP 10811-60111. OCXOs are supposed to be powered on at all time: environmental changes tend to stress them out and result in a deviation of their clock speed, which is why there’s a “24 hour warm-up” sticker on top of the case. It can indeed take a while for an OCXO to relax and settle back into its normal behavior, though 24 hours seems a bit excessive. The 5370A has a separate always-on power supply just for the oven of the OCXO to keeps the crystal at constant temperature even when the power switch on the front is not in the ON position. Luckily, the fan is powered off when the front switch is set to stand-by.1 In the image above, from top to bottom, you can see: the main power supply control PCB the HP 10811-60111 OCXO. To the right of it is the main power relay. the OCXO oven power supply the reference frequency buffer PCB These are the items that will play the biggest role during the repair. RIFA Capacitors in the Corcom F2058 Power Entry Module? Spoiler: probably not… After plugging in the 5370A the first time, magic smoke came out of it along with a pretty disgusting chemical smell, one that I already knew from some work that I did on my HP 8656A RF signal generator. I unplugged the power, opened up the case, and looked for burnt components but couldn’t find any. After a while, I decided to power the unit back on and… nothing. No smoke, no additional foul smell, but also no display. One of common failure mode of test equipment from way back when are RIFA capacitors that sit right next to the mains power input, before any kind of power switch. Their primary function is to filter out high frequency noise that’s coming from the device and reduce EMI. RIFAs have a well known tendency to crack over time and eventually catch fire. A couple of years ago, I replaced the RIFA capacitors of my HP 3457A, but a general advice is to inspect all old equipment for these gold colored capacitors. However, no such discrete capacitors could be found. But that doesn’t mean they are not there: like a lot of older HP test equipment, the 5370A uses a Corcom F2058 line power module that has capacitors embedded inside. Below is the schematic of the Corcom F2058 (HP part number 0960-0443). The capacitors are marked in red. You can also see a fuse F1, a transformer and, on the right, a selector that can be used to configure the device for 100V, 115V/120V, 220V and 230V/240V operation. (Click to enlarge) There was a bad smell lingering around the Corcom module, so I removed it to check it out. There are metal clips on the left and right side that you need to push in to get the module out. It takes a bit of wiggling, but it works out eventually. Once removed, however, the Corcom didn’t really have a strong smell at all. I couldn’t find any strong evidence online that these modules have RIFAs inside them, so for now, my conclusion is that they don’t have them and that there’s no need to replace them. Module replacement In the unlikely case that you want to replace the Corcom module, you can use this $20 AC Power Entry Module from Mouser. One reason why you might want to do this is because the new module has a built-in power switch. If you use an external 10 MHz clock reference instead of the 10811 OCXO, then there’s really no need to keep the 5370A connected to the mains all the time. There are two caveats, however: while it has the same dimensions as the Corcom F2058, the power terminals are located at the very back, not in an indented space. This is not a problem for the 5370A, which still has enough room for both, but it doesn’t work for most other HP devices that don’t have an oversized case. You can see that in the picture below: Unlike the Corcom F2058, the replacement only feeds through the line, neutral and ground that’s fed into it. You’d have to choose one configuration, 120V in my case, and wire a bunch of wires together to drive the transformer correctly. If you do this wrong, the input voltage to the power regulator will either be too low, and it wont work, or too high, and you might blow up the power regulation transistors. It’s not super complicated, but you need to know what you’re doing. 15V Rail Issues After powering the unit back up, it still didn’t work, but thanks to the 4 power rail status LEDs, it was immediately obvious that +15V power rail had issues. A close-by PCB is the reference frequency buffer PCB. It has a “10 MHz present” status LED that didn’t light up either, suggesting an issue with the 10811 OCXO, but I soon figured out that this status LED relies on the presence of the 15V rail. Power Suppy Architecture The 5370A was first released in 1978, decades before HP decided to stop including detailed schematics in their service manuals. Until Keysight, the Company Formerly Known as HP, decides to change its name again, you can download the operating and service manual here. If you need a higher quality scan, you can also purchase the manual for $10 from ArtekManuals2. The diagrams below were copied from the Keysight version. The power supply architecture is straightforward: the line transformer has 5 separate windings, 4 for the main power supply and 1 for the always-on OCXO power supply. A relay is used to disconnect the 4 unregulated DC rails from the power regulators when the front power button is in stand-by position, but the diode rectification bridge and the gigantic smoothing capacitors are located before the relay.3 For each of the 4 main power rails, a discrete linear voltage regulator is built around a power transistor, an LM307AN opamp and a smaller transistor for over-current protection, and a fuse. The 4 regulators share a 10V voltage reference. The opamps and the voltage reference are powered by a simple +16.2V power rail built out of a resistor and a Zener diode. (Click to enlarge) The power regulators for the +5V and -5.2V rails have a current sense resistor of 0.07 Ohm. The sense resistors for the +15V and -15V rails have a value of 0.4 Ohm. When the voltage across these resistors exceeds the 0.7V base-emitter potential of the bipolar transistors across them, the transistors start to conduct and pull down the base-emitter voltage of the power transistor, thus shutting them off. In the red rectanngle of the schematic above, the +15V power transistor is on the right, the current control transistor on the left, and current sense resistor R4 is right next to the +15V label. Using the values of 0.4 Ohm, 0.07 Ohm and 0.7V, we can estimate that the power regulators enter current control (and reduce the output voltage) when the current exceeds 10A for the +5/-5.2V rails and 1.5A for the +15/-15V rails. This more or less matches the value of the fuses, which are rated at 7A and 1.5A respectively. Power loss in this high current linear regulators is signficant and the heat sinks in the back become pretty hot. Some people have installed an external fan too cool it down a bit. Fault Isolation - It’s the Reference Frequency Buffer PCB! I measured a voltage of 8V instead of 15V. I would have prefered if I had measured no voltage at all, because a lower than expected voltage suggests that the power regulator is in current control instead of voltage control mode. In other words: there’s a short somewhere which results in a current that exceeds what’s expected under normal working conditions. Such a short can be located anywhere. But this is where the modular design of the 5370A shines: you can unplug all the PCBs, check the 15V rail, and if it’s fine, add back PCBs until it’s dead again. And, indeed, with all the PCBs removed, the 15V rail worked fine. I first added the CPU related PCBs, then the time acquisition PCBs, and the 15V stayed healthy. But after plugging in the reference frequency buffer PCB, the 15V LED went off and I measured 8V again. Of all the PCBs, this one is the easiest one to understand. The Reference Frequency Buffer Board The reference frequency buffer board has the following functionality: Convert the internally generated 10MHz frequency to emitter-coupled logic (ECL) signaling. The 5370A came either with the OCXO or with a lower performance crystal oscillator. These cheaper units were usually deployed in labs that already had an external reference clock network. Receive an external reference clock of 5 MHz or 10 MHz, multiply by 2 in the case of 5 MHz, and apply a 10 MHz filter. Convert to ECL as well. Select between internal and external clock to create the final reference clock. Send final reference clock out as ECL (time measurement logic), TTL (CPU) and sine wave (reference-out connector on the back panel). During PCB swapping, the front-panel display had remained off when all CPU boards were plugged in. Unlike later HP test equipment like the HP 5334A universal counter, the CPU clock of the 5370A is derived from the 10 MHz clock that comes out of this reference frequency buffer PCB4, so if this board is broken, nothing works. When we zoom down from block diagram to the schematic, we get this: (Click to enlarge) Leaving aside the debug process for a moment, I thought the 5 MHz/10 MHz to 10 MHz circuit was intriguing. I assumed that it worked by creating some second harmonic and filter out the base frequency, and that’s kind of how it works. There are 3 LC tanks with an inductance of 1 uH and a capacitance of 250pF, good for a natural resonance frequency of \(f = \frac{1}{2 \pi \sqrt{ L C }}\) = 10.066 MHz. The first 2 LC tanks are each part of a class C amplifier. The 3rd LC tank is an additional filter. The incoming 5 MHz or 10 MHz signal periodically inserts a bit of energy into the LC tank and nudges it to be in sync with it. This circuit deserves a blog post on its own. Fixing the Internal Reference Clock When you take a closer look at the schematic, there are 2 points that you can take advantage of: The only part on the path from the internal clock input to the various internal outputs that depends on the 15V rail is the ECL to TTL conversion circuit. And that part of the 15V rail is only connected to 3k Ohm resistor R4. Immediately after the connector, 15V first goes through an L/C/R/C circuit. In the process of debugging, I noticed the following: The arrow points to capacitor C17, which looks suspicioulsy black. I found the magic smoke generator. This was the plan off attack: Replace C17 with a new 10uF capacitor Remove resistor R16 to decouple the internal 15V rail from the external one. Disconnect the top side of R4 from the internal 15V and wire it up straight to the connector 15V rail. It’s an ugly bodge, but after these 3 fixes, I had a nice 10MHz ECL clock signal on the output clock test pin. The 5370A was alive and working fine! Fixing the External Reference Clock I usually connect my test equipment to my GT300 frequency standard, so I really wanted to fix that part of the board as well. This took way longer than it could have been… I started by replacing the burnt capacitor with a 10uF electrolytic capacitor and reinstalling R16. That didn’t go well: this time, the resistor went up in smoke. My theory is that, with shorted capacitor C17 removed, there was still another short and now the current path had to go through this resistor. Before burning up, this 10 Ohm resistor measured only 4 Ohms. I then removed the board and created a stand-alone setup to debug the board in isolation. With that burnt up R16 removed again, 15V applied to the internal 15V and a 10 MHz signal at the external input, the full circuit was working fine. I removed capacitor C16, checked it with an LCR tester and the values nicely in spec. Unable to find any real issues, I finally put in a new 10 Ohm resistor, put a new 10uF capacitor for C16 as well, plugged in the board and… now the external clock input was working fine too?! So the board is fixed now and I can use both the internal and external clock, but I still don’t why R16 burnt up after the first capacitor was replaced. Future work The HP 5370A is working very well now. Once I have another Digikey order going out, I want to add 2 capacitors to install 2 tantalum ones instead of the electrolytics that used to repair. I can’t find it back, but on the time-nuts email list, 2 easy modifications were suggested: Drill a hole through the case right above the HP 10811-60111 to have access to the frequency adjust screw. An OCXO is supposed to be immune to external temperature variations, but when you’re measuring picoseconds, a difference in ambient temperature can still have a minor impact. With this hole, you can keep the case closed while calibrating the internal oscillator. Disconnect the “10 MHz present” status LED on the reference clock buffer PCB. Apparently, this circuit creates some frequency spurs that can introduce some additional jitter on the reference clock. If you’re really hard core: Replace the entire CPU system by a modern CPU board More than 10 years ago, the HP5370 Processor Replacement Project reverse engineered the entire embedded software stack, created a PCB based on a Beagle board with new firmware. PCBs are not available anymore, but one could easily have a new one made for much cheaper than what it would have cost back then. Footnotes My HP 8656A RF signal generator has an OCXO as well. But the fan keeps running even when it’s in stand-by mode, and the default fan is very loud too! ↩ Don’t expect to be able to cut-and-paste text from the ArtekManuals scans, because they have some obnoxious rights managment that prevents this. ↩ Each smoothing capacitor has a bleeding resistor in parallel to discharge the capacitors when the power cable is unplugged. But these resistors will leak power even when the unit is switched off. Energy Star regulations clearly weren’t a thing back in 1978. ↩ The CPU runs at 1.25 MHz, the 10 MHz divided by 8. ↩

6 days ago 28 votes
Practical Computing Interviewed Alan Sugar (1985)

A Quick Look Behind the Scenes at Amstrad.

a week ago 18 votes