We had a massive merge at work last year where two teams diverged for months on a shared codebase. It took three people a full week to resolve everything, and the worst part was stepping on each other's conflict resolutions. Splitting the merge into independent slices that people can work on in parallel would have saved us a lot of pain. The integration branch approach where non-conflicting changes land automatically is a nice touch too.
Yeah, that is a terrible idea. Before you do this, you'll first do a redesign and split things up in packages that can be separately updated and installed.
> We had a massive merge at work last year where two teams diverged for months on a shared codebase.
There is no tool in the world that can save you guys from yourselves. There is a reason why Agile methodologies put such a premium on continuous integration.
A lot of the "free labor for Amazon" framing in this thread misses the core dynamic here. Colin wasn't doing charity work, he was making FreeBSD run on EC2 because Tarsnap literally depends on it. That's probably the healthiest model for open source contribution: you fix the infrastructure your own product sits on, and everyone downstream benefits too. The alternative is waiting for Amazon to care about your niche platform, which could mean waiting forever. It's a different calculus than, say, an indie dev writing a library that AWS wraps into a managed service.
The difference between this and Munich's attempt is that France has been building up gradually. They already run Tchap (Matrix-based) for government messaging, and the gendarmerie switched to Linux years ago with over 70k desktops. Munich tried a big-bang migration without enough internal expertise and caved under political pressure when MS moved their HQ there. Schleswig-Holstein in Germany is taking the same incremental approach now and seeing better results. The pattern is pretty clear: governments that treat it as a multi-year capability build succeed, those that treat it as a licensing swap don't.
If anything, the lesson to learn from the LiMux failure has nothing to do with technology or with project planning + execution, but with politics. If you extort millions from government as a for-profit business, most of which ends up as pure profit, there is an “emperor's new clothes” dynamic. It aligns the interests of government officials with yours in driving a narrative that there was good value generated for the taxpayer from that taxpayer money you got. Also: You now have those millions in a war fund, which you can use as negotiation mass. (In the case of LiMux and Munich, Microsoft relocated their corporate HQ to Munich as a quid pro quo for the City of Munich abandoning the LiMux project, which directly benefitted the City of Munich because it now got to tax Microsoft in a way that it didn't previously get to do). … these kinds of strategies game theoretically dominate any kind of play that's possible through open source.
Elm's type system and architecture are genuinely pleasant to work with, so seeing those ideas ported to a Go compilation target is interesting. You get the safety and expressiveness of Elm but end up with a Go binary you can deploy anywhere. I wonder how the error messages compare, since that was always one of Elm's strongest features.
Writing a C compiler in pure shell is one of those projects that sounds absurd until you think about bootstrapping. If you want to compile C on a system where you literally have nothing but a POSIX shell, this is exactly what you need. The fact that the parser itself is BNF-generated from shell modules makes it even more interesting as a study in how far you can push shell scripting before it breaks. Would love to see this evolve into a proper repo with tests so it can actually serve as a minimal bootstrapping tool.
It's not just a toy or a fun hobby project, there's potential for practical use as a step in bootstrapping an entire software stack from human-verifiable artifacts.
A shell is almost always used to setup the bootstrap environment, so the dependency on a shell is more or less always there.
Otherwise, something special with POSIX shell is its large number of independent implementations, making it the ideal starting point for diverse double-compilation (https://arxiv.org/abs/1004.5534). The idea is to bootstrap a toolchain from multiple compilers (shells in this case), and the result compared to verify that no shell introduced a trusting trust attack.
The shell.c ouroboros is really cool. Being able to bootstrap trust through an entirely different language family (shell → C → shell) adds genuine value to the trusting-trust problem beyond just technical novelty.
The fzf integration is a really nice touch here. Half the battle with dev tool management isn't installing things, it's remembering what you installed and how six months later. I know everyone's going to recommend Nix (and they already have), but there's something to be said for a solution where the entire logic fits in your head on first read. I've had a similar Makefile-based setup for years and the biggest win is onboarding new team members who can just read the targets and immediately know what's available.
100x is a bold claim but the Zig approach to optimizing hot paths in Bun makes a lot of sense. There is so much low hanging fruit when you actually dig into how package managers interact with git under the hood. Nice writeup, the before/after benchmarks are convincing.
But then there's this: "When evaluating the complete bun install improvements, it came out speed-wise to about the same as the existing git usage (due to networking being the big bottleneck time-wise despite more cases being slightly faster with ziggit over multiple benchmarks). Except, it's done in 100% zig and those internal improvements pile up as projects consist of more git dependencies. All in all, it seems like a sensible upstream contribution."
Sooo, after burning these 10k+ worth of tokens we find out that it's sensible to use it because the language (zig) feels good as opposed to git itself which now has +20 years of human eval on it. That seems. Well. Yeah...
The original target was bun since it itself is written in zig, not because of anything specific to the language
When it was clear that there were benefits in filling in more of git's capabilities (ie targeting WASM), I then went and filled in more git features.
It's not by any means a universal win over everything but it does have notable wins like having git operations be between 4-10x faster on arm-based MacBooks than git itself
reply