Good to know it works for some people! I think it's another issue where they focus too much on MacOS and neglect Windows and Linux releases. I use WSL for Claude Code since the Windows release is far worse and currently unusable do to several neglected issues.
Hoping to see several missing features land in the Linux release soon.
I'm also feeling weak and the pull of getting a Mac is stronger. But I also really don't like the neglect around being cross-platform. It's "cross-platform" except a bunch of crap doesn't work outside MacOS. This applies to Claude Code, Claude Desktop (MacOS and Windows only - no Linux or WSL support), Claude Cowork (MacOS only). OpenAI does the same crap - the new Codex desktop app is MacOS only. And now I'm ranting.
I'm on v2.1.37 and I have it set to auto-update, which it does. I also tend to run `claude update` when I see a new release thread on Twitter, and usually it has already updated itself.
Claude Code CLI 2.1.39 released a few hours ago fixes the problem. They didn't note it in the changelog though. Seems like a significant bug fix. ¯\_(ツ)_/¯
I think the closest that this has come is in the form of GitLab, which pretty famously did a ton of the corporate work in the format of a very open Handbook (https://handbook.gitlab.com/)
In the early years, it was extremely, extremely open and comprehensive. I've definitely looked through it when I wasn't sure how to handle something at work.
That's pretty cool. Wonder if it is deployed and updated religiously still. If they wanted to deploy an 'Agent' worker that source is goldmine for context.
I have a hard time believing that the right move for most organizations that aren't already bought into an OpenAI enterprise plan is going to be building their entire business around something like this. This ties you to one model provider that has been having issues keeping up with the other big labs and provides what looks like superficially some extremely useful tools but with unclear amounts of rigor. I don't think I would want to build my business on this if I was an AI-native company that was just starting right now unless they figure out how to make this much more legible and transparent to people.
I see Pangolin has a Self-Host Community Edition, doesn't that already give something over digital sovereignity for EU users? I am considering both for a migration from Tailscale, any suggestion on their differences?
For a Tailscale migration, NetBird is the direct swap. Pangolin won't give you device-to-device connectivity.
On EU sovereignty: NetBird is Germany-based and explicitly positions itself as a European alternative. Self-hosted gives full control with no callbacks to their servers. Pangolin is US/YC-backed, so while self-hosting gives you control of the data plane, the project itself is American.
Also, NetBird has a reverse proxy feature coming this quarter, which would cover the Pangolin use case within the same platform.
I think it's actually already there. It's definitely possible to make these sorts of explainers with something like a Claude Code, you just have to spend a fair amount of time making sure that it's actually doing what you expect it to do. I think the biggest danger with something like a Claude Code is that you get something that looks functionally correct but that the details are suddenly wrong on. I wrote a little bit about this on my blog for some of the places that I've done visualizations actually, and I think it's remarkably easy to iterate on them now.
> Someone like Bartosz Ciechanowski is still operating at a level I can't touch. His work reflects years of accumulated craft and a visual intuition that I don't have.
Seems like you’re contradicting yourself a bit here- is it actually possible to make the sort of explainers from TFA, or is the author operating at a level you can’t touch?
This is the equivalent of setting up a developer environment for charging a car. Once you have a car that's working, and you know how to connect to the app and charge it, almost all these problems go away. If you're in a place that has a lot of public chargers near your destination that you're already going to, then it's even easier, and it just becomes trivial.
That being said, I don't think I would want to rent a car that didn't have a place to charge it or a very easy-to-use fast charger nearby.
Until NACS and plug and go are uniquitous, going on a trip not in a Tesla is a gamble of having the right app on your phone, and that you will be able to reach working chargers.
I think we are still a couple of years away from other manufacturers catching up to Tesla and making road trips for most people useful.
username at gmail if you want to chat