Hacker Newsnew | past | comments | ask | show | jobs | submit | TheFlyingFish's commentslogin

The linked article isn't describing a form of input sanitization, it's a complete separation between trusted and untrusted contexts. The trusted model has no access to untrusted input, and the untrusted model has no access to tools.

Simon Willison has a good explainer on CaMeL: https://simonwillison.net/2025/Apr/11/camel/


That’s still only as good as the ability of the trusted model to delineate instructions from data. The untrusted model will inevitably be compromised so as to pass bad data to the trusted model.

I have significant doubt that a P-LLM (as in the camel paper) operating a programming-language-like instruction set with “really good checks” is sufficient to avoid this issue. If it were, the P-LLM could be replaced with a deterministic tool call.


He's living the hacker dream. Made a billion bucks, then went right back to writing code. People upvote because they wish they were him.


The HN zeitgeist has something of a love/hate relationship with the web, I've noticed. HN in general seems to skew a little older than a lot of online communities, so a lot of HN users were adults back in the early days of the web/Usenet/etc. There's a tendency to view those days with nostalgia, leading a lot of people to feel like the "good old days" of the web were "ruined" by the modern shift into more interactivity, fancier/prettier design, etc. And "web developers" are the ones proximately responsible for the shift, so they get the hate too.

I laugh every time I see someone on HN asserting that the web "shouldn't" be used for anything beyond "documents and lightly interactive content", which is not uncomment. There's some real old-man-yelling-at-clouds energy there.


It basically boils down to: (a) 90s web developers tended not to have computer science backgrounds and weren't aware of fundamentals -> (b) when js frameworks exploded in popularity and diversity in the 00s, there was much wheel reinventing, because those developers (and to a lesser degree framework inventors) were often ignorant of wheels -> (c) there are persistent, fundamental mistakes* in the web ecosystem that could have been fixed at the start if anyone with experience had been asked.

All of those people are now the vibe coders of the 20s, and it's going to end up in the same dumpster fire of 'Who knew it might be a good idea to cryptographically sign and control library packages in a public repository?'

* Note: I'm distinguishing things going sideways despite best intentions and careful planning from YOLO + 'Oops, how could that possibly have happened?' shit


If you're crazy then I am too. 50% odds it was written by a human, 50% bot.


I imagine offloading a lot of the heavy lifting to Vite helps cut down on the code size.


This. Whole thing struck me as basically an advertisement for Vite. 99% of the base functionality is probably already there, written by humans.

"Use our proprietary SaaS and you too can approximate Next.js in 1/100 as much code using a bit of chicken wire and an LLM".

Whole thing sounded too good to be true, and it was.


I once managed to trigger what I think was a race condition in a microwave's beep routine. It was one of the type that does a single long beep rather than individual beeps, and like most it would cut the beep short when you opened the door. But one time, one single time, I managed to open the door PRECISELY as the timer finished, and the beep just didn't stop. I finally closed and opened the door after maybe 30 seconds, and that stopped it.

I was never able to trigger it again, so I have no idea whether it was a race condition or some other random one-in-a-million happenstance, but it makes a fun theory at least.


I've never used a Lisp either, but I get the impression that "forcing you to write the AST" is sort of the secret sauce. That is, if your source code is basically an AST to begin with, then transforming that AST programmatically (i.e. macros) is much more ergonomic. So you do, which means that Lisp ends up operating at a higher level of abstraction than most languages because you can basically create DSL on the fly for whatever you're doing.

That's my impression, at least. Like I said, I've never actually used a Lisp. Maybe I'm put off by the smug superiority of so many Lisp people who presume that using Lisp makes them better at programming, smarter, and probably morally superior to me.


Technically native selects do have a very rudimentary form of filtering: start typing text with the select focused and it will auto-select the first matching option.

E.g. if the select is a list of US states, type "N" and it will jump to Nebraska. Continue into "New" and you'll get New Hampshire, etc.

This is better than nothing (and I personally use it all the time) but not a patch on an actual proper select-with-filtering which, yes, you still need JS to implement properly.


That works if you're dealing with a known set of keys (i.e. what most statically-typed languages would call a struct). It falls down if you need something where the keys are unknowable until runtime, like a lookup table.

I do like dataclasses, though. I find them sneaking into my code more and more as time goes on. Having a declared set of properties is really useful, and it doesn't hurt either that they're syntactically nicer to use.


I haven't used Deno, but I do use Bun purely as a replacement for npm. It does the hard-linking thing that seems to be increasingly common for package managers these days (i.e. it populates your local node_modules with a bunch of hard links to its systemwide cache), which makes it vastly quicker and more disk-efficient than npm for most usage.

Even with a cold cache, `bun install` with a large-ish dependency graph is significantly faster than `npm install` in my experience.

I don't know if Deno does that, but some googling for "deno install performance vs npm install" doesn't turn up much, so I suspect not?

As a runtime, though, I have no opinion. I did test it against Node, but for my use case (build tooling for web projects) it didn't make a noticeable difference, so I decided to stick with Node.


Deno does all that. Hell, yarn does too, or pnpm as the sibling mentioned.


Sure, but pnpm is very slow compared to bun.


Deno does that. It also refrains from keeping a local node_modules at all until/unless you explicitly ask it to for whatever compatibility reason. There are plugins to things like esbuild to use the Deno resolver and not need a node_modules at all (if you aren't also using the Deno-provided bundler for whatever reason such as it disappeared for a couple versions and is still marked "experimental").


pnpm does all that on top of node. Also disables postinstall scripts by default, making the recent security incidents we've seen a non-issue.


As the victim of the larger pre-Shai-Hulud attack, unfortunately the install script validation wouldn't have protected you. Also, if you already have an infected package on the whitelist, a new infection in the install script will still affect you.


I’m not sure why but bun still feels snappier.



Aside from speed, what would the major selling points be on migrating from pnpm to bun?


Are there any popular packages that require postinstall scripts that this hurts?


A whitelist in package.json is only a partial assist


IIRC bun zig code base has a lot of fine optimization too. I think the lead did a conference explaining his work. Or maybe i'm confused.



oh thanks yes, i couldn't find it, i was already lost thinking it was a conference by andrew kelley .. thanks a lot


I decided to stick with Node in general. I don't see any compelling reason to change it.

Faster install and less disk space due to hardlink? Not really all that important to me. Npm comes with a cache too, and I have the disk space. I don't need it to be faster.

With the old-school setup I can easily manually edit something in node_modules to quickly test a change.

No more node_modules? It was a cool idea when yarn 2 initially implemented it, but at the end of the day I prefer things to just work rather than debug what is and isn't broken by the new resolver. At the time my DevOps team also wasn't too excited about me proposing to put the dependencies into git for the zero-install.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: