"We should look at how people are using LLMs right now instead of chasing promises of superintelligence."
This. When more computing power and memory resources became available to software engineers, we've seen how that impacted the software development. Sure, there were happy stories of new class of problems being attacked, which couldn't before due to resource limits, but a lot of software just stopped being frugal and did pretty much what it did before, but somehow consuming much more. Extrapolating to the case of staggering resources poured over the AI solutions, I'd be surprised if most of it won't just be consumed to generate higher resolution (and longer) videos.
rsync does what was designed to do and the lack of scope creep is not a bad thing. There is "fpsync" - another tool on top of rsync (which was mentioned in one of the comments at article's page) that covers the parallel processing use-case: https://manpages.debian.org/bullseye/fpart/fpsync.1.en.html
"No stuffing about with swapping physical hardware just because I've temporarily relocated myself."
That's exactly the use case for which the carriers offer roaming plans. The bonus is that you (as in your phone number) get to remain connected and accessible by your contacts, as no other phone number is involved at any point. One should not need to change the SIM unless is about one's phone change.
Managing the light source, specifically the 13.5nm length on the wave spectrum, that gets generated from overheated tin plasma, is in fact the most challenging part of the machine. Here "managing" includes the process of hitting a rightly sized tin droplet with lasers at the right angles, and all the rest of the complicated fluid math necessary to get the most of that precious lighting moment, as well as the proper handling of that spark event's after-effects, of course. As opposed to the rest of the machine parts (like directing the EUV light to the reticle through those mirrors you mention), the light generation part is dynamic, very easily to get wrong, and very costly to iterate on.
I was thinking about this too. When you read a program, there is this information payload, which is the metaphorical ball you have to keep your eyes on, and more or less forget about the rest as soon as it isn't relevant any more. In the functional paradigm it's like seeing the juggle of a bunch of such balls instead (plus the expectation to admire it), but that's just wasteful on reader's attention.
Most likely, it didn't happen (yet) due to kernel related stuff being still actively worked on¹ and (more importantly) due to a shortage of developers willing and capable to tackle that kind of challenge.
"putting yourself and your hard work in legal risk"
Like what? I'm genuinely curious what personal risks faces anyone from contributing to ReactOS. I also am curious what kind of legal risk may threaten the work? I mean, even in the unlikely scenario that something gets proven illegal and ordered to be dismissed from the project, what would prevent any such particular expunged part to be re-implemented by some paid contractor (now under legally indisputable circumstances), thus rendering the initial effort (of legal action) moot?
"A good example I think has adjusted the behavior of the ecosystem is Rust: it makes certain things much easier than before and slowly the complex bug-mired world of software is improving just a little bit because of it."
From a software design prospective, the functionality that should go into a compiler is code compilation only. Taken it to extreme (as in Unix philosophy), if the code compiles, then the compiler should just build you the binary or fail silently otherwise. The code checking and reporting various aspects of the quality of the code is supposed to be a static code analyzer's job. (In reality, pretty much all compilers we have are doing compilation coupled with some amount of lighter code checking before that, and the static code analyzers left only with the heavier and more exhaustive code checking.) What Rust does is to demand its compiler to perform even more of what a static analyzer is supposed to do. It's a mishmash of two things (which still manage to stay separate things when it's about other programming languages, because that makes sense) and masquerades that as revolution.
So, (even when it's about code in blamed languages like C & C++) the "the complex bug-mired world of software is improving just a little bit" by not skipping the static analyzer kind of expensive checks, the kind that Rust happen to make impossible to skip.
The transition from kilobytes to megabytes is not comparable to the transition from megabytes to gigabytes at all. Back in the kilobytes days, when the engineers (still) had to manage bits and resort to all kind of tricks to somehow make it to something working, a lot of software (and software engineering) aspects left to be desired. Way too many efforts were poured not so much into putting the things together for the business logic as were poured into overcoming the shortcomings of limited memory (and other computing) resource availability. Legitimate requirements for software had to be butchered like Procrustes' victims, so that the software could have a chance to be. The megabytes era accommodated all but high end media software, without having to compromise on their internal build-up. It was the time when things could be properly done, no excuses.
Nowadays' disregard for computing resource consumption is simply the result of said resources getting too cheap to be properly valued and a trend of taking their continued increase for granted. There's simply little to no addition in today's software functionality that couldn't do without the gigabytes levels of memory consumption.
This. When more computing power and memory resources became available to software engineers, we've seen how that impacted the software development. Sure, there were happy stories of new class of problems being attacked, which couldn't before due to resource limits, but a lot of software just stopped being frugal and did pretty much what it did before, but somehow consuming much more. Extrapolating to the case of staggering resources poured over the AI solutions, I'd be surprised if most of it won't just be consumed to generate higher resolution (and longer) videos.
reply