I actually started Illinois as a Computer Engineering major and switched to Computer Science because I thought I'd get to use all the cool supercomputers at the Beckman Institute. Those electrical courses were all part of my CS requirements. Illinois CS was big on architecture, having designed Illiac and all of that stuff. Hennessy/Patterson for life.
The supercomputer thing... never happened. And I turned out to have a CE career anyway.
Was, he said it a long time ago. He has said a lot of things that sounded kind of far-fetched and paranoid at the time, but which were later demonstrated to be true, so "Stallman was right" is reappraising the past statement.
Lots of people are saying some of those things now, but Stallman was saying them decades ago. It's not differentiating or meaningful to say "A is right" if B-E are saying the same thing at the same time.
It's easy to be right when you live outside the boundaries of reality.
E.g. he won't (didn't?) own a mobile phone, but is okay with borrowing someone else's. He won't use Wi-Fi where he has to log in but would happily borrow someone else's.
It's not being right; it's shifting responsibility in exchange for his own personal convenience.
It's not setting an example if you shift responsibility to someone else.
Setting an example would be just doing without the things he doesn't agree with. Need to make a call but only other people's cell phones are available? Well, you don't make the call. Need wifi but no open networks are available? Well, you don't get wifi. Is this even more inconvenient than the already-inconvenient use of other people's cell phones or wifi logins? Absolutely. But it's actually sticking to your principles.
Live like the Amish in 2026 (though I assume they have phones now).
It's not setting an example. We have a word for it and it's called being a mooch.
The attitude is consistent with that famous video where RMS explains that he's "never installed GNU/Linux" because he could just ask someone else to do it for him, and suggest others should do the same.
For that matter, why own a car if we can borrow someone else's? Especially with license plate readers and traffic cameras everywhere, who wants to be tracked? Let your friend be tracked instead. That is the level of logic here.
First people call him "outside the boundaries of reality" then demand he lives like Amish... Look who is removed from reality now. How would he even work for the FSF's goals, if he were to follow silly advice like that? Apparently, whatever he does, he can't do right by every naysayer's standards. What many people miss is, that even Stallman admits, that you don't have to go cold turkey free/libre only, but it is already a step in the right direction to do what you can, accepting an inconvenience in exchange. Many people will rather bury their head in the sand than to accept any inconvenience at all.
But if everyone acted like Stallman then solutions that have gone away such as public payphones would come back due to their requirement.
He doesn't give a crap if a random phone record of his appears in a random haystack, and that's kind of the point isn't it? It's the aggregated, crawlable stores that are the threat
There may be other issues with Stallman, but that behavior doesn't strike me as particularly inconsistent
For someone who claims to take a principled stance on these sorts of things, it feels very unprincipled to leverage the risk that others take in e.g. carrying a cell phone.
Consider that there are two components here: one is that Stallman is uncomfortable with the risk of carrying a tracking device (aka cell phone) around with him. The other is that he wants to make it known that people shouldn't carry cell phones because of that tracking; part of his platform is advocating for and against things like this.
If he was merely worried about the risk, and was just out to protect himself, then using someone else's cell phone (which would be at hand regardless of whether or not he used it) would be a perfectly reasonable, pragmatic thing to do. Transferring the risk, as you say.
But using someone else's cell phone is a violation of the principle. How can I take his advocacy seriously if he freely admits that we need cell phones out in the world, otherwise it's even too inconvenient for him to go about his business?
He does leverage the risk that others take, but those others are also the people who collectively build society so as to require taking that risk. It's kind of tit-for-tat in a way.
>How can I take his advocacy seriously
You could just listen to what he has to say and consider whether or not it's true. His personal behaviour at the end of the day has little bearing on that. "He doesn't even do XYZ therefore I won't believe him" feels more like a rationalization one comes up with because one doesn't want to believe him in the first place.
It depends. Many modern microcontrollers are perfectly fine driving LEDs directly off IO pins if the pin specs say it is rated for sufficient current (like 20mA). However, older ones like ESP8266 can only do like 2mA and the 8051 even less. Or you run into a total power budget issue if your are running too many pins. Also, some IO pins are perfectly fine at sinking current to ground but aren't suited for sourcing current, in which case the LED would be directly connected to an external high voltage and the IO pin would simply be switching to ground or not.
> When C code is run in machines capable of failing with gruesome death, its unsafeness may indeed result in gruesome death.
And yet, it never does. It's been powering those types of machines likely longer than you have been alive, and the one exception I can think of where lives were lost, the experts found that the development process was at fault, not the language.
If it was as bad as you make out, we'd have many many many occurrences of this starting in the 80s. We don't.
How was this flamebait? It is an example of how bad programming choices/assumptions/guardrails costs lives, a counterargument to the statement of 'And yet, it never does'. Splitting hairs if the language is C or assembly is missing the spirit of the argument, as both those languages share the linguistic footguns that made this horrible situation happen (but hey, it _was_ the 80s and choices of languages was limited!). Though, even allowing the "well ackuacally" cop-out argument, it is trivial to find examples of code in C causing failures due to out-of-bounds usage of memory; these bugs are found constantly (and reported here, on HN!). Now, you would need to argue, "well _none_ of those programs are used in life-saving tech" or "well _none_ of those failures would, could, or did cause injury", to which I call shenanigans. The link drop was meant to do just that.
We need to agree to disagree on this one; the claim that C is fine and does not cause harm due to its multitude of foot-guns, I think, is an egregious and false claim. So don't make false claims and don't post toxic positivity, I guess?
I don't think the prevalence of these articles this time of year is because the authors go on holiday, but instead is because the new year is the perfect time to ponder: "Will this be the year of the Linux desktop?"
reply