Are you sure about that? Can you make the opposite case and steel-man it? Until you can, a stupid LLM will be smarter than you are.
I can write a PR now. I can code now. Probably not as good as you can, but before LLMs I couldn't. I tried for decades learning to code. My brain is a Top-down network, I can see the big picture very quickly, but I cannot maintain focus to build bottom-up. Now I don't have to. I use LLMs to set the goal, to examine all corner cases, to define the milestones, to predict the wrong turns, to write a human-readable spec, to break it down to units of test code, to write the blue-prints of units of code. I can test them, and debug them with LLMs. The end result can be sub-optimal, but it runs, it does what I want, is well documented, and is maintainable. Before LLMs I couldn't do any of that. In doing all this, I get better at the bottom-up thing, just by trying.
We are a spectrum of people. Do not assume the world is like you.
Everything is collapsing toward a low-trust default. At the end of this trajectory, we rediscover that the analog world becomes valuable precisely because it can't be infinitely replicated.
Authenticity becomes the foundational currency.
But everyone must master AI tools to stay relevant. The brilliant engineer who refuses AI-generated PR by principle will get replaced. Every 18-24 months, as capabilities double, required skills shift. Specialization diminishes. Learning velocity becomes the only durable advantage. These people cannot learn new tricks.
Those who cannot question their assumptions cannot self-correct and will be replaced. The future belongs to the humble, the fluid, and the resilient. 60% of HN users is going toward a very tough time, and I am being very charitable with this assumption.
Large Language Models cannot think but they can self correct. You can think but clearly have some serious problems self-correcting your wrong assumptions. Who is smarter then?
Yes, I know that quote is used to make fun of Internet nay-sayers, but I think younger people underestimate just how important fax machines were even as late as the early 2000s (and reportedly, they are still popular in Japan, where handwriting a note and faxing it is still easier than typing the characters).
Basically what you’re saying is “another person made a wrong prediction in the past, so your prediction about something entirely different is wrong too”.
People used that same quote when talking about NFTs, and VR, and 3D TVs, and literally every grift and failed “big idea” ever since.
That quote is not an argument, and using it is a tired technique. The only thing it signals is a blind belief in whatever the quoter thinks is good at the time.
How about Christmas lights? Washing machines? TV and video entertainment? Elevators for the first floor? How about for the second floor? Social media posts, like these?
The demand for technology leads to advancements that meet our needs. As we continue to innovate, we must focus on consuming more energy rather than less.
You are eager to decide what is useful and what is not. Can you predict the future? Can you predict the full impact of technologies? Can you see second, third and forth order effect? Likely not. For instance, many may not have anticipated the significant role smartphones play today.
It concerns me when some individuals attempt to control others' resource usage, potentially leading to authoritarian rule driven by fear. Such actions might result in adverse effects before any noticeable climate changes occur in the near future.
The counterweight elevator is -- by far -- the most energy efficient way for people to live and be supplied. As in, if you need to supply a few thousand people with food and services then cramming them in high rises and surrounding those with facilities in walking distance will consume the least energy. Even more efficient than moving around on bikes or trams. I do not have the book at hand where this was calculated in painstaking detail but I am not sure a human consumes less energy walking up a floor than the elevator does thanks to said counterweight.
Also, if you need to convince people to live in such circumstances then a little convenience goes rather far so that also needs to be considered.
Modern washing machines are certainly more water efficient than hand washing and I wouldn't be surprised, again, if they would be more energy efficient too. Once again: humans consume energy too. Edit: and as someone else noted, we look should look at the societal effect. Well, it's quite clear the washing machine is an extremely big plus as it automates a time consuming, hard physical task.
So far every order effect of LLMs are terrible as they are built on the backs of exploited workers and are used to further disenfranchise workers and also artists.
> As we continue to innovate, we must focus on consuming more energy rather than less.
> The demand for technology leads to advancements that meet our needs
The things you listed are "wants." Perhaps we could say that washing machines has turned into a need, in much the same way crude oil has. What would the world have been had we tamed nuclear power before oil was commercialized?
> You are eager to decide what is useful and what is not
I think GP is eager to decide what is net beneficial, which is a tradeoff between usefulness and cost (monetary, social, environmental.)
I don't personally care that much about Earth. It's a rock in space, and it will continue existing with our without a functioning ecosystem, but I try to be conservative with my actions, so that the people who do care, may continue enjoying it.
> The demand for technology leads to advancements that meet our needs.
Or does the desire for technology leads to advancements that meet
our wants? Wants versus needs, and desires versus demands get
confusing sometimes.
> It concerns me when some individuals attempt to control others'
resource usage,
From a psychological point of view that's understandable. And it does
portend an ugly authoritarianism. From a realist standing, it's
inevitable if resources are limited. Right now we seem to be limited
by the heat capacity of the planetary ecosphere. To avoid that
becoming an open conflict I think we need to enrich the debate to talk
about appetites rather than needs.
> The demand for technology leads to advancements that meet our needs.
A huge majority of tech advancements are driven by supply rather than demand. Capitalism and modern economics pushes companies to build whatever they can market and sell, they aren't designing new products and tech because consumers already asked for it.
I can write a PR now. I can code now. Probably not as good as you can, but before LLMs I couldn't. I tried for decades learning to code. My brain is a Top-down network, I can see the big picture very quickly, but I cannot maintain focus to build bottom-up. Now I don't have to. I use LLMs to set the goal, to examine all corner cases, to define the milestones, to predict the wrong turns, to write a human-readable spec, to break it down to units of test code, to write the blue-prints of units of code. I can test them, and debug them with LLMs. The end result can be sub-optimal, but it runs, it does what I want, is well documented, and is maintainable. Before LLMs I couldn't do any of that. In doing all this, I get better at the bottom-up thing, just by trying.
We are a spectrum of people. Do not assume the world is like you.