Its simply true that the average person you talk to is going to be ...average. Or you could listen to John Carmack on a 5 hour podcast. This warps your perception of what the people around you can offer you.
I think younger people have maybe thrown the baby out with the bathwater, and you need some discernment on whose advice you can value and trust. But ive just been in many situations in my life where ive asked for advice and its just been total shit.
"Wisdom of the elders" is overrated when society changes so rapidly, and not all the adults you know are the insightful village shaman.
I recall asking my grandfather what is was like to live through the JFK assassination and just recieving something to the effect of "oh yeah that was crazy and bad, i remember seeing it on the news." follow up questions produced no further insight. So you come to the conclusion, why bother with that when you can just read a book about the topic.
idk, i think your underestimating the ubiquity and resources behind stuff like A/B and usability testing nowadays. Certainly a much more sophisticated way of determining whether people are able to find what they need.
people have been making some version of this comment for the past three years, and the only thing that has changes is that you keep adding capabilities.
2 years ago people were saying it was purely autocomplete and enhanced google.
AI bears just continue to eat shit year after year and keep pretending they didnt say that AI would never be capable of what its currently capable of.
i'll bite. the uses for llms i've described are about what i've been using them for since chatgpt 3o. they've absolutely gotten better since then but i still find them to be very poor replacements for humans, esp in regards to architectural direction. they're very useful assistants tho
it actually insane that this sort of thing is tolerated. Its a culture thing and frankly just rude. My org is pretty AI-pilled and this type of behavior will just not fly. I need to be assured im talking to a human who is using their brain.
If I paste something from an AI into chat, I always identify it as such by saying something like "my claude instance says this:". I also don't blindly copy paste from it, I always read it first and usually edit it for brevity or tone. Feel like this should be the absolute minimum for sending AI content to a person.
Even that is pretty useless because we have no idea what context "your Claude instance" has. All you're doing is dressing up some bullshit to look authoritative.
When I started my PhD I was already really good at typesetting with LaTeX. I started to bring in fully typeset works in progress for my supervisor to read through. These proofs often had fatal flaws. He asked me to stop typesetting until after the work had been verified because it looked too convincingly correct due to being typeset.
That was about 15 years ago but I've never forgotten it. Drafts should look like drafts. Scrappy work and proofs of concept should look as such. Stop fucking with people by making your bullshit, scrappy ideas look legit. Progress is a cooperative effort. It's not about trying to make people say yes.
Can confirm. I saw some fresh out of college colleagues do this in text docs. Al nice markup, but the text content was very drafty. I always sent them back to keep the format concept-y if you are tuning the text first.
There’s people who use AI to solve problems, and then there’s people who have completely offloaded all of their thinking to LLMs. I have a manager who when asked a question won’t think even for a moment about it and will just paste paragraphs of AI generated text back.
Most big tech CEOs are people that only "succeeded" due to have an unregulated monopoly or picking the right lotto ticket and not due to any innate above average intelligence. Go look at the 100s of billions in wasted capital and tell me who benefitted from such waste while workers + children suffer from lack of medical care.
You honestly expect this trajectory to continue unabated?
> You honestly expect this trajectory to continue unabated?
Knowing humanity's history, yes. Not sure we're ever going to see a second French Revolution. People are pacified and are not rioting. And they really should. Most of us are kind of privileged. I know people out there who are barely holding on and the recent fuel + food price increases might push them over the edge to actual poverty.
??? the reason for every single layoff in the history of the world is to lower expenses, and that is obvious to everyone. not actually sure what other kind of explanation would satisfy you.
But why is lowering expenses now necessary? If we believed the press releases, it’s because AI blah blah. I’m suggesting that some legal requirement for being truthful about the reason would be beneficial.
What benefit would it truly provide? Companies would simply say they need to cut costs to maximize shareholder value, which is no different than what happened here.
Well, generally speaking I think it’s a better world if corporations are forced to not lie to people.
Presumably investors and those shorting the company would benefit from more accurate information about a company. So the market as a whole would be healthier and less prone to inflationary claims.
I also don’t think that excuse would really hold up under scrutiny: “we fired 14% of our workforce to maximize shareholder value” isn’t exactly a straightforward answer. Right now the answer seems to be latching onto whatever’s trendy and blaming the layoffs on that.
If there is an expectation that reasons will be investigated, then I think you’d just get more accurate information in the market, tldr.
"Our product is stupid and probably won't sell despite the mountains of bullshit I've spewed over the last couple of years and we need to pivot so ...."
"We took out some huge debt and need to pay it off asap so ...."
"I made a strategic mistake, so ...."
"I'm hoping to get a huge rise in the stock price and make money off it somehow so ...."
I'm just joking but I think the point is that the smug person doing the firing wants to make themselves look good rather than bad and HAS to try to make the company look good to shareholders even if it's not.
And yeah, there's crossover but they're not 1 to 1. At the same time, if a company is taking two people of equal position and firing one, or keeping the other, the honesty in how they came to that conclusion through transparency has value. Was the decision one of seniority? Performance? Geographical relevance? Was it favoritism masked in another reason? The person receiving the pink slip deserves to know the truth, especially in cases where legal matters could be of question where a company may say one thing, but be acting on another.
you think the deepest mysteries of reality and the universe should just reveal themselves because we have a couple thousand smart people working on it for... 10 years?
I think younger people have maybe thrown the baby out with the bathwater, and you need some discernment on whose advice you can value and trust. But ive just been in many situations in my life where ive asked for advice and its just been total shit.
"Wisdom of the elders" is overrated when society changes so rapidly, and not all the adults you know are the insightful village shaman.
I recall asking my grandfather what is was like to live through the JFK assassination and just recieving something to the effect of "oh yeah that was crazy and bad, i remember seeing it on the news." follow up questions produced no further insight. So you come to the conclusion, why bother with that when you can just read a book about the topic.
reply