Feels like a take from an alternate reality. I can’t think of a single great developer I know who was not self-taught. In my experience, if you got the will, drive, attitude, and curiosity, you had it for a while, and any given situation can only slow or accelerate your pace. And if you don’t got them, you don’t got them, and no sage shove from the outside is going to help.
I think you need both intrinsic talent / motivation and external guidance.
Like with research, nobody can take arithmetic and single-handedly discover calculus, number theory, and higher concepts. You're not going to discover good software development paradigms without reading about some of them. You especially can't write anything useful without using high-level libraries and working with others.
At the same time, I think intrinsic motivation significantly helps learning these concepts. To some random person, learning about "one function one purpose" could be like learning random historical dates to me. "Why can't I just copy / paste the code? Why do I need good function names?" These people would have to discipline themselves and power through learning this stuff. But to me, I didn't have to discipline myself, because for some reason I was genuinely interested in writing "clean" code and making my development more efficient. This gave me an advantage. And there are people who love writing code more than I do, so when I get tired and brain fog after a couple hours they keep writing.
I'm not them. It's better to assume we all need help and direction. Just about everyone I've ever worked with has had a beautiful insight or three. The real trick seems to be finding a process that produces good enough results that isn't so soul-crushing that it extinguishes those rare brilliant insights.
Academic citations work the same way: saying thank you to the people who helped builds their reputation, and is a positive-feedback cycle of growth and joy. Thank you jfoutz for humbly saying that you also need help and direction; me too.
You need to add Gauss, who basically figured out arithmetic on his own the way he tells it, but you probably need to remove Ramanujan. Yes, he was brilliant and self-taught with the proper material as inspiration, but where he ended up wasn't understandable to other mathematicians and neither where they to him.
In my career, I've had the pleasure of watching a few developers go from enthusiastic newbies to great developers. I'd describe them as self-taught, but none of them developed the whole field of computer science by themselves. They got there by reading other people's code, by writing bad code and getting shown a better way, by reading books and articles and taking a few early CS classes. Of course, I know a few great developers who are older than me. They never made such mistakes or needed such help that I saw. But they were novices once, and they learned from others who came before them.
Nothing in the article contradicts your statement, IMO. Literally none of the things she mentions are things that are taught in school. She’s more stressing the importance of learning from peers, not some guru from on high.
Yes, though it does give a feel of dismissing people who mostly learned on their own.
In my experience, these two learning sources - your peers at work, and your own research - yield different type of knowledge. To show you what I mean, let's dissect an example from the article:
> Think about the difference between being able to write functions that print out to your terminal versus creating a class with methods that return text to pass to other methods that checks for sanitized inputs, and then passes it to a front-end.
This belongs to "objective level" of working with code: how to write correct and efficient code, how to build the right abstractions, how to make it work in context of a larger system, etc. This kind of knowledge is something very amenable to self-directed learning: studying books, writing code, reading code, playing around, getting a feel for handling complexity, for how thoughts map to code, and code maps to execution.
> Now imagine that that class is a function that has to be packaged to work in the cloud. And, on top of that, imagine that the function has to be version-controlled in a repo where 5-6 people are regularly merging code, pass CI/CD, and is part of a system that returns the outputs of some machine learning model with latency constraints.
This belongs to "meta level" of working with code: how to write code in business context, how to collaborate with colleagues. It's not software skills - it's business skills and people skills. Both learned best through experience on the job.
Point being, the two types of knowledge/experience are somewhat orthogonal, though they reinforce each other. You won't learn how to write good code by just interacting with your peers at work, not unless one of those peers is learning independently and applying their knowledge to raise the craftsmanship level in the shop. But then, working in a team, under time and budget pressure, introduces tradeoffs that affect the way you code, in a way you just can't reproduce on personal projects.
To sell your skills to a business, and for that business to make use of them, you really need both types of learning. Companies understand the need for learning from peers well (perhaps too well, it's becoming a pro-office argument), but I wish they also understood the need for self-directed learning better, and allocated resources accordingly. As things are right now, I feel the progress in our industry mostly rests on people who are either paid to do R&D, have time to do learning off-work, or are just learning independently at work and not telling their boss.
As someone who’s had a grand total of 3 CS courses, in which were mostly taught things I already knew, I didn’t find it at all dismissive. Remember, it says “no one becomes a good software engineer by themselves,” (emphasis added), not “you can’t learn to program well on your own, or “you can’t effectively self-study CS.” There’s a big difference.