Hacker Newsnew | past | comments | ask | show | jobs | submit | toadpipe's commentslogin

What a crock. The things we know amount to nothing compared to the things we don't, and education doesn't even move the needle on that dial. So what? What matters is what you need to know to accomplish some goal vs. what you know right now, and that's much more tractable. Focusing on the (incorrect) idea that no one else knows anything either so you're just fine the way you are is pabulum designed to boost self-esteem at the expense of actually learning something.


I think you missed the point in the last section. I think a lot of people hesitate to seek out the answers for shame that they do not already know the answers. I'm not saying to be comfortable in your ignorance. I'm saying, don't be ashamed to ask questions.

And yes, as the rest of the article states, the third category is incredibly (even inconceivably large). In case I actually need to say it, those pie charts are for illustrative purposes only, i.e. not drawn to scale.


Niceness is not the same thing as determination. Also, trying to flatter someone described as a good judge of character doesn't seem like a good idea to me. /serious


People who give up too early, for any reason.


That's pretty good. Basically, people who are prone to give up when they encounter resistance; people who are the opposite of tough.


Pretty much the entire industry, as well as the academic system, is based on the premise that the vast majority of code can and should be written by programmers who will be borrowing heavily against great (in the sense of being large even more than in the sense of being high quality or thoughtfully designed) black box systems in the form of languages, compilers, libraries, operating systems, databases, and other applications that are written by a small fraction and are mostly built to make sure that the rest can't screw things up too badly. Leveraging Ruby on Rails to throw together a barely functional website/app is just one of the more blatant examples of the pattern that pervades the pipeline that now produces most of the software in the world.

There seems to be a large demand for software, and the system is good at producing a lot of software.


If you are not interested in learning how to do it properly, then hire someone who is. Failing that, do not attempt to do it in C.


It can increase people's commitment to compassion too. Like Lisp or Forth, it is an amplifier. Actually placebos are a better analogy. Both are imaginary social support that allows a body to commit resources that would otherwise be held in reserve.


Expressiveness in a single line of code is such a stupid metric.

magic()

What's that? It's a function call. What does it do? Anything. Everything. Why does it need a complex grammar? It doesn't. Spend your time thinking about the problem, not the language. What a concept!

Do simple things take many lines of code in C? Yes. If you only have a few data structures and algorithms in your mental toolbox, simple things will take many lines of code in C, or C++, or in any language. Why do people advocate baroque languages by claiming that you cannot do simple things simply except with baroque languages, and then blame people for complicating their programs by using that baroqueness?

Yeah, C++ programmers are feature junkies; the language is designed by a feature junkie and promotes language features as the solution to every problem. If you go against that, you are going against the whole culture.

If PL/I was the fatal disease, C++ is the shambling corpse.


Wrong: C++ promotes library features before language features. Language features are carefully considered before they are included or dropped.

In fact the entire language is designed with care, which should be obvious to anyone who has read The design & evolution of C++ or has followed C++'s evolution.


Yes, really careful... "Oh look, another shiny feature. Let's add it!" ;)

But kidding aside - the constraints imposed on C++ during its design (C backwards compatible) combined with a desire to have every feature under the sun available has led to an overly complex beast.

Don't tell me you looked at e.g. C++ lambda functions and thought that was "good design". It gets the job done, but that's the best you can say about it.


If someone could come along and invest the millions of man hours required in making a performance-critical high-level language without the C-baggage

As long as it also had C syntax, and something close to the C memory model, and was recognizably object oriented and/or functional. And came with a lot of libraries. And 3D graphics engines. And physics engines.

Unless some independent game developer does something in a new language that other developers can't easily duplicate with their current ecosystem (not that it couldn't be done in C++), and it catches on, there's no incentive to do anything other than continue to evolve things in the most backwards compatible way. There are always more C++ programmers coming off the assembly lines who want nothing more than to work on games.


This is really tempting, but my guess is that the answer is going to be no. Users want something that just works, and they would probably rather dig through a pile of stuff to find that one thing that just works rather than script it themselves. Elegant solution > pile of features > elegant scripting > source. Or you can do it all, like Excel. Even Word has scripting.

I think ahoyhere has alluded to the reason why the pile of features tends to win, and that is because it is better at leveraging the economies of scale in the shrinkwrap software business. It is so cheap to distribute software that you are much better off building for the mass audience. Now the hot thing is webapps which do not scale nearly as well (but avoid most of the junk that comes with shrinkwrap scaling, like having to deal with a strange machine and a crappy OS that deluges the end user in spyware, or pushing updates to users you don't know much about), so there is more incentive to meet the needs of a niche audience. And Apple is having some success sort of splitting the difference (vertical integration from the hardware up, and attempting to exert more discipline on developers to increase quality, all at the expense of distributing software to mass audiences). Microsoft, of course, is still in a pretty dominant position from exploiting this scaling to the max when IBM so graciously made the hardware into a commodity mass market item.

So far, the most successful web app (Google) is the one which has made web app discovery scale like nothing else (every page with useful content on it is a web app - code, data, it's all the same). The brilliance of Google is that it captured a mass end user market with a minimal user interface. They did it with math (maximal leverage of plain text queries plus the structure of the web, and similarly with ads which have simple interfaces at both ends and are backed with sophisticated algorithms), and scaling by imposing unusual amounts of internal discipline on the best developers they can get (and they can get pretty good ones). All this just to get to a position where they might be able to compete with Microsoft as a platform, and unseat the power of shrinkwrap scaling. I think the only way they can do it is by attracting a lot more developers than Microsoft, and the only way they can do that is by taking a lot of Microsoft's developers away. They might be able to do it, but there are all sorts of challenges. My guess is that what will happen is that they will draw a lot of developers away from Microsoft, and in the process they will lose a lot of their external cultural influence as far as being able to promote clean interfaces. Web apps will be even more dominated by the everything in one place aggregators (exemplified by Amazon and eBay) than they are now.

The problem with nice UIs is that there just aren't enough good developers to make it scale.


In my opinion, a lot of The Software Problem(tm) can be traced back to this tendency to underestimate the difficulty of using magic black box X. Maybe this is because most programmers never try to make a real magic black box X that will be used by random person Y in random situation Z themselves. If they did, they might have a little more respect for the fact that it's really fucking hard, the interface will inevitably have all sorts of corners, and the result is never going to be as magic as you would like.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: