Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'd be interested to hear, from people who have experience with these older systems, why they didn't take off.

Maybe it's that they were developed almost pre-web? I was using the web in 1994 on windows 3.1 but certainly never heard of smalltalk. I think it must have been the Microsoft ecosystem and then later the Linux ecosystem that kept "better" tools in the shadows. In 1994 windows was a fantastic upgrade from DOS. You could not just have multiple windows, but multitask!

These days it seems like programming platforms like node.js can take off astonishingly quickly because of faster and faster dissemination on the web. It's not just more adoption, but more contribution too.



Most languages were proprietary and expensive through the 80s. When C took off it wiped out (or much reduced) the use of expensive proprietary languages. C was free so a lot of people learned it in university and wanted to use it on the job later because they were used to it. Since it was free it was easy to introduce to companies. It established a base and then when Java came out in 1995, also free but a higher level language than C, it took off like a rocket. Now the idea of expensive proprietary languages seems absurd.

Many tools have been developed that support C-family languages and most programmers depend on those tools. It's a huge job to move out of that and into an image-based language like Smalltalk because everything changes at once. That's a major reason why new frameworks like node.js or Rails can take off quickly now: They fit into most programmers' normal workflow so they are easy to adopt.

I worked in an image-based language in the 80s and it drove me crazy that I didn't have diff and grep. I was glad to leave that language when I moved on to a job that used a conventional text-file language. I'm sure I'm not the only one who found it awkward.


Execution. Execution. Execution.

Really. Ideas are 1% of the work. Doing the idea is 9%. Doing it well is the other 90%. Different things matter every time, that's what makes it so hard.


And the missing 'other' 100% is getting anyone to take notice!

One can have the most wonderful, state-of-the-art, mature product - that remains largely unknown by the 'majority' because... well, why?? I still can't figure out this last bit...

(Modern Smalltalk is, unfortunately, a perfect example of this.)


I toss the "take notice" into the execution pile. That's why it's 90% of the work. That includes understanding why brands fail regardless of the quality of the good or service. It includes understanding the pains of your userbase, the expectations they have going in, and what you have to do in order for them to converge and spread your technology in an organic manner.

It is the holistic picture that constitutes success. That's the "other 90%" and it's a black art.


This is not useful. Great execution on a bad idea still results in a bad experience. Truly, you need both: great ideas and great execution, you can't just brush ideas off to 1% of the work and expect to win.


Surely. I think I'm a bit bitter from all of the would-be-CEOs that have a wonderful idea and just need the rock-star programmer to um, do the design, engineering, QA, customer feedback loop, product definition, you know, the actual work.

Let's take youtube; "I want to watch a video on demand". Really? What a brilliant 1920s style idea I've never heard before. It was how they pulled it off that matters.

What about an iphone/android? "I want one device that works as a PDA and a phone that does everything I need." Really? Never heard that idea before.

Every now and then someone comes along with a truly innovative and truly brilliant idea; and may God Help Their Soul.

I've personally suffered from being ahead of the curve many times. Or was it bad execution? In 2002, I had this AIM bot that you would send small messages too, then it would be posted on a website under your AIM name. You could follow your AIM buddies and see their messages.

Yeah, it's called twitter. I did it on top of AIM in 2002. In 2003 I did another bot that would proxy messages between anonymous users. Yep, you heard of that too, it's called Omegle.

Then in 2003 I did a multiplayer extension to an NES emulator that utilized DCC irc connections so that you could in an Fserve style, play random people well known games in IRC.

Sounds familiar? not yet. There will be a node.js/socket.io version of this soon by someone, I know it.

So yeah, even with novel ideas, I still think the 1/9/90 rule plays. In the 90 here is the very important element of timing, along with, of course, target audience.

So I'm not trying to brush off the necessity of true ideas, just trying to minimize the importance of it. Look around, many of the successful things that we use are totally void of true innovation (as in, I'm not using the First One). The desk I sit at, the monitor I use, the keyboard I type on. They are just decently executions of old ideas.


The Microsoft dev tools were important for PCs, but they weren't the only game in town. Lots of people were still using Turbo Pascal and later Delphi -- there weren't a lot of tools that could generate Windows binaries yet. Some people were using DBASE, Foxpro, etc. Watcom C targeted 32-bit DOS extenders which were important for games.

The most important thing was (and still is) that you could easily integrate your the OS and UI framework of choice. Smalltalk couldn't provide that because it tried to replace all that with its own code -- which the implementor would probably further tweak into unrecognizability.

UNIX of course was its own world at this point, but it also was driven by pragmatics. You aren't going to write your MUD in Lisp if its going to thrash your swapfile, or if your friend down the hall can't hack on it with you.

It's also questionable whether these tools were "better". The one time I tried to use VisualAge all I could think about was getting back into my comfy text editor. I am hopeful however that we can eventually improve on the age-old compile/edit/debug cycle.


Most of this predates PCs and Microsoft dev tools (unless we are talking about Microsoft BASIC) by a good couple years. When Smalltalk was conceived there was no OS or GUI framework to integrate because all the OS did on a personal computer was file IO and start programs.


True enough, and there's an interesting history of Smalltalk on early Apple computers that I wasn't aware of: http://basalgangster.macgui.com/RetroMacComputing/The_Long_V... Makes sense, since you can see the Smalltalk influence today in Objective-C.


One interesting reference on the subject is the final seection of Richard Gabriel's Patterns of Software[1].

And just to play devil's advocate: both Smalltalk and Lisp were designed ex novo to do things that had never been done before on machines that didn't yet exist. In contrast, node.js was built on top of pre-existing languages and tools already familiar to its target audience, to solve broadly similar problems in familiar environments.

Not that there's anything wrong with that — but it's hardly a fair comparison.

[1] http://www.dreamsongs.com/Files/PatternsOfSoftware.pdf


VisualAge for Java is another example of a Smalltalk-inspired visual programming evnrionment that never quite caught on. It had a lot of cool advanced IDE features, particularly for its time. It was also slow and kind of crashy, a complex program in its own.


VisualAge Smalltalk begot VisualAge Java which then became Eclipse, although over the transition some things changed a lot (more interop, no images). I think in turn IBM got VisualAge from their acquisition of OTI; Dave Thomas has the history, you could ask him if you ever attend OOPSLA .




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: