Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Continuous Delivery", huh?

Yeah, let's replace one consulting fad (the 4 or 5th I've seen in my career, I entered when "Waterfall" was still in vogue, then XP, then Agile, some variations of each too) with another.

How about this methodology: http://programming-motherfucker.com/



"Extreme Programming" always made me giggle. I mean, during that era, I was rarely seen without a can of mountain dew by my desk, but still.


Kent Beck who created much of XP, is quite sarcastic in what he writes at times. I think it's meant to be sort of sarcastic tongue and cheek sort of title so giggling at it is probably the right response. XP is not extreme at all, in fact it's quite reasonable.


People tell me that waterfall was never really a thing, and it was only ever a name for how things should not be done and that nobody ever actually advocated it.

Is this not true? I'm too young to remember what came before agile.


Pure waterfall as it's depicted by Agile consultants was never a thing. Think about it, how could you possibly code an entire app and only then debug it?

However, there was a waterfall-like process that was widely used. After getting the reqs and design in place, you started coding and you mostly coded. You would compile as you went along and made sure that your code worked on the handful of tests you put together as you went along. When the app was mostly coded, you then started more serious testing: writing more elaborate tests and verifying functionality. This step often highlighted buried errors that you then spent time debugging, writing additional tests, and so on.

So, the biggest innovation in Agile was the latter's orientation towards developer testing concurrently with coding. Which, of course, has had many other ramifications.

While the old way of doing things might sound klunky by today's standards, when done right it actually worked better than its reputation. Because devs hated spending hours in the debugger, they tended to code very carefully. The concept of "let's code this and run it through some tests to validate it" was unknown. Rather you coded so that you were pretty much sure that what you were writing worked properly if it compiled without error.


> Pure waterfall as it's depicted by Agile consultants was never a thing.

Yes it was.

> Think about it, how could you possibly code an entire app and only then debug it?

Which is why programmers hate waterfall so much, but you're absolutely wrong to think this isn't exactly what was being attempted. Time and time again, management in an attempt to cut the cost of programming time, thought the way to do it was to first build specification for everything and try and prototype the entire app up front, storyboard every screen, build out mountains of specs because naturally they think that's how you build things, with detailed blueprints so all the decisions have already been made. They couldn't be more wrong, but it's certainly the most natural way to think if you aren't a programmer and don't know better.

> So, the biggest innovation in Agile was the latter's orientation towards developer testing concurrently with coding. Which, of course, has had many other ramifications.

Agile in general had nothing to do with testing, the big change in agile was removing the big design up front, the specs and meetings and months spent planning something before being developed. Some agile methods like extreme programming certainly had testing as a big part of their process, but what differentiated agile from waterfall was introducing iterative programming where work was done in short week or two cycles and then delivered whereas waterfall wastes enormous time trying to nail down details that simply ended up being wrong come programming time.

tldr; agile is not about testing, it's about iterative development in short cycles with little planning and always has been and that's what made it different from waterfall; no "Big Design Up Front".


"Pure waterfall as it's depitected by Agile consultants was never a thing."

This is correct.

The first paper to describe a stepwise model was by Royce in 1970 [1]. The model he is describing is hypothetical and does not use the term Waterfall.

The first use of the word "waterfall" (including the quotation marks) is from 1976 [2] and specifically refers to [1], the hypothetical model. In [2], the writers specifically state that "so few" projects fit this scheme.

If you were programming in the 80s and early 90s, you would know that no one in programming ever referred to a "waterfall" model.

Even Kent Beck's seminal book on XP, written in 2000, describes many failures of software development in those days, but he does not once use the word Waterfall.

So, in summary: the paper that supposedly describes it describes a hypothetical system; the paper that first uses the term mentions how little the model is used; the term wasn't used by people during the era it was supposedly most popular; and the folks who initiated the new generation of software dev don't refer to the model.

I think it's safe to say that it was not a thing. Or, to be more accurate, if it was a thing, it was never the thing it became until the Agile consultants used is as a strawman.

[1] http://www.cs.umd.edu/class/spring2003/cmsc838p/Process/wate...

[2] https://goo.gl/eYLBN7


The term "Waterfall" was not applied.

but it absolutely was very common to use the process that it implies.

many man-years of gathering requirements, building lists of features, interrogating stakeholders.

people knew they would never get a second chance to get what they wanted, so they would throw every feature in that they could think of.

Then the developers would build it. more man years building. mostly they tested as they went along, but....different features and areas of the app were often build in isolation, so the most you could say was that you had tested what you had built.

the stakeholders were consulted with screenshots, and sometimes the app, but the Big Fear was new features and additional requirements (which would cause massive contractual headaches) so any contact was VERY carefully controlled.

Also it was assumed every project was massively valuable IP, so detailed designs and features were held very close to the chest..again, any contact was VERY carefully controlled.

Eventually, after months or years of effort the various parts of the app would be "merged". omg the cluster fuck that would occur, the compromises, and arguments, the agonizing...

..then more testing..

Once that was completed the brilliant new creation would be seriously unveiled to the users to cheers and relief.

Finally, the contractual battles would begin as the paying company realised they had asked for entirely the wrong thing, and the development company tried desperately to cover their ass and make a profit.

The single biggest problem that Agile fixes isn't the testing, its the disconnect between what the client wanted and what the developers actually deliver.


I was there in the mid-nineties, and it was absolutely a thing in enterprise style IT.

There were also several books on the matter, and also revised waterfall models. The very term was not always used early on (though around the nineties it did), but the schemes were the same.

Lots of public projects, including some I've been involved, were designed and managed in such a way.

There were no changing requirements until the things were delivered, which could take 1-2 full years. The design phase resulted in monstrous 400 or 800-page documents covering every aspect, and to apply for such a software tender you needed to write those in excruciating detail (I did that too for several projects).

The waterfall model, with the name and all, was also taught at our university (early/mid nineties). XP wasn't even mentioned back then at our level, though we discovered it on our own, reading Fowler, Beck et al in the late nineties.

There's some tinfoil conspiracy theory thrown around that Waterfall was never a thing etc, and it was just used to push Agile etc. I wonder what its adherents did in the 80s and 90s, but surely not programming in large enterprise/public/military etc software projects.

(That Waterfall cannot work is another thing altogether -- nobody did it 100%, as nobody does Agile 100% today).


People would refer to "The Software Development Life Cycle", which was by all means a waterfall-esque process (look it up on Google, you'll see the classic waterfall diagram used for it). The word itsself may have been coined as something to compare Agile against, but the process itself existed, and existed long after Agile had "won" in more traditional companies.

I ran projects that ran through a VERY traditional waterfall process as recently as 2010.


Saying that the term was never used isn't really refuting the point. It doesn't matter whether you coded back in those days. Anyone who looks at (or used) the software being produced back then can see that continuous software delivery wasn't really the norm, probably due at least in part to technical limitations. I'm thinking about your average Microsoft product, but there are plenty of examples even today of poorly-designed waterfall projects (e.g., healthcare.gov for Obamacare).


You're wrong. Look at the responses you're getting, many of us were there in the 90's and if you think waterfall wasn't the dominant method of developing then you just don't know what was going on back then. It doesn't matter if they called it "waterfall" or not, it matters that it's simply what they were doing. Massive requirements gathering and spec writing with development being considered like construction of a blueprint at the end of a months/years long cycle. That's how development was done before agile came around.

Waterfall still isn't dead and will never die because it's how people think by default, plan it all out, then build it. That's how buildings are made and thus it becomes the default methodology of all new managers who don't know what they're doing.


Parent just took a "conspiracy theory" thing that's popular around Waterfall and run off with it.


This thread makes me feel old.


This is absolutely wrong, I have seen first-hand people seriously advocating (and actually trying to practice) it, even within the past 10 years.

The thing to remember though is that the 'pure' version of the process (in which you never return to the previous level - hence the name 'waterfall') was regarded as an ideal, not as something achievable. Nobody expected that there would be zero bugs found in testing that would require more coding work (why test otherwise?), but it was regarded as a kind moral failing, to be met with a lot of hand-wringing about why we can't be like real engineers, who never make mistakes &c. And the same applied to revisiting the design or requirements phases.

The next level of idealism held that even if avoiding returning to a previous stage was impossible, you could at least limit it to the immediately preceding stage. As Royce pointed out in the first paper to provide a convenient diagram of the established software lifecycle to cite, this never happened in practice either.

Over time the necessity of doubling back became less of a moral failing and was incorporated into the teaching of the process, without any acknowledgement that the entire concept was fatally flawed. It was also succeeded by several iterative but still waterfall-like processes (starting with Royce's paper and eventually followed by Spiral and RUP). Thus there are many folks who still believe that the pre-Agile process (they may not recognise that there was more than one) was just fine, and that the pejorative 'waterfall' was applied to something that never really existed. In a certain sense that's true, but it gives a completely misleading picture of the history.

Agile's contribution was the idea that any or all of the phases could be happening simultaneously - and, more importantly, that was OK. Plenty of people had been ignoring the orthodox methodologies and working that way, of course, but don't believe anyone who tells you that was an orthodox idea before Agile.


It depends.

Much like most "scrum" teams don't really do by-the-book Scrum, it was very rare to do by-the-book waterfall.

If you defined waterfall to mean

- get every requirement written up in absolute detail before you do any design

- finish the design in great details before you write any code

- if the design process identifies problems with the requirements (missing/ambiguous/etc), then stop the design work and go back to requirements phase

- write all the code before you start testing.

- if during development you find an issue with the design, stop development and go back to design

etc

Then I never saw any project that worked that way.

But it was quite common to have a requirements gathering exercise, write up a document that covered the requirements, get that "signed off", then do a multi-stage design phase (usually we'd start that before requirements were signed off, since the probability of the requirements being rejected in their entirety was pretty low), sign that off, and then move into development to implement that design. And then test the whole thing at the end once it was "done".

If at any stage we found missing requirements, that would get raised as a scope change. If, during development, we found that the design was broken in some way, we'd have a design change (which was usually lightweight and was a couple of emails saying "we can't do X, so we're doing Y, OK?")


Only opponents to it calls it Waterfall, but I've definitely worked in environments like that back in the old days. We got very little done.


That methodology is awful. And kind of embarrassing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: