Hacker Newsnew | past | comments | ask | show | jobs | submit | camcil's commentslogin

I worked as a civil servant for a decade and a half in various capacities up until a couple of weeks ago. I'm the last person that would tell you that there isn't plenty of fat that could be trimmed. Slicing at random, multiple delayed resignation opportunities, and threatening cuts to benefits, however, is doing the opposite. Those that are skilled enough and in demand, or like me lucky enough, to quickly find other employ, are the ones that are going to leave- leaving behind nothing but the fat.


People who should be fired are the very people who would be the best at justifying why they shouldn't be fired. Not at all by a coincidence.

"Slicing at random" could actually outperform most other methods, as long as it's truly random. You can weasel your way out of a firing based on vibes or performance reviews - but you can't convince an RNG that its roll was wrong.


Slicing at random would leave the same ratio of worker types. Random slices have the potential to cause enormous damage by removing critical contributors while also creating exploitable power vacuums. The actual solution is to just continue operating the way NASA has been operating because it's not actually a problem.


Not actually a problem? Have you seen what NASA is doing lately?

SLS. Orion. Gateway. Ambitionless Artemis. JPL's disaster of an MSR proposal. NASA reeks of rot and decay. It's not in a good place, and hasn't been in a long time now.

If you "just continue operating", it's only going to get worse.


SLS is Congress, not NASA. The core mission of NASA is space exploration and their achievements in that domain are unparalleled.

Randomly firing and cutting funding isn’t a solution. Especially not to a perceived problem. If you think the spending is excessive you have to do the hard work of explaining why and then the even harder work of fixing it. If that seems too hard then yeah, it’s fine.


Being able to paste source into early version of Dreamweaver really made things "click" for me. Image maps, IFrames, and SSI really set you apart. Ah, thems were the days.


Lest we forget that Python is ~4 years older than Java.


I would've imagined these numbers to be much more than they are.


Yup. This is surprising to me too. These are numbers of a relatively small IT/financial firm. I guess all the hundreds of millions of dollars in gaming are taken by a small % of companies like Rockstar or the big mobile games.


Are you misreading the chart? They're not in dollars, they're in units. On slide 28 it mentions they sold 24 million units last year. Multiply by $30-60 to get actual revenue, and it's far in excess of "a relatively small IT/financial firm".


That makes much more sense, thank you. I just looked at the table on the linked page and not the *.pdf of the deck itself.


thank you that makes sense


SEGA earns $2-3 billion in revenue every year.


Keep in mind a lot of that is from Pachinko, see articles like this: https://www.tweaktown.com/news/56271/sega-made-1-billion-pac...


What caused the drop in popularity in RoR? It seemed like ~10-12 years ago RoR was the de-facto startup standard. On any given day there were multiple items on the HN front page having something to do with RoR.


NodeJS and the siren's song of using the same language on the server as the client.

Despite the popularity, node never caught up with rails in terms of features and productivity. I was part of a replatforming from rails to node some 10 years ago. So many things we had to just rewrite because there was no option at the time in node. The team lead that made the decision left half-way through the project. Second worst thing to happen to me in my career, after covid of course.


Well and that JS is really fast thanks to V8 and TS has leapfrogged Ruby in terms of developer tooling. Personally I think Ruby is a nicer language and JS has a lot of odd quirks but it became Java. A good natural choice that can be used for any project.


> leapfrogged Ruby in terms of developer tooling

The only thing I would ever want from js is destructuring assignment. There's rightward assignment in ruby now but it's pretty clunky.

> became Java

Best laugh I've had all week.


Java is both significantly faster than JS and significantly less buggy/footgunny. JS is pretty much just a bad language in my mind. It seems specifically designed to create as many bugs as humanly possible. It's a lot like C++ in that way, but at least C++ is actually performant.


Mobile came on scene, which meant you wanted an API, which split web development into backend and frontend. At the same time, the powerful-but-crusty complexities of enterprise Java backend world, which Rails stood in opposition to, started to get more lightweight and fashionable answers in the Go/Kafka/gRPC/microservices scene. While a very convenient overall development experience, it didn't stack up as well when considered in isolation as either a backend or a frontend technology. Much of the Javascript integration it has today (Turbo, etc) came after people had already moved on.


> What caused the drop in popularity in RoR? It seemed like ~10-12 years ago RoR was the de-facto startup standard. On any given day there were multiple items on the HN front page having something to do with RoR

New things are "simple", and old things are inevitably complex, which always attracts the new generation of inexperienced coders (I include myself in this). This continues until all of the complexity of the domain are captured in the "new" thing, and the cycle begins again. Rails is vastly more sophisticated than when I started using it in ~2007, when things like CSRF attack mitigation weren't even built in. So it's a better framework now, but you have to understand a lot more to get started.

Also, from ~2012 until recently, bootcamps have been pumping out new programmers who only know Javascript because it was possible to do a full-stack web app with JS, and the bootcamps would rather not teach another language.


Performance, scalability or ultimately cost. Remember 10 - 15 years ago when Twitter was using Ruby it was a lot slower. Even without YJIT CRuby today would still be 2 - 3x faster than then. Tooling wasn't as good, Hardware were a lot more expensive at the time. So when you have news spread out about how RoR cant scale cheaply, they jump to newer and shinier things like Node.

I would guess running RoR today is 100x cheaper than 10 years before. And will continue to improve as we get ZJIT or running on top of JRuby.


When Twitter was using Rails it was also using a totally broken architecture that was way too centralised. Their replatforming also involved rearchitecting.


* Node bringing JS to the backend.

* Python won in data science/analytics and AI/machine learning

* Python also seems to be the high level language used most in academia for non CS engineering (and CS too)

Rails continues to be relatively popular in early stage companies. Plenty of well known companies started with Rails in the last 10+ years and it continues on as part of their stack.


Around that same time, microservice architecture was the new hot. Rails apps tend to be monolithic. Now that many people have realized that microservice architecture is often not worth the complexity costs, monolithic apps are back in fashion, and people are rediscovering how great rails is.



> What caused the drop in popularity in RoR?

Async/await. JavaScript and all other modern languages and frameworks have a great concurrency story. Rails still hasn't (but it's coming next year, it's been coming next year for a decade).


The concurrency story in Ruby is fine. We've been using multi-process Ruby scripts in production for over a decade. The pre 2.7 ruby had some issues, but it's been solid for years. The async/await programming paradigm is painful by comparison. Sure, there are languages out there that have been designed from the ground up with concurrency in mind, that have an even better concurrency story, but those do not put developer happiness(™) front and center.


the allure of the new and shiny


Adderall, I would presume.


It seems pretty simple: we get paid on the perceived business value of what we provide. Some organizations may take a holistic approach to determine that perception (our contributions to other things, and the advancement of the industry as a whole) or not.


Articles like this terrify me, and recently they are plenty. I have had a lifelong interest and participant in development, doing freelance at times, always utilizing it in my career when approved, and contributing to opensource. I finished out my BS:CS while working full-time and although my general-IT career has been pretty successful, I have always planned on trying to pivot into a proper development team full-time from the public sector.

Now that the structure of my organization, benefits, and even my job existing is on thin-ice (again, public sector), I have been dropping my name in the hat to open positions. My numbers are much better OP’s (landing at least a 1st round with ~10% of apps), but the closer I get to potential offers with some [great] companies, I can’t seem but to get even more worried about the stability or if this is the right choice for me and my family. My physiological and safety needs are met (i.e. Maslows), for now, but I have a longing for the rest of the hierarchy.

Is the industry forecast as bad as these outlooks paint it?


In a data conscious world, the complete and utter disregard for PII and lack of competency in design and implementation would result in catastrophic business failure.

They may have "patched" the ability to exploit it in this way, but the plaintext data is still there in that same fragile architecture and still being handled by the same org that made all of these same fundamental mistakes in the first place. Yikes.


> In a data conscious world, the complete and utter disregard for PII and lack of competency in design and implementation would result in catastrophic business failure.

As you are probably well aware, we do not live in that world. Companies like Equifax can suffer breaches exposing the personal information of millions and stock still goes up.


Sorry about that. Please fill out this class action postcard, and, if approved, you will receive up to two years of identity protection services provided by Equifax (to be served concurrently with any other court-ordered two years of identity protection services), or, if you have financial damages you can conclusively prove are directly linked to this specific identity disclosure, you may mail your evidence to the provided address for up to $1000 in restitution, pending arbitration.


Maybe they’ll send you duplicate $1000 checks if you claim to be the other people in the leak.


PII data breaches, especially PHI data can lead to high financial losses, mostly in the US through litigation. Fines in the EU are low in comparison.

Companies don’t like to talk about this, and they bury these costs deep down in their financial statements. But trust me, they’re quite substantial.


If that's true, then stock prices should reflect that. But that's not what we see after major PII breaches at publicly traded companies.


So you have seen the failure of Apple’s car project in their stock price?


It's worth noting that companies that are too big to fail (as I assume credit bureaus are considered) are great places to park money.


Affecting their bottom-line via litigation, less usage, whatever...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: