Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think you have two common misconceptions here.

1) Wages have very little to do with value/productivity in a free market. They are almost entirely determined by supply and demand. Value simply places a ceiling on compensation. Thus, if far more people can perform a "programming" job because of GPT-X, unless the demand for those jobs rises significantly the net result will be wage reduction.

2) There's this weird thinking on HN that since a developer's job involves [insert your % estimate of time spent actually coding/bug fixing] and the rest is figuring out requirements, dealing with weird requests, planning, etc. that means developers can't be replaced. However, I don't see a whole lot of discussion around what the difference is between a developer and a competent business analyst in a GPT-X world. The latter can be had for significantly less money, requires less training, and if the actual programming part is largely automated away or concentrated in the hands of fewer "real" developers, those roles start to look awfully similar.



It's not that more people can do programming with GPT-X around, because the AI will only solve the problems that have already been solved thousands of time in the past in slightly different ways. What GPT-X cannot do is left to real CS people. So instead of coding CRUD systems, we can do real algorithms research again, except fewer people are capable of it.


> competent business analyst

with the danger of invoking the "No True Scotsman" fallacy, I'd say that competent business analyst are even more difficult to get hold of than a competent programmer.

I've had so few managers that were competent at managing people and projects.


Business Analysts aren't usually (project) managers


The developer can prompt for a solution with specific storage/performance requirements by specifying an algorithm, and specific scalability requirements using by specifying an architecture. Imagine a business analyst prompting for an app, and getting a ruby on rails monolith with O(N^2) performance for the core behavior for a service that is expected to have millions of requests daily.


> Imagine a business analyst prompting for an app, and getting a ruby on rails monolith with O(N^2) performance for the core behavior for a service that is expected to have millions of requests daily.

I see this as the main argument against "we will just have tools that allow managers and ba's to do what devs do now". I think folks often forget that there are two sets of requirements for every app: business requirements and technical requirements. Non technical folks might understand the business requirements very well and may even be able to write code that satisfies those requirements, but the real value in a dev is squaring those business requirements with technical ones. A BA might look at a DynamoDB table and say "yeah lets just run a scan for the items we need", whereas a dev will look at the same problem and say "yeah we can do that but it will cause issue A, issue B and sometimes issue C". And the dev knowing those gotchas is why you have them there in the first place, a dedicated person that knows all these gotchas and makes sure you organization avoids a footgun in prod.


The follow-on prompts would be to refactor the existing system to solve the scalability issues. You'd need to be able to feed in the existing codebase for that, though.


1) Economic theory says that marginal product of labor (value) is equal to wages, at least in simple models.

2) With GPT-4 you still need to know how to program. A product manager can't replace you.


1) The real world is not a simple economic model. The wage rate is roughly equivalent to the rate it costs to replace an employee, not their marginal value. If your argument was true, company profits would tend toward zero as wages rise.

2) I specifically did not say GPT-4. If you think v4 is the peak of what will be possible when looking at how far we have come in just 2 years then I don't know what to tell you. Also, a product manager is not a business analyst.


1) I think it's approximately true in the real world. If you can hire an employee who adds substantially more revenue than what their wage is, you keep hiring employees until value created by an employee is close to their wage.

When discussing economy, it's good to start with understanding the situation through the lens of economic models and than look to what extent are the conclusions applicable to the real world.

Using your argumentation - above you said that xyz is true in a free market, but real-world markets are not free...

2) And I specifically said that I'm assuming no dramatic improvement beyond GPT-4. The 2 misconception I supposedly have... I didn't even make that claim.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: