A lot of people will give a list of technical reasons (e.g. native platforms are "bad"), but in my opinion those tend to be exaggerated to a considerable degree – Cocoa/AppKit, Win32, etc aren't sexy or buzzwordy but they are deeply capable and in the hands of a knowledgable developer, can save a lot of time by virtue of no need to write a bunch of widgets from scratch (or contend with all the caveats and gotchas of a bunch of half-baked widgets from third party libraries).
I think the biggest driver is actually a shift in software licensing models. Web apps are eminently more compatible with subscriptions and SaaS; access given by one-time purchases is easily revoked or diminished, they can't be pirated, and user data living in some datacenter somewhere makes it a cinch to keep customers locked in. In short, it strips control from the user while giving more to the developer. This by far has greater implications on developer profit margins than tech stack alone does.
I think the biggest driver is actually a shift in software licensing models. Web apps are eminently more compatible with subscriptions and SaaS; access given by one-time purchases is easily revoked or diminished, they can't be pirated, and user data living in some datacenter somewhere makes it a cinch to keep customers locked in. In short, it strips control from the user while giving more to the developer. This by far has greater implications on developer profit margins than tech stack alone does.