I disagree. ChatGPT reached 100 million MAUs 2 months after launch. It’s one of the fastest-growing consumer applications in history.
Anecdotally, lots of my non technical friends (and me) are using it for everything from cooking to learning a foreign language.
Lots of my technical friends are using it for side projects on the weekends. I’d say it’s the top new technology all of them are working with or incorporating into their workflows.
I and all of my teammates are using it to help us write sql and answer basic programming questions.
It’s clearly a way bigger deal than VR right now.
The problem here seems to be that Snap rammed this feature into their product in a really awkward fashion that doesn’t make sense for their users. Hence the backlash.
You're inside a bubble though. The 100 MAU is certainly totally misleading, maybe 1/10 of that in reality, and something that's only out for a few months can easily rollercoaster up and down as people try it once for novelty and then forget it.
I have a good group of friends who keep me grounded - we all went to my average state school alma mater, and none of them are in tech.
Not one of them has brought it up, no one uses it or cares about it, and only two of them even know what it is beyond having seen some headlines.
The playoffs have gotten about 200 texts recently, AI 0. This is closer to the reality on the ground.
I'd venture almost all of the hype is students and kids who are excited to see what mischief it can help them achieve, techies who just like playing with new things, and companies trying to cash in. All of those are non-durable.
“You're inside a bubble though. The 100 MAU is certainly totally misleading, maybe 1/10 of that in reality, and something that's only out for a few months can easily rollercoaster up and down as people try it once for novelty and then forget it.”
What’a your basis for saying this is misleading and doubting that figure?
Anecdotal friend groups aside, if their was no user traction, they wouldn’t be getting a ten billion dollar investment from MSFT.
Their growth in web traffic is also pretty impressive:
In my personal and professional life I’ve been using it every day and happily pay $20 for premium. It has replaced google for me for a huge variety of queries.
Because they’ve had limits on accounts combined with trivial registration of new accounts, combined with counting enterprise “accounts” from stuff like bing or api usage which really is just the same people using them over and over. That and startups tend to look at numbers that please them once and never question them. I don’t doubt they had a ton of accounts sign up. But MAU after 2 months isn’t a statistic it’s a data point, and coming from the most unreliable source possible.
Incredible growth in traffic is easy to explain - a lot of big companies and the most influential investors invested a ton of money in this. They have connections to all the top media sources. They pushed this stuff absolutely everywhere. It was a massive media blitz. You’re being manipulated. They’re hyping AI, they’re talking about doom and fear, they’re getting front pages everywhere. Everyone wants in - media wants in on the hype, social media users realize they can gain tons of viral views, it’s a giant pit of self reinforcing hype. That happens with things. See Pokémon Go, and then what happened a short while later. Or the crypto bubble. Or any number of bubbles. You can’t base your predictions of future success on self published popularity numbers after a huge and expensive media blitz.
The Bay Area usually commands a premium because a) quality of talent b) the ability to scale out a team.
Quality of talent means not only intelligence and skill but also people who have spent years working on the specific thing you are building (hardware/firmware, AI, at scale codebases or services).
If you assume that timezone matter and relevant experience working in large orgs is important, the Bay Area premium will continue for the foreseeable future.
Scale means you can hire 100-200 talented IC's within a year that meet the quality of talent criteria and have experience working and getting things done in larger orgs, the ability to do so also commands a premium.
Only a few other places in the USA have this scale, i.e New York and Seattle.
Coinbase the company I work for is fully remote and does salary bands by location. Within the USA, Seattle/New York and the Bay Area all are in the same top tier band. Also, the majority of our USA engineering workforce is still based in these hubs despite being fully remote for nearly 2 years. I don't expect this trend to change any time soon.
More like a) well-funded || revenue-generating firms competing in a limited pool of local talent and b) high salaries have yielded high costs of living, so to hire local, we must pay said premium
Coinbase is fully remote, i.e no local talent pool dependency, with no expectation of coming into the office, can hire from anywhere in the USA and still it mainly hires people in those locations.
It's not the only remote company where I've noticed this happening.
This is why I still think we are early on with tech stock corrections. I.e current P/E ratios assume that past earnings are still accurate.
Specifically apart from rising rates I would expect this to eventually hit public company earnings in a big way and hence prompt more layoffs in public/private tech.
Last earnings season didn’t see much of an impact. We are a couple of weeks out from earnings, I wonder if this or the next quarter will be where we will see more layoffs and the tech jobs market generally tighten?
The p/e of most of the big names is still in the 20-25+ range. Way too high, and that's before earnings for the current quarter.
In a downturn, an expensive new phone will be the first thing to get cut from consumer shopping budgets. And indiscrimate marketing spend will be the first thing to be cut from business spending.
Not so sure about that phone. Most people are getting them on credit/contract, so you'll just see contract terms extended and/or made less generous to compensate for higher interest rates and inflation. People will still replace their phones whenever they can because it makes them feel good.
You can see how rapidly this has been rising since the covid deleveraging. But we're now back to trend and rates/real prices will be higher.
As long as consumers continue to lever up aggressively, earnings will be somewhat supported. But we're about to run into a brick wall with maxed out credit
Do you have any idea why the consumer credit total in this chart doubled practically overnight in March 2010? It looks like a definition change, but I couldn’t find an explanation.
Network states are online communities that have collective agency (governance of some kind) that eventually try to materialize on land in the physical world. A DAO, could potentially become a network state but it could also in theory emerge from a subreddit or some other online community organized around a specific thing.
Balaji has a particular vision for these network states that sees cryptocurrency as being an integral part of them. It also presupposes that these network states need to have a moral imperative to be long lasting (I.e a strong purpose like a religious community, being against the FDA, dietary etc)
An important point to note is that a network state is not inherently a “right wing” or libertarian idea. In fact Vitalik references another more left leaning author, David de Ugarte, who explores similar ideas from a different perspective in his book Phyles: Economic Democracy in the Twenty First Century.
It’s entirely possible to disagree with many of Balaji’s previous positions and see this as a useful playbook for implementing a network state that aligns with your world views.
A large part of his book seems to be laying out a justification for this vision as well as it’s theoretical underpinnings. I.e why this needs to exist and why this would be better than say moving to an existing city state etc.
Apart from that it’s basically a playbook for how a community could in theory go from lose collection of individuals on discords to a mini city with its own regulations and laws.
Vitalik is sympathetic to much of the book but calls out 4 main issues he has with it:
1)The "founder" thing - why do network states need a recognized founder to be so central?
2)What if network states end up only serving the wealthy?
3)"Exit" alone is not sufficient to stabilize global politics. So if exit is everyone's first choice, what happens?
4)What about global negative externalities more generally?
Of these critiques the ones that resonated with me so far are 2 and 4. I’m only about 25% through his book. In terms of 4, I think this exists today with nation states and hence I think it’s a little unfair to expect this to be addressed in this book.
In terms of 2. I think this book is written for middle class and wealthy people who can easily move cities and or countries. I.e software engineers and scientists.
A big question for me is, assuming network states are a thing that happen and are wide spread. What happens to all the displaced unskilled or semi skilled global poor? What will their likely relationships be with these new network states?
How do millions of people displaced by wars like in Syria or the Ukraine fit into or impact this network state model? People who are forced to exit as opposed to having the luxury of choosing to exit. This seems like a bit of a blind spot if even from just a network state game theory perspective.
In general I’m enjoying this book so far and would recommend people read it if they are interested in subjects like charter cities or DAOs.
I treat it as a thought provoking work that’s not mean spirited in tone like the sovereign individual.
Within my lifetime I expect to see people try and create new charter cities bootstrapped from online communities. I think this book offers a lot of useful advice on how to think about forming these communities.
But Balaji's book isn't really a playbook for how a community could go from loose collection of individuals on discord to a mini city! If it was that, it would be a far more compelling read, instead the first 50% could be summed up as, "institutions bad, crypto will save us all, media is biased, America is just as bad as China, and India is rising." And the back half while more interesting basically rehashes similar examples over and over (keto community) without many tangible details on going from 0 to 1. Could have been far more interesting, talked about things like Sovereign Military order of Malta, Charter Cities, SEZ's, and other interim ways for a network state to come to be but instead was just repeated ramblings and definitions.
The big missing piece of this article is a sense of at what scale and why should a startup decide to invest in a piece of infrastructure like Kubernetes.
The author mentions other things he considers red flags such as using a different language for backend and frontend development with no additional context.
Is the author talking about a startup in the context of one person who just knows JavaScript working on their own building a prototype? Is he talking about a series B company with 500k MAUs?
Some additional context would improve the article a lot. I think the author should have had a few people read over the article and given feedback before publication.
Nah, I disagree. I work in crypto (coinbase) and was working in tech in the Bay Area in 2008.
As of today, many crypto companies have money from 2021/early 2022 raises and are still hiring. In 2008 the private tech market reaction to the stock market crash was swift and brutal.
It was really hard to get a job in 2008. Today it feels like their is a big lag between stock pullbacks and jobs drying up. None of my colleagues who were laid off are having trouble getting jobs in crypto and have options in other parts of tech if they want it.
To be clear I expect the jobs situation to get worse this year and in 2023.
All this panic about tech layoffs seems a little premature given we haven’t even begun to see the impact of rates hikes on quarterly earnings yet. Many companies still have open recs and budgets to keep hiring.
I work at a company that just laid off 18% of the workforce (Coinbase) including many people in engineering. The day this happened my email, LinkedIn, Twitter inboxes exploded with companies asking me if I knew of anyone looking for a job or if I myself had been impacted.
I reached out to many of my former colleagues who had been impacted to see if they needed help. All had multiple interviews in flight and were not having trouble finding jobs.
Aside from many crypto companies the inbound jobs were from many public and private companies and spanned many industries.
I expect tech layoffs to get worse and the white collar job market to tighten towards the end of this year and 2023.
I expect the main driver for this will be cost reduction at public and private companies in the lead up to earnings or quarterly reports. Main cost for a tech company obviously being labor.
The thing that seems so weird to me about the current state of the economy is that you have lots of "smart people" shouting pretty loudly "A recession is coming! Is going to be bad! We may already be in one!!" I don't remember any recession - not the early 90s recession, not the .com bust, not the Great Recession - having anywhere near so much foreshadowing. Sure, there were people during the .com boom saying "Umm, you know, 'eyeballs' don't pay the bills and at some point you need to actually make money" and during the 00s housing bubble "I'm not even sure lenders are using the 'can you fog a mirror' test anymore when making loans", but it wasn't particularly widespread, and large sections of the economic and policy elite actively downplayed the possibility of recession.
Now though, I see the exact opposite. Tech moguls, central bankers, VCs, etc. etc. have been warning "this is going to be really bad" now for months. But, in actuality, it's not that bad (yet). Yes, inflation is really bad, but unemployment was 3.6% in May. It still feels like many companies are having a very difficult time hiring and keeping workers.
So why the difference? I don't want to go into "conspiracy theory territory", but I do think it's pretty undeniable that there is a marked difference in "warning levels" between the current time and recessions in the past 40 years.
That's because this recession is going to be manufactured. Powell was clear, wages are rising too fast for the feds liking so they are going to crash the ship into the rocks.
> According to a transcript of the presser published by the Wall Street Journal, Powell blamed this inflation crisis, which is global, not on the proxy war in Ukraine [1] and Western sanctions on Russia [2], but rather on U.S. workers supposedly making too much money.
This is basically Russian propaganda. The author and founder of that website has a history of sympathetic covarage of authoritarian regimes (https://en.wikipedia.org/wiki/The_Grayzone). It's also funny that these far-left journalists always cite their own previous articles. Powell mentions the war in Ukraine as contributing to inflation, but since he didn't call it a "proxy war" I guess it doesn't count.
> “Employers are having difficulties filling job openings, and wages are rising at the fastest pace in many years,” Powell complained.
LOL. Definitely sounds like a complaint to me. No editorializing here. That's why Powell said this later in the press conference:
> If you think about it, if you look at the last cycle, we had a very, very—longest expansion cycle in our recorded history, and in the last two, three years, you had the benefits of this tight labor market going to people in the lower quartiles and it was—you know, racial wealth and income—not wealth but income gaps were coming down, wage gaps. So it’s a really great thing. We’d all love to get back to that place, but to get back to anything like that place, you need price stability.
People are obsessed with real estate prices going up because low interest rates result in full employment which means more people can afford to pay more for a house. The strange part is the argument that raising interest rates is going to somehow make housing more affordable. It doesn't solve the underlying problem of a lack of housing near job centres. An interest rate hike means less employment and that is ultimately what drives house prices down. It doesn't make housing more affordable, there isn't enough housing to begin with and now there won't be any jobs.
If you want to lower home prices you are going to need to stop land speculation and so far public land ownership has been the only way to do that e.g. in Singapore. The alternative is a land value tax.
This orthogonal problem needs orthogonal solutions. Money is about employment and trade, not about housing. Housing is a land allocation problem e.g. zoning.
It’s an old fashioned sports league lock out. Players salaries too high in fed’s opinion. Labor’s cut approaching 60%[1]. I think the owners prefer it at 50% if not lower.
Seems like a weird theory considering the history of that graph over the last 70 years. Not really much variation considering how much the US economy has changed.
I don’t get it? In the grandparent comment you said the number would be pushed lower. But now you’re claiming it will actually be higher. Or maybe that it just won’t change much?
Fed will want to push it back lower. Due to extremely low unemployment, it has been pushed up. they don’t update it often enough. Will probably show up in Jan 2022, then down in 23 / beyond if recesssion.
I think you're misremembering. The 2001 and 2007 crashes were very widely foretold. The current recession risk concensus seems pretty weak to me right now. That's just perception but the popular joke goes something like "economists have predicted 8 of the last 3 recessions".
The risk of an unexpected recession is much worse than a warning that doesn't come true so warnings are always pessimistic. Risk right now still feels 50/50. The next CPI report after the major Fed action will be watched very closely.
> I think you're misremembering. The 2001 and 2007 crashes were very widely foretold.
Hard disagree, at least by what I said in my comment about what "widely foretold" meant. I mean, the IMDB opening description of The Big Short starts with "When four outsiders saw what the big banks, media and government refused to..." Michael Burry, https://en.wikipedia.org/wiki/Michael_Burry, famously said he wasn't a super genius or anything, and was surprised that so few other folks saw the coming housing collapse like he did.
Again, my point is not that nobody could foresee that the recessions were coming, it's that the institutional "powers that be" - government, large corporations, VCs, etc. - actively downplayed the risk of recession. The exact opposite is happening now.
JP Morgan navigated the 2007 crisis and came out on top. They're saying recession risk is 50/50 and the S&P will end up positive for the year. There were definitely a lot of people in power and especially policy makers who put their own interests ahead of what the data tells them.
Burry is also a shameless self-promoter who has predicted a lot of disasters that haven't happened including WW3 a few years ago. There's like a dozen people who have made careers out of claiming to be the only ones that predicted the 2007 recession. If you keep predicting recessions you're bound to be right. In reality, it's just not possible to predict accurately. Here's Krugman in late 2006 presenting the data and putting the recession risk for 2007 at 2:1
Right now there are several indicators flashing and a lot that aren't. Technical data like P/E ratios, volatility, yield curves are great at predicting things that happened in the past but they just can't be relied on as being infallible.
Burry's famous because he placed a bet on a carefully researched observation connected to the sub-prime crisis before it was remotely known, and he was disbelieved.
He might have made other wrong predictions but to gloss over the above fact is a big omission.
There are the economists and there are people with skin in the game, I think there's a big difference if you're a wealth manager versus a NYT editorialist. Incidentally Krugman seems like a proponent for whatever blue tie is in power (https://www.nytimes.com/2022/01/04/opinion/2021-economic-rec...). That's different than placing bets in the tens of millions.
I'm not sure what that link is supposed to prove. 2021 was a pretty great year. Krugman is usually very sanguine about not ascribing credit policy or presidents and he didn't in that article. He doesn't disclose his holdings because he doesn't give investment advice.
And Burry definitely wasn't a rogue genius. Lots of hedge funds bet on the crash. Goldman Sachs was specifically called out for betting heavily on a crash while they were still selling mortgage assets to customers.
2021 was a fake year for growth, that's the point, because we were relying on cheap and plentiful money being subsidized by borrowed time. We'll be paying the press for a long time for the spending and helicopter money that provided demand, while supply was still limited. For Krugman, a monetarist, printing money when there are fewer goods and services in an economy, wasn't registered as a problem. Ah, but he admitted recently that he was wrong in the party line of "transient inflation" (where every private economist outside of academia and the government knew it was not going to be transient and made moves at the end of the 2021).
I'll give a slightly different view. Many of us who lived through the early 90s, the dot com bubble, and the great recession are worried about this because recessions happen pretty frequently (every 4-6 years). However, the last big downturn for the US was 2009.
While there was the Covid downturn, in general, the economy has been on the upswing for 12 years. The US Federal Reserve has appeared to manipulate that due to the Great Recession.
So, many of us are worried that roosters are coming home to roost.
That doesn't mean it will happen, but our fear is mean-reversion.
>So, many of us are worried that roosters are coming home to roost.
According to Keynes a recession only happens when the interest rate on financial capital exceeds the maximum yield of physical capital. That implies that if the interest rate is set properly, then you would expect economic recessions to never happen. Economic cycles are equivalent to oscillations in control theory and those are usually a sign that you are doing something wrong.
The expectation that when things are too good then something bad must eventually happen is completely misguided and can at best be explained by having a medium of exchange that is incapable of conducting in some transactions that humans would like to engage in and are currently engaging in.
Barter is unable to represent a lot of useful transactions. Money without credit is unable to represent a lot of useful transactions. Money with credit is unable to represent a lot of transactions. The next stage is money with credit and inflation is able to represent more transactions than without inflation and so on.
This means that if anyone tells you to go back and adopt a system that allows less transactions they mistakenly believe that the transactions you have conducted are somehow sinful/immoral or simply shouldn't happen and hence adopting their system will require undoing a lot of transactions which they consider akin to divine punishment for going against the laws of their preferred money system.
It is particularly common with Austrian economists who insist on going back to gold currency. However, because gold is unable to conduct a lot of transactions that we today take for granted, all the growth that happened because we abandoned gold shouldn't have happened according to them and we must now pay for going against the gold standard.
It is complete nonsense. If demurrage currency sits at the apex of representing the most transactions then it is entirely plausible that this economic cycle and crisis crap was pointless and inefficient nonsense to begin with and that booms and busts should only ever occur due to the real business cycle theory which as it stands only explains a subset of all recessions and not most of them because it is about external economic shocks in e.g. oil and gas prices for example.
The expectation is that interest rates will be increased to reduce inflation however this increase will also be expected to cause a recession. The hope would be that this recession would be smaller than the later recession that might be expected later from the high inflation. The theory was the same in the ’70s between Carter and Reagan. This time around, some think that raising rates won’t work in the same way to curb inflation due to differing theories about its cause. Needless to say, these things are hard to predict.
People knew there were dotcom stock bubbles (dotcom bubble) and real estate bubbles (GFC). People knew during the gas crisis that the supply shock sent inflation skyrocketing (70s). People knew that wildcat banking would cause insolvency and runs (tons of recessions and panics in the 19th century), people knew that unrestricted lending would cause the mother of all long squeezes (Great Depression).
In this case people knew that overly dovish monetary and fiscal policy would overstimulate demand. That leads to inflation, and the Fed’s job is to balance inflation with employment, and since the unemployment rates they look at very low, that means it’s time for them to raise rates to curb inflation by reducing aggregate demand. This has a knock-on effect of cooling asset prices since credit becomes more expensive and harder to get, so the market deleverages.
One difference is that expectations are set but interest rates/the money “printed” due to overly dovish policy are still in the process of being changed. So stocks are sold off and people are planning for a recession, but layoffs and the actual economic cooling haven’t happened yet.
It’s entirely possible that the fed is able to cool inflation easily (it could also resolve itself if Russia stops fighting or China stops trying to do zero-COVID) without that much of a recession, but raising rates lowers demand, so since rates may go much higher than they are, a recession is likely. In terms of assets, rising rates and subsequent recession involving lowered rates are already priced in
I've had 3 folks respond now that "hey, people knew these bubbles were coming", so apologies that my original point wasn't clear.
I wholeheartedly agree that some people saw the crashes coming, and honestly, I don't even think they were that hard to spot. I'm a pretty big fan of Jeremy Grantham, who considers himself a "bubble historian", who points out that while the timing of when bubbles pop is almost impossible to determine, the fact that one sector is in a bubble is not.
But that said, my point is that the institutional powers that be were very quiet, or in many cases actively argued against the possibility of a recession, in both the .com and Great Recession cases. In the present day the exact opposite appears to be happening. The seats of institutional power (central bankers, execs of large corporations, rich VCs, etc.) have been talking about the sky falling for a while now. That is what is really different.
> I don't remember any recession having anywhere near so much foreshadowing.
The 2008 downturn had tons of warnings, everyone I knew who had any visibility into product shipping and future orders was talking about the big slowdown in orders, across all kinds of industries.
Yeah it was the tail end of a boom in Ireland, and I remember being in a hotel during the week and every conference was public sector, zero private companies. That's when I knew things were gonna be bad.
> All this panic about tech layoffs seems a little premature given we haven’t even begun to see the impact of rates hikes on quarterly earnings yet. Many companies still have open recs and budgets to keep hiring.
I think that's the main point. Tech is going nowhere, it's crucial for the future for most countries around the globe.
Didn’t Coinbase cut its marketing budget? Advertising companies are going to feel that trend immediately, even if the quarterly numbers aren’t out yet.
Navigation for example. If you use react-navigation stack and native-stack may look very similar but have tons of hidden implications.
If you are presenting a screen modally on iOS you cannot present a Modal component over it for example. This requires that you understand (or at least research) platform specific behavior.
Flexibility is what keeps software engineering salaries going up. Over the course of my career I’ve done firmware, desktop software, mobile and web development for a variety of industries. All on the back of a 4 year computer science degree.
Example industries I’ve worked in: pro audio(Avid), consumer hardware (Apple), social networks (Facebook), medical marijuana, education, crypto currency (Coinbase).
My work has impacted tens of millions of people and generated massive revenues for the companies I’ve worked for.
Sadly Chemical engineers and pharmacists are very specialized jobs that often require masters degrees and hence don’t have the same career flexibility.
This friction in career switching combined with the cost of education in the USA is a big problem. Their are few professions that offer the same bang for your buck as software engineering. We are a very privileged and lucky group of people.
The macro economic setup doesn’t look good for small tech or growth stocks in general. Big tech I’m less worried about.
We have central banks rising interest rates to fight inflation, caused by wars and massive monetary and fiscal stimulus to fight Covid. these hikes impact valuation multiples used to value tech and unprofitable startups.
We have a supply chain shock caused by war in the Ukraine, the aftermath of Covid and Chinas second lock down which could lead to stagflation.
The only silver lining I see is that big tech prints money, is in many areas essential for cost reduction and has excellent balance sheets.
In summary Im more worried about the general economy tear down than the tech tear down, especially in the later half of this year and 2023.
Anecdotally, lots of my non technical friends (and me) are using it for everything from cooking to learning a foreign language.
Lots of my technical friends are using it for side projects on the weekends. I’d say it’s the top new technology all of them are working with or incorporating into their workflows.
I and all of my teammates are using it to help us write sql and answer basic programming questions.
It’s clearly a way bigger deal than VR right now.
The problem here seems to be that Snap rammed this feature into their product in a really awkward fashion that doesn’t make sense for their users. Hence the backlash.
source: https://arstechnica.com/information-technology/2023/02/chatg...