Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think the statement "I'd be much happier with a Studio" is a little hypothetical? Sorry if that's not true, but everywhere I've looked, it seems like these are not ML training chips, and people are just hoping they will handle LLMs well.

You can absolutely build (with real support from the PyTorch folks) a 4x3090 deep learning workstation that has 96 GB of VRAM for roughly $7k. Or, more likely, you'll rent a A100 from AWS for ~$0.15/hr.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: