Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I understand that this is google's library, and they're promoting TensorFlow's usage, and subsequently the usage of TPU's.

But this can just as easily be done in Pytorch right? Yes we can't have GPU's there but having your model train 50% slower (exaggeration) is better than having to spend 150% of the time taken in Pytorch to debug anything in TensorFlow.

I may be beating a dead horse here, but why doesn't google just accept that TensorFlow needs to be redesigned for more ease of use?

Can anyone point out the benefits of TensorFlow over Pytorch (besides TPU's) ? It's been a few years since I've used TF and I may have missed something.



Have you used TF 2.0? It’s definitely better in terms of ease of use. Still not as good as Pytorch, in my opinion. Pytorch is still just more... ergonomic, I would say.

However, I don’t think the Pytorch ecosystem really matches tensorflow yet for production, with TFX and all the other nice-to-haves that google and others have open sourced.

As someone who spends a lot of their time doing POCs and research work, I strongly prefer Pytorch. But I have colleagues who mainly productionize language models, and they all seem to like Tensorflow. I don’t know if that’s just inertia on their part, or the result of a considered choice however.


TFX/TensorFlow Serving doesn't have a PyTorch equivalent. TF has better mobile mobile support. Jax also works on TPUs and is closer (with other libraries on top like Haiku) to PyTorch.

But framework wars are like language wars :) There's probably not too much productive arguments that haven't been made




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: