Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
wmf
on Dec 22, 2017
|
parent
|
context
|
favorite
| on:
Nvidia’s New Policy Limits GeForce Data Center Usa...
The fact that "humps" exist implies that quite a few people are putting GeForces in servers and not talking about it.
https://www.servethehome.com/avert-your-eyes-from-the-server...
Rapzid
on Dec 22, 2017
[–]
Plenty of people. You can find tons of articles on GeForce 1080 Ti based learning box builds.
jamesblonde
on Dec 22, 2017
|
parent
[–]
Here's a figure on the scale-out you can get on the DeepLearning11 server (cost $15k) - it's about 60-80% of the performance of a DGX-1 for 1/10th of the price (for deep ConvNets and image classification, at least).
https://www.oreilly.com/ideas/distributed-tensorflow
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: