Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeps, that's a GCN, and probably the most popular GNN is indeed a RGCN :)

The transformer comment is interesting. They're very close, and in general, the tricks people use elsewhere are getting translated to GNNs: convolution, attention, ... . But scaling is still happening, so recent couple of years have gotten folks doing 1M-1B level, but not yet LLM scale yet. Critically, the scaling work is relatively recent -- good GPU impl, good samplers, etc -- and with a good trajectory.

We tracked GNNs for years but stayed away until heterogeneity + scaling started to get realistic for commercial workloads, and that's finally happening. Major credit to folks like deepmind, michael bronstein, early graphsage / jure, various individual researchers, and now aws+nvidia engineers for practical engineering evolutions here.



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: