Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I’m studying neuroscience but very interested in how ai works. I’ve read up on the old school but phrases like memory graph and energy minimization are new to me. What modern papers/articles would you recommend for folks who want to learn more?


Someone put this link up on another discussion the other day and I found it really fascinating:

https://bbycroft.net/llm

I believe energy minimization is literal, just look at the size of that thing and imagine the power bill.


For phrases, Google's TF glossary [0] is a good resource, but it does not cover certain subsets of AI (and more specifically, is mostly focused on TensorFlow).

[0] https://developers.google.com/machine-learning/glossary


If you are in neuroscience I would recommend looking into neural radiance fields rendering as well. I find it fascinating since it's essentially an over-fitted neural network.


He is most likely referring to some sort of free energy minimization.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: