Hacker Newsnew | past | comments | ask | show | jobs | submit | jarekd's commentslogin


In ML there is usually used normalization by subtracting the mean and dividing by standard deviation - I haven't seen by CDF in ML (?, they are popular in finance for copulas: https://en.wikipedia.org/wiki/Copula_(statistics) ), which provides more uniform distributions, allowing for better description with smaller models, what seems beneficial for generalization (e.g. description with low degree polynomials in this arXiv).

For which tasks CDF/EDF normalization could be beneficial in ML? Any reasons it seems unknown in ML?

Any other interesting nonstandard normalizations?



Sure biological NN are much more complicated, but basically action propagation can travel in both directions, and evolution should optimize for that.

In contrast, current ANNs are focused on unidirectional propagation, and are much worse at training from single samples - to reach abilities of biological, maybe it is worth to start thinking about multidirectional?

Neurons containing joint distribution model can do propagate conditional distributions in various direction, and it is not that difficult to represent - maybe something like that could be hidden in biological (?)


There is a dozen of papers in this methodology (e.g. end of https://community.wolfram.com/groups/-/m/t/3017754 ), but not as ANN.

However, it degenerates to ~KAN if restring to pairwise dependencies (can consciously add triplewise and higher), and gives many new possibilities, like multidirectional propagation, of values or probability distributions, with novel additional training approaches like through tensor decomposition.


When I ask if this has been tested, I meant as an ANN on conventional benchmarks. Sorry if that wasn't clear.

There are a lot of ideas that are clever and seem promising... but fail to perform well on such benchmarks.

Is there a github repo with code available?


Which benchmarks for multidirectional neurons? To compare with which approaches?

Multidirectional are biological neurons, but I don't know how to compare with them?


Can you show the world this can be made to work for, say, a toy benchmark like MNIST classification?

---

To be 100% clear: My question about practical application today is orthogonal to the question about whether this research is worth pursuing!


(Multidirectional) biological neural networks are no longer superior in MNIST benchmark ... but e.g. consciousness, or being able to learn from single examples.

And no, recreating it is not a task a single person can complete.


Alright. I've added you preprint to my reading list, so I can take a closer look at this.


Just represent joint density for each neuron as a linear combination - then you can inexpensively propagate in both directions e.g. as E[X|Y,Z] or E[Y,Z|X] by substituting and normalizing ... the formulas turn out quite simple - could be hidden in dynamics of (bidirectional) biological NN ...

And for pairwise distribution becomes ~KAN, which turned out quit successful ... so we are talking about its extension: adding more possibilities, like triplewise dependencies and multidirectional propagation.


While ANNs are rather trained for unidirectional propagation, action potential propagation in biological neurons is symmetric e.g. ”it is not uncommon for axonal propagation of action potentials to happen in both directions” ( https://journals.aps.org/pre/abstract/10.1103/PhysRevE.92.03... ).

Also, while current ANNs use guessed parametrizations, objectively available is joint distribution - biological neuron should be evolutionarily optimized to exploit, and it is relatively simple in approach from this arXiv.

Such joint distribution neurons bring additional training approaches - maybe some of them are used by biological neural networks?


Talk: https://www.youtube.com/watch?v=pv95hvSdA3c

This is enhancement of standard one-way quantum computers (1WQC) by adding missing operation: "reversed" version (CPT symmetry analog) of state preparation (e.g. pull/push, negative/positive pressure, stimulated emission/absorption). There are many arguments it should be possible, but needs experimental confirmation.

In theory this enhancement allows to attack any NP problems, but it likely will be more difficult than Shor's factorization - so might be completely impractical (?)

There are many interesting new question - like of theoretical complexity class for such 2WQC, somewhere between NP and PSPACE?

There might appear also different approaches to NP, so maybe it would be safer to start thinking of cryptography based on PSPACE? https://en.wikipedia.org/wiki/PSPACE-complete


Slides: https://www.dropbox.com/scl/fi/1fymjwd7obq8sdv1e661f/two-way...

Article: https://www.researchgate.net/publication/372677599_Two-way_q...

This is enhancement of standard one-way quantum computers (1WQC) by adding missing operation: "reversed" version (CPT symmetry analog) of state preparation (e.g. pull/push, negative/positive pressure, stimulated emission/absorption). There are many arguments it should be possible, but needs experimental confirmation.

In theory this enhancement allows to attack any NP problems, but it likely will be more difficult than Shor's factorization - so might be completely impractical (?)

There are many interesting new question - like of theoretical complexity class for such 2WQC, somewhere between NP and PSPACE?

There might appear also different approaches to NP, so maybe it would be safer to start thinking of cryptography based on PSPACE? https://en.wikipedia.org/wiki/PSPACE-complete


For more than half a century they unsuccessfully try to unify Standard Model with gravity ... doesn't it mean that at least one of them is inaccurate?

https://en.wikipedia.org/wiki/Quantum_gravity#Nonrenormaliza...


Indeed baryogenesis is one example of baryon number violation, another is Hawking radiation - baryons forming stars and finally ending as massless radiation.

In both these examples conditions are completely extreme - what might be the missing factor for human attempts to directly observe it: in room temperature water.

I would rather expect it e.g. in the centers of some neutron stars - especially those thousands times brighter than possible with standard explanations, like: https://www.space.com/bizare-object-10-times-brighter-than-s...

While electric charge is ultimately conserved due to Gauss law, there is no Gauss law for baryon number.


Lots of talks about topological defects including hopfions: http://solitonsatwork.net/?display=archive

Topological defect framework for liquid crystal based particle models: https://community.wolfram.com/groups/-/m/t/2856493


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: