In ML there is usually used normalization by subtracting the mean and dividing by standard deviation - I haven't seen by CDF in ML (?, they are popular in finance for copulas: https://en.wikipedia.org/wiki/Copula_(statistics) ), which provides more uniform distributions, allowing for better description with smaller models, what seems beneficial for generalization (e.g. description with low degree polynomials in this arXiv).
For which tasks CDF/EDF normalization could be beneficial in ML? Any reasons it seems unknown in ML?
Sure biological NN are much more complicated, but basically action propagation can travel in both directions, and evolution should optimize for that.
In contrast, current ANNs are focused on unidirectional propagation, and are much worse at training from single samples - to reach abilities of biological, maybe it is worth to start thinking about multidirectional?
Neurons containing joint distribution model can do propagate conditional distributions in various direction, and it is not that difficult to represent - maybe something like that could be hidden in biological (?)
However, it degenerates to ~KAN if restring to pairwise dependencies (can consciously add triplewise and higher), and gives many new possibilities, like multidirectional propagation, of values or probability distributions, with novel additional training approaches like through tensor decomposition.
(Multidirectional) biological neural networks are no longer superior in MNIST benchmark ... but e.g. consciousness, or being able to learn from single examples.
And no, recreating it is not a task a single person can complete.
Just represent joint density for each neuron as a linear combination - then you can inexpensively propagate in both directions e.g. as E[X|Y,Z] or E[Y,Z|X] by substituting and normalizing ... the formulas turn out quite simple - could be hidden in dynamics of (bidirectional) biological NN ...
And for pairwise distribution becomes ~KAN, which turned out quit successful ... so we are talking about its extension: adding more possibilities, like triplewise dependencies and multidirectional propagation.
While ANNs are rather trained for unidirectional propagation, action potential propagation in biological neurons is symmetric e.g. ”it is not uncommon for axonal propagation of action potentials to happen in both directions” ( https://journals.aps.org/pre/abstract/10.1103/PhysRevE.92.03... ).
Also, while current ANNs use guessed parametrizations, objectively available is joint distribution - biological neuron should be evolutionarily optimized to exploit, and it is relatively simple in approach from this arXiv.
Such joint distribution neurons bring additional training approaches - maybe some of them are used by biological neural networks?
This is enhancement of standard one-way quantum computers (1WQC) by adding missing operation: "reversed" version (CPT symmetry analog) of state preparation (e.g. pull/push, negative/positive pressure, stimulated emission/absorption). There are many arguments it should be possible, but needs experimental confirmation.
In theory this enhancement allows to attack any NP problems, but it likely will be more difficult than Shor's factorization - so might be completely impractical (?)
There are many interesting new question - like of theoretical complexity class for such 2WQC, somewhere between NP and PSPACE?
This is enhancement of standard one-way quantum computers (1WQC) by adding missing operation: "reversed" version (CPT symmetry analog) of state preparation (e.g. pull/push, negative/positive pressure, stimulated emission/absorption). There are many arguments it should be possible, but needs experimental confirmation.
In theory this enhancement allows to attack any NP problems, but it likely will be more difficult than Shor's factorization - so might be completely impractical (?)
There are many interesting new question - like of theoretical complexity class for such 2WQC, somewhere between NP and PSPACE?
Indeed baryogenesis is one example of baryon number violation, another is Hawking radiation - baryons forming stars and finally ending as massless radiation.
In both these examples conditions are completely extreme - what might be the missing factor for human attempts to directly observe it: in room temperature water.