Publication
Theory II: Deep learning and optimization. Bulletin of the Polish Academy of Sciences: Technical Sciences 66, (2018).
03_775-788_00920_Bpast.No_.66-6_31.12.18_K2.pdf (5.43 MB)
An Overview of Some Issues in the Theory of Deep Networks. IEEJ Transactions on Electrical and Electronic Engineering 15, 1560 - 1571 (2020).
Associative Memory as the Core of Intelligence in Technology and Evolution. (2026).
Review_On_Associative_Memories-14.pdf (245.78 KB)
Computational role of eccentricity dependent cortical magnification. (2014).
CBMM-Memo-017.pdf (1.04 MB)
Deep Leaning: Mathematics and Neuroscience. A Sponsored Supplement to Science Brain-Inspired intelligent robotics: The intersection of robotics and neuroscience, 9-12 (2016).
Complexity Control by Gradient Descent in Deep Networks. Nature Communications 11, (2020).
s41467-020-14663-9.pdf (431.68 KB)
How Deep Sparse Networks Avoid the Curse of Dimensionality: Efficiently Computable Functions are Compositionally Sparse. (2022).
v1.0 (984.15 KB)
v5.7 adding in context learning etc (1.16 MB)
Is Research in Intelligence an Existential Risk?. (2014).
Is Research in Intelligence an Existential Risk.pdf (571.42 KB)
Theory I: Deep networks and the curse of dimensionality. Bulletin of the Polish Academy of Sciences: Technical Sciences 66, (2018).
02_761-774_00966_Bpast.No_.66-6_28.12.18_K1.pdf (1.18 MB)
]