From: mk_thisisit
While advanced AI utilizes neural networks, their most interesting and distinguishing feature is how they differ from biology [00:00:00]. An important area for future development is the incorporation of neural networks into biological fields [00:00:17].
Imitation vs. Essence
Early research into AI focused on understanding knowledge hidden within networks, rather than their origin [00:00:55]. This approach was crucial for the progression from Hopfield networks to advanced AI [00:01:01].
The Hopfield network model, published in 1982 by John J. Hopfield, combined statistical physics with neurobiology, laying the groundwork for modern deep learning [00:01:25]. Hopfield’s work is recognized alongside pioneers like Turing and Hinton for its fundamental discoveries enabling machine learning through neural networks [00:01:16].
The development of artificial neural networks has found greater advantage in not slavishly copying every minute detail from biology, but rather in imitating its essence [00:05:51], [00:06:03]. This approach allows for the capture of a phenomenon’s core rather than all its intricacies [00:06:44]. Slavish imitation, conversely, consumes computer resources and provides little insight [00:07:01]. The goal is to achieve something universal, similar to phase transition systems that aid in understanding biology, physics, and collective systems [00:08:25].
Neuronal Activity and Computation
Before Hopfield networks, brain activity descriptions heavily emphasized tracking the detailed occurrence of action potentials in individual neurons over time [00:09:06]. A significant realization was the need to understand computation within physical systems and the dynamics of large classical systems with numerous coupled variables, rather than just the activities of their basic elements [00:09:36], [00:10:05]. This involves dealing with multi-dimensional space where calculations aim to grasp the dynamics within that space, moving beyond limiting computation to mere logic [00:10:11].
Another pivotal moment was the realization that the biological and psychological views of memory are isomorphic to physicists’ claims about spin and magnetism in spin systems [00:10:37]. This led to creating models that coherently combined these similar concepts, demonstrating that the same equations could be derived from different perspectives like biology and engineering [00:10:56].
Evolution of Neural Networks
The field of neural networks has seen gradual progress:
- One Neuron Networks [00:13:53]
- Perceptron Networks (forward information flow) [00:13:59]
- Hopfield Networks (with feedback) [00:14:05]
- Wilson Machine (more complex topological networks) [00:14:11]
- Deep Learning (learning in multi-layered networks) [00:14:17]
Current learning systems used today do not consider synaptic weights as dynamic variables, unlike biology [00:17:30]. The separation of synaptic weight changes from dynamic activity variables may hinder the computational power achieved by evolution [00:17:45]. A future turning point in neural networks could be a focus on the true learning algorithm in biology and incorporating its essence into engineering [00:13:34].
Consciousness and Collective Behavior
The question of consciousness has puzzled researchers for decades [00:00:22], [00:16:45]. It raises the question of whether neurons should be viewed as independent systems or as exhibiting collective behavior [00:00:29], [00:16:50]. The speaker suggests that consciousness might be a problem of classical mechanics rather than quantum mechanics [00:00:39], [00:21:12].
While some, like Roger Penrose, believe human consciousness arises from quantum phenomena [00:19:57], the speaker asserts that deciding the role of quantum mechanics is a secondary issue [00:22:21]. Until the relationship of consciousness to other factors is understood, it is not possible to definitively state whether it is a matter of classical or quantum physics [00:22:34].