When Isaac Newton formulated his Second Law (𝐹 = 𝑚𝑎), observation and experimentation were crucial. Drawing on experiments, many influenced by Galileo and others, he concluded that force causes changes in motion, or acceleration. This process involved solving both forward and inverse problems through iterative refinement (experimentation and observation), much like the training of neural networks. In neural networks, forward and inverse iterations are combined in a similar way, and this is the primary reason behind their immense power. It’s even possible that a neural network could independently derive Newton's Second Law. That’s the true potential of this technology. This is what we’ve discovered and explored in greater detail in our recent publication: Deep Manifold Part 1: Anatomy of Neural Network Manifold.

Two key factors enable the forward and inverse iteration process in neural networks: infinite degrees of freedom and self-progressing boundary conditions. The infinite degrees of freedom explain how just a few gigabytes of neural network weights can store vast amounts of knowledge and information. The self-progressing boundary conditions are what allow neural networks to be trained on any data type, including mixed types like language and images. This boundary condition enables the network to efficiently intake large and complex datasets, while its infinite degrees of freedom process them with unparalleled flexibility and capacity.

If a neural network could independently derive Newton's Second Law, would we still need Newton's Second Law? The answer is yes.

  1. Newton's Second Law can be integrated into a neural network as a neural operator to tackle much more complex problems, such as the analysis of the Vajont Dam failure
  2. Newton's Second Law can be to verify neural network model results.

If you believe that 'math is a branch of physics,' we are now seeing the true power source behind neural networks—advancing to a point where AI is actually pushing the boundaries of mathematics itself. 

Welcome to the AI driven world.

New Comment