Skip to main content

The new tools are making more powerful neural networks.




Above: Difference between a multilayer perceptron,  MLP, or traditional neural- and Kolmogorov-Arnold Network (KAN) networks. The KAN is more like a synapse than a network. And in KAN those layers exchange data more like two neurons in the human brain, than the complex neural network. The KAN is easier to control than a multilayer perceptron, or MLP network. 


In this text, I call KAN a neural network. The difference between those networks is visible in image above this text. Neural networks are tools that can drive more complicated AI- applications. The new tool called the Kolmogorov-Arnold Network (KAN) makes those ultimate tools more interpretable and more predictive than ever before. The outsider controller can see how the system connects data from multiple sources. The ideal model of how the network should operate is the grey box model, where the controller can see the code and operation. The controller sees how the code and operation (the job) interact. And that helps the error detection. 

If the powerful system operates for a long time, and there is some kind of error, the grey box model helps to find the error while the system handles the operation. In regular neural networks called multilayer perceptron, or MLP the operation or learning happens in a black box. That makes runtime error detection impossible. And if the researcher wants to find out the error, the entire operation must be retaken. 

In the complete system, there can be two mainframes or entireties that operate independently. The control system sees if there are differences in those systems operations there can be some kind of errors. That structure imitates human brains. 

Non-centralized neural networks are not easy to destroy by using physical attacks like bombs. Computer viruses can destroy the AI-based tools that control and connect the system into one entirety. And that's why things like quantum networks are needed to secure this tool. Securing is not the same as hiding the process. 




The idea in the neural networks is this. The neural network is the computer that has multiple subcomputers. The system operates like human brains. The neural network can have the mainframe. The purpose or mission that combines the entirety. 

However, each part can operate independently for the goal. Some of those computers search for data, and the other will combine it. In neural networks, multiple computers can walk through homepages with extremely high speed. Those computers can search data from extremely large data mass. 

The neural network can combine the information from the mass memories with the information from sensors. And machine learning means that the system can make new memory structures where it connects some kind of reactions. 

The knowledge of how humans think makes it possible to create systems that can imitate human thinking. That thing is called biomimetics. The artificial system imitates nature. Superficially, the human thinking process is well known. The system recycles and combinates from different sources with memories. But the deep knowledge about that process missing. 

The new neural networks are a combination of microchips and individual computers that operate as one entirety. The problem is that those networks make their solutions in the black box. Those neural networks called multilayer perceptron, or MLP don't introduce the process of how they create solutions. 

"An April 2024 study introduced an alternative neural network design, called a Kolmogorov-Arnold network (KAN), that is more transparent yet can also do almost everything a regular neural network can for a certain class of problems. It’s based on a mathematical idea from the mid-20th century that has been rediscovered and reconfigured for deployment in the deep learning era. " (QuantaMagazine, Novel Architecture Makes Neural Networks More Understandable)

The neural network that makes complicated solutions must, or should introduce its operations. If we put that thing into the glass box, we can see if it makes an error. We have the grey box model in a neural network that allows us to follow how the code travels and how the system makes its conclusions or interconnects networks. The fact is this: the system must be open to controllers who observe its functions. But it must protected against non-authorized observations. 


https://arxiv.org/html/2406.02496v1


https://blog.paperspace.com/kolmogorov-arnold-networks-kan-revolutionizing-deep-learning/


https://www.dailydoseofds.com/a-beginner-friendly-introduction-to-kolmogorov-arnold-networks-kan/


https://ihopenet.org/2019/05/05/knowledge-action-network-kan-on-emergent-risks-and-extreme-events/


https://www.nitcoinc.com/blog/four-outstanding-deep-learning-applications-in-business/


https://scitechdaily.com/ai-decodes-the-enigmatic-secrets-of-human-thought/


https://www.quantamagazine.org/novel-architecture-makes-neural-networks-more-understandable-20240911/


https://en.wikipedia.org/wiki/Biomimetics



Comments

Popular posts from this blog

The hydrogen-burning supernovas are interesting models.

"Researchers discovered a significant magnesium anomaly in a meteorite’s dust particle, challenging current astrophysical models and suggesting new insights into hydrogen-burning supernovas. (Artist’s concept.)Credit: SciTechDaily.com" (ScitechDaily, Rare Dust Particle From Ancient Extraterrestrial Meteorite Challenges Astrophysical Models) If the star is too heavy when its fusion reaction starts, it can detonate just at that moment, when its fusion starts. If the collapsing nebula is heavy enough, it can form a black hole straight from the nebula. But if the nebula's gravity is too heavy to  form  the blue giant or too  small  it can collapse  straight  into a black hole . If  the forming star is a little bit larger than the blue supergiants. It can explode immediately when the fusion starts.    The theory of hydrogen-burning supernovas consists  model  of the giant stars that explode immediately after their fusion starts. When the interstellar nebula  falls  it can form

Transcendence, or the ability to transcendent thinking may grow in teen's brains.

   "New research has discovered that transcendent thinking, which involves analyzing the broader implications of situations, can foster brain growth in adolescents. This form of thinking enhances brain network coordination, impacting developmental milestones and future life satisfaction. The study emphasizes the need for education that encourages deep, reflective thought, underscoring the critical role of adolescents in their own brain development". (ScitechDaily, Scientists Discover That “Transcendent” Thinking May Grow Teens’ Brains) "Scientists at  USC Rossier School of Education’s Center for Affective Neuroscience, Development, Learning and Education (CANDLE) have discovered that adolescents who grapple with the bigger meaning of social situations experience greater brain growth, which predicts stronger identity development and life satisfaction years later". (ScitechDaily, Scientists Discover That “Transcendent” Thinking May Grow Teens’ Brains) The transcendenc

The future of supercomputing can be molecular.

"Researchers have developed new molecular designs that mimic brain processes to boost computing power and efficiency. This novel approach supports robust AI applications and represents a significant advance in sustainable computing. Credit: SciTechDaily.com" (ScitechDaily, Unlocking AI’s Future With a New Molecular Computing Breakthrough) The future of computing can be molecular, says an article in ScitechDaily.com. The fact is that helix molecules like DNA can be key to room-temperature quantum computing. The idea is this. There are two DNA molecules side-by-side. Then, the system drives data into the first molecule. After that, the data packet jumps into another DNA. That means the DNA transforms the row-type data into linear data. In that case, the DNA pair acts as a qubit.  The qubit is the tool that turns row-form data into linear data. And it's possible to use a quantum computer as multiple simultaneously operating binary computers. Or it can operate as the fastest