Close Menu
  • Home
  • Technology
  • Science
  • Space
  • Health
  • Biology
  • Earth
  • History
  • About Us
    • Contact Us
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
What's Hot

Florida Startup Beams Solar Power Across NFL Stadium in Groundbreaking Test

April 15, 2025

Unlocking the Future: NASA’s Groundbreaking Space Tech Concepts

February 24, 2025

How Brain Stimulation Affects the Right Ear Advantage

November 29, 2024
Facebook X (Twitter) Instagram
TechinleapTechinleap
  • Home
  • Technology
  • Science
  • Space
  • Health
  • Biology
  • Earth
  • History
  • About Us
    • Contact Us
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
TechinleapTechinleap
Home»Science»The Breakthrough Algorithm Overcoming ‘Catastrophic Forgetting’ in Neural Networks
Science

The Breakthrough Algorithm Overcoming ‘Catastrophic Forgetting’ in Neural Networks

October 11, 2024No Comments4 Mins Read
Share
Facebook Twitter LinkedIn Email Telegram

Researchers at the California Institute of Technology (Caltech) have developed a novel algorithm, called the Functionally Invariant Path (FIP) algorithm, that enables neural networks to continuously learn new tasks without ‘forgetting’ previous knowledge. This breakthrough, inspired by the flexibility of the human brain, has wide-ranging applications from e-commerce recommendations to self-driving car technology. The algorithm utilizes differential geometry to modify neural networks without losing previously encoded information, addressing the longstanding challenge of ‘catastrophic forgetting’ in artificial intelligence.

New algorithm enables neural networks to learn continuously
Differential geometric framework for constructing FIPs in weight space. a, Left: conventional training on a task finds a single trained network (wt) solution. Right: the FIP strategy discovers a submanifold of isoperformance networks (w1, w2…wN) for a task of interest, enabling the efficient search for networks endowed with adversarial robustness (w2), sparse networks with high task performance (w3) and for learning multiple tasks without forgetting (w4). b, Top: a trained CNN with weight configuration (wt), represented by lines connecting different layers of the network, accepts an input image x and produces a ten-element output vector, f(x, wt). Bottom: perturbation of network weights by dw results in a new network with weight configuration wt + dw with an altered output vector, f(x, wt + dw), for the same input, x. c, The FIP algorithm identifies weight perturbations θ* that minimize the distance moved in output space and maximize alignment with the gradient of a secondary objective function (∇wL). The light-blue arrow indicates an ϵ-norm weight perturbation that minimizes distance moved in output space and the dark-blue arrow indicates an ϵ-norm weight perturbation that maximizes alignment with the gradient of the objective function, L(x, w). The secondary objective function L(x, w) is varied to solve distinct machine learning challenges. d, Path sampling algorithm defines FIPs, γ(t), through the iterative identification of ϵ-norm perturbations (θ*(t)) in the weight space. Credit: Nature Machine Intelligence (2024). DOI: 10.1038/s42256-024-00902-x

Overcoming the Limitations of Traditional Neural Networks

Neural networks have become a cornerstone of modern artificial intelligence, excelling at tasks such as image recognition and natural language processing. However, these models often struggle with a phenomenon known as ‘catastrophic forgetting’. When a neural network is trained on a new task, it can successfully learn the new assignment but at the cost of forgetting how to complete the original task.

This limitation poses significant challenges for applications that require continuous learning, such as self-driving cars. Traditionally, these models would need to be fully reprogrammed to acquire new capabilities, which is both time-consuming and resource-intensive. In contrast, the human brain is remarkably flexible, allowing us to easily learn new skills without losing our previous knowledge and abilities.

The Inspiration from Neuroscience and Biological Flexibility

Inspired by the remarkable flexibility of the human and animal brain, the Caltech researchers set out to develop a new algorithm that could overcome the limitations of traditional neural networks. They were particularly intrigued by the work of Carlos Lois, a Research Professor of Biology at Caltech, who studies how birds can rewire their brains to learn how to sing again after a brain injury.

This observation, along with the understanding that humans can forge new neural connections to relearn everyday functions after a stroke, provided the inspiration for the Caltech team to create the Functionally Invariant Path (FIP) algorithm. The algorithm, developed by Matt Thomson, Assistant Professor of Computational Biology, and former graduate student Guru Raghavan, Ph.D., utilizes a mathematical technique called differential geometry to allow neural networks to be modified without losing previously encoded information.

Empowering Continuous Learning in Artificial Neural Networks

The FIP algorithm represents a significant advancement in the field of machine learning, as it enables neural networks to be continuously updated with new data without having to start from scratch. This capability has far-reaching implications, from improving recommendations on online stores to fine-tuning self-driving cars.

In 2022, with guidance from Julie Schoenfeld, Caltech Entrepreneur In Residence, Raghavan and Thomson founded a company called Yurts to further develop the FIP algorithm and deploy machine learning systems at scale to address a wide range of problems. The algorithm’s versatility and ability to overcome the challenge of ‘catastrophic forgetting’ have the potential to revolutionize how artificial intelligence systems are designed and deployed, paving the way for more flexible and adaptable technologies.

Artificial intelligence artificial neural networks catastrophic forgetting continuous learning differential geometry machine learning analysis
jeffbinu
  • Website

Tech enthusiast by profession, passionate blogger by choice. When I'm not immersed in the world of technology, you'll find me crafting and sharing content on this blog. Here, I explore my diverse interests and insights, turning my free time into an opportunity to connect with like-minded readers.

Related Posts

Science

How Brain Stimulation Affects the Right Ear Advantage

November 29, 2024
Science

New study: CO2 Conversion with Machine Learning

November 17, 2024
Science

New discovery in solar energy

November 17, 2024
Science

Aninga: New Fiber Plant From Amazon Forest

November 17, 2024
Science

Groundwater Salinization Affects coastal environment: New study

November 17, 2024
Science

Ski Resort Water demand : New study

November 17, 2024
Leave A Reply Cancel Reply

Top Posts

Florida Startup Beams Solar Power Across NFL Stadium in Groundbreaking Test

April 15, 2025

Quantum Computing in Healthcare: Transforming Drug Discovery and Medical Innovations

September 3, 2024

Graphene’s Spark: Revolutionizing Batteries from Safety to Supercharge

September 3, 2024

The Invisible Enemy’s Worst Nightmare: AINU AI Goes Nano

September 3, 2024
Don't Miss
Space

Florida Startup Beams Solar Power Across NFL Stadium in Groundbreaking Test

April 15, 20250

Florida startup Star Catcher successfully beams solar power across an NFL football field, a major milestone in the development of space-based solar power.

Unlocking the Future: NASA’s Groundbreaking Space Tech Concepts

February 24, 2025

How Brain Stimulation Affects the Right Ear Advantage

November 29, 2024

A Tale of Storms and Science from Svalbard

November 29, 2024
Stay In Touch
  • Facebook
  • Twitter
  • Instagram

Subscribe

Stay informed with our latest tech updates.

About Us
About Us

Welcome to our technology blog, where you can find the most recent information and analysis on a wide range of technological topics. keep up with the ever changing tech scene and be informed.

Our Picks

Quantum Leaps: Unlocking the Secrets of Ultrafast Electron Control

October 3, 2024

Unlocking the Secrets of Viruses: Harnessing Sound Waves for Groundbreaking Therapeutics

September 25, 2024

Tackling Violence Against Women and Girls in Sports: A Global Reckoning

October 14, 2024
Updates

Quantum Leaps: Unlocking the Secrets of Ultrafast Electron Control

October 3, 2024

Unlocking the Secrets of Viruses: Harnessing Sound Waves for Groundbreaking Therapeutics

September 25, 2024

Tackling Violence Against Women and Girls in Sports: A Global Reckoning

October 14, 2024
Facebook X (Twitter) Instagram
  • Homepage
  • About Us
  • Contact Us
  • Terms and Conditions
  • Privacy Policy
  • Disclaimer
© 2025 TechinLeap.

Type above and press Enter to search. Press Esc to cancel.