Enhancing Continual Learning in Neural Networks with Lightweight Beneficial Perturbation Networks

Vanaparthi, Kiranmai and Manikandan, A (2026) Enhancing Continual Learning in Neural Networks with Lightweight Beneficial Perturbation Networks. Journal of Internet Services and Information Security, 16 (1). pp. 253-276. ISSN 2182-2069

[thumbnail of 2026.I1.015new paper journal 2026.pdf] Text
2026.I1.015new paper journal 2026.pdf

Download (1MB)

Abstract

The artificial intelligence systems are bombarded by constant learning, and adaptive algorithms are
necessary to sustain knowledge across a broad spectrum of tasks. The paper discusses the performance of various techniques for handling incremental and object recognition problems using the CIFAR-100 dataset. We present new algorithms, Beneficial Perturbation Network (BPN)
variants, namely BD+EWC, PSP, and BD+PSP, aimed at improving flexibility and resiliency in ongoing learning. We do this in our study by comparing the algorithms based on their accuracy, performance, and cost per calculation task. Results show that the highest accuracy of 90.65 is with
BD+PSP, followed by PSP with 90.01, and lastly, BD+EWC with 89.95. To promote safe data processing, we will incorporate encryption-based processes and privacy-preserving algorithms into the proposed models to ensure that sensitive data is not destroyed during the learning process. In
addition, cost analysis proves that BD+EWC enjoys the minimal computational overhead per task of 4,039 bytes, and PSP and BD+PSP have 10,897 bytes and 11,456 bytes, respectively. The paper indicates the potential of safe and efficient systems of continuous learning in real-life contexts, particularly where accuracy and confidentiality of data are needed. Moreover, we analyze the performance patterns of different variants on incremental CIFAR-100 tasks. The most effective strategies, including BD+PSP, display higher retention of learned information on succession tasks.Our proposed algorithms highlight the immense advances in object recognition tasks—and the
algorithms themselves—when compared to the current methods. The results underscore the algorithms' importance in object recognition as they achieve results superlative to the existing ones. LW-BPN-EPIE-Net's architecture is optimized and experimental results show it is helpful in the
accurate estimation of BPM as well as its ability to function in continuous learning systems. Such
systems as LW-BPN-EPIE-Net demonstrate its ability to maintain an accuracy of 98 percent. The model achieves 6% on CIFAR-100, outperforming existing models, and its processing is efficient, taking only 28 seconds. Moreover, we go further to evaluate the validity of current empirical
Beneficial Perturbation Networks and processes to understand the complexity of issues encountered by the detection departments. Our research goal, therefore, is to develop a lightweight yet comprehensive framework to address the problem of constant learning in neural networks. Our
observations provide helpful advice for creating adaptive algorithms capable of performing various and changing tasks in practice.

Item Type: Article
Subjects: Computer Science Engineering > Machine Learning
Domains: Computer Science Engineering
Depositing User: User 1 1
Date Deposited: 06 Mar 2026 10:59
Last Modified: 13 Mar 2026 05:52
URI: https://ir.vistas.ac.in/id/eprint/13072

Actions (login required)

View Item
View Item