Title: Transfer Learning for Image Classification Using Hebbian Plasticity Principles
Abstract: Transfer learning is a deep learning technique has proved to be of great importance. However, most of the standard transfer learning algorithms are designed to repeat the same method for fine-tuning of the weights on the target domain. If we try to investigate the human brain's mechanism of learning a new complex concept based on a simple and basic concept, we can say, it is different from just the repetition of the same method of learning on a different dataset. In this article, we have introduced a novel transfer learning algorithm referred to as HTL (Hebbian transfer learning) using synaptic plasticity. The Hebbian theory, introduced by Donald Hebb, explains the "associative learning" in which the simultaneous activation of the brain cells positively affects the increase in the synaptic connection strength between the individual cells. This particular behaviour of Hebbian learning, makes it a very viable candidate for discriminative learning for the search of the specific feature for the task of object recognition or image classification. It helps connection weights of the learned model to adapt as per task dataset using numerical methods defining plasticity principles. Learning to discriminate between instances of different classes, over a variable number of classes within the dataset space defined by the task at hand, can be the result-oriented approach for classification problem. Extensive experiments verify that HTL, using synaptic plastic behaviour in heterogeneous transfer learning task does better than the standard state of the art methods of transfer learning on the cross-domain image classification task.
Publication Year: 2019
Publication Date: 2019-12-06
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 6
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot