Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
380498 | Engineering Applications of Artificial Intelligence | 2015 | 7 Pages |
Abstract
Restricted Boltzmann machines and deep belief networks have been shown to perform effectively in many applications such as supervised and unsupervised learning, dimensionality reduction and feature learning. Implementing networks, which use contrastive divergence as the learning algorithm on neuromorphic hardware, can be beneficial for real-time hardware interfacing, power efficient hardware and scalability. Neuromorphic hardware which uses memristors as synapses is one of the most promising areas to achieve the above-mentioned goals. This paper presents a restricted Boltzmann machine which uses a two memristor model to emulate synaptic weights and achieves learning using contrastive divergence.
Related Topics
Physical Sciences and Engineering
Computer Science
Artificial Intelligence
Authors
Ahmad Muqeem Sheri, Aasim Rafique, Witold Pedrycz, Moongu Jeon,