کد مقاله کد نشریه سال انتشار مقاله انگلیسی نسخه تمام متن
4973636 1451680 2018 14 صفحه PDF دانلود رایگان
عنوان انگلیسی مقاله ISI
Restricted Boltzmann machines for vector representation of speech in speaker recognition
ترجمه فارسی عنوان
ماشین آلات Boltzmann محدود برای نشان دادن بردار از سخنرانی در تشخیص سخنران
کلمات کلیدی
ماشین Boltzmann محدود؛ یادگیری عمیق؛ متغیر واحد خطی تصحیح شده شناسایی بلندگو؛ بردار GMMRBM؛ I-vector
موضوعات مرتبط
مهندسی و علوم پایه مهندسی کامپیوتر پردازش سیگنال
چکیده انگلیسی


- An efficient low dimensional vector representation of speech based on GMM and RBM, referred to as GMMRBM vectors, is proposed.
- A Universal RBM (URBM) is trained to learn the total speaker and session variability among background GMM supervectors.
- A variant of Rectified Linear Units (ReLU), referred to as variable ReLU (VReLU), is proposed to train the URBM efficiently.
- URBM is then used to transform unseen supervectors to the proposed GMM-RBM vectors.

Over the last few years, i-vectors have been the state-of-the-art technique in speaker recognition. Recent advances in Deep Learning (DL) technology have improved the quality of i-vectors but the DL techniques in use are computationally expensive and need phonetically labeled background data. The aim of this work is to develop an efficient alternative vector representation of speech by keeping the computational cost as low as possible and avoiding phonetic labels, which are not always accessible. The proposed vectors will be based on both Gaussian Mixture Models (GMM) and Restricted Boltzmann Machines (RBM) and will be referred to as GMM-RBM vectors. The role of RBM is to learn the total speaker and session variability among background GMM supervectors. This RBM, which will be referred to as Universal RBM (URBM), will then be used to transform unseen supervectors to the proposed low dimensional vectors. The use of different activation functions for training the URBM and different transformation functions for extracting the proposed vectors are investigated. At the end, a variant of Rectified Linear Units (ReLU) which is referred to as variable ReLU (VReLU) is proposed. Experiments on the core test condition 5 of NIST SRE 2010 show that comparable results with conventional i-vectors are achieved with a clearly lower computational load in the vector extraction process.

ناشر
Database: Elsevier - ScienceDirect (ساینس دایرکت)
Journal: Computer Speech & Language - Volume 47, January 2018, Pages 16-29
نویسندگان
, ,