|کد مقاله||کد نشریه||سال انتشار||مقاله انگلیسی||ترجمه فارسی||نسخه تمام متن|
|382338||660757||2016||9 صفحه PDF||سفارش دهید||دانلود رایگان|
• Different mappings are considered within the selective style transfer framework.
• In developing the linear style transfer mapping, a nonlinear transfer mapping is proposed.
• A new source point set is suggested that uses the neutral vectors from a new person.
A key assumption in some learning methods in intelligent systems is that the probability distribution of both test and training data is the same. However, the identical distribution assumption does not hold true for person-independent facial expression recognition. This is because the appearance of an expression may significantly vary for different people. Therefore, domain adaptation methods have been proposed to bring the performance of a person-independent system closer to a person-dependent one. Mismatched conditions between training data and new subject data vary based on the individual. Selective style transfer mapping (SSTM) is an instance-transfer method that will not require re-training classifier and can be classifier independent. This method is proposed to increase the generalization ability of action unit detection through the selection of style transfer mapping type (linear or nonlinear) for different persons. We also propose a rapid SSTM that uses the neutral vectors from a particular person (a small amount of data) to improve action unit detection. Rapid SSTM is also first method for style transfer using a SSTM framework without the need of full labelled target data. The F1 score of selective style transfer mapping for action unit detection in the UNBC-McMaster database is 81.15, which is a significant (P < 0.05) improvement over the style transfer mapping, which is 60.30. The results also show that our approach can effectively perform the task of action unit detection with better generalization of the subjects in the other database. The training database was CK+, but the test database was UNBC-McMaster.
Journal: Expert Systems with Applications - Volume 56, 1 September 2016, Pages 282–290