# Journal:

Neurocomputing, IF 4.072

# Authors:

Changsheng Lu, Chaochen Gu, Kaijie Wu, Siyu Xia, Haotian Wang, Xinping Guan

# Abstract:

Transfer neural networks have been successfully applied in many domain adaptation tasks. The initiative of most of the current transfer networks, essentially, is optimizing a single distance metric between the source domain and target domain, while few studies integrate multiple distances for training transfer networks. In this paper, we propose an architecture of transfer neural network equipped with a hybrid method in domain adaptation, which incorporates the advantages of different domain discrepancy representations. In our architecture, the Maximum Mean Discrepancy (MMD) and $\mathcal{H}$-distance based domain adaptations are combined for distribution alignment and domain confusion. Through extensive experiments, we find that the proposed method is able to achieve compelling transfer performance across the datasets with domain discrepancy from small scale to large scale. Especially, the proposed method can be promisingly used to predict the viewpoint of 3D-printed workpiece even trained without labels of real images. The visualization of learned features and adapted distributions by our transfer network highlights that the proposed approach could effectively learn the similar features between two domains and deal with a wide range of transfer tasks.