Regularized semipaired kernel CCA for domain adaptation

Siamak Mehrkanoon*, Johan A.K. Suykens

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review


Domain adaptation learning is one of the fundamental research topics in pattern recognition and machine learning. This paper introduces a regularized semipaired kernel canonical correlation analysis formulation for learning a latent space for the domain adaptation problem. The optimization problem is formulated in the primal-dual least squares support vector machine setting where side information can be readily incorporated through regularization terms. The proposed model learns a joint representation of the data set across different domains by solving a generalized eigenvalue problem or linear system of equations in the dual. The approach is naturally equipped with out-of-sample extension property, which plays an important role for model selection. Furthermore, the Nyström approximation technique is used to make the computational issues due to the large size of the matrices involved in the eigendecomposition feasible. The learned latent space of the source domain is fed to a multiclass semisupervised kernel spectral clustering model that can learn from both labeled and unlabeled data points of the source domain in order to classify the data instances of the target domain. Experimental results are given to illustrate the effectiveness of the proposed approaches on synthetic and real-life data sets.

Original languageEnglish
Pages (from-to)3199-3213
Number of pages15
JournalIEEE Transactions on Neural Networks and Learning Systems
Issue number7
Publication statusPublished - Jul 2018


  • Domain adaption
  • kernel canonical correlation analysis (KCCA)
  • Nyström approximation
  • semisupervised learning


Dive into the research topics of 'Regularized semipaired kernel CCA for domain adaptation'. Together they form a unique fingerprint.

Cite this