Add like
Add dislike
Add to saved papers

Constrained ERM Learning of Canonical Correlation Analysis: A Least Squares Perspective.

Neural Computation 2017 October
Canonical correlation analysis (CCA) is a useful tool in detecting the latent relationship between two sets of multivariate variables. In theoretical analysis of CCA, a regularization technique is utilized to investigate the consistency of its analysis. This letter addresses the consistency property of CCA from a least squares view. We construct a constrained empirical risk minimization framework of CCA and apply a two-stage randomized Kaczmarz method to solve it. In the first stage, we remove the noise, and in the second stage, we compute the canonical weight vectors. Rigorous theoretical consistency is addressed. The statistical consistency of this novel scenario is extended to the kernel version of it. Moreover, experiments on both synthetic and real-world data sets demonstrate the effectiveness and efficiency of the proposed algorithms.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app