Constrained Principal Component Analysis and Related Techniques
Chapman and Hall/CRC – 2013 – 251 pages
Chapman and Hall/CRC – 2013 – 251 pages
In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data.
Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.
The book begins with four concrete examples of CPCA that provide readers with a basic understanding of the technique and its applications. It gives a detailed account of two key mathematical ideas in CPCA: projection and singular value decomposition. The author then describes the basic data requirements, models, and analytical tools for CPCA and their immediate extensions. He also introduces techniques that are special cases of or closely related to CPCA and discusses several topics relevant to practical uses of CPCA. The book concludes with a technique that imposes different constraints on different dimensions (DCDD), along with its analytical extensions. MATLAB® programs for CPCA and DCDD as well as data to create the book’s examples are available on the author’s website.
"Takane is a renowned worker in this field and he draws upon his vast array of papers in the area to write an extremely informative review of the subject. … To bring together all the topics contained in the book, and to present them in such a clear and concise manner, is a great achievement. The author’s aim of promoting principal component analysis as more than just an exploratory data analysis tool has been achieved. … For those interested in the topic, the book is worth purchasing. It would be a useful addition to any library. I wholeheartedly recommend this book."
—Journal of Applied Statistics, 2014
"Coverage of principal component analysis (PCA) in books ranges from chapters to entire texts. … None of these books nor this line of research, however, describes PCA from the perspective of this text. …Though emphasizing algebra, Takane provides numerous examples that illustrate the methodology."
—Robert A. Stine, Journal of the American Statistical Association, September 2014, Vol. 109
Analysis of Mezzich’s Data
Analysis of Food and Cancer Data
Analysis of Greenacre’s Data
Analysis of Tocher’s Data
A Summary of the Analyses in This Chapter
Singular Value Decomposition (SVD)
Constrained Principal Component Analysis (CPCA)
Special Cases and Related Methods
Pre- and Postprocessings
Redundancy Analysis (RA)
Canonical Correlation Analysis (CANO)
Canonical Discriminant Analysis (CDA)
Multidimensional Scaling (MDS)
Correspondence Analysis (CA)
Nonsymmetric CA (NSCA)
Multiple-Set CANO (GCANO)
Multiple Correspondence Analysis (MCA)
Vector Preference Models
Growth Curve Models (GCM)
Extended Growth Curve Models (ExGCM)
Seemingly Unrelated Regression (SUR)
Multilevel RA (MLRA)
Weighted Low Rank Approximations (WLRA)
Orthogonal Procrustes Rotation
PCA of Image Data Matrices
Related Topics of Interest
Determining the Value of δ
Different Constraints on Different Dimensions (DCDD)
Model and Algorithm
Graphical Display of Oblique Components
Extended Redundancy Analysis (ERA)
Generalized Structured Component Analysis (GSCA)
Yoshio Takane is an emeritus professor at McGill University and an adjunct professor at the University of Victoria. He is a former president of the Psychometric Society and a recipient of a Career Award from the Behaviormetric Society of Japan and a Special Award from the Japanese Psychological Association. His recent interests include regularization techniques for multivariate data analysis, acceleration methods for iterative model fitting, the development of structural equation models for analyzing brain connectivity, and various kinds of singular value decompositions. He earned his DL from the University of Tokyo and PhD from the University of North Carolina at Chapel Hill.