ICA - CNL
Independent Component Analysis




Lars Kai Hansen
Department of Mathematical Modeling
Building 321
Technical University of Denmark
DK-2800 Lyngby, DENMARK
email: lkhansen@imm.dtu.dk
http://eivind.imm.dtu.dk

Independent Component Analysis (ICA) is a new signal processing research field concerned with unsupervised learning of independent effects in multivariate signals. There are a number of comprehensive Web presentations of ICA research:
Terry Sejnowskis Computational Neuroscience Lab
Dr. Shun-ichi Amari, Riken Brain Science Institute.

Erkki Oja is Professor of Computer Science and the head of the Laboratory of Computer and Information Science, Helsinki University of Technology. .

International Workshop on Independent Component Analysis and Blind Signal Separation
ICA 2000
19-22 June 2000. Helsinki, Finland.

J.-F. Cardoso organizor of:

International workshop on Independent Component Analysis and        ICA '99
Blind Separation of Signals. Jan. '99. Aussois, France.



My interests in ICA were stimulated during a San Diego visit (Oct 97 - Mar 98) and discussions with the Sejnowski group:

Noisy ICA

Motivated by the succes of principal component analysis (PCA) in image processing, e.g., in functional neuroimaging, my aim is to use ICA to discover generalizable "events" in short image sequences. I am analyzing a model which in a simple way extends PCA. The model decomposes the image sequences in a mixture of a few independent components and additive white noise. The model is likelihood based (\cite{Belouchrani,Olshausen,Pearlmutter,MacKay,Moulines}). The likelihood formulation is attractive for several reasons. First, it allows a principled discussion of the inevitable priors implicit in any separation scheme. Secondly, the likelihood approach allows for direct adaptation of the plethora of powerful schemes for parameter optimization, regularization, and evaluation of supervised learning algorithms. Finally, for the case of linear mixtures without noise, the likelihood approach is equivalent to another popular approach based on information maximization (\cite{Amari,Bell,Lee}).

The optimal number of independent components is an important issue when dealing with high-dimensional noisy data. We face the classical bias-variance dilemma \cite{Geman}: if the number of components is too small we fail to capture the variability of the image set, while a model with too many components will overfit the training set. The estimated generalization error, i.e., the performance of an algorithm on test data has been used to optimize both supervised and unsupervised adaptive systems e.g., PCA and Clustering (\cite{Hansen96}). Preliminary results indicate that generalization may also be used to select the optimal dimensionality of ICA algorithms (\cite{Hansen98a}) .

ICA by Delayed Correlation

This paper addresses the use of blind sources separation techniques for analysis of short image sequences (multi-dimensional signals). We suggest a modification of Molgedey and Schuster's approach allowing for an arbitrary number of source signals. The viability of the new algorithm is illustrated on a toy-problem image sequence separation problem (\cite{Hansen98b}).

Selected references

A. Bell and T.J. Sejnowski:
An Information-Maximization Approach to Blind Separation and Blind Deconvolution
Neural Computation 1129-59 (1995).

A. Belouchrani and J.-F. Cardoso:
Maximum likelihood source separation by the expectation-maximization technique:
Deterministic and stochastic implementation
In Proc. NOLTA, 49-53 (1995).

S. Geman, E. Bienenstock, and R. Doursat:
Neural Networks and the Bias/Variance Dilemma
Neural Computation, 1-58 (1992).

L.K. Hansen and J. Larsen:
Unsupervised Learning and Generalization.
In Proceedings of the IEEE International Conference on Neural Networks 1996, Washington DC, vol. 1, 25-30 (1996).

L.K. Hansen:
Separation of noiy mixtures.
Unpublished manuscript (1998a).

L.K. Hansen and J. Larsen:
Source Separation in Short Image Sequences using Delayed Correlation.
In Proceedings of the IEEE Nordic Signal Processing Symposium Denmark 1998, 253-256, (1998a).

T.-W. Lee, M. Girolami, A.J. Bell and T.J. Sejnowski:
A unifying Information-theoretic framework for Independent Component Analysis.
International Journal on Mathematical and Computer Modeling, in press (1998)

D. MacKay:
Maximum Likelihood and Covariant Algorithms for Independent Components Analysis.
``Draft 3.7'' (1996).

E. Moulines, J.-F. Cardoso, E. Gassiat:
Maximum likelihood for blind separation and deconvolution of noisy signals using mixture models
Proc. ICASSP'97 Munich, vol. 5, pp. 3617-20 (1997).

B.A. Olshausen:
Learning linear, sparse, factorial codes
A.I. Memo 1580, Massachusetts Institute of Technology (1996).

B.A. Pearlmutter and L.C. Parra:
A context-sensitive generalization of ICA.
In Proc. 1996 International Conference on Neural Information Processing. Hong Kong (1996).

Return to homepage.