MATLAB codes used in "Classifying Matrices with a Spectral Regularization" (ICML 2007)
What it does
It solves a classification problem over symmetric matrices with dual spectral norm (trace norm) regularization using a simple interior point method. It was successfully applied to single trial EEG classification problem in the context of brain-computer interfacing (BCI). In this case the input matrix is the short-time covariance estimate or its matrix logarithm (Tomioka & Aihara, 2007; see also [Slides]).
Given training examples (X1,y1), (X2,y2),..., (Xn,yn), it solves a regularized empirical risk minimization problem:
where W is a real matrix and
The above norm (sum of singularvalues) is called the dual spectral norm, trace norm, Ky-Fan r-norm, or the nuclear norm.
BCI competition III dataset IVa.
(a) inputs are covariance matrices (symmetric and positive semidefinite)
(b) inputs are the log of covariance matrices (only symmetric)
You can compare the above results with the results at the competition. Note that the task in the competition was to cope with the small training set size and many participants used the test data to adaptly update their classifiers, which we haven't done here for the sake of simplicity.
How to use it
Note that the number of electrodes is reduced to 49 (see
- Download lrds.zip (file size: 3,992KB).
- Download the datasets: BCI competition III dataset IVa. Download also the true labels, save it with the variable name
- Change the first two lines of
BcicompIIIiva.m to the path you saved the datasets, as follows:
4: file = '<your path>data_set_IVa_%s.mat';
5: file_t = '<your path>data_set_IVa_%s_truth.mat';
(%s is where the name of each subject is substituted.)
- Run the script
- You can compare the results with the files
performance_acc.eps in the folder
- You can try the logm option by setting
opt.logm to one as follows:
13: opt.logm = 1;
The results are saved in files
- You can also try another implementation based on
CVX toolbox, developed by Michael Grant, Stephen Boyd, and Yinyu Ye (tested with version 1.1 build 520 July 29, 2007). Call
lrds_cvx.m instead of
lrds_dual.m as follows:
63: [W, bias] = lrds_cvx(Xtr, Ytr, lambda(ii));
channels.eps) to make the computation fast. You can skip this but we have seen no significant difference in the performance.
List of files
Here's a list of files created using m2html. You can also take a look into the files.
| BcicompIIIiva||BcicompIIIiva.m - main script file that applies the method to BCI |
| apply_lrds||apply_lrds - applies the classifier |
| covariance||covariance - calculate the covariance between channels for each sample |
| cutoutTrials||cutoutTrials - cuts out trials from continuous EEG recording |
| getfieldarray||getfieldarray - get field from a structured array. |
| logmatrix||logmatrix - takes logm for each trial. |
| loss_0_1||loss_0_1 - zero-one loss. |
| lrds_cvx||lrds_cvx - logistic regression with dual spectral regularization |
| lrds_dual||lrds_dual - Logistic regression with dual spectral regularization |
| matmultcv||matmultcv - multiplies matrix to each covarianced trial. |
| setDefaults||setDefaults - set defaults. |
| train_lrds||train_lrds - wrapper function that produces the classifier struct |
| whiten||whiten - Perform whitening |
Generated on Sat 26-Apr-2008 15:48:22 by m2html © 2003
- Please cite the following paper if you publish something based on the software provided above:
"Classifying Matrices with a Spectral Regularization",
Ryota Tomioka and Kazuyuki Aihara,
Proc. of the 24th Annual International Conference on Machine Learning (ICML2007), pp. 895-902, ACM Press. Oregon, USA, June, 2007. [BibTex]
- Please keep the copyright information in the files and provide a link to this page if you make a software based on the codes provided above public.
- Please contact me when you want to use the software commercially.
- Please give me feedbacks! good or bad, :-)
I give the list of relevant prior studies below because I was not aware of them when my ICML paper was written.
- Maryam Fazel, Haitham Hindi, & Stephen P. Boyd, "A Rank Minimization Heuristic with Application to Minimum Order System Approximation". Proc. American Control Conference, 2001. [web]
- Nathan Srebro, Jason Rennie, & Tommi Jaakkola, "Maximum Margin Matrix Factorizations". Advances in NIPS 17, 2005. [web]
- Jacob Abernethy, Francis Bach, Theodoros Evgeniou, Jean-Philippe Vert,
"Low-rank matrix factorization with attributes". Technical Report, 2006. [web]
- Andreas Argyriou, Theodoros Evgeniou, & Massimiliano Pontil, "Multi-Task Feature Learning". Advances in NIPS 19, 2007. [web]
- Yonatan Amit, Michael Fink, Nathan Srebro, & Shimon Ullman. "Uncovering
Shared Structures in Multiclass Classification". In Proc. ICML 2007.
- Benjamin Recht, Maryam Fazel, & Pablo A. Parrilo, "Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization". Technical Report, 2007.
- Ming Yuan, Ali Ekici, Zhaosong Lu, & Renato Monteiro, "Dimension reduction and coefficient estimation in multivariate linear regression". J. Roy. Stat. Soc. B, 69(3), 329-346, 2007. [web]
- Andreas Argyriou, Charles A. Micchelli, Massimiliano Pontil, & Yiming Ying, "A Spectral Regularization Framework for Multi-Task Structure Learning". Advances in NIPS 20, 2008.
- Jacob Abernethy, Francis Bach, Theodoros Evgeniou, & Jean-Philippe Vert, "A New Approach to Collaborative Filtering: Operator Estimation with Spectral Regularization". Technical Report, 2008.
I am grateful to Klaus-Robert Müller and my colleagues at Intelligent Data Analysis Group, Fraunhofer FIRST and Technische Universität Berlin for their support in developing this software. I thank Jeremy Hill at Max Planck Institiute for Biological Cybernetics for suggesting me the logm variant.
back to my homepage