Sunteți pe pagina 1din 20

Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Feature Extraction Improvements using an LMI


approach and Riemannian Geometry Tools: an
application to BCI

Cleison Silva, Rafael Mendes, and Alexandre Trofino

Department of Automation and Systems Engineering, UFSC, Brazil

IEEE-MSC, Buenos Aires, Argentina


September 19-22, 2016

Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 1 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Outline

1 Introduction

2 Feature Extraction

3 Classification using Riemann Distance

4 Main Results

5 Numerical Example

6 Conclusion

Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 2 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Introduction

• Signal Acquisition
• EEG signals acquired with
scalp sensors;
• BCI paradigm using Motor
Imagery.
• Signal Processing steps
• Filtering: Select frequency
bands of interest;
• Feature Extraction: Focus
on patterns related to
mental tasks;
• Classification: Labeling the
patterns with correct users’
intension.

Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 3 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

How a BCI system works? - A brief review

Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 4 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Signal Decomposition using a Sinusoidal Basis

The optimal estimation from raw EEG signal, Z , using the sinusoidal
basis, B:
X̂ = KB (1)
in the least squares sense (LS)

K ∗ = min kZ − KBk2F (2)


K

which implies in the solution:

K ∗ = ZB T (BB T )−1 (3)

K ∗ is a compact representation of Z in the sinusoidal basis B.

Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 5 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Signal Decomposition using a Sinusoidal Basis

Figura: a) Raw signal b) Sinusoidal basis c) Projected and Filtered signals d)


Coefficients

Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 6 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Feature Extraction based on correlation matrix


Consider a linear transformation G on X to highlight the feature.

Y = XG ∈ Rn×q (4)

The correlation matrix of Y is:


1
cor{Y } , YY T (5)
q
Replacing (4) into (5)
1
cor{Y } = XGGT X T (6)
q

Defining H = q1 GGT ≥ 0, we obtain

cor{Y } = X HX T (7)

Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 7 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Feature Extraction based on correlation matrix

In (7) the correlation matrix is linear in H, but the dimension of H is too


big q × q, so replacing X by X̂ we obtain

1
cor{X̂ G} = KBGGT B T K T = K H0 K T (8)
q
where
1
H0 = BGGT B T ∈ R2m×2m (9)
q
note that H0 has smaller dimension than H, because m << q.

Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 8 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Using LMIs to solve the Feature Extraction Problem

As the correlation matrix is linear in H0 , the problem of determining H0


can be formulated as
min J(H0 )
H0

subject to: H0 ∈ M (10)

where J(H0 ) is an objective function and M is a set of constraints to


the problem.
Note that the correlation matrices must be at least positive
semi-definite matrices, thus H0 ≥ 0.

Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 9 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Classification using Riemann Distance

• The cone of positive definite matrices is not a Euclidean space.


• The shortest path between two elements of the Manifold is called
Geodesic.
The length of Geodesic between Φ and Σ is
n
δR2 (Φ, Σ) = kln(Φ−1 Σ)k2F = ∑ log2 (λi ) (11)
i=1

where λi s are the generalized eigenvalues of Φ−1 Σ. It is called


Riemann distance.

Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 10 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Classification using Riemann Distance


The classification is based on the shortest distance of a sample
correlation matrix to the mean correlation matrices of the classes.

Figura: Geodesic path between two elemnets in the Manifold.

Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 11 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Main Results

Consider Φ, Σ positive definite matrices, and the following matrices


inequalities:
αΣ − Φ > 0 ⇔ αI > ΦΣ−1 (12)
αΦ − Σ > 0 ⇔ αI > ΣΦ−1 (13)

• The α ∈ R∗+ is an upper-bound for the eigenvalues of ΦΣ−1 and


ΣΦ−1 ;
• Note that, when α → 1 implies that Σ ≈ Φ.

Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 12 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Minimizing α

ΦΣ−1 = V ΛV T ΣΦ−1 = V Λ−1 V T

The eigenvalues of ΦΣ−1 are sorted in decreasing order of magnitude,


as λ1 > λ2 > · · · > λn ,

   

λ1   λn−1 
   −1 
λ2  −1
λn−1 
diag(Λ) =  ..  , diag(Λ ) = 
 
 .. 

 .  . 
λn   −1 
λ1
↑ ↑
Minimizing α is equivalent to minimize the λmax (ΦΣ−1 ) and
λmax (ΣΦ−1 ) and simultaneously, maximizing the λmin (ΦΣ−1 ) and
λmin (ΣΦ−1 ).
Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 13 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Riemann Distance as a function of the eigenvalues


f (λ ) = log2 (λ )

λmin 1 λmax λ

Figura: Eigenvalues contribution to Riemann distance between Φ and Σ.


Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 14 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Main Results

We begin defining the mean correlation matrices of both classes,


defined as Ca and Cb , respectively.

1 na 1 nb
Ca = ∑ cor{Ŷi } and Cb = ∑ cor{Ŷi } (14)
na i=1 nb i=1

Note that mean correlation matrices are linear functions of H0 as well.


.
And Ci = cor{Ŷi } is a sample correlation matrix.

Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 15 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Main Results

τ Ca − Ci > 0 ⇔ τ I > Ci Ca−1 , Ci ∈ A (15)


τ Ci − Ca > 0 ⇔ τ I > Ca Ci−1 , Ci ∈ A (16)
and
τ Cb − Ci > 0 ⇔ τ I > Ci Cb−1 , Ci ∈ B (17)
τ Ci − Cb > 0 ⇔ τ I > Cb Ci−1 , Ci ∈ B (18)
where τ is an upper-bound for all the elements, na + nb .
When no objective function is considered and the problem is to
minimize τ, the underlying problem is quasi-convex and can be
represented as

min τ; Subject to: H0 ≥ 0, (16), (17, (18), (19) (19)


H0

Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 16 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Numerical Example

• Dataset from BCI competition IV - dataset IIa


• Includes EEG signals from 9 subjects;
• Four different mental tasks: left (LH) and right (RH) hand, feet
(FE) and tongue (TO);
• 22 EEG channels, time window of 2 seconds, and sampling rate
250 Hz;
• 4 classes with 72 trials per class and subject;
• Frequency band between 8 to 30 Hz;
• Each trial is represented by a correlation matrix
XHX 0 , X ∈ R22×500 ( filtered signal) or
KH0 K 0 , K ∈ R22×90 (approximation from the sinusoidal basis)

Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 17 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Results and Discussion

Tabela: Evaluation results of the proposed method.


SUBJ 1 2 3 4 5 6 7 8 9 Mean
H0 opt 92.96 60.54 87.32 70.14 66.45 73.23 79.17 93.06 93.06 79.54
Refer 93.75 63.19 94.44 75.00 63.19 71.53 72.92 96.55 91.67 80.23

• The classification results exhibits significant variations depending


on the subject;
• Subjects 5 and 7 have classification results poor, but the
optimization proposed obtained higher classification accuracy;

Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 18 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Concluding Remarks

Despite difficulties to numerically solve the problem, as it stands, the


proposed method has important contributions to be highlighted.
• Information about the spectrum of the EEG signals can be easily
found at the columns of the matrix K in the correlation
parameterization matrix KH0 K T .
• A considerable reduction in the size of the problem to be solved is
obtained with this parameterization.
• The possibility of using a different parameterization H0 for each
class, and optimize H0 to reduce the distance among elements of
the classes, while keeping apart the mean correlation matrices of
the classes, is an important feature that needs further research.

Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 19 / 20
Introduction Feature Extraction Classification using Riemann Distance Main Results Numerical Example Conclusion

Thanks for listening


Emails:
• Cleison Silva - cleison@ufpa.br
• Rafael Mendes - rmendes@gmail.com
• Prof. Alexandre Trofino Neto - alexandre.trofino@ufsc.br

This work was supported in part by Brazilian agencies CNPq and


CAPES, and the Brazilian Universities UFSC and UFPA.

Feature Extraction Improvements using an LMI approach and Riemannian Geometry Tools: an application to BCI 20 / 20

S-ar putea să vă placă și