-
| | | | | | | EnglishEn

New approaches for complex brain activity analysis.

Kyselgov E.N.

STC REMET. National aerospace university KHAI named by N.E. Zhukovski.

1. Introduction

Scientific Technical Center of Radio Electronic and Medical Equipment Technologies proposes new modern system for performing analysis of brain electrical activity. The NeuroCom system is designed for operation systems Windows 98/Me and Windows 2000 and works on Intel and AMD platforms. The kernel of NeuroCom is software module that performs data registration, storing, processing, analysis and data description. Professional developers team designed this modern software product used all last software technologies in medical diagnostic field. In such a way, NeuroCom can be used for quick analysis and for complex scientific investigations. There are a lot of features are designed for work simplification. They are:

  • Customizable interface of the program,
  • Special education mode,
  • Assistant that a similar to Microsoft Office assistant,
  • Day notes,
  • Tool tips,
  • Speech comments,
  • Voice commands,
  • and many others.

You can tune almost all compartments of the program, beginning with colors up to calculation constants.

Many parts of the program are well known and traditional for medical-diagnostic EEG system, but there are many new ones. For example, NeuroCom has the traditional subsystems:

  • Database, for storing visit data, failure tolerance, network processing;
  • Registration subsystem;
  • Data processing subsystem (filters, spectra, correlation analysis);
  • Analysis subsystem.

All this parts was improved and extended according to modern requirements and process specifications.

The fundamentally new subsystems in NeuroCom are:

  • Template and report subsystem,
  • Norms subsystem,
  • Independent component analysis (ICA),
  • Tomograph subsystem.

In next parts of article well shortly stop on general subsystems and in detail view most interesting new solutions.

2. Database

Database is a first course that you see when program is run. Its place when all information about patient and visit is stored. Its simple, reliable and software environment-independent module, but a power of this module hides behind the seeming simplicity. In this module there are many peculiar properties that are created for user convenience. For example, you can use any count of database that attached to the program. Any kind of visit information can be stored in the base; it can be electroencephalogram data (EEG), event related potential data (ERP), rheograph data (RHG), electrocardiogram data (ECG), and so on. The size of registration data is unlimited. Any base, from database list, can be located on shared resource in local network, intranet and Internet, in that way, thanks to database networks capability, multi-user architecture is available. Doctors assistant can record visit data and store it in the network database. It this time, doctor can process this visit, reading it from the base.

Open architecture of the database and visit files enables you to do your own program for data analysis.

3. Hardware and Registration subsystem

Hardware

The potential difference on scalp surface usually has the small amplitude that does not exceed 100mV in normal records. This distinguishing feature requires usage of the bio-potential amplifier with high multiplication factor and low level of the input noise.

The amplifier has to have a high input resistance for influence minimization of the skin-electrode resistance upon EEG signals.

As a rule, with the infrequent exception, bio-potential amplifier works in the environment with high level of the line noise (50 Hz, 60 Hz). The line noise applies to the amplifier inputs in the form of the common-mode voltage. It is necessary usage amplifiers with high level of the common-mode rejection ratio for the success noise rejection, and extra-high input impedance level with respect to in-phase component, for preventing transformation of the common-mode signal to the differential signal.

As to the spectrum structure of the EEG signal, necessary amplifier bandwidth lies in 0.5 50 Hz range (with the exception of some ERP signals when necessary bandwidth lies in 10 2000 Hz range).

STC REMET produces bio-potential amplifiers that satisfy these requirements. Amplifiers are produced with 16-chanals (for general examination) and 19-chanals (for registration EEG signals in International 10-20 scheme). Amplifiers are supplied with the additional channels for ERP, EOG, and so on registration.

Registration subsystem

Usually, in NeuroCom the EEG signals are recorded in monopolar derivation (MPAo) relatively to the referent electrode Ao that is receipted by averaging electrodes A1 and A2 (are located on lobe of the ears). The recorded EEG signals can be remontaged by the program to any derivation from the list:

  • Monopolar derivations relatively to the referent electrode A1;
  • Monopolar derivations relatively to the referent electrode A2;
  • Monopolar derivations with two referent electrodes A1, A2;
  • Monopolar derivations relatively to the common average referent electrode (CAR);
  • Bipolar sagittal derivations with long and short distance between electrodes;
  • Bipolar frontal derivations with long and short distance between electrodes;
  • Source derivation;
  • Deep source derivation.

Besides to the mentioned above schemes, you can make your any own schemes.

Before recording EEG data you have to select interesting for you type of the visit. Its so called visit template that contains all information about the future visit. This information consist of type and kind of probes, assumed recording length, norms that will be applied to the visit data, stimulation information, etc. If you want, you can modify any visit template and store it with new name.

4. Data processing and Analysis subsystem

In present, there are no fully automated systems for EEG signal processing or description, because until now all tries of the full EEG data analysis formalization are failed. This situation exists in all medical-diagnostic fields that have the dial with the complex subject of inquiry, especially as the human brain.

Building new system, we task themselves dont change the doctor but help him by reducing routine works and giving to him new possibilities in analysis and processing. By the new mathematic approaches, we tried to leave to the doctor to look at EEG data on the other hand. We created the tool that extends research capability of such complex subject. In NeuroCom the tradition methods of research adjoin with fundamentally new approach. Most of the tradition methods we have extended. So, for example, rich set of filters features exists in NeuroCom. You can tune filter characteristic both form all channels simultaneously and separately for each channel, also your filters can be tuned for filtering any spectra range that is selected by you.

Besides, there are correlation analysis, spectra view, highly precision mapping that are extended by rich set of new map types according to all possible derivations, independent component analysis and so on.

The fundamentally new approach for EEG data analysis is Independent Component Analysis (ICA). Independent Component Analysis (Lee, Bell, Sejnowski) refers to a family of related algorithms that exploit independence to perform blind source separation. Its an effective method for removing artifacts and separating sources of the brain signals from EEG recordings.

Independent Component Analysis (ICA)

Independent Component Analysis was originally proposed to solve the blind source separation problem, to recover M source signals, s = {s1(t), , sM(t)}T (e.g., different voice, music, or noise sources) after they are linearly mixed by multiplying by A, an unknown matrix, x = {x1(t), , xN(t)}T = A×s (where N count of observed signals), while assuming as little as possible about the natures of A or the component signals. Specifically, one tries to recover a version, u = W×x, of the original sources, s, identical save for scaling and permutation, by finding a square matrix, W, specifying spatial filters that linearly invert the mixing process. The key assumption used in ICA to solve this problem is that the time courses of activation of the sources (or in other cases the spatial weights) are as statistically independent as possible. Most ICA is performed using information-theoretic unsupervised learning algorithms. Despite its relatively short history, ICA is rapidly becoming standard technique in multivariate analysis.

Mathematically, the ICA problem is as follows: We are given a collection of N-dimensional random vectors, x (sound pressure levels at N microphones, N-pixel patches of a larger image, outputs of N scalp electrodes recording brain potentials, or nearly any other kind of multi-dimensional signal). Typically there are diffuse and complex patterns of correlation between the elements of the vectors. ICA, like Principal Component Analysis (PCA), is a method to remove those correlations by multiplying the data by a matrix as follows:

u = W×x        (1)

(Here, we imagine the data is zero-mean see below for preprocessing details.). But while PCA only uses second-order statistics (the data covariance matrix), ICA uses statistics of all orders and pursues a more ambitious objective. While PCA simply decorrelates the outputs (using an orthogonal matrix W), ICA attempts to make the outputs statistically independent, while placing no constraints on the matrix W. Statistical independence means the joint probability density function (p.d.f.) of the output factorizes:

     (2)

(it was showed by Nadal and Parga, and applicable only in the low-noise case), while decorrelation means only that <uuT>, the covariance matrix of u, is diagonal (here <> means average).

Another way to think of the transform in (1) is as

x = W-1×u       (3)

Here, x is considered the linear superposition of basis functions (columns of W-1), each of which is activated by an independent component, ui. We call the rows of W filters because they extract the independent components. In orthogonal transforms such as PCA, the Fourier transform and many wavelet transforms, the basis functions and filters are the same (because WT = W-1), but in ICA they are different. The ICA problem was introduced by Herault and Jutten. The results of their algorithm were poorly understood and led to Comons 1994 paper defining the problem, and to his solution using fourth-order statistics. In parallel to blind source separation studies, unsupervised learning rule based on information theory were proposed by Linsker. The goal was to maximize the mutual information between the inputs and outups of a neural network. Roth and Baram and Bell and Sejnowski independently derived stochastic gradient rules for this maximization and applied them, respectively, to forecasting, time series analysis, and the blind source separation. Bell and Sejnowski put the blind source separation problem into an information-theoretic framework and demonstrated the separation and deconvolution of mixed sources.

Other algorithms for performing ICA have been proposed from different viewpoints. Maximum likelihood estimation (MLE) approaches to ICA by Pham, Parra, Gaeta and Lacoume. These methods are very close to the Infomax approach so this algorithm may be called Infomax/MLE ICA.

The original Infomax learning rule for blind source separation by Bell and Sejnowski was suitable for super-Gaussian sources. Girolami and Fyfe derive, by choosing negentropy as a projection pursuit index, a learning rule that is able to blindly separate mixed sub- and super-Gaussian sources distributions. Lee, Girolami and Sejnowski show that the learning rule is an extension of the infomax principle satisfying a general stability criterion and preserving the simple architecture of Bell and Sejnowski. When optimization uses the natural gradient, or equivalently the relative gradient, the learning rule gives superior convergence. Simulations and results of real-world physiological data show the power of the proposed methods.

For the linear mixing and unmixing model, ICA use the following assumptions:

  • The number of sources is less than or equal to the number of sensors M <= N;
  • The sources s(t) are at each time instant mutually independent;
  • At most one source is normally distributed;
  • No sensor noise or only low additive noise signals are permitted.

In NeuroCom the extended infomax ICA version is realized. This version can perform blind source separation of the mixed sub- and super-Gaussian sources. Matrix W is result of ICA processing; by this matrix NeuroCom calculate EEG sources u (equation 1); by inverse matrix W-1 NeuroCom calculate signal composition x from source set u. Matrix M is founded in iteration process by maximization of joint entropy H by natural gradient as follows:

,       (4)

where j(u) is gradient vector of log likelihood called the score function.

An elegant way of parameterizing the learning rule (4) to separate mixed sub- and super- Gaussians has been proposed by Girolami and Fyfe by choosing negentropy as a projection pursuit index, resulting in simple form for ji(ui):

,      (5)

giving:

where K is a diagonal matrix with elements sign(k4(ui)) and k4(ui) is kurtosis of the source estimate ui.

Usage the Independent Component Analysis in NeuroCom

The special process window is proposed for ICA analysis. You can analyze in it any part of recorded probe applying rich filter set for rejection noises and filtering out insignificant frequency range. By means of properties window you can tune ICA algorithm for best performance and calculation quality. As showed on the figure, you can to select interesting for you probe range (left window) and calculate decomposition (ICA processor will find ICA matrix W and calculate source signals, those you will see in right ICA window). For example, in left window of the figure we see probe with strong artifact in all channels. This artifact is most significant in frontal electrodes; from this observation we can assume that it is eye artifact. After performing ICA decomposition, in right window we will see sources set. The first from the topside of window source (ICA#1) is our artifact component (in maps window we can see dipole and total power maps of this component that gives information about localization of this component, in tomograph window we can see its position in brain). Other sources in the window are signals of the brain activity (with the exception of the ICA#14 that is electrode artifact (it is clear from tomograph window)).

For elimination eye artifact from signal we have to construct the composition from all sources except ICA#1. Second figure shows the result of such composition. As we can see, probe no contains eye-artifact ICA#1 (the ruler is set in the place where artifact maximum was located). If you have seen some artifact components in observing time range, you can to construct the composition without these components and in the same way to suppress all artifacts. If there are artifacts in more then one time range, all of them can be suppressed step by step. If artifact is observed during whole probe, it can be suppressed at one sitting (we select whole probe as the source for ICA decomposition).

Artifact suppression is only one application of ICA approach. Besides, ICA can be used for dipole source localization, research of the brain activity, as the auxiliary method in event related potential research, etc.

Tomograph subsystem

Tomograph subsystem was created for the task of the brain activity localization resolving. Localization of the brain activity is calculated according to assumption that activity of neurons group can be modeled by dipole with current distribution on the scalp equivalent to the current distribution from the neurons group. The inputs for tomograph subsystem are ICA matrix W that consists of weights of each source relatively to the scalp channels. Tomograph subsystem calculates position of the dipole in the brain on base the electrode position and source weights (from matrix WT) knowing. If some ICA component is electrode artifact, the ICA weights for this component have the distinguishing distribution that can be selected by the program. Dipole for such component is removed from the brain scope to the skin surface. Tomograph subsystem permits to localize the position of each independent sources of ICA decomposition.

Figures show example of dipoles localization. Each dipole is presented in two views: front view and top view. The dipole center (position in the brain) is presented by colored sphere. For easy work with tomograph, so called active dipole is introduced. The active dipole is one for which a and b coordinates, relative distance from brain center, dipole or total power map, and position on the tomographical slices are showed. If active dipole is the electrode artifact, the corresponding notification is showed at topside of the window. You can change active dipole by the mouse. In this case, the brain model will be rotated in both views so as you can see dipole position in the best way. There is rotation mode in which brain model will be rotating until you stop this mode. In this mode you can completely inspect location and relative position of all dipoles.

5.Conclusion

So, in a general, we have examined NeuroCom program. We hope that this hardware-software complex will be useful fore any investigations. The NeuroCom program is not creating in the final shape and will constantly improving in accordance with the modern requirements.

6.Literature using

  1. A.J. Bell and T.J. Sejnowski, An information maximization approach to blind separation and blind deconvolution, Neural Computation 7, 1129-1159p. (1995).
  2. Te-Won Lee, M. Girolami, A.J. Bell and T.J. Sejnowski, A Unifying Information-Theoretic Framework for Independent Component Analysis, Computers & mathematics with application 39, 1-21p. (2000).

google icon
+38(057) 719-0478, 719-9188, (095) 047-4000, (098) 047-4000
E-mail : info@xai-medica.com