libpyhat.transform.dim_reductions package

Submodules

libpyhat.transform.dim_reductions.jade module

class libpyhat.transform.dim_reductions.jade.JADE(n_components=4)[source]

Bases: object

fit(X, corrdata=None)[source]
transform(X)[source]
libpyhat.transform.dim_reductions.jade.jadeR(X, m=None, verbose=True)[source]

Blind separation of real signals with JADE.

jadeR implements JADE, an Independent Component Analysis (ICA) algorithm developed by Jean-Francois Cardoso. See http://www.tsi.enst.fr/~cardoso/guidesepsou.html , and papers cited at the end of the source file.

Translated into NumPy from the original Matlab Version 1.8 (May 2005) by Gabriel Beckers, http://gbeckers.nl/pages/numpy_scripts/jadeR.py

Parameters:

X – an nxT data matrix (n sensors, T samples). May be a numpy array or

matrix.

m – output matrix B has size mxn so that only m sources are

extracted. This is done by restricting the operation of jadeR to the m first principal components. Defaults to None, in which case m=n.

verbose – print info on progress. Default is True.

Returns:

An m*n matrix B (NumPy matrix type), such that Y=B*X are separated sources extracted from the n*T data matrix X. If m is omitted, B is a square n*n matrix (as many sources as sensors). The rows of B are ordered such that the columns of pinv(B) are in order of decreasing norm; this has the effect that the most energetically significant components appear first in the rows of Y=B*X.

Quick notes (more at the end of this file):

o This code is for REAL-valued signals. A MATLAB implementation of JADE

for both real and complex signals is also available from http://sig.enst.fr/~cardoso/stuff.html

o This algorithm differs from the first released implementations of

JADE in that it has been optimized to deal more efficiently 1) with real signals (as opposed to complex) 2) with the case when the ICA model does not necessarily hold.

o There is a practical limit to the number of independent

components that can be extracted with this implementation. Note that the first step of JADE amounts to a PCA with dimensionality reduction from n to m (which defaults to n). In practice m cannot be very large (more than 40, 50, 60… depending on available memory)

o See more notes, references and revision history at the end of

this file and more stuff on the WEB http://sig.enst.fr/~cardoso/stuff.html

o For more info on NumPy translation, see the end of this file.

o This code is supposed to do a good job! Please report any

problem relating to the NumPY code gabriel@gbeckers.nl

Copyright original Matlab code : Jean-Francois Cardoso <cardoso@sig.enst.fr> Copyright Numpy translation : Gabriel Beckers <gabriel@gbeckers.nl>

libpyhat.transform.dim_reductions.lfda module

Translated from https://github.com/cran/lfda/blob/master/R/lfda.R #’ Local Fisher Discriminant Analysis for #’ Supervised Dimensionality Reduction #’ #’ Performs local fisher discriminant analysis (LFDA) on the given data. #’ #’ LFDA is a method for linear dimensionality reduction that maximizes #’ between-class scatter and minimizes within-class scatter while at the #’ same time maintain the local structure of the data so that multimodal #’ data can be embedded appropriately. Its limitation is that it only #’ looks for linear boundaries between clusters. In this case, a non-linear #’ version called kernel LFDA will be used instead. Three metric types can #’ be used if needed. #’ #’ x = n x d matrix of original samples. #’ n is the number of samples. #’ y = length n vector of class labels #’ r = dimensionality of reduced space (default: d) #’ metric = type of metric in the embedding space (no default) #’ ‘weighted’ — weighted eigenvectors #’ ‘orthonormalized’ — orthonormalized #’ ‘plain’ — raw eigenvectors #’ knn = parameter used in local scaling method (default: 5) #’ #’ Returns: #’ T = d x r transformation matrix (Z = x * T) #’ Z = n x r matrix of dimensionality reduced samples #’ #’ Keywords: lfda local fisher discriminant transformation mahalanobis metric #’ #’ @author Yuan Tang #’ @seealso See klfda for the kernelized variant of #’ LFDA (Kernel LFDA). #’ #’ References: #’ Sugiyama, M (2007). #’ Dimensionality reduction of multimodal labeled data by #’ local Fisher discriminant analysis. #’ Journal of Machine Learning Research, vol.8, 1027–1061. #’ #’ Sugiyama, M (2006). #’ Local Fisher discriminant analysis for supervised dimensionality reduction. #’ In W. W. Cohen and A. Moore (Eds.), Proceedings of 23rd International #’ Conference on Machine Learning (ICML2006), 905–912. #’

class libpyhat.transform.dim_reductions.lfda.LFDA(r=None, metric='plain', knn=5)[source]

Bases: object

fit(x, y)[source]
transform(newdata=None)[source]
libpyhat.transform.dim_reductions.lfda.getAffinity(distance2, knn, nc)[source]

libpyhat.transform.dim_reductions.mnf module

class libpyhat.transform.dim_reductions.mnf.MNF(n_components=4)[source]

Bases: object

fit_transform(data)[source]

Description: Minimum Noise Fraction (MNF) wrapper for pysptools implementation Rationale: Removes noise while preserving information

libpyhat.transform.dim_reductions.mnf.make_3d(data)[source]
libpyhat.transform.dim_reductions.mnf.mnf(M, n_components)[source]

Module contents