PRESENTATION

  • The director words
  • Organigram
  • People
  • Contacts & Access
  • Intranet

RESEARCH

  • Geometry & Images
  • DDeterministic Models and Algorithms
  • Data and Stochastic: Theory and Applications

PRODUCTION

  • Seminars & Colloquiums
  • Academical defenses
  • Key Works
  • Publications
  • Sofware
  • Gallery

JOBS

  • Masters & Doctoral studies
  • Job Opportunities
  • UFR-IMA
  • ENSIMAG

LINKS

  • LJK Forge Platform
  • MIRAGE Platform
  • INRIA Rhône-Alpes

D.A.T.A. Department

MENU
  • Home
  • Members
  • PhD students
  • Publications
  • Seminars
  • R Packages

LINKS
  • SFdS
  • SMAI
Dimension reduction (extraits du rapport d'évaluation du LJK)

In most high-dimensional problems, samples are sparse and classical estimators are thus unreliable. This is the so-called curse of dimensionality. Under the assumption that high-dimensional phenomena lie near subspace of lower dimension, these problems can be tackled using dimension reduction strategies.  SAM and Mistis teams both worked on the estimation of effective dimension reduction subspaces. Main applications are hyperspectral imaging and pattern recognition.  Compressed sensing techniques have also been developed through various extensions of the Dantzig selector and of the Lasso method.

– High dimensionality, Non parametric statistics. High dimensional supervised and unsupervised clustering.
Stéphane Girard has proposed Gaussian models for high dimensional data based on a parsimonious parametrisation of the covariance matrix. The model is applied to supervised, unsupervised and semi-supervised classification contexts. [hal-inria-00071243v1, hal-00022183v1, hal-00325263v1]. The associated R software (HDDA/HDDC package) is described in [hal-00541203v3]. The approach has been adapted to the classification of spectroscopy data [hal-00459947v2
Voir la ressource
] where the observations were curves (joint work with J. Jacques from INRIA team MODAL in Lille). A dimension selection technique has also been proposed [hal-00440372v3] in collaboration with G. Celeux from INRIA team SELECT in Orsay.  Recently, the approach was extended to non Gaussian distributions through the use of kernel techniques [hal-00687304v1].

– High dimension regression.

While regression has been extensively studied, situations where the input variable is of high dimension, and where the response variable may not be fully observed, still challenge the current state of the art. Mistis team addressed this point following two lines.  
  • On one hand,  Stéphane Girard extended the standard Slice Inverse Regression method to make it  tractable  in  real-data  problems,  namely  with  very high dimensional inputs or [inria-00180458v3] multivariate  outputs  [hal-00714981v3]  and for data-streams [hal-00688609v3].  Both contributions were joint work with J. Saracco from INRIA team CQFD in Bordeaux. An application to the estimation of dominant parameters  for  leakage  variability  in  micro-electronics  [hal-00846806v1]  was  carried  out  in a collaboration with ST-microelectronics in the context of the PhD thesis of S. Joshi.  In the  context  of  the  PhD  of  A.  Chiancone  (started  Oct. 2013)  application  to  hyperspectral images analysis is investigated. 
  • On  the  other  hand,  starting  from  standard  mixture of  linear  regressions, Florence Forbes proposed  a novel  mixture of  locally-linear regression model that unifies regression and dimensionality reduction into a common framework [hal-00863468v3]. The approach compares favourably to a number of existing regression techniques and was apply to the retrieval of Mars surface physical properties from hyper-spectral images [hal-00863468v3] and sound source separation [hal-00960796v1].  The model has been implemented in a Matlab toolbox (GLLiM) available at https://team.inria.fr/perception/gllim_toolbox/.

–  Dimensionality  reduction  for  functional  models.

A.  Antoniadis,  P.  Fryzlewicz  (LondonSchool of Economics) and F. Letué adapted the Candès-Tao “Dantzig Selector” algorithm to the framework of the nonparametric Cox model and applied it to microarray gene expression problems [hal-00568233v1, hal-00853895v1, hal-00379716v1].  

M. Giacofci worked on dimensionality reduction in wavelet-based mixed effects models for genomics data [tel-00987441v1].

S. Lambert-Lacroix (TIMC-Grenoble) and L. Zwald combined Huber’s criterion with Lasso to create a Least Absolute Deviation regression technique that is resistant to heavy-tailed errors and outliers [hal-00661864v1].

With Z. Harchaoui (LEAR team, LJK), F. Enikeeva studied the problem of detecting abrupt multidimensional changes of mean in sequences of Gaussian vectors. Their decision procedure provides optimal performance for both highly and moderately sparse changes, and it is rate-optimal in the minimax sense [hal-00933185v1].

In the context of dimensionality reduction for nonparametric estimation, A.
Iouditski
with V.Spokoiny and co-workers [hal-00978264v1,hal-00981927v1], aimed to estimate the “Effec-tive Dimension Reduction” (EDR) subspace – the subspace of the observation universe that is “charged” by the function/features of interest. The current state of this research, the most recent algorithms for dimensionality reduction by direct estimation of the EDR-subspace using semidefinite programming, and an application of these techniques to genomic data are described in [hal-00381120v1].

–  Certified  modelling  for  compressed  sensing.

With  A.  Nemirovski  and  F.  Kilinc ̧-Karzan, A. Iouditski studied the computational tractability of sensor synthesis for compressed sensing [hal-00978268v1, hal-00978266v1, hal-00976864v1, hal-00372141v2, hal-00450983v2,hal-00981868v1,hal-00981904v1, hal-00981896v1, hal-00981921v1, hal-00981926v1, hal-00981929v1, hal-00981928v1,hal-00981940v1]. One statistical outcome of this research is new optimal methods for sparse recovery  based  on L1 minimization.   These  are  closely  related  to  classical  Lasso  and  the Dantzig Selector, but they outperform these “conventional” techniques by exploiting a computational analysis of the sensing matrix.

–  Adaptive nonparametric estimation.

C. Dion developed an adaptive nonparametric deconvolution technique based on the recent Goldenshluger-Lepski selection method [hal-01023300v3].

With A. Nemirovski (Georgia Institute of Technology), A. Iouditski worked on minimax and adaptive affine estimators, showing that under rather general assumptions and up to an absolute constant factor, the minimax-optimal affine estimator is minimax-optimal among all estimators [hal-00976658v1, hal-00981924v1]. The resulting estimators have been successfully applied to dynamical system identification [hal-00853893v1], testing and detection applications [hal-00978374v1, hal-00978362v1, hal-00981883v1], and distribution recovery from noisy observations [hal-00976668v1].  A. Iouditski and A. Nemirovski also studied the problem of denoising signals of unknown structure [hal-00318084v1, hal-00365531v1]. This relies on the adaptive reconstruction of an optimal linear filter from the observations (which thus depends on the unknown signal itself).


Mentions légales - contact: Webmaster