Codes

MATLAB Codes

  1. Code and Data set
    • Paper 1  « Classification using Partial Leas Squares with Penalized logistic Regression », with S. Lambert-Lacroix
    • Paper 2 « Partial Least Squares for classification and feature selection in Microarray gene expression data »
    • Codes developed by G. Fort
  2. Code
    • Paper « A convergence theorem for Variation-EM like algorithms: Applications to Image Segmentation », with F. Forbes
    • Codes developed by G. Fort
  3. Code
    • Paper « On adaptive stratification », with B. Jourdain, P. Etoré and E. Moulines
    • Codes developed by G. Fort
  4. PMC applied to cosmology
    • Paper « Estimation of cosmological parameters using adaptive importance sampling », with D. Wraith, M. Kilbinger, K. Benabed, O. Cappé, J.F. Cardoso, S. Prunet and C.P. Robert
    • These codes were developed by M. Kilbinger and K. Benabed (G. Fort participated to preliminary versions of the codes,developed in Matlab).
  5. Codes
    • Paper 1 « Convergence of a particle-based approximation of the Block-Online Expectation Maximization algorithm », with S. Le Corff
    • Paper 2 « Online Expectation Maximization based algorithms for inference in Hidden Markov Models », with S.Le Corff
    • Paper 3 « New Online-EM algorithms for general Hidden Markov Models. Application to the SLAM », with S. Le Corff and E. Moulines
    • Codes developed by S. Le Corff — during his PhD under my co-supervision.
  6. Codes
    • Paper « Adaptive Equi-Energy sampler: convergence and illustration », with A. Schreck and E. Moulines
    • Codes developed by A. Schreck — during her PhD under my co-supervision
  7. Codes
    • Paper « Adaptive Metropolis Online-Relabeling », with R. Bardenet, O. Cappé and B. Kegl
    • Codes developed by R. Bardenet
  8. Codes
    • Paper « Shrinkage-Threshoding Metropolis-Adjusted Langevin Algorithm », with A. Schreck, S. Le Corff and E. Moulines.
    • Codes developed by A. Schreck — during her PhD under my co-supervision
  9. Codes on github
    • Paper « Fast Incremental Expectation Maximization for non-convex finite-sum optimization: non asymptotic convergence bounds » with E. Moulines and P. Gach
    • Codes developed by G. Fort
  10. Codes (TBA)
    • Paper 1 « The Perturbed Prox-Preconditioned SPIDER algorithm for EM-based large scale learning », with E. Moulines
    • Paper 2 « The Perturbed Prox-Preconditioned SPIDER algorithm: non-asymptotic convergence bounds », with E. Moulines.
    • Paper 3 — TBA
    • Codes developed by G. Fort
  11. Codes on github
    • Paper 1- « Temporal Evolution of the Covid19 pandemic reproduction number: Estimations from Proximal optimization to Monte Carlo sampling », by P. Abry, G. Fort, B. Pascal and N. Pustelnik. Accepted for publication in EMBC 2022 proceedings.
    • Paper 2- « Credibility Interval Design for Covid19 Reproduction Number from nonsmooth Langevin-type Monte Carlo sampling », by H. Artigas, B. Pascal, G. Fort, P. Abry and N. Pustelnik. Accepted for publication in EUSIPCO 2022 proceedings.
    • Paper 3- « Estimation et Intervalles de crédibilité pour le taux de reproduction de la Covid19 par échantillonnage Monte Carlo Langevin Proximal », by P. Abry, G. Fort, B. Pascal and N. Pustelnik. Accepted for publication in GRETSI 2022 proceedings.
    • Paper 4- « Credibility intervals for the reproduction number of the Covid-19 pandemic using Proximal Lanvevin samplers », by P. Abry, G. Fort, B. Pascal and N. Pustelnik. Accepted for publication in EUSIPCO 2023.
    • Paper 5- « Covid19 Reproduction Number: Credibility Intervals by Blockwise Proximal Monte Carlo samplers », by G. Fort, B. Pascal, P. Abry and N. Pustelnik. Accepted for publication in IEEE Trans Signal Processing
    • Codes developed by G. Fort (and some of them, with H. Artigas).
Mentions Légales