Journal Paper

The list of the submitted papers is here

  1. A. Dieuleveut, G. Fort, E. Moulines and H-T. Wai. Stochastic Approximation beyond Gradient for Signal Processing and Machine Learning. Submitted. HAL-03979922. IEEE Trans Signal Processing,71:3117-3148, 2023.
  2. G. Fort and E. Moulines. Stochastic Variable Metric Proximal Gradient with variance reduction for non-convex composite optimization. September 2022, revised in December 2022.  Accepted for publication in Statistics and Computing, March 2023. HAL-03781216
  3. G. Fort, B. Pascal, P. Abry and N. Pustelnik. Covid19 Reproduction Number: Credibility Intervals by Blockwise Proximal Monte Carlo Samplers. HAL-03611079.  IEEE Trans. Signal Processing, 71:888-900, 2023.
  4. G. Fort, E. Moulines, P. Gach. Fast Incremental Expectation Maximization for non-convex finite-sum optimization: non asymptotic convergence bounds, with a Supplementary material.   Matlab codes on Github.  Revised in December 2020, under the title « Fast Incremental Expectation Maximization for finite-sum optimization: asymptotic convergence« . Accepted in Statistics and Computing, May 2021.
  5. S. Crepey, G. Fort, E. Gobet and U. Stazhynski. Uncertainty quantification for Stochastic Approximation limits using Chaos Expansion. Nov17, HAL-01629952. SIAM-ASA Journal of Uncertainty Quantification, 8(3):1061-1089, 2020.
  6. D. Barrera, S. Crepey, B. Diallo, G. Fort, E. Gobet and V. Stazhinksi. Stochastic Approximation Schemes for Economic Capital and Risk Margin ComputationsESAIM Proc (CEMRACS 2017), 65:182–218, 2019.
  7. G. Fort, B. Jourdain, T. Lelièvre and G. Stoltz. Convergence and Efficiency of Adaptive Importance Sampling techniques with partial biasingJournal of Statistical Physics, 171(2):220-268, 2018.
  8. G. Fort, E. Ollier and A. Leclerc-Samson. Stochastic Proximal Gradient Algorithms for Penalized Mixed Models. arXiv:1704.08891. Statistics and Computing, 29(2):231-253, 2019.
  9. G. Fort, E. Gobet and E. Moulines. MCMC design-based non-parametric regression for rare event. Application to nested risk computation. Monte Carlo Methods and Applications, 23(1):21–42, 2017.
  10. G. Morral, P. Bianchi and G. Fort. Success and Failure of Adaptation-Diffusion Algorithms for Consensus in Multi-Agent Networks. IEEE Trans. Signal Processing, 65(11):2798-2813, 2017.
  11. Y. Atchadé, G. Fort and E. Moulines. On perturbed proximal gradient algorithms, Submitted in February 2014 under the title « On stochastic proximal gradient algorithms ». arXiv:1402:2365 math.ST.  JMLR, 18(10):1-33, 2017.
  12. H. Braham, S. Ben Jemaa, G. Fort, E. Moulines and B. Sayrac. Spatial prediction under location uncertainty in cellular networks, arXiv:1510:03638,  IEEE Trans. Wireless Communications, 15(11):7633-7643, 2016.
  13. H. Braham, S. Ben Jemaa, G. Fort, E. Moulines and B. Sayrac. Fixed Rank Kriging for Cellular Coverage Analysis. ArXiv:1505:07062, IEEE Trans. Vehicular Technology, 66(5):4212-4222, 2016
  14. A. Schreck, G. Fort, E. Moulines and M. Vihola.Convergence of Markovian Stochastic Approximation with discontinuous dynamics . arXiv math.ST 1403.6803, submitted in March 2014. SIAM J. Control Optim., 54(2):866-893, 2016
  15. G. Fort, B. Jourdain, T. Lelièvre and G. Stoltz. Self-Healing Umbrella Sampling: convergence and efficiency. arXiv math.PR 1410.2109, submitted in October 2014. Revised in April 2015, Accepted Nov 15. Statistics and Computing, 27(1):147-168, 2017
  16. .A. Schreck, G. Fort, S. Le Corff and E. Moulines. A shrinkage-thresholding Metropolis adjusted Langevin algorithm for Bayesian variable selection. arXiv math.ST 1312.5658.  IEEE J. of Selected Topics in Signal Processing, 10(2):366-375, 2016.
  17. A. Durmus, G. Fort and E. Moulines. Subgeometric rates of convergence rates in Wasserstein distance for Markov chains. arXiv:1402.4577math.PR. Ann. inst. Henri Poincaré, 52(4):1799-1822, 2016.
  18. G. Fort. Central Limit Theorems for Stochastic Approximation with Controlled Markov Chain Dynamics EsaimPS, 19:60-80, 2015.  arXiv math.PR 1309.311C.
  19. Andrieu, G. Fort and M. Vihola. Quantitative convergence rates for sub-geometric Markov chains. Advances in Applied Probability, 52(2):391-404, 2015. arXiv math.PR 1309.0622
  20. G. Fort, B. Jourdain, E. Kuhn, T. Lelièvre and G. Stoltz. Convergence of the Wang-Landau algorithm. Math. Comp., 84:2297-2327, 2015.  arXiv:1207.6880 [math.PR]
  21. G. Fort, B. Jourdain, E. Kuhn, T. Lelièvre and G. Stoltz. Efficiency of the Wang-Landau algorithm.  App. Math. Res. Express, 2914(2):275-311, 2014.   arXiv:1310.6550.
  22. R. Bardenet, O. Cappé, G. Fort and B. Kegl. Adaptive MCMC with Online Relabeling. (accepted for publication in 2013) Bernoulli, 21(3):1304-1340, 2015. arXiv:1210.2601 [stat.CO]
  23. P. Bianchi, G. Fort and W. Hachem.  Performance of a Distributed Stochastic Approximation Algorithm,  IEEE Trans. on Information Theory, 59(11):7405-7418, 2013.
  24. S. Le Corff and G. Fort. Online Expectation Maximization-based algorithms for inference in Hidden Markov Models. Electronic Journal of Statistics, 7:763-792, 2013. arXiv math.ST 1108-3968. Supplement paper, math.ST 1108-4130.
  25. G. Fort, E. Moulines, P. Priouret and P. Vandekerkhove. A Central  Limit Theorem for Adaptive and Interacting Markov Chains.arXiv:1107.2574 Supplement paper   Bernoulli 20(2):457-485, 2014.
  26. A. Schreck, G. Fort and E. Moulines. Adaptive Equi-energy sampler : convergence and illustration. ACM Transactions on Modeling and Computer Simulation (TOMACS), 23(1):Article 5 – 27 pages, 2013.
  27. S. Le Corff and G. Fort. Convergence of a particle-based approximation of the Block online  Expectation Maximization algorithm, ACM Transactions on Modeling and Computer Simulation (TOMACS) 23(1):Article2 – 22 pages, 2013.
  28. G. Fort, E. Moulines, P. Priouret and P. Vandekerkhove. A simple variance inequality  for U-statistics of a Markov chain with Applications. Statistics & Probability Letters 82(6):1193-1201, 2012.
  29. G. Fort, E. Moulines and P. Priouret. Convergence of adaptive and interacting  Markov chain Monte Carlo algorithms. Ann. Statist. 39(6):3262-3289, 2012.  [Supplementary material],
  30. Y. Atchadé and G. Fort. Limit theorems for some adaptive MCMC algorithms with subgeometric kernels, part II. Bernoulli 18(3):975-1001, 2012.
  31. M. Kilbinger, D. Wraith, C. P. Robert, K. Benabed, O. Cappé,  J.F.Cardoso, G. Fort, S. Prunet, and F.R.Bouchet. Bayesian model comparison in cosmology with Population Monte Carlo.  MNRAS 405(4):2381-2390, 2010. ArXiv astro-ph.CO/0912.1614
  32. P. Etoré, G. Fort, B. Jourdain and E. Moulines. On adaptive stratification. Annals of Operations Research 189(1):127-154, 2011. ArXiv math.PR/0809.1135
  33. Y. Atchadé and G. Fort. Limit theorems for some adaptive MCMC algorithms with subgeometric kernels. Bernoulli 16(1):116-154, 2010. ArXiv  math.PR/0807.2952  
  34. S. Connor and G. Fort. State-dependent Foster-Lyapunov criteria for subgeometric convergence of Markov chains. Stoch. Processes Appl.119:4176-4193, 2009 ArXiv math.PR/0901.2453
  35. D.Wraith, M. Kilbinger, K. Benabed, O. Cappé, J.F. Cardoso, G. Fort, S. Prunet and C.P. Robert. Estimation of cosmological parameters using adaptive importance sampling. Phys. Rev. D. 80(2), 2009.
  36. R. Douc, G. Fort, E. Moulines and P. Priouret. Forgetting of the initial distribution for Hidden Markov Models. Stoch. Process Appl. 119(4):1235-1256, 2009. ArXiv.math.ST/0703836.
  37. R. Douc, G. Fort and A. Guillin. Subgeometric rates of convergence of f-ergodic strong Markov processes. Stoch. Process Appl., 119(3):897-923, 2009. ArXiv math.ST/0605791
  38. G. Fort, S. Meyn, E. Moulines and P. Priouret. The ODE methog for the stability of skip-free Markov Chains with applications to MCMC. Ann. Appl. Probab. 18(2) :664-707, 2008.
  39. F. Forbes and G. Fort. A convergence theorem for Variational EM-like algorithms: application to image segmentation. IEEE Trans on Image Processing, 16(3):824(837, 2007.
  40. G. Fort, S. Lambert-Lacroix, J. Peyre. Réduction de dimension dans les modèles généralisés : application à la classification de données issues de biopuces. Journal de la SFDS, 146(1-2):117-152, 2005. Matlab code and Data set. Erratum on the research report TR0471.
  41. G. Fort and S. Lambert-Lacroix. Classification using Partial Least Squares with Penalized Logistic Regression. Bioinformatics, 21(7):1104-1111, 2005. Matlab codes and Data set.
  42. G. Fort and G.O. Roberts. Subgeometric ergodicity of strong Markov processes. Ann. Appli. Probab. 15(2):1565-1589, 2005.
  43. R. Douc, G. Fort, E. Moulines and P. Soulier. Practical drift conditions for subgeometric rates of convergence. Ann. Appl. Probab. 14(3):1353-1377, 2004.
  44. G. Fort, E. Moulines, G.O. Robserts and J.S. Rosenthal. On the geometric ergodicity of hybrid samplers. J. Appl. Probab. 40(1):123-146, 2003.
  45. G. Fort and E. Moulines. Polynomial ergodicity of Markov transition kernels. Stoch. Process Appl. 103(1),57-99, 2003.
  46. G. Fort and E. Moulines. Convergence of the Monte Caro EM for curved exponential families. Ann. Stat. 31(4):1220-1259, 2003.
  47. G. Fort and E. Moulines. V-subgeometric ergodicity for a Hastings-Metropolis algorithm. Stat. Probab. Lett. 49(4):401-410, 2000. 

 

MathScinet : here

Google Scholar : here

Mentions Légales