{"id":641,"date":"2021-04-13T20:52:55","date_gmt":"2021-04-13T20:52:55","guid":{"rendered":"http:\/\/quentin-duchemin.alwaysdata.net\/wiki\/?page_id=641"},"modified":"2022-02-06T19:27:02","modified_gmt":"2022-02-06T19:27:02","slug":"documents","status":"publish","type":"page","link":"https:\/\/quentin-duchemin.alwaysdata.net\/wiki\/index.php\/documents\/","title":{"rendered":"Documents"},"content":{"rendered":"\n<p>Here I share some works done during my studies.<\/p>\n\n\n\n<p><strong>Some works done during my research Master (MVA)<\/strong><\/p>\n\n\n\n<p><a href=\"http:\/\/quentin-duchemin.alwaysdata.net\/wiki\/wp-content\/uploads\/2021\/04\/KERNEL_DM.pdf\" data-type=\"URL\" data-id=\"http:\/\/quentin-duchemin.alwaysdata.net\/wiki\/wp-content\/uploads\/2021\/04\/KERNEL_DM.pdf\">Project<\/a> for the course on Kernel Methods with the <a href=\"http:\/\/quentin-duchemin.alwaysdata.net\/wiki\/wp-content\/uploads\/2021\/04\/hw.pdf\" data-type=\"URL\" data-id=\"http:\/\/quentin-duchemin.alwaysdata.net\/wiki\/wp-content\/uploads\/2021\/04\/hw.pdf\">questions<\/a>.<\/p>\n\n\n\n<p><a href=\"http:\/\/quentin-duchemin.alwaysdata.net\/wiki\/wp-content\/uploads\/2021\/04\/maths_for_imaging.pdf\" data-type=\"URL\" data-id=\"http:\/\/quentin-duchemin.alwaysdata.net\/wiki\/wp-content\/uploads\/2021\/04\/maths_for_imaging.pdf\">Project<\/a> on a unifying framework for representer theorems.<\/p>\n\n\n\n<p>Homeworks (<a href=\"http:\/\/quentin-duchemin.alwaysdata.net\/wiki\/wp-content\/uploads\/2021\/04\/MVA_DM1_duchemin_oreistein.pdf\" data-type=\"URL\" data-id=\"http:\/\/quentin-duchemin.alwaysdata.net\/wiki\/wp-content\/uploads\/2021\/04\/MVA_DM1_duchemin_oreistein.pdf\">1<\/a>, <a href=\"http:\/\/quentin-duchemin.alwaysdata.net\/wiki\/wp-content\/uploads\/2021\/04\/MVA_DM2_DUCHEMIN_OREISTEIN.pdf\" data-type=\"URL\" data-id=\"http:\/\/quentin-duchemin.alwaysdata.net\/wiki\/wp-content\/uploads\/2021\/04\/MVA_DM2_DUCHEMIN_OREISTEIN.pdf\">2<\/a> and <a href=\"http:\/\/quentin-duchemin.alwaysdata.net\/wiki\/wp-content\/uploads\/2021\/04\/MVA_DM3_DUCHEMIN_OREISTEIN.pdf\">3<\/a>) related to the course &#8220;Probabilistic Graphical Models&#8221;<\/p>\n\n\n\n<p><a href=\"http:\/\/quentin-duchemin.alwaysdata.net\/wiki\/wp-content\/uploads\/2021\/04\/poster-malap.pdf\" data-type=\"URL\" data-id=\"http:\/\/quentin-duchemin.alwaysdata.net\/wiki\/wp-content\/uploads\/2021\/04\/poster-malap.pdf\">Poster<\/a> done for a Machine-Learning project on classification by diffusion. <\/p>\n\n\n\n<p><strong>A list of good references that I read during my master and my PhD with some comments <\/strong><\/p>\n\n\n\n<ul><li>General references on the theoretical aspects of Machine Learning and data-science <br>&#8211; A large list of nice Theoretical Computer Science references are available <a href=\"https:\/\/github.com\/mostafatouny\/awesome-theoretical-computer-science\" data-type=\"URL\" data-id=\"https:\/\/github.com\/mostafatouny\/awesome-theoretical-computer-science\">here<\/a>. <br>&#8211; I recommand also the book recently written by Francis Bach &#8220;Learning from first Principles&#8221;<\/li><li>Markov chains<br>&#8211; Meyn &amp; Tweedie&#8217;s book: A good reference for theoretical aspects on (discrete time) Markov chains on general state space<br>&#8211; Levin&#8217;s book on mixing time for Markov chains. This book deals mainly with Markov chain with finite state space and introduces important notions related to mixing like the spectral gap of a transition kernel or the Cheeger constant. <br>&#8211; For a friendly version of the previous book, I recommend the lectures of Justin Salez (from a course at Sorbonne University).<br>&#8211; For hidden Markov chains, one can read the book wirtten by Eric Moulines on the subject.<\/li><li>Concentration of measure<br>&#8211; P.Massart&#8217;s book is the main reference on the subject<br>&#8211; I also recommand the lecture notes from Ana Ben Amou to get a more  synthetic introduction to this topic.<\/li><li>Convex Optimization<br>&#8211; S. Bubeck&#8217;s book is a nice reference to understand the most important algorithms in convex optimization.<br>&#8211; Boyd&#8217;s book is one of the most famous reference in the field of convex optimization.<\/li><li>General books on Optimization<br>&#8211; Numerical Optimization from Nocedal and Wright<br>&#8211; Convex Analysis and Monotone Operator Theory in Hilbert Spaces from Combettes and al. is a great reference to understand ADMM, proximal algorithms or Douglas Rachford algorithm.<\/li><li>High dimensional statistics<br>&#8211; The book from R. Vershynin introduces important concepts of high dimensional statistics with a specific focus on geometry and concentration of measure and their interactions. This book presents in particular sub-gaussian and exponential random variables, concentration inequalities for quadratic forms (Hanson Wright&#8217;s inequality), Slepian, Gordon, Sudakov and Dudley inequalities (and much more).<br>&#8211; To focus specifically on Random Matrices, there exists the Saint Flour lecture notes of Alice Guionnet. <br>&#8211; Wainwright, High-Dimensional Statistics: A Non-Asymptotic Viewpoint   This book covers a larger number of topics compared to the previous one with chapters on RKHS or PCA. I also spends several chapters presenting theoretical results for sparse linear models in high dimensions. The author takes the framework the more general possible which makes some sections difficult to grasp. <br>&#8211; Giraud, Introduction to High-Dimensional Statistics<br>&#8211; Gin\u00e9, Mathematical Foundations of Infinite-Dimensional Statistical Models<\/li><li>Non parametric statistics<br>&#8211; Non parametric inference, Tsybakov<br>&#8211; Lecture Notes written by John Duchi were also really helpful for me.<\/li><li>Sparsity and compressed sensing<br>&#8211; Statistical Learning with Sparsity, Hastie: one of the few books that present some &#8220;recent&#8221; and important topics such as Post-selection inference and the polyhedral lemma.<br>&#8211; Books from Sara Van de Geer are widely used in the community (but are technically difficult).<br>&#8211; Claire Boyer&#8217;s lecture notes on compressed sensing<\/li><li>Multiple testing procedures<br>&#8211; Lecture notes from Emmanuel Cand\u00e8s offer a really nice historical overview of the subject. A specific attention is given to the work personally done by Pr. Cand\u00e8s (such as the Knockoff filter).<br>&#8211; The recent handbook &#8220;Handbook of Multiple Comparisons&#8221; edited by Cui, Dickhaus and al. presents advanced research topics.<\/li><li>Probability theory<br>&#8211; Books from Billingsley are often used as references for master courses on probability theory and limit theorems.<br>&#8211; Gin\u00e9, Decoupling &#8211; From dependence to independence : Book introducing decoupling methods.<\/li><li>Stochastic Calculus<br>&#8211; Marc Yor&#8217;s book on Continuous Martingales and brownian motion describes  in detail a variety of techniques used by probabilists in the field of random processes. This is not a book for beginners on the subject and is more intended for doctoral students or reseachers in the field.<br>&#8211; Jean-Fran\u00e7ois Le Gall&#8217;s lecture notes are a great reference for people that do not work precisely on Stochastic Calculus.<\/li><li>Optimal Transport<br>&#8211; A nice reference that was the purpose of a reading group during my master MVA with Professor Vianney Perchet is the book written by Santambrogio: Optimal transport. <br>&#8211; For a more computational view point (and maybe a more easy to grasp presentation of concepts from a theoretical point of view), I recommend the book from Gabriel Peyr\u00e9 and Marco Cuturi: &#8220;Computational Optimal Transport&#8221;.<br>&#8211; Lectures Notes from Lenaic Chizat give an impressive well-structured summary of the two previous references.<\/li><\/ul>\n","protected":false},"excerpt":{"rendered":"<p>Here I share some works done during my studies. Some works done during my research Master (MVA) Project for the course on Kernel Methods with the questions. Project on a unifying framework for representer theorems. Homeworks (1, 2 and 3) related to the course &#8220;Probabilistic Graphical Models&#8221; Poster done for a Machine-Learning project on classification<\/p>\n<div class=\"more-link\">\n             <a href=\"https:\/\/quentin-duchemin.alwaysdata.net\/wiki\/index.php\/documents\/\" class=\"read-more\">Read More<i class=\"fa fa-caret-right\"><\/i><\/a>\n        <\/div>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":[],"_links":{"self":[{"href":"https:\/\/quentin-duchemin.alwaysdata.net\/wiki\/index.php\/wp-json\/wp\/v2\/pages\/641"}],"collection":[{"href":"https:\/\/quentin-duchemin.alwaysdata.net\/wiki\/index.php\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/quentin-duchemin.alwaysdata.net\/wiki\/index.php\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/quentin-duchemin.alwaysdata.net\/wiki\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/quentin-duchemin.alwaysdata.net\/wiki\/index.php\/wp-json\/wp\/v2\/comments?post=641"}],"version-history":[{"count":22,"href":"https:\/\/quentin-duchemin.alwaysdata.net\/wiki\/index.php\/wp-json\/wp\/v2\/pages\/641\/revisions"}],"predecessor-version":[{"id":861,"href":"https:\/\/quentin-duchemin.alwaysdata.net\/wiki\/index.php\/wp-json\/wp\/v2\/pages\/641\/revisions\/861"}],"wp:attachment":[{"href":"https:\/\/quentin-duchemin.alwaysdata.net\/wiki\/index.php\/wp-json\/wp\/v2\/media?parent=641"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}