CSD Seminar webpage

, par Simon Coste

Welcome to the CSD seminar webpage ! 

 

This seminar aims at inviting researchers coming from all domains related to Data Science, ranging from physics and mathematics to linguistics and cognitive science. 

The talks will usually be on Thursdays from 2pm to 3pm in the conference room of the center, but since there are many workshops and seminars, they will sometimes be moved to other days of the week. 

The talks will last about 45 minutes, and will be followed by questions and discussions . 

The CSD seminar team

 

List of seminars : 

  • 2024/01/25, Francis Bach, TBA
  • 2023/12/14, Mathieu Desbrun, Blue Noise Sampling
  • 2023/11/30, Pierre-Alexandre Mattei, Are ensembles getting better All the time ?
  • 2023/11/14, Antoine Maillard, Fitting ellipsoids to random points
  • 2023/10/17, Andrea Montanari, Sampling via diffusion processes : rigorous guarantees, hardness, disorder chaos
  • 2023/06/22, Soledad Villar, Machine learning and invariant theory
  • 2023/06/08, Antonio Silveti-Falls, Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion Problems
  • 2023/05/11, Francois Lanusse, Deep Generative Models for Hybrid Physical/Data-Driven Bayesian Inference
  • 2023/04/27, Pierre Ablin, Training neural networks with orthogonal weights
  • 2023/04/13, Arthur Mensch, Flamingo : a Visual Language Model for Few-Shot Learning
  • 2023/03/16, Stéphanie Allassonnière, Data-Augmentation in High Dimensional Low Sample Size Setting Using a Geometry-Based Variational Autoencoder
  • 2023/03/02, James Thornton, Diffusion / Field based Reconstruction and Generation for 3D Shapes
  • 2023/02/16, Gaël Varoquaux, Embeddings to learn on messy relational data
  • 2023/01/19, Hervé Jégou, Learning image representations with coarse, instance-level and image-level supervision
  • 2022/10/20 Bruno Loureiro, Phase diagram of Stochastic Gradient Descent in high-dimensional two-layer neural networks
  • 2022/10/13, Raja Giryes, Sampling based analysis of neural network generalization and extrapolation
  • 2022/6/16, Francis Bach, The quest for adaptivity
  • 2022/6/9, Rachel Bawden, Low-resource MT : few-shot learning and historical language normalisation
  • 2022/6/2, Thomas Moreau, Convolutional Sparse Coding for Electromagnetic Brain Signals
  • 2022/4/21, Marylou Gabrié, Enhancing Sampling with Learning
  • 2022/3/24, Charles Martin - A Semi-Empirical Model for the Generalization Capacity of Deep Neural Networks
  • 2022/2/3, Marc Lelarge - Exploiting Graph Invariants in Deep Learning
  • 2022/1/27, Sebastian Goldt - The interplay of data structure and learning dynamics in simple neural networks
  • 2021/12/16, Dieuwke Hupkes - On locality, globality, consistency and compositionality in neural machine translation
  • 2022/12/9, François Charton - Deep learning for symbolic maths
  • 2022/11/26, Mathieu Wyart - What makes data learnable by deep learning
  • 2022/11/18, Rémi Monasson - Restricted Boltzmann Machines revisited : from sampling to design