A MILES paper on LLMs accepted at ACL 2024

The paper Exploiting Precision and Recall to assess the quality and diversity of LLMs, by Florian Le BronnecAlexandre VérineBenjamin NegrevergneYann Chevaleyre and Alexandre Allauzen has been accepted to the 62nd Annual Meeting of the Association for Computational Linguistics (ACL 2024)!

This paper adapts the precision and recall metrics used in image generation to the context of text generation, with experiments on large language models (LLMs) such as LLAMA-2 and Mistral. Congrats to all, and especially to Florian Le Bronnec and Alexandre Vérine for leading the effort!

 

 

Best paper award@EACL 2024

Congratulations to Florian Le Bronnec, PhD student and MILES member, for receiving the best paper award at the 18th conference of the European chapter of the Association for Computational Linguistics (EACL 2024)!

This paper is a joint work with Alexandre Allauzen, as well as colleagues from Sorbonne Université, Criteo and Singapore.

Check out their work here: LOCOST: State-Space Models for Long Document Abstractive Summarization

MILES@NeurIPS 2023

MILES members are attending the Neural Information Processing Systems (NeurIPS) conference in New Orleans, LA, USA! Be sure to check their posters:

 

MILES @ICLR 2023

Two papers co-authored by MILES members have been accepted to the 2023 edition of ICLR (International Conference on Learning Representations):

Congratulations to all!

MILES at NeurIPS

Six papers featuring members of MILES (and very recent alumni) were accepted at NeurIPS 2022, one of the flagship conferences in machine learning! Below are the papers (with MILES authors in bold)

Congratulations to all, and kudos to all authors who submitted at NeurIPS this year!

 

MILES Seminar: Makoto Yamada

MILES has the great pleasure of hosting Dr. Makoto Yamada on September 22 for a guest seminar!

Title: Approximating 1-Wasserstein Distance with Trees

Abstract: Wasserstein distance, which measures the discrepancy between distributions, shows efficacy in various types of natural language processing (NLP) and computer vision (CV) applications. One of the challenges in estimating Wasserstein distance is that it is computationally expensive and does not scale well for many distribution comparison tasks. In this talk, I propose a learning-based approach to approximate the 1-Wasserstein distance with trees. Then, I demonstrate that the proposed approach can accurately approximate the original 1-Wasserstein distance for NLP tasks. (https://arxiv.org/abs/2206.12116)

Speaker bio: Makoto Yamada received the Ph.D. degree in statistical science from The Graduate University for Advanced Studies (SOKENDAI, The Institute of Statistical Mathematics), Tokyo, in 2010. Currently, he is a team leader at RIKEN AIP, an associate professor at Kyoto University, and a transitional associate professor at Okinawa Institute of Science and Technology (OIST). His research interests include machine learning and its application to biology, natural language processing, and computer vision. He published more than 50 research papers in premium conferences and journals such as NeurIPS, AISTATS, ICML, AAAI, IJCAI, and TPAMI, and won the WSDM 2016 Best Paper Award.

The seminar will take place on September 22 at 3pm in room A707 of Université Paris Dauphine-PSL.

MILES seminars are back!

As part of its weekly group meeting, MILES is glad to announce three seminars in the upcoming weeks!

 

  • September 30, 2021: In-person seminar by Ruben Ohana (ENS & LightOn) 
    Training neural networks with Direct Feedback Alignment: theory and applications in adversarial robustness
  • October 7, 2021: In-person seminar by Laurent Daudet (LightOn)
    Optical random projections: scaling up randomized algorithms for scientific computing and machine learning
  • October 14, 2021: Online seminar by Lindon Roberts  (ANU, Australia)
    Inexact Derivative-Free Optimization for Bilevel Learning

Want to give a seminar for our group and beyond? Please send an email at clement.royer@dauphine.psl.eu.