Gilles Louppe Profile
Gilles Louppe

@glouppe

8,890
Followers
2,284
Following
232
Media
5,836
Statuses

On a quest to accelerate Science with #AI . Professor of AI and deep learning @UniversiteLiege . Previously @CERN , @nyuniversity .

Liege, Belgium
Joined September 2009
Don't wanna be here? Send us removal request.
Pinned Tweet
@glouppe
Gilles Louppe
1 month
Very excited to finally share the excellent work of my PhD students @RochmanOmer and @SachaLewin on NeuralMPM, a neural emulation framework for particle-based simulations!
3
24
116
@glouppe
Gilles Louppe
4 years
OpenAI's CLIP is confused by Belgian art 🤷‍♂️
Tweet media one
61
941
5K
@glouppe
Gilles Louppe
6 years
In Belgium, scientific papers that originate from public funds can now all be made public in open access, regardless of any contract with publishers. It is written in the law. And retroactive.
@PaulThir
Paul Thirion
6 years
Voilà, la loi fédérale belge autorisant #OAGreen publiée aujourd'hui au Moniteur belge art. 29 page 68691 TOUS articles scientif des chercheurs belges peuvent désormais être déposés en #OA sur #RI avec embargo max 6/12 mois. Quel que soit contrat signé !
4
62
77
29
988
2K
@glouppe
Gilles Louppe
2 years
Excited to kick off the second term with my students! Time to teach Deep Learning again! This year, I have decided to say goodbye to RNNs and GANs and cover GNNs and score-based diffusion models instead 🤖💡
Tweet media one
19
210
1K
@glouppe
Gilles Louppe
2 years
Bayesian nugget of the day: Metropolis-Hastings. 1) Draw a candidate θ from a proposal q(θ|θₜ), 2) Accept θ if u < π(θ)/π(θₜ) q(θₜ|θ)/q(θ|θₜ) for u~U[0,1], 3) Repeat. Beautifully, this results in a Markov chain whose stationary distribution coincides with the target π!
10
115
635
@glouppe
Gilles Louppe
3 years
Finally done with teaching my Deep Learning class for this term! This year's version includes new lectures on autodiff and transformers. All (787) slides can be found at 🧑‍🎓
6
95
450
@glouppe
Gilles Louppe
6 years
Scientific and societal impact of research comes rarely from where we expect. Open source software is one such example: today, Scikit-Learn just passed 10000 citations! Congratulations to all contributors for all the progress they triggered!
Tweet media one
5
123
443
@glouppe
Gilles Louppe
6 years
Remember the Galton board? When nails are positioned such that the probability of bouncing left is always the same, the resulting distribution is a binomial, for which the likelihood p(x) of bin x is known analytically.
3
99
392
@glouppe
Gilles Louppe
3 years
Today I used @OpenAI Davinci model (GPT-3) for an actual Turing test in my first lecture of Introduction to AI. It turned out great! Davinci passed the test! 🤖
3
27
311
@glouppe
Gilles Louppe
7 years
Tomorrow I'll actually teach my first ever lecture on deep learning, rebuilding all the basic constructs, starting from LDA up to deep rectified networks. Lecture slides available at
Tweet media one
10
78
304
@glouppe
Gilles Louppe
4 years
When your colleagues and friends send a welcome card to celebrate your newborn using the #NeurIPS2020 paper template ❤️ Thanks all! See you soon :)
Tweet media one
12
12
291
@glouppe
Gilles Louppe
7 years
It's out! Our latest paper with @KyleCranmer : Adversarial Variational Optimization of Non-Differentiable Simulators
Tweet media one
Tweet media two
6
91
252
@glouppe
Gilles Louppe
2 months
Watching PhD students lose their last crumbs of illusion about science when faced with stubborn, unreasonable, or completely absent reviewers is heartbreaking. We've got to do better as a community. #NeurIPS2024
6
24
262
@glouppe
Gilles Louppe
2 years
Started training my baby, hand-made 40M GPT model on Harry Potter books 🤓 GPUs go brrrr! 🚀 ( @PyTorch 's nn.DataParallel makes multi-GPU support so easy <3)
Tweet media one
9
19
235
@glouppe
Gilles Louppe
5 years
New paper by my student @WehenkelAntoine : Unconstrained Monotonic Neural Networks! In this work, we show how to make use of free-form neural networks to model arbitrary monotonic functions and derive a new autoregressive flow.
Tweet media one
2
67
225
@glouppe
Gilles Louppe
6 years
Brilliant step-by-step review of optimization methods for large-scale machine learning, by Léon Bottou et al. Must read if you want to understand why SGD is actually a good optimization algorithm in ML!
1
74
228
@glouppe
Gilles Louppe
7 years
The Kalman Filter, as implemented in assembler in the Apollo Guidance Computer! Those little things you discover when preparing your next lecture on temporal models...
2
76
222
@glouppe
Gilles Louppe
4 years
Who knew? Changing the reset gate in @kchonyc 's GRU so that the gate can take values in ]0,2[ instead of ]0,1[ makes the cell bistable, hence enabling long-lasting memories! Very nice work by my colleagues @vecovennicolas , @DamienERNST1 and Guillaume Drion!
@DamienERNST1
Damien ERNST
4 years
We have taken inspiration from biological neuron bistability to embed Recurrent Neural Nets (RNN) with long-lasting memory at the cellular level. Excellent performances on time-series which require very long memory. #DeepLearning #ArticialIntelligence
Tweet media one
2
51
228
2
53
221
@glouppe
Gilles Louppe
2 years
The more I talk about simulation-based inference, the more I realize that the concept of an intractable likelihood is completely foreign in some fields. A short thread 🧵
3
39
220
@glouppe
Gilles Louppe
1 year
Had quite some fun this morning teaching a new lecture on #GNNs ! 🧑‍🏫 Key takeaway: GNNs are built on the blueprint of stacking shared, local, and permutation invariant functions. They generalize classic architectures like convnets & transformers to graph-structured data 🤯🌐
3
40
208
@glouppe
Gilles Louppe
4 years
📢 My research group has an open position for a PhD candidate in deep-learning for simulation-based inference. Details available at PM for further details 🤖🔭
1
70
185
@glouppe
Gilles Louppe
4 years
You say Normalizing Flows? We see Bayesian Networks! Quite excited to announce two papers @WehenkelAntoine and I have just released on arXiv! + Thread below 👇
@WehenkelAntoine
Antoine Wehenkel
4 years
I am VERY excited to share not 1 but 2 papers hot from arxiv, joint work with @glouppe !🥳 We are showing strong connections between Bayesian networks and normalizing flows. 1)Using Bayesian networks to better understand NFs and 2)increasing BN framework with the NF machinery.
4
17
97
1
36
176
@glouppe
Gilles Louppe
2 years
Bayesian recipe of the day 🧑‍🍳: Take a simple latent variable model where the latents are Gaussian and the observed variables are linear Gaussian. Fit the linear projection parameters by maximum likelihood estimation. Then, posterior inference gives you (Probabilistic) PCA! 🤯
Tweet media one
3
16
172
@glouppe
Gilles Louppe
4 years
Finally found some time for hacking around with JAX! It feels almost as if I was back in 2011, writing Numpy code for sklearn, except now it is fully differentiable and runs on GPUs 🤓
2
8
166
@glouppe
Gilles Louppe
1 year
The materials of my lectures on i) deep generative models (VAEs, diffusion models, flows) and ii) simulation-based inference can now all be found at
@glouppe
Gilles Louppe
1 year
On my way to SFO ✈️ for a week at the 51st @SLAClab summer institute Ping me for ☕ around SLAC or Stanford
0
2
22
1
39
163
@glouppe
Gilles Louppe
8 years
Bayesian optimisation for tuning scikit-learn hyper-parameters Feedback welcome before the 1st release of `skopt`
10
56
145
@glouppe
Gilles Louppe
10 years
A manual for tree huggers: My PhD thesis "Understanding Random Forests: From Theory to Practice" http://t.co/i0JlGOB11W
9
62
128
@glouppe
Gilles Louppe
1 year
Diffusion models are not only good at generating pretty pictures, they can also solve (very) high-dimensional inference problems! Really proud of this last piece of work led by @FrancoisRozet on Bayesian data assimilation 🌍🤖 (We scale to 128x64x64-dimensional posteriors!)
@FrancoisRozet
François Rozet
1 year
📢 Tired of inaccurate weather forecasts? Of course you are, and that's why @glouppe and I are thrilled to announce Score-based Data Assimilation, a novel trajectory inference method powered by score-based generative models. 🧵
5
29
109
1
25
128
@glouppe
Gilles Louppe
6 years
Today is the first lecture of my new Deep Learning course at @UniversiteLiege . We will start easy with some reminders on the fundamentals of machine learning. Materials will be posted every week at Feedback is welcome!
Tweet media one
Tweet media two
Tweet media three
Tweet media four
3
23
121
@glouppe
Gilles Louppe
8 years
MinPy, or how to run your Numpy code on a GPU by changing a single import line Also curious to try its autograd
1
55
119
@glouppe
Gilles Louppe
5 years
Do you know of anyone who reproduced the double-U generalization curve of over-parameterized networks? Looking for a friend :-)
Tweet media one
9
31
118
@glouppe
Gilles Louppe
11 months
The list of all 191 papers accepted at the #ML4PS2023 NeurIPS workshop is now available at Congratulations to all authors and huge thanks to all reviewers! This year, we reached an unprecedented number of 3.9 reviews per paper on average!
1
17
114
@glouppe
Gilles Louppe
4 years
When code you wrote appears on television🤓CC: @scikit_learn @ogrisel @GaelVaroquaux @amuellerml
@zhipan01
zhi
4 years
THIS KDRAMA JUST IMPORTED SKLEARN
Tweet media one
Tweet media two
3
24
152
1
5
110
@glouppe
Gilles Louppe
6 years
Phew! I am now halfway through my deep learning lectures. Today we went over recurrent networks, starting from Elman networks up to Differentiable Neural Computers. Slides available et
Tweet media one
Tweet media two
Tweet media three
Tweet media four
1
23
107
@glouppe
Gilles Louppe
6 years
At the Isaac Newton institute for mathematics for a few days. Where you find black boards in the bathroom. You cannot be too cautious in case of emergency.
Tweet media one
3
6
107
@glouppe
Gilles Louppe
4 years
Lazy tweet: I am looking for applications of automatic differentiation beyond neural nets. Any exciting example I could show to my students? Thinking about things like differentiating through simulations or gradient-based hyper-parameter optimization
42
13
107
@glouppe
Gilles Louppe
2 years
In previous work, we showed that algorithms for simulation-based inference can all produce underdispersed posteriors, which can make them unreliable for scientific inquiry. Today, we are happy to announce a first algorithm for reliable simulation-based inference! 🧵
@StatMLPapers
Stat.ML Papers
2 years
Towards Reliable Simulation-Based Inference with Balanced Neural Ratio Estimation. (arXiv:2208.13624v1 [])
0
5
23
2
20
102
@glouppe
Gilles Louppe
2 years
On behalf of the organizers, I am happy to announce that the Call for Papers is now out! Deadline: September 22. ✍️ #ML4PS2022
@KyleCranmer
Kyle Cranmer
2 years
Great news, our Machine Learning and the Physical Sciences workshop at #NeurIPS2022 was accepted! This will be the fifth edition of this fantastic workshop series. More details to follow. #ML4PS2022
Tweet media one
7
85
522
1
23
97
@glouppe
Gilles Louppe
8 years
Recommendation letter for John Nash. Short and precise.
Tweet media one
1
53
89
@glouppe
Gilles Louppe
6 years
This year, I am really satisfied with the second version of my Introduction to AI lectures. Tomorrow will already be the last lecture! All (700+!) slides can be downloaded at
0
27
90
@glouppe
Gilles Louppe
4 years
Happy to announce our latest work led by @ArnaudDelaunoy for fast posterior inference on gravitational wave data using likelihood-to-evidence ratio estimation (AALR / SNRE), reducing inference time from days to minutes or less! 🔭🤖
Tweet media one
Tweet media two
4
18
91
@glouppe
Gilles Louppe
6 years
Digging into old papers to prepare a deep learning talk for a general audience. I never realized the neocognitron paper also came with an attention model!
Tweet media one
2
28
88
@glouppe
Gilles Louppe
4 years
Look what I found in the office! Yesterday, I gave my AI class on top of the very first two volumes of the #NeurIPS proceedings (1987, 1988). Stand on the shoulders of giants they say... 🤖
Tweet media one
3
1
88
@glouppe
Gilles Louppe
5 years
I am always speechless at the precision physicists write simulators...
@Rainmaker1973
Massimo
5 years
The first-ever image of a black hole (M87 or Virgo A, left) is here compared with a simulation (middle) and the simulation blurred to the expected resolution of the telescope (right)
Tweet media one
14
907
2K
0
15
86
@glouppe
Gilles Louppe
5 years
Tomorrow is already my last Deep Learning lecture of the year! The full deck of (600+) slides can be downloaded at Quite happy with the final result, but relieved at the same time! Now let's hope for the best regarding the outcome of the exam.
1
22
82
@glouppe
Gilles Louppe
7 months
📢 Open PhD position in my group, on simulation-based inference and generative models for particle physics, on a joint project with Fabio Maltoni and Tilman Plehn
0
35
76
@glouppe
Gilles Louppe
2 years
Big news! 🚨 Our workshop got accepted at ICML 2023! Join us to discuss how to combine scientific ⚗️ and ML 🤖 models into hybrid models 🧌. Let's bridge the gap between fields! 🤝 Stay tuned for more details 📻 #ICML2023
0
9
79
@glouppe
Gilles Louppe
1 year
The Machine Learning and the Physical Sciences workshop will return for a 6th edition at #NeurIPS2023 ! More info coming soon! 📢🔭⚗️🧲
3
11
78
@glouppe
Gilles Louppe
1 month
Can we run reliable simulation-based inference on a budget? when having as few as O(10-100) simulations? Yes, we can! Check out @ArnaudDelaunoy 's latest paper on Bayesian neural networks for low-budget simulation-based inference 🤓
2
15
76
@glouppe
Gilles Louppe
4 years
What if you want to do posterior inference but don't have a prior to start with? In this new work led by @VandegarM , we investigate how the empirical Bayesian can use neural density estimators to estimate a source distribution over uncorrupted samples from noise-corrupted ones.
Tweet media one
1
21
76
@glouppe
Gilles Louppe
7 years
Open PhD position @UniversiteLiege in exoplanet imaging with deep learning. Exciting opportunity to develop ML algorithms for the detection of new planets!
1
40
74
@glouppe
Gilles Louppe
7 months
Very proud to sign my first paper with @ATLASexperiment on "Deep Generative Models for Fast Photon Shower Simulation in ATLAS"! This project actually began in 2017, way before generative AI was cool. It is a relief to see it finally published 🤖
3
9
74
@glouppe
Gilles Louppe
2 years
"The secret to modelling is not being perfect" Warm congratulations to my student @WehenkelAntoine for defending his PhD thesis. Exciting work ahead at @apple with @jh_jacobsen #proudadvisor
Tweet media one
Tweet media two
8
4
73
@glouppe
Gilles Louppe
2 years
Using neural score estimation for reverse-diffusing a Gaussian into a Bayesian posterior!👇
@glouppe
Gilles Louppe
2 years
New in💡lampe v0.7 by @FrancoisRozet : score-based diffusion models (sub-VP SDEs) for simulation-based inference! Check it out at
Tweet media one
0
14
56
0
9
74
@glouppe
Gilles Louppe
2 years
Congratulations to @joeri_hermans , my first student to defend his PhD thesis! before/after 🎊🍻 #proudadvisor
Tweet media one
Tweet media two
2
2
73
@glouppe
Gilles Louppe
6 years
Accepted for publication at #AISTATS2019 ! 📜Can't wait to discover Japan!
@glouppe
Gilles Louppe
7 years
It's out! Our latest paper with @KyleCranmer : Adversarial Variational Optimization of Non-Differentiable Simulators
Tweet media one
Tweet media two
6
91
252
0
14
73
@glouppe
Gilles Louppe
8 years
Want a scatter matrix of the 2D partial dependence of your objective function? @betatim made it for you in `skopt`!
Tweet media one
3
24
71
@glouppe
Gilles Louppe
6 years
In our latest paper with Johann Brehmer, @KyleCranmer and @juanpavez , we compare this extraction process to 'mining gold' from implicit models and show how it can dramatically improve data efficiency of likelihood-free inference algorithms.
Tweet media one
1
17
69
@glouppe
Gilles Louppe
8 years
The paper behind TensorFlow Fold: Deep Learning with Dynamic Computation Graphs by Norvig et al.
1
27
70
@glouppe
Gilles Louppe
3 years
Simulation-based inference outside of science? Yes! Here is our attempt for robotic multi-fingered grasping 🤖 It took some efforts to bridge the sim-to-reality gap, but adding nuisance parameters was then natural and worked very smoothly.
4
12
67
@glouppe
Gilles Louppe
2 years
Less talking, more action: tomorrow in my deep learning class, I'll be building and training a mini GPT from scratch👨‍💻! It won't rival the largest LLMs 🦙but expect some magical Harry Potter-esque babble✨🪄
4
6
68
@glouppe
Gilles Louppe
6 years
Quite satisfactory to reach a point in my AI class where students should have all the pieces for building an approximately intelligent agent in partially observable stochastic environments, combining algorithms for reasoning over time, handling uncertainty and taking decisions.
Tweet media one
Tweet media two
Tweet media three
Tweet media four
2
13
67
@glouppe
Gilles Louppe
6 years
TIL when preparing my second deep learning lecture that the operators' manual for Rosenblatt's Mark 1 Perceptron machine was a classified document. It became unclassified only in 1977. The manual can now be found at
0
20
63
@glouppe
Gilles Louppe
2 years
Mission complete! It took us several hours (almost 4!), but we built and trained a fully-working GPT language model on Harry Potter! The code implemented in class can be found at
@glouppe
Gilles Louppe
2 years
Less talking, more action: tomorrow in my deep learning class, I'll be building and training a mini GPT from scratch👨‍💻! It won't rival the largest LLMs 🦙but expect some magical Harry Potter-esque babble✨🪄
4
6
68
4
12
65
@glouppe
Gilles Louppe
5 years
This is the model from Report 13 of the Imperial College COVID-19 response team. It fits the death data jointly from 11 European countries to estimate the reproduction number and the effect of lockdowns. Such a remarkable piece of @mcmc_stan
4
24
66
@glouppe
Gilles Louppe
6 years
Lecture 2 of my Deep Learning class this morning: from logistic regression to deep rectified networks. Slides at
Tweet media one
Tweet media two
Tweet media three
Tweet media four
0
19
65
@glouppe
Gilles Louppe
7 years
Looking for a PhD studentship? Want to develop AI and deep learning for furthering scientific discovery? Let's get in touch! More details at I'll be at #NIPS2017 if you want to chat.
2
35
64
@glouppe
Gilles Louppe
9 years
Our latest paper: Likelihood-free inference with your favorite ML model! @KyleCranmer @juanpavez
5
29
62
@glouppe
Gilles Louppe
7 years
Inspired by , today I tried to distill to my students some intuition behind the effectiveness of deep learning by folding a sheet of paper in front of them. Not sure if they clicked, or if I just lost them...
5
9
61
@glouppe
Gilles Louppe
7 years
By the way, I cannot but recommend @francoisfleuret 's amazingly thorough deep learning course It's been a strong inspiration for those slides.
2
16
61
@glouppe
Gilles Louppe
3 years
The call for papers to our Machine Learning and the Physical Sciences workshop #ML4PS2021 #NeurIPS2021 is now out! The deadline is set a month from now, on September 18.
@KyleCranmer
Kyle Cranmer
3 years
Our Machine Learning for Physical Sciences workshop was accepted for #NeurIPS2021 ! Excellent work by @atilimgunes @BasicScienceSav @glouppe @iamstarnord @BPNachman @carrasqu & Emine Kucukbenli in putting the proposal together. Updates to follow #ML4PS2021
Tweet media one
5
63
306
0
23
55
@glouppe
Gilles Louppe
4 years
Now accepted at #aistats2021 ! Congratulations to my incoming PhD student @VandegarM for his first paper! Excited to see what's next :) (with @WehenkelAntoine and @Michael_A_Kagan )
@glouppe
Gilles Louppe
4 years
What if you want to do posterior inference but don't have a prior to start with? In this new work led by @VandegarM , we investigate how the empirical Bayesian can use neural density estimators to estimate a source distribution over uncorrupted samples from noise-corrupted ones.
Tweet media one
1
21
76
2
7
59
@glouppe
Gilles Louppe
7 years
That is, I'll start as an Assistant Professor in AI in Fall 2017 :)
10
3
58
@glouppe
Gilles Louppe
2 years
New in💡lampe v0.7 by @FrancoisRozet : score-based diffusion models (sub-VP SDEs) for simulation-based inference! Check it out at
Tweet media one
0
14
56
@glouppe
Gilles Louppe
7 years
Dear AI students, study your Bayes' rule. Can't count how often it appears in #NIPS2017 talks
1
9
56
@glouppe
Gilles Louppe
6 years
Excited to have this finally out! In this work, we look to see how a neural network can hijack random number calls in a particle physics simulator for amortized posterior inference. Praises to @atilimgunes and @lukasheinrich_ !
@StatMLPapers
Stat.ML Papers
6 years
Efficient Probabilistic Inference in the Quest for Physics Beyond the Standard Model. (arXiv:1807.07706v1 [cs.LG])
0
3
4
0
13
53
@glouppe
Gilles Louppe
4 years
Exciting news for #AI research in Belgium! Fine print: I will soon be looking for new PhD students to work on AI for science! Prospective applicants are welcome to reach out 🤖🧑‍🎓
@LaLibreEco
La Libre Eco
4 years
Le gouvernement wallon va octroyer 32 millions d'euros au "Trail Institute" pour des projets en intelligence artificielle
Tweet media one
0
1
0
2
8
53
@glouppe
Gilles Louppe
4 years
In this first part, I illustrate the change of variables theorem, together with its application for parameter estimation of affine transformations. Purely artificial I concur, but that's nice exercise for getting used to JAX API. Stay tuned for Part 2!
Tweet media one
0
9
54
@glouppe
Gilles Louppe
4 years
2nd paper of the day with friends from @GRAPPAinstitute , incl. @C_Weniger , @NilanjanBanik and @gfbertone ! We haven't solved protein folding, but I am quite excited about this study in which we show the full potential of simulation-based inference for grand scientific problems💫🤖
@gfbertone
Gianfranco Bertone
4 years
New paper today w/ @joeri_hermans , Nil Banik, @C_Weniger and @glouppe : a likelihood-free Bayesian inference pipeline to infer properties of dark matter from observations of stellar streams
Tweet media one
2
4
42
2
13
54
@glouppe
Gilles Louppe
8 years
Thrilled to announce our latest paper with @KyleCranmer : Learning to pivot with adversarial nets
3
20
53
@glouppe
Gilles Louppe
6 years
Really impressed by the expressiveness of the latent space of #StyleGAN ! I can map myself into latent space and interpolate to my office neighbor @DamienERNST1
Tweet media one
3
9
52
@glouppe
Gilles Louppe
7 years
Brilliant conversation between @EtienneKlein and @ylecun : can intelligence be artificial? (in French) Feels refreshing to see a debate of this quality among all the over-hyped AI coverage
@ylecun
Yann LeCun
7 years
Podcast de l'émission "La Pensée Scientifique" animé par Étienne Klein sur France Culture dans laquelle nous...
1
35
57
0
20
53
@glouppe
Gilles Louppe
2 years
All 188 papers and posters accepted to the #ML4PS2022 workshop are now online! Take the time to have a look before arriving at NeurIPS next week!
1
5
51
@glouppe
Gilles Louppe
5 years
The third edition of my "Introduction to AI" class -- a modernized summary of GOFAI -- is now finished! Now is time to relax during the Christmas break 😴 Slides are all available at .
Tweet media one
Tweet media two
Tweet media three
Tweet media four
1
20
52
@glouppe
Gilles Louppe
5 years
Hot from your arXiv: a (much) revised version of from my student @joeri_hermans , where we show how to effectively plug approximate likelihood ratios to obtain likelihood-free variants of MCMC algorithms.
Tweet media one
2
19
50
@glouppe
Gilles Louppe
3 years
Meh, rejected from #AISTATS2022 because the "paper focuses too much on applications in astronomy"?! An argument that was not even raised in any of the original reviews... Alright then, we will get a ticket for the next lottery.🎲
@glouppe
Gilles Louppe
3 years
New paper📢"Averting A Crisis In Simulation-Based Inference" -- a huge team effort led by my students @joeri_hermans and @ArnaudDelaunoy
5
10
79
2
1
51
@glouppe
Gilles Louppe
6 years
First paper of the year, by my student @joeri_hermans ! In , we adapt MCMC algorithms to the likelihood-free scenario by making use of a parameterized likelihood ratio model learned by supervised learning.
Tweet media one
Tweet media two
2
16
50
@glouppe
Gilles Louppe
5 years
👇 Proud of my student who reimplemented almost entirely from scratch a full deep learning pipeline for voice embedding, cloning and synthesis. Let him know what you think!
@CorentinJemine
Corentin Jemine
5 years
The finished product of my master's thesis: a real-time voice cloning toolbox. Clone a voice from only 5 seconds of audio and use it for text-to-speech! #DeepLearning #AI
2
12
41
1
7
51