Frank Hutter Profile
Frank Hutter

@FrankRHutter

5,139
Followers
48
Following
68
Media
468
Statuses

Professor of #machinelearning at the University of Freiburg, 3x ERC Grant holder, EurAI & ELLIS fellow. All opinions are my own.

Freiburg im Breisgau, Germany
Joined July 2018
Don't wanna be here? Send us removal request.
Pinned Tweet
@FrankRHutter
Frank Hutter
2 years
I’m thrilled to announce that I’ve been awarded an #ERC Consolidator Grant. 2 million Euros to work on what I call „Deep Learning 2.0“. I hope that the next generation of DL can achieve Trustworthy AI by design. Details about DL 2.0 in this blog post: 1/3
24
43
522
@FrankRHutter
Frank Hutter
2 years
This may revolutionize data science: we introduce TabPFN, a new tabular data classification method that takes 1 second & yields SOTA performance (better than hyperparameter-optimized gradient boosting in 1h). Current limits: up to 1k data points, 100 features, 10 classes. 🧵1/6
Tweet media one
113
778
4K
@FrankRHutter
Frank Hutter
3 years
Hooray, we got 5/5 submissions accepted at #ICLR2022 . Congratulations to all my wonderful students and collaborators. A brief overview of our papers below:
6
12
387
@FrankRHutter
Frank Hutter
9 months
I like this new paper on DL vs trees at the #NeurIPS 2023 datasets and benchmark track: . One of their findings: #TabPFN performs well across 98 datasets: it ties for 1st place with CatBoost in predictive performance but is 200x faster to train. This is
Tweet media one
6
70
321
@FrankRHutter
Frank Hutter
4 years
I like this post and would advise all PhD students to follow it! In a dynamic field like machine learning, life always keeps on getting busier (even once you have a tenured position and brilliant PhD students to keep up with ;-) and you need to schedule breaks to recharge!
@PhdExhausted
Not so Exhausted Anymore, Ph.D.
4 years
I'll relax once I submit this paper. I'll relax once I submit my thesis. I'll relax once I defend my thesis. I'll relax after I get a post-doc. Damn! I guess I'll have to relax after I get a permanent position. No. The future always gets busier. Schedule proper breaks NOW. 💛
81
2K
11K
4
30
277
@FrankRHutter
Frank Hutter
2 years
Please see our blog post for details & code. Also, we just got an oral at the #NeurIPS table representation learning workshop 🎉Joint work with my outstanding students @noahholl , @SamuelMullr and @KEggensperger . 6/6
18
30
271
@FrankRHutter
Frank Hutter
2 years
TabPFN is radically different from previous ML methods. It is meta-learned to approximate Bayesian inference with a prior based on principles of causality and simplicity. Here‘s a qualitative comparison to some sklearn classifiers, showing very smooth uncertainty estimates. 2/6
Tweet media one
5
21
270
@FrankRHutter
Frank Hutter
2 years
Thanks for the many comments on my tweet on TabPFNs: Here's an update on how this may revolutionize data science. TL;DR: learning classification algorithms rather than programming them; real-time AutoML; democratizing ML to tackle *small data*! 🧵1/9
Tweet media one
@FrankRHutter
Frank Hutter
2 years
This may revolutionize data science: we introduce TabPFN, a new tabular data classification method that takes 1 second & yields SOTA performance (better than hyperparameter-optimized gradient boosting in 1h). Current limits: up to 1k data points, 100 features, 10 classes. 🧵1/6
Tweet media one
113
778
4K
3
49
257
@FrankRHutter
Frank Hutter
3 years
Exciting news: following a series of 8 AutoML workshops at ICML, we will hold the 1st International Conference on AutoML in 2022. Co-located with ICML, 3 days, currently planned in person. Website: Submission deadline: Feb 24, 2022 #AutoML #AutoML_Conf 1/
3
79
254
@FrankRHutter
Frank Hutter
2 years
If you'd like to play with TabPFNs yourself, here is a direct link to the Colab with a scikit-learn like interface:
5
13
206
@FrankRHutter
Frank Hutter
3 years
6 accepted papers at NeurIPS. Amazing! 4 at the main track and 2 more at the benchmarks and datasets track. Huge congratulations to my bright and hard-working students & collaborators, they couldn't deserve it more!! Previews to early versions in this thread, more to follow soon.
3
9
208
@FrankRHutter
Frank Hutter
3 years
Deep Learning, in particular even simple MLPs, can outperform gradient boosting for tabular data, if you use the right „cocktail“ of different modern DL regularization methods. Go #AutoML ! We tried different GBDT implementations, hyperparameter spaces, etc, with similar results.
@_akhaliq
AK
3 years
Regularization is all you Need: Simple Neural Nets can Excel on Tabular Data pdf: abs: well-regularized plain MLPs significantly outperform sota specialized neural network architectures
Tweet media one
6
88
485
6
40
185
@FrankRHutter
Frank Hutter
3 years
We got Transformers to do basically perfect Bayesian inference with any prior we can sample from. Super cheap (in a single forward prop) & more accurate than MCMC or VI. This could be really disruptive. , with @SamuelMullr , @noahholl , Sebastian, Josif
8
38
178
@FrankRHutter
Frank Hutter
1 year
The #ICML2023 rebuttal deadline was Sunday. That’s sad from a diversity & inclusion point of view; there goes the weekend with the family. For , we have “no deadlines on weekends” in our internal guidelines. I hope @icmlconf will follow suit in the future.
1
24
164
@FrankRHutter
Frank Hutter
2 years
TabPFN happens to be a transformer, but this is not your usual trees vs nets battle. Given a new data set, there is no costly gradient-based training. Rather, it’s a single forward pass of a fixed network: you feed in (Xtrain, ytrain, Xtest); the network outputs p(y_test). 3/6
3
6
124
@FrankRHutter
Frank Hutter
11 months
🎉 I’m delighted that my amazing team got 6 #AutoML papers accepted to #NeurIPS2023 , including an oral on AutoML for Trustworthy AI! Other topics include #NAS , #BO , practical #HPO , LC extrapolation and #LLMs for data science. Congratulations to all my coauthors! Details below🧵1/
Tweet media one
8
7
123
@FrankRHutter
Frank Hutter
2 years
Time for an update on experiments with our learned TabPFN: we performed a much more comprehensive analysis on 179 datasets, confirming the patterns from our previous evaluation on 30 datasets and looking in much more detail. A long thread with lots of plots🧵1/17
2
18
114
@FrankRHutter
Frank Hutter
2 years
Imagine the possibilities! Portable real-time ML with a single forward pass of a medium-sized neural net (25M parameters). Go, #GreenAutoML ! Please share widely; there are endless possibilities for improvements & extensions and we'd love to tackle them with your help. 5/6
1
5
109
@FrankRHutter
Frank Hutter
2 years
TabPFN is fully learned: We only specified the task (strong predictions in a single forward pass, for millions of synthetic datasets) but not *how* it should be solved. Still, TabPFN outperforms decades worth of manually-created algorithms. A big step up in learning to learn. 4/6
2
5
109
@FrankRHutter
Frank Hutter
1 year
🏆 Honored to receive the #KDD2023 Test of Time Award for our Auto-WEKA paper from 2013, together with Chris Thornton, @HolgerHoos and @k_leyton_brown . A decade later, it's humbling to see the lasting impact of our work in the ever-evolving #AutoML landscape. I'm very grateful
Tweet media one
Tweet media two
9
8
94
@FrankRHutter
Frank Hutter
9 months
#ICLR2024 released reviews to authors Friday evening at 7pm European time. Have a fun weekend, poor grad students 😕 And even worse, poor parents! Not a good move, #ICLR ; so available by conscious scheduling! Let’s hope #ICML2024 and #NeurIPS2024 will avoid this.
3
9
92
@FrankRHutter
Frank Hutter
3 years
This is a good example of interaction effects between hyperparameters. The original weight decay value was optimal before adding various other regularization methods; but, after adding these, decreasing weight decay from 1e-4 to 4e-5 gave the single most important improvement!
Tweet media one
@OriolVinyalsML
Oriol Vinyals
3 years
The Deep Learning Devil is in the Details. I love this work from @IrwanBello and collaborators in which they show how training "tricks" improve ~3% absolute accuracy on ImageNet, progress equivalent to years of developments and research! Paper:
Tweet media one
16
227
1K
1
19
86
@FrankRHutter
Frank Hutter
4 years
If you’d like to do a PhD in #AutoML or #MetaLearning in my group at the #ELLIS unit Freiburg, or in another ELLIS unit co-supervised by me, please apply to the ELLIS PhD program and indicate your interest in working with me: The deadline is December 1st!
0
18
75
@FrankRHutter
Frank Hutter
3 years
At #AutoML_Conf , we care about #reproducibility . Today, we're introducing the novel concept of reproducibility reviewers, who assess to which degree results can be reproduced with the shared code. Full story: Sign up as a reviewer:
2
27
68
@FrankRHutter
Frank Hutter
3 years
Congrats to the TorchVision team, who just trained the best-ever ResNet using our TrivialAugment () and CosineAnnealing () methods (amongst several other recent DL training tricks):
0
9
62
@FrankRHutter
Frank Hutter
2 years
I'm happy to announce the first set of sponsors for #AutoML_Conf (more to come): @abacusai , @DeepMind , @Bosch_AI , @Amazon , @Meta , inria, 4Paradigm, @Microsoft , Samsung AI Cambridge. It's gonna be big, don't miss it! Abstract submission deadline in < 24h.
1
9
60
@FrankRHutter
Frank Hutter
1 year
Here are all the infos for #AutoML23 in one place. Please share widely -- it's going to be great!🚀 Call for Participation: #AutoML conference 2023, Potsdam/Berlin, September 12-15, 2023. Some reasons to join the conference: - 5 great on-site keynote
Tweet media one
0
31
60
@FrankRHutter
Frank Hutter
3 years
Our #metalearning workshop at #NeurIPS2021 got accepted! Amazing speakers: Carlo Ciliberto, @rosemary_ke , @Luke_Metz , @MihaelaVDS , @Eleni30fillou ,Ying Wei! Tentative submission deadline: Sept 17. Co-organizers: @FerreiraFabioDE , @ermgrant , @schwarzjn_ , @joavanschoren , @HuaxiuYaoML
0
10
56
@FrankRHutter
Frank Hutter
2 years
My team and I are at NeurIPS, presenting a total of 17 (!) papers: 4 in the main proceedings and 13 at workshops. I'm very much looking forward to meeting old friends in person again, and also to making new ones! Here is a quick overview of our papers (short tread of 6 posts):
1
4
54
@FrankRHutter
Frank Hutter
1 year
Do you like #LLMs and #AutoML ? Then you might like the Auto-Survey competition at #AutoML2023 (), which just launched officially. It's a very novel competition where LLM agents write survey papers and LLM agents judge them (final judges are human). IMO
1
14
57
@FrankRHutter
Frank Hutter
3 years
Surprising turn in #HPO and #NAS : we now have DEHB, an evolutionary successor of BOHB that's up to 1000x faster than random search for optimizing 6 NN hypers and up to 32x faster than BOHB on NAS-Bench-B201! #IJCAI21 paper: Code:
Tweet media one
Tweet media two
Tweet media three
Tweet media four
1
7
56
@FrankRHutter
Frank Hutter
2 years
Congratulations to my fantastic MSc student Shuhei Watanabe, for winning the multi-objective hyperparameter optimization competition for transformers at #AutoML_Conf !
Tweet media one
1
5
50
@FrankRHutter
Frank Hutter
3 years
We extend Bayesian optimization (BO) by user priors over promising regions of the search space. This expert-guided BO is up to 12x faster than vanilla BO and allows DL experts with little compute to embrace BO. , with Carl, Danny, @LindauerMarius , @luiginardi
2
8
49
@FrankRHutter
Frank Hutter
1 year
#GPT meets #AutoML : in an effort to integrate user knowledge into AutoML, our new tool CAAFE uses LLMs to generate semantically meaningful features for tabular data (and also explains them). Towards an AI assistant for human data scientists🚀 Paper Demo
Tweet media one
2
9
48
@FrankRHutter
Frank Hutter
8 months
@zicokolter ⁩ just did a live demo on jail breaking (the live) GPT3.5 to write a tutorial on hotwiring a car at the #NeurIPS alignment workshop.
Tweet media one
1
2
49
@FrankRHutter
Frank Hutter
3 years
Fantastic news, I’m honoured and humbled to receive the AIJ Prominent Paper Award 2021! Super happy to receive this award with this amazing set of coauthors, @HolgerHoos , @k_leyton_brown & Lin Xu, that I started my research journey with and that formed me so very much! Thanks!
@HolgerHoos
Holger Hoos
3 years
Very proud and humbled to have won, together with @k_leyton_brown and our former students, @FrankRHutter and Lin Xu, the AIJ Prominent Paper Award 2021 (as just announced at @IJCAIconf ), for our work on using #MachineLearning for predicting running times of #AI algorithms.
Tweet media one
12
8
77
1
0
48
@FrankRHutter
Frank Hutter
8 months
I'm at #NeurIPS2023 with my amazing team, excited to be presenting 6 papers at the main track and 2 at workshops, as well as a keynote in the table representation learning workshop. Here's all the info in one tweet, ordered by day. Please come by and chat with us 🙂 Tuesday
0
7
48
@FrankRHutter
Frank Hutter
2 years
Excited to present ELIZA, the Konrad Zuse School of Excellence in Learning and Intelligent Systems, a new AI Lighthouse comprising 7 German ELLIS Units, to attract the brightest MSc & PhD students to Germany. 1/2
Tweet media one
Tweet media two
Tweet media three
1
3
45
@FrankRHutter
Frank Hutter
2 years
Many DL papers for tabular data didn’t deliver, so I understand the skeptics. But I’m a champion for reproducibility and hope to convince you that way. All code is available at with demos. Here is a Colab comparing to baselines: 9/9
4
4
45
@FrankRHutter
Frank Hutter
3 years
For everyone submitting #NAS papers to #NeurIPS2021 : I'd encourage using the NAS best practices checklist () to ensure your work is reproducible. Here is a .tex file you can include in your appendix next to the NeurIPS checklist:
Tweet media one
0
7
45
@FrankRHutter
Frank Hutter
3 years
#AutoML_Conf 2022 is now open for submissions! Abstract submission deadline: February 24th Full paper submission deadline: March 3th Please see the call for papers and the author guidelines:
1
18
41
@FrankRHutter
Frank Hutter
1 year
I have 2 new #AutoML PhD openings in our new Collaborative Research Center (Sonderforschungsbereich) „Small Data“, a major collaboration between the Freiburg university clinic and AI folks, with ~30 PhD students. 1st application deadline: June 18 1/4
2
16
43
@FrankRHutter
Frank Hutter
2 years
Announcing the late-breaking workshop track at #AutoML_Conf : paper submission by April 20, notification May 18 These short workshop papers are compatible with submission of full versions to a conference, e.g. #NeurIPS . Looking forward to great papers! 1/2
2
22
41
@FrankRHutter
Frank Hutter
5 years
I'm thrilled to announce that our proposal to run the first #NAS workshop at #ICLR2020 was accepted. We're looking forward to building a vibrant, open, collaborative, diverse and welcoming community around neural architecture search! Prepare to submit your work in late January!
2
6
42
@FrankRHutter
Frank Hutter
2 years
Congratulations, Dr @AndreBiedenkapp to finishing your outstanding thesis on #DAC with summa cum laude, the best possible grade! It’s been a huge pleasure to supervise you over the last years, and to have you as a positive force & integral part of the lab. Thanks, Dr. Andre!
@LindauerMarius
Marius Lindauer
2 years
Congratulations to Dr André @AndreBiedenkapp . It was my greatest pleasure to co-supervise you for the last years. You showed an outstanding performance!
Tweet media one
Tweet media two
Tweet media three
2
2
40
1
4
40
@FrankRHutter
Frank Hutter
10 months
At the AI Safety Summit in London many risks will be discussed, but the biggest risk for Europe is different: the LLM ship has sailed without us, and we are on our way to an all-encompassing dependency on American companies. Europe needs to wake up and invest heavily in AI. We
2
7
42
@FrankRHutter
Frank Hutter
10 months
Good morning Schluchsee! What a beautiful location for an #AutoML research retreat with my amazing group. The perfect setup for planning the next big thing in AutoML 🚀
Tweet media one
Tweet media two
2
0
41
@FrankRHutter
Frank Hutter
4 years
Excited to introduce the first large-scale efficient-to-evaluate NAS benchmark, NAS-Bench-301: Exactly for the DARTS space (10^18 architectures)! Enabled by a surrogate model with lower estimation error than the noise in a single SGD run in NAS-Bench-101.
1
9
41
@FrankRHutter
Frank Hutter
1 year
I recently had the privilege to work with fairness-aware ML folks to explore the question Can fairness be automated? Spoiler alert: No it can’t. #AutoML needs to be extremely cautious in this domain but still has an important role to play. This was an amazing project that left
3
14
42
@FrankRHutter
Frank Hutter
3 years
We #metalearn proxy MDPs called "synthetic environments" for RL: given an RL environment E, you can train a neural net to replace E‘s original MDP & train policies for the original E more efficiently! , with @FerreiraFabioDE , Thomas N., @AndreasSalinger
1
6
41
@FrankRHutter
Frank Hutter
9 months
@predict_addict Dear Valeriy, this might be hard for you to accept: an independent evaluation on 98 datasets at the #NeurIPS2023 datasets and benchmark track () showed TabPFN to tie for 1st place with CatBoost in predictive performance but to be 200x faster to train. This
Tweet media one
3
4
40
@FrankRHutter
Frank Hutter
2 years
📢To not follow CHI and CVPR as #COVID super spreader events, with all organizers and senior area chairs we decided that #AutoML_Conf will mandate vaccinations, daily antigen self-tests, and masks inside. Hoping that our attendees safely make it through the preceding #ICML2022 ...
3
2
39
@FrankRHutter
Frank Hutter
3 years
More exciting news about #AutoML_Conf : we will have a journal track: , chaired by @HolgerHoos . In-scope open access papers recently published at top-tier journals will get the opportunity to present at the conference exactly like regular papers. 1/
1
13
37
@FrankRHutter
Frank Hutter
2 years
Congratulations to my students @noahholl , @SamuelMullr & @KEggensperger for winning the best paper award at the #NeurIPS2022 @TrlWorkshop with TabPFN And many thanks to the organizers for a super-stimulating workshop rightfully overflowing with attendants!
@noahholl
Noah Hollmann
2 years
So excited that our work on TabPFN received the best paper award at the NeurIPS Table Representation Learning Workshop on Friday, #neurips #TrlWorkshop 🥳🥳 This workshop has been so exciting, merging perspectives for semantic and more classical tabular data learning!
Tweet media one
Tweet media two
2
1
27
1
3
36
@FrankRHutter
Frank Hutter
3 years
We attracted great keynote speakers for AutoML-Conf, covering areas like NAS, meta-learning, AGI, AutoML & fairness, AutoML systems, and AutoML with dirty data! @AnimaAnandkumar @jeffclune @chelseabfinn @timnitGebru @JulieJosseStat @smolix 2/
Tweet media one
1
6
33
@FrankRHutter
Frank Hutter
8 months
This #NeurIPS2023 paper presents a really simple and effective idea: use LLMs to understand the context of a tabular data set and write code to engineer new features for it. Moving #AutoML more towards automated data science! Congratulations, @noahholl and @SamuelMullr !
@noahholl
Noah Hollmann
8 months
Excited to present "LLMs for Automated Data Science: Introducing CAAFE for Context-Aware Automated Feature Engineering" at #NeurIPS2023 ! We had a simple idea: Can we use the domain knowledge of LLMs to automate the feature engineering process? with @SamuelMullr @FrankRHutter
1
5
30
0
2
34
@FrankRHutter
Frank Hutter
5 years
Our Springer book on Automated Machine Learning has finally been published :-) Edited together with @larskotthoff and @joavanschoren Get the free PDF or the hard cover: Also see the book website: #AutoML
Tweet media one
0
14
28
@FrankRHutter
Frank Hutter
10 months
Together with 3 Turing Award winners ( @geoffreyhinton , Yoshua Bengio, Andrew Yao), Nobel laureate @kahneman_daniel , Stuart Russel and 20 other AI & policy leaders, we just released a commentary that recommends actionable measures towards making AI safer:
1
7
34
@FrankRHutter
Frank Hutter
2 years
3 years and 2000 papers later, it was about time for an update of our NAS survey from 2019 with Thomas Elsken and Jan Hendrik Metzen!
@crwhite_ml
Colin White
2 years
In the past two years, there have been more than 1000 papers on neural architecture search🤯. What are the key insights? Introducing our new survey! with Mahmoud, @RheaSukthanker , Robin, Thomas, @ZelaArber , @debadeepta , @FrankRHutter #AutoML #NAS
Tweet media one
2
28
134
0
8
30
@FrankRHutter
Frank Hutter
4 years
Great to have the next generation of Auto-sklearn: Lots of improvements, including multi-fidelity optimization for fast progress on large datasets and theoretically-grounded meta-learning across datasets to find a strong set of configurations to start from
@__mfeurer__
Matthias Feurer
4 years
Excited to announce Auto-sklearn 2.0! Introduces a lot of improvements for large datasets, portfolio-based warmstarting, and meta-learning to set its own hyperparameters. Blog: with @KEggensperger , Stefan Falkner, @LindauerMarius and @FrankRHutter
2
15
34
0
9
33
@FrankRHutter
Frank Hutter
3 years
I’ll give an #AutoML talk at #ODSCEurope on June 9th, directly followed by a hands-on workshop on Auto-sklearn by @KEggensperger and @__mfeurer__
@_odsc
Open Data Science
3 years
Tweet media one
0
1
4
0
10
29
@FrankRHutter
Frank Hutter
3 years
For everyone submitting #NAS papers to #ICLR2021 : I'd encourage using the NAS best practices checklist () for your reproducibility statement. Here is a .tex snippet you can fill out and include:
Tweet media one
0
7
32
@FrankRHutter
Frank Hutter
2 years
Really looking forward to #AutoML_Conf : Next to 6 great keynote speakers, we'll have 5 tutorials, panels on AutoML & fairness and AutoML & large pretrained models, 4 poster sessions and lots of social/networking time, including a banquet on a yacht! 🧵 1/
2
4
28
@FrankRHutter
Frank Hutter
2 years
Our #NeurIPS 2022 workshop on #metalearning () will be hybrid. We also strongly encourage submissions from authors who do not plan to attend NeurIPS in person. There will be a virtual poster session. 4 pages + unlimited refs & appendix. Deadline Oct 1 AOE
0
6
29
@FrankRHutter
Frank Hutter
11 months
In our #NeurIPS2023 oral „Rethinking Bias Mitigation: Fairer Architectures Make for Fairer Face Recognition“, we use #AutoML for Trustworthy AI! More precisely, we use multi-objective joint #NAS & #HPO for fairness mitigation in face recognition. The main conclusion:
Tweet media one
1
6
32
@FrankRHutter
Frank Hutter
3 years
While I'm thrilled for the authors of accepted papers I feel with the authors of the many rejected papers. Don't let this discourage you! Rejects happen to all of us; I had 6/6 rejects once, there is a lot of noise in reviewing. What helps is using criticism well & perseverance.
0
0
27
@FrankRHutter
Frank Hutter
2 years
I'm happy to announce 4 competitions affiliated with #AutoML_Conf2022 : Zero-cost NAS (), multi-objective HPO for transformers (), DAC4AutoML (), and meta-learning from learning curves ()
0
13
30
@FrankRHutter
Frank Hutter
2 years
I’m not saying that our particular trained TabPFN revolutionizes data science. It totally doesn’t. But learning classification algorithms rather than programming them is pretty wild, and I do believe this is where the future of ML lies. AI-GAs @jeffclune 6/9
2
3
29
@FrankRHutter
Frank Hutter
2 years
What an amazing year this was. Next to the great advances in AI we also have a home conference for #AutoML now! Thanks to all who helped make this happen @automl_conf ! And thanks to my awesome students for a great year. Let's take time to recharge batteries now🙂Happy holidays!
0
0
30
@FrankRHutter
Frank Hutter
3 years
We show that NAS-Bench-101 and -201 are not enough and release a unified #NAS benchmark suite with 25 benchmarks. A unified interface allows reliable benchmarking of NAS methods. , with @y_mehtu , @crwhite_ml , @ZelaArber ,Arjun,Guri, Shakiba,Mahmoud & Kaicheng
1
3
28
@FrankRHutter
Frank Hutter
4 years
Had a great time at the #AutoML workshop at #ICML2020 ! Thanks to all our speakers, the participants, our sponsor @GoogleAI , slideslive, and my co-organizers @KEggensperger @__mfeurer__ @LindauerMarius @joavanschoren @charlesweill :-) All papers at
Tweet media one
0
5
29
@FrankRHutter
Frank Hutter
3 years
Congrats to everyone who survived the #NeurIPS deadline! Time to relax and recharge your batteries now :-)
0
0
24
@FrankRHutter
Frank Hutter
2 years
This week my fantastic PhD student @KEggensperger defended her PhD on #AutoML , #HPO and #benchmarking , with summa cum laude (perfect grade). It's been a privilege to be your supervisor, Katha! Katha is on the market now; be sure to make her an offer that's as awesome as she is!
0
0
26
@FrankRHutter
Frank Hutter
3 years
Our #AutoML MOOC on the AI Campus has over 1000 attendees as of today :-) It’s entirely free, including the certificates. I’m very happy to help shape the next generation of #AutoML experts :-) With @LindauerMarius and @BBischl #KICampus
0
6
25
@FrankRHutter
Frank Hutter
2 years
Excited for the dinner cruise of #AutoML_Conf 💥
Tweet media one
0
1
24
@FrankRHutter
Frank Hutter
1 year
As a diversity & outreach measure, I'm thrilled that the first 100 online student tickets for #AutoML23 are free: Please register while tickets last & spread the word! We also embrace hybrid participation this year: even the poster sessions are hybrid!
3
10
26
@FrankRHutter
Frank Hutter
3 years
Congratulations to my team, @SimonSchrodi , @crwhite_ml , Ekrem Öztürk & Danny Stoll for winning 2nd place in the first #NAS competition! Especially impressive seeing that, contrary to the 1st and 3rd place, in the spirit of the rules, we refrained from using data augmentation.
2
2
26
@FrankRHutter
Frank Hutter
3 years
While I’m thrilled with the 100% acceptance rate of our #ICLR2022 submissions, I feel with the many authors who were less lucky. I’ve been there, too, many times. Peer review is quite noisy, but what pays off is perseverance and the ability to accept & incorporate feedback.
1
0
23
@FrankRHutter
Frank Hutter
3 years
By popular demand we extended the deadline for the #MetaLearning workshop at #NeurIPS2021 to September 29th. This is the day after the #ICLR2022 abstract deadline, so you can submit as a milestone for polishing up your ICLR submissions.
0
9
26
@FrankRHutter
Frank Hutter
5 years
4/4 papers accepted at #ICLR2020 , including an oral and a spotlight! What a nice Christmas present to my hard-working students :-) They deserve it! Thanks to the PCs, ACs, and reviewers for all their hard work with the reviewing process!
2
1
24
@FrankRHutter
Frank Hutter
2 years
DL 2.0 puts the domain expert back into the picture to specify (multiple) objective functions that matter in a particular application. Then multi-objective AutoML (optionally guided by DL experts) finds Pareto-optimal architectures, optimizers, SSL pipelines, hyperparams, etc 2/3
1
1
24
@FrankRHutter
Frank Hutter
4 years
Great news, our #ICLR2020 workshop on #nas got accepted! Deadline will be late February; please consider the NAS best practice checklist from our JMLR paper: . Organizing together with Aaron Klein, Jan Hendrik Metzen, Liam Li, Jovita Lukasik, and @ZelaArber
1
2
25
@FrankRHutter
Frank Hutter
5 years
We're starting a series of blog posts about our 7 neural architecture search (NAS) papers from 2019: First up: my PhD student Thomas Elsken about our ICLR 2019 paper on truly multiobjective NAS (experiments with up to 5 objectives):
0
6
24
@FrankRHutter
Frank Hutter
3 years
Would your company like to connect to the #AutoML research community and/or recruit in this field? If so, you might be interested in the call for sponsors for #AutoML_Conf2022 :
0
7
23
@FrankRHutter
Frank Hutter
2 years
Another way in which this may revolutionize data science is by comparison to AutoML systems. I’m the senior author of Auto-sklearn, a very successful 7-year project with 79 contributors. TabPFN was learned in 20h and in 1s ties what Auto-sklearn achieves in 5-60 minutes. 5/9
Tweet media one
1
1
25
@FrankRHutter
Frank Hutter
2 years
On day 2 of #Automl_Conf , @AnimaAnandkumar from @Caltech and @nvidia will speak about the Trinity of Explainable AI: Calibrated, Verifiable, and User-friendly AI
Tweet media one
1
1
23
@FrankRHutter
Frank Hutter
2 years
#AutoML_Conf update: registration is now open (), along with the call for applications for a small number of travel awards for those in need of financial assistance, . The tentative conference schedule is here:
0
13
23
@FrankRHutter
Frank Hutter
3 years
I'm very excited that my former PhD supervisor @HolgerHoos just received an Alexander von Humboldt (AvH) Professorship. This is Germany's highest-valued research price (5M Euro). The press release focuses quite a bit on on #AutoML :
1
1
23
@FrankRHutter
Frank Hutter
4 years
We have multiple positions open at #AutoML Freiburg, for both outstanding postdocs and outstanding research engineers. We focus on AutoML methods development, learning to learn, #NAS , #AutoPytorch , #Autosklearn , @open_ml , etc. Apply by March 15th; details:
0
8
22
@FrankRHutter
Frank Hutter
4 years
The result of this paper is still astonishing to me: given an RL environment E, you can train a (cheap) neural network “surrogate MDP“ to replace E‘s original MDP to train a strong policy for the original E much faster! Go #AutoML & #MetaLearning :-)
@AutoML_org
AutoML.org
4 years
Paper + Oral accepted to #AAAI2021 MetaLearn Workshop: We meta-learn proxies for teaching agents Gym tasks, allowing for robust & up to 60% faster training + transferability to unseen agents Joint work by Thomas Nierhoff, @FerreiraFabioDE , @FrankRHutter
0
4
14
0
1
22
@FrankRHutter
Frank Hutter
5 years
New post in our NAS series: It's on NAS-Bench-101, a tabular NAS benchmark we created with Google to enable reproducibility and scientific experimentation in NAS. We also compared RL and Bayesian Optimization (BO) on NAS for the first time; BO was better!
Tweet media one
0
7
21
@FrankRHutter
Frank Hutter
2 years
📢I'm excited to share details about the six keynotes at #AutoML_Conf (right after #ICML ) in Baltimore, on topics ranging from #metalearning , #AutoML systems, #explainability , #missingdata , #AGI , and #fairness . Don't miss it: 🧵
1
4
20
@FrankRHutter
Frank Hutter
2 years
Last call for submissions to the late-breaking workshop track of #AutoML_Conf . The deadline is in 28h (April 20th, AOE): Submitting to this workshop is a great milestone for AutoML papers in preparation for #NeurIPS !
1
6
20
@FrankRHutter
Frank Hutter
10 months
If you’d like to do a #PhD in #AutoML in my group, please apply through the #ELLISPhD program 👇
@ELLISforEurope
ELLIS
11 months
The portal is open: Our #ELLISPhD Program is now accepting applications! Apply by November 15 to work with leading #AI labs across Europe and choose your advisors among 200 top #machinelearning researchers! #JoinELLISforEurope #PhD #PhDProgram #ML
6
175
417
1
6
20
@FrankRHutter
Frank Hutter
3 years
The call for competitions at #AutoML_Conf2022 is out: We also welcome previous competitions reopening for submissions! Fast-track proposal deadline: Feb 19. Competition start: March 15. Winner announcement: at the conference July 25-27
1
11
21
@FrankRHutter
Frank Hutter
5 years
Are you writing a paper on neural architecture search (NAS)? Reviewing one? See our best practices for scientific research on NAS: , arXiv paper: , blog post: #automl #reproducibility
Tweet media one
0
6
19
@FrankRHutter
Frank Hutter
3 years
#AutoML helped the model-based RL approach PETS to find policies that break the Mujoco simulator for half-cheetah with a new rolling gate! We all know that RL is sensitive to its hyperparameters, but this is a new high ;-)
@natolambert
Nathan Lambert
3 years
Ever wonder what the limits of current Deep RL algorithms are with better hyperparameter tuning? The answer (with model-based RL): way better than we thought. It breaks the simulator. Blog & paper link: 1/⬇️ presented at @aistats_conf 2021
14
63
333
0
3
21
@FrankRHutter
Frank Hutter
1 year
Congratulations @jabergT , @dyamins and @neurobongo for this runner-up ICML test of time award. This early #AutoML work precedes main-stream neural architecture search by at least 4 years!
@icmlconf
ICML Conference
1 year
"Making a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures" by @jabergT , @dyamins , @neurobongo
0
2
24
0
2
21
@FrankRHutter
Frank Hutter
4 years
Efficient #AutoML for Off-Policy RL, doing joint architecture and hyperparameter search at almost no overhead by sharing experiences from a population of agents. Similar to PBT, but much faster due to using a shared replay buffer, and also optimizing architectures on the fly.
@jkhfranke
Jörg Franke
4 years
🥳Our new paper on Sample-Efficient Automated Deep RL was accepted @iclr_conf ! We train a population of RL agents while simultaneously optimizing HPs and architecture sample-efficiently. ()) w/ @GregorKoe @AndreBiedenkapp @FrankRHutter @AutoML_org
2
5
23
0
3
19