Oier Mees Profile
Oier Mees

@oier_mees

1,663
Followers
265
Following
61
Media
189
Statuses

Robot Foundation Models. Postdoc at @berkeley_ai w/ Sergey Levine. PhD @UniFreiburg . Prev. intern @NVIDIAAI .

Joined October 2022
Don't wanna be here? Send us removal request.
Pinned Tweet
@oier_mees
Oier Mees
10 months
Thrilled to announce Octo 🐙, an open-source robot foundation model! Octo is a sota generalist robot policy based on transformer+diffusion. Most importantly, you can finetune Octo *today* with flexible observation and action spaces on your robot setup!
8
29
208
@oier_mees
Oier Mees
1 year
Excited to start my Postdoc with Prof. @svlevine at @UCBerkeley ! I look forward to collaborating with all the amazing researchers at @berkeley_ai (BAIR)!
Tweet media one
Tweet media two
2
1
123
@oier_mees
Oier Mees
1 year
Today I had the pleasure of presenting my 🤖 robot learning research at @ToyotaResearch in Los Altos and to learn about their ongoing efforts on building and scaling 📈 AI models for robotics! Thanks for the invitation!
Tweet media one
2
2
112
@oier_mees
Oier Mees
1 year
Excited that AVLMaps, our take for 🤖 following multimodal prompts from audio 🔊, vision 👀 and language🗣️ w/ foundation models to solve zero-shot spatial goal Navigation will be presented at ISER 23! w/ @huang_chenguang @andyzeng_ @wolfram_burgard :
0
15
100
@oier_mees
Oier Mees
1 year
Honored to receive the IEEE Robotics and Automation Letters Best Paper Award for our CALVIN paper and being a runner up for the IEEE International Conference on Robotics and Automation Best Paper Award in Robot Learning for HULC2! w/ @wolfram_burgard @UniFreiburg @utn_nuremberg
Tweet media one
5
3
70
@oier_mees
Oier Mees
11 months
Check out the workshop recordings 📹📺 if you missed our 2nd edition Language and Robotics Workshop at #CoRL2023 !
@oier_mees
Oier Mees
1 year
Join us for the 2nd edition of the #LangRob workshop at #CoRL2023 in vibrant Atlanta! Get ready for an unforgettable day with an all-star ensemble of speakers and two spicy panels that will ignite your passion for language and robotics! 🔥🤖 P.S. Guess who wrote this tweet 😉
Tweet media one
0
11
67
0
9
64
@oier_mees
Oier Mees
1 year
It was an honor to give a Google Tech Talk to present my latest research on 🤖 robot learning at @GoogleDeepMind in Mountain View! It was fun and very inspirational to spend the day talking to many smart people about tackling the next research challenges for scaling 🤖 learning!
Tweet media one
1
0
63
@oier_mees
Oier Mees
1 year
Join us for the 2nd edition of the #LangRob workshop at #CoRL2023 in vibrant Atlanta! Get ready for an unforgettable day with an all-star ensemble of speakers and two spicy panels that will ignite your passion for language and robotics! 🔥🤖 P.S. Guess who wrote this tweet 😉
Tweet media one
0
11
67
@oier_mees
Oier Mees
2 years
Excited to share our latest collaboration with @GoogleAI on following multimodal prompts from audio, vision and language to solve zero-shot spatial navigation tasks in the real world!
@andyzeng_
Andy Zeng
2 years
Can robots 🤖 to navigate to sounds 🔊 they've heard? w/ audio-language 🔊✏️ foundation models, excited that we can now ask our helper robots to "go to where you heard coughing" Audio-Visual-Language Maps w/ @huang_chenguang @oier_mees @wolfram_burgard :
1
47
207
0
10
58
@oier_mees
Oier Mees
2 years
Great way to start the year with 2/2 papers accepted at #ICRA23 on language based skill learning and navigation! VLMaps: HULC++: Thanks to my wonderful collaborators @andyzengtweets , @huang_chenguang , @wolfram_burgard and Jessica!
0
10
59
@oier_mees
Oier Mees
2 years
PhD thesis submitted! Under the supervision of @wolfram_burgard Big shout out to my family, friends, and colleagues for the incredible encouragement and support. This milestone wouldn't have been possible without you! 🙌🎓 #PhD #ThesisSubmitted
Tweet media one
5
1
47
@oier_mees
Oier Mees
2 years
Unlabeled ♾ data is the key ingredient of todays ML, but how can we leverage unstructured, unlabeled, offline datasets in robot learning? Introducing TACO-RL, a robotic model that consumes uncurated data to learn multi-tier, long-horizon policies! 📜
1
10
35
@oier_mees
Oier Mees
1 year
Congrats to my student @ErickRoseteBeas who has won the Best Master Thesis Award from the Association of German Engineers (VDI) ! 🙌 His thesis "Skill-Chaining Latent Behaviors with Offline RL" co-supervised with @GabrielKalweit led to our TACO-RL #CoRL2022 paper!
@oier_mees
Oier Mees
2 years
Unlabeled ♾ data is the key ingredient of todays ML, but how can we leverage unstructured, unlabeled, offline datasets in robot learning? Introducing TACO-RL, a robotic model that consumes uncurated data to learn multi-tier, long-horizon policies! 📜
1
10
35
0
4
30
@oier_mees
Oier Mees
11 months
Honored to have been invited to serve as an Associate Editor of Robot Learning for @ieeeras Robotics and Automation Letters journal (RA-L)! Looking forward to working with the team of @AleksandraFaust 🙂
0
1
30
@oier_mees
Oier Mees
1 year
Habemus PhD! 🎓 Thanks to @wolfram_burgard , Dieter Fox, Joschka Boedecker, Matthias Teschner and Armin Biere for being on my committee. Thanks to all friends, family and collaborators for being part of this incredible journey. Summa cum laude!
Tweet media one
Tweet media two
Tweet media three
2
0
26
@oier_mees
Oier Mees
1 year
VLMaps was an amazing collaboration between @GoogleAI @UniFreiburg @utn_nuremberg with @andyzengtweets @huang_chenguang and @wolfram_burgard check the project for zero-shot language enabled spatial robot navigation! #ICRA2023
Tweet media one
0
6
24
@oier_mees
Oier Mees
1 year
Honored to be featured on the newspaper!
Tweet media one
0
0
22
@oier_mees
Oier Mees
1 year
I am thrilled to receive the award of AI Newcomer 2023 by the German Federal Ministry of Education and Research and the German Informatics Society! Thanks to my collaborators, all the people who supported us with their votes, the jury and the KI Amp organizers @informatikradar
@informatikfest
KI-Camp 2023
1 year
Oier Mees is AI Newcomer in the category Technical and Engineering Sciences - congratulations! 🎉 @oier_mees research aims to enable robots to perform a wide range of everyday tasks in human-centered environments. More information here:
Tweet media one
0
0
0
1
0
20
@oier_mees
Oier Mees
1 year
@svlevine I wanted to highlight, that despite the authors being humble about this, they have significantly outperformed SOTA on the challenging CALVIN zero-shot benchmark ()! 🙌 @kvablack @mitsuhiko_nm @HomerWalke Pranav Atreya @aviral_kumar2 @chelseabfinn @svlevine
1
2
18
@oier_mees
Oier Mees
1 year
Congrats to my co-authors and all the other awardees and finalists! Check out the two papers: CALVIN: HULC2: #robotics #learning #ICRA2023 #RAL #robotlearning
0
3
16
@oier_mees
Oier Mees
11 months
Come to the all-star ✨ #LangRob panel with @animesh_garg @DorsaSadigh @Ken_Goldberg @DhruvBatraDB and Abhinav Gupta moderated by the fantastic @ybisk ! Room Sequoia 2 or stream
Tweet media one
@oier_mees
Oier Mees
1 year
Join us for the 2nd edition of the #LangRob workshop at #CoRL2023 in vibrant Atlanta! Get ready for an unforgettable day with an all-star ensemble of speakers and two spicy panels that will ignite your passion for language and robotics! 🔥🤖 P.S. Guess who wrote this tweet 😉
Tweet media one
0
11
67
0
0
16
@oier_mees
Oier Mees
1 year
Excited to have been part of this 🌎 scale collaboration! Ground breaking 173 researchers, 34 labs, 22 embodiments and +1M trajectories. Hope this will become our ImageNet moment! Proud to have been the 1. 🇪🇺 lab to provide evals in my Freiburg setup before joining Berkeley.
@QuanVng
Quan Vuong
1 year
RT-X: generalist AI models lead to 50% improvement over RT-1 and 3x improvement over RT-2, our previous best models. 🔥🥳🧵 Project website:
7
143
619
0
1
15
@oier_mees
Oier Mees
2 years
Our workshop on Language and Robotics #LangRob is on @ #CoRL2022 ! 🔥 Very interesting talks and panel discussions on the future of language, LLMs and robotics. + amazing live demo by @andyzengtweets ! Kudos to co-organizers @xiao_ted , @shahdhruv_ , @mohito1905 + many more!
Tweet media one
Tweet media two
Tweet media three
0
1
14
@oier_mees
Oier Mees
1 year
If you want to know how to use the Open X-Embodiment AKA RT-X dataset check out the deep dive by the fantastic @KarlPertsch who led the herculean effort of collecting and standardizing the individual datasets together with the no less amazing @QuanVng and @xiao_ted !
@KarlPertsch
Karl Pertsch
1 year
Very excited to release the Open X-Embodiment Dataset today — the largest robot dataset to date with 1M+ trajectories! Robotics needs more data & this is a big step! There’s lots to unpack here, so let’s do a deep dive into the dataset! 🧵1/15
8
90
450
0
1
15
@oier_mees
Oier Mees
1 year
Super honored to have been invited to join the @ELLISforEurope Society, the Pan-European AI Laboratory! I feel incredibly grateful for the endorsement of amazing scholars including @chelseabfinn , @m_wulfmeier , @RCalandra @DanicaKragic and Jens Kober!
Tweet media one
0
0
13
@oier_mees
Oier Mees
2 years
Great panel between pro language+robotics researchers and contra! Come check it out at the #LangRob workshop @ #CoRL2022 ! @LerrelPinto @andyzengtweets @DorsaSadigh @ybisk
Tweet media one
1
1
14
@oier_mees
Oier Mees
1 year
Some very exciting new work to share tomorrow... 🤖 Stay tuned! 👀 👀
0
0
11
@oier_mees
Oier Mees
2 years
I am in Auckland this week for #CoRL2022 ! Ping me if you are interested in talking about natural language + robot learning 😊
0
0
11
@oier_mees
Oier Mees
2 years
@LerrelPinto Great work @LerrelPinto ! Love seeing more people work on leveraging play data. We present our take on it in our upcoming CoRL paper:
1
2
9
@oier_mees
Oier Mees
2 years
On my way to #IROS2022 in Kyoto 🇯🇵 to present to present our RA-L papers on language conditioned policy learning CALVIN and HULC! 👇 Ping me if you are interested in chatting on robot skill learning and/or language grounding!
0
0
7
@oier_mees
Oier Mees
2 years
Thrilled to to see our work on connecting foundation models <-> maps (via language) featured on the Google AI Blog!
@GoogleAI
Google AI
2 years
VLMaps is a map representation that fuses pre-trained visual-language embeddings into a 3D reconstruction of an environment, enabling robots to index landmarks and generate open-vocabulary maps for path planning. Learn more and copy the code →
21
148
585
0
0
7
@oier_mees
Oier Mees
1 year
We are hosting the 2nd workshop on Language and Robot Learning at #corl2023 ! We've got an incredible panel of speakers, and we are excited to discuss together "Language as Grounding"! Be sure to submit your papers!
@xiao_ted
Ted Xiao
1 year
Announcing the 2nd Workshop on Language and Robot Learning at #CoRL2023 on November 6th, 2023! This year's theme is "Language as Grounding". Featuring a great speaker lineup and two panels! Website: CfP: Deadline: October 1
Tweet media one
2
13
76
0
0
7
@oier_mees
Oier Mees
1 year
Pretty cool results! Looking forward to your talk on our #LangRob workshop at #CoRL2023 @Ed__Johns !
@Ed__Johns
Edward Johns
1 year
So you say that LLMs can't be used for lower-level robot control, because they aren't directly trained on robotics data? Actually, we found that often, they can! "Language Models as Zero-Shot Trajectory Generators". See: . I'll explain more below ... 🧵
9
50
277
1
0
7
@oier_mees
Oier Mees
11 months
We are collecting though questions to make our panelists sweat! 🔥 Ask a question to your favorite panelist here!
@oier_mees
Oier Mees
1 year
Join us for the 2nd edition of the #LangRob workshop at #CoRL2023 in vibrant Atlanta! Get ready for an unforgettable day with an all-star ensemble of speakers and two spicy panels that will ignite your passion for language and robotics! 🔥🤖 P.S. Guess who wrote this tweet 😉
Tweet media one
0
11
67
0
0
6
@oier_mees
Oier Mees
1 year
Well deserved! 🙌 We have great stuff in the works... 👀
@smithlaura1028
Laura Smith
1 year
Excited and deeply grateful to share that I've been selected as a recipient of the 2023 Google PhD Fellowship! Thank you @GoogleAI for graciously supporting my research in developing intelligent, self-improving real-world systems 🤖
24
13
372
0
0
6
@oier_mees
Oier Mees
1 year
It was an honor to receive the AI Newcomer 2023 award by the German Federal Ministry of Education and Research and the German Informatics Society! Congratulations to Noémie Jaquier (KIT) who also won the award! 🙌 @informatikradar @BMBF_Bund @informatikfest 📷© BMBF/C. Czybik
Tweet media one
0
1
6
@oier_mees
Oier Mees
2 years
Impressive work by @chichengcc @SongShuran and team! Love how diffusion policies capture multimodal distributions and the thorough evaluation!
@chichengcc
Cheng Chi
2 years
What if the form of visuomotor policy has been the bottleneck for robotic manipulation all along? Diffusion Policy achieves 46.9% improvement vs prior StoA on 11 tasks from 4 benchmarks + 4 real world tasks! (1/7) website : paper:
9
99
538
0
0
5
@oier_mees
Oier Mees
1 year
Our Dibya has made a neat visual browser of the large scale RT-X robot dataset! Surprised to see the sim data with 32x32 images in it 😅
@its_dibya
Dibya Ghosh
1 year
Got a chance to dig through the big robot X-embodiment dataset released last week, and hacked together a little website for others to look through the data. Check it out! There's some pretty random and diverse robot data in there
0
37
173
0
0
5
@oier_mees
Oier Mees
1 year
Let's talk about Visual Language Maps for Robot Navigation here! #ICRA2023 @andyzengtweets @Chenguang @wolfram_burgard
Tweet media one
1
0
5
@oier_mees
Oier Mees
11 months
LangRob @ #CoRL2023 is starting in 10 minutes! Join us in Sequoia 2 or online:
@oier_mees
Oier Mees
1 year
Join us for the 2nd edition of the #LangRob workshop at #CoRL2023 in vibrant Atlanta! Get ready for an unforgettable day with an all-star ensemble of speakers and two spicy panels that will ignite your passion for language and robotics! 🔥🤖 P.S. Guess who wrote this tweet 😉
Tweet media one
0
11
67
0
0
4
@oier_mees
Oier Mees
2 years
Well done @andyzengtweets and @jackyliang42 ! 👏 Great to see the demo come a long way since my last visit! 🙂
@GoogleAI
Google AI
2 years
From our demo floor at AI@, check out Code as Policies at work. This helper robot is able to compute and execute a task given via natural language. Read more →
13
66
253
0
0
3
@oier_mees
Oier Mees
2 years
The emergent ability of LLMs for evaluating their own answers has ♾️ potential. Could LLMs improve themselves like this? 🤔
@ericjang11
Eric Jang
2 years
Instead of finding the perfect prompt for an LLM (let's think step by step), you can ask LLMs to critique their outputs and immediately fix their own mistakes. Here's a fun example:
61
223
1K
0
1
3
@oier_mees
Oier Mees
1 year
@svlevine @kvablack @mitsuhiko_nm @HomerWalke @aviral_kumar2 @chelseabfinn Would be amazing if today's equally impressive concurrent Video Language Planning () work would also consider evaluating on the public zero-shot CALVIN benchmark 🙂 @du_yilun @mengjiao_yang @brian_ichter @andyzeng_ @peteflorence
0
0
3
@oier_mees
Oier Mees
2 years
Excited to be nominated for the 2023 AI Newcomer award by the German Federal Ministry of Education and Research and @informatikradar ! 😊If you have time, please support me by voting via #KINewcomer #KICamp23 @UniFreiburg
Tweet media one
0
2
3
@oier_mees
Oier Mees
2 years
Excited to co-organize #RSSPioneers2023 ! Deadline for applications is this Friday Feb 10 AoE, good luck everybody! 😊
@RSSPioneers
RSS Pioneers
2 years
Applications for the #RSSPioneers2023 workshop (to be held at #RSS2023 ) are open! Senior PhD students and early-career researchers in robotics are encouraged to apply. Find more details on how to apply here:
1
6
7
0
0
3
@oier_mees
Oier Mees
11 months
@contactrika Congrats Rika! 😊
0
0
1
@oier_mees
Oier Mees
2 years
This summarizes GPT-4s 98 pages of technical report perfectly: we trained stuff and got results. It's sad that AI is increseangly becoming closed source research.
Tweet media one
0
0
2
@oier_mees
Oier Mees
2 years
We learn a single multi-task visuomotor policy for 25 tasks that outperforms state-of-the-art baselines, with an order-of-magnitude improvement by taking a 9 hour long continuous demo and using only image obs. We have released our data, code and models!
0
0
3
@oier_mees
Oier Mees
1 year
@xiao_ted You are too kind @xiao_ted ! All this work stands on the shoulders of many of your and your teams ( @coreylynch , Pierre Sermanet...) advancements!
0
0
3
@oier_mees
Oier Mees
1 year
@SurajNair_1 @ToyotaResearch The pleasure was mine! Really looking forward to the cool stuff you guys are working on at TRI!
0
0
2
@oier_mees
Oier Mees
2 years
A key ingredient for our offline RL policy is the goal sampling strategy. We find that incorporating hard negative examples from scenes that contains similar proprioceptive states enables the Q-function to learn to focus on the scene’s under-actuated parts (e.g., objects).
Tweet media one
1
0
2
@oier_mees
Oier Mees
2 years
TACO-RL makes sense of this data by combining the strengths of imitation learning and offline RL. Our hierarchical approach combines a low-level policy that learns latent skills via imitation and a high-level policy learned from offline RL for skill-chaining the latent behaviors.
Tweet media one
1
0
2
@oier_mees
Oier Mees
1 year
Due to popular demand, we are extending the #CoRL2023 LangRob workshop deadline by one week to *October 8, 23:59 AOE*! LangRob remains non-archival, and welcomes recent submissions to ICRA/ICLR.
@oier_mees
Oier Mees
1 year
We are hosting the 2nd workshop on Language and Robot Learning at #corl2023 ! We've got an incredible panel of speakers, and we are excited to discuss together "Language as Grounding"! Be sure to submit your papers!
0
0
7
0
0
2
@oier_mees
Oier Mees
1 year
Can't wait to share it!
@svlevine
Sergey Levine
1 year
So far, there have been some remarkable large-scale robotic learning results, datasets, and milestones this year. But we have something pretty big coming out tomorrow. So big that we needed a globe to visualize its scale😉
24
48
772
0
0
2
@oier_mees
Oier Mees
1 year
@xiao_ted @GoogleDeepMind Thanks a lot Ted! It's always a great pleasure to talk and brainstorm with your!
0
0
1
@oier_mees
Oier Mees
1 year
@coreylynch Very impressive @coreylynch ! Would love to teach the robot some everyday tasks in our lab if you are looking for academic collaborators!
0
0
1
@oier_mees
Oier Mees
2 years
Had a blast presenting TACO-RL, our take on scaling robot learning to highly diverse, unstructured, unlabeled, offline data to learn task-agnostic, long-horizon policies at #CoRL2022 !
Tweet media one
0
0
2
@oier_mees
Oier Mees
11 months
@chris_j_paxton Congrats Chris! 🎉
0
0
2
@oier_mees
Oier Mees
1 year
@Ken_Goldberg Congrats on the big ICRA push!
0
0
1
@oier_mees
Oier Mees
2 years
0
0
1
@oier_mees
Oier Mees
2 years
@ChukwuemekaOme4 @LerrelPinto That's a great question! A good starting point might be CALVIN, which contains several hours of multimodal play data for 4 simulated tabletop environments.
1
0
1
@oier_mees
Oier Mees
2 years
@shahdhruv_ @wolfram_burgard Thanks! But the defense is missing, so no Dr yet 😉
0
0
1
@oier_mees
Oier Mees
1 year
@GeorgiaChal @wolfram_burgard Thanks a lot Georgia! See you hopefully at ICRA 🙂
1
0
1
@oier_mees
Oier Mees
2 years
@hausman_k Amazing work @hausman_k @xiao_ted et. al. ! Since you mention the project took so long, what were the biggest challenges you had to solve to get it work?
1
0
1
@oier_mees
Oier Mees
1 year
@coreylynch @wolfram_burgard @UniFreiburg @utn_nuremberg Thanks Corey! A lot of this research has been possible thanks to the pioneering work from you 😊
0
0
1
@oier_mees
Oier Mees
2 years
Based on the success of tapping into large scale unlabeled data in other fields, our goal was to build a model that absorbs uncurated robot data. It should discover skills from highly diverse, multimodal data, collected by asking users to teleop without any tasks in mind.
1
0
1
@oier_mees
Oier Mees
1 year
@jackyliang42 @wolfram_burgard Thanks Jacky 😊 likewise!
0
0
0
@oier_mees
Oier Mees
2 years
0
0
1
@oier_mees
Oier Mees
2 years
@chris_j_paxton @mshaheerahmed What? 😮 That's a shame, I really liked CLIP-Fields!
0
0
1
@oier_mees
Oier Mees
2 years
@chris_j_paxton Thanks Chris! Looking forward to more research in this direction 😊
0
0
1