AI Curriculum
⚪MIT 6.S191: Introduction to Deep Learning
⚪CS231n: CNNs for Visual Recognition, Stanford | Spring 2019
⚪CS224n: NLP with Deep Learning, Stanford | Winter 2019
⚪CS285: Deep Reinforcement Learning, UC Berkeley | Fall 2019
AI curriculum 🤖
CS231n: CNNs for Visual Recognition, Stanford | Spring 2019
CS224n: NLP with Deep Learning, Stanford | Winter 2019
CS285: Deep Reinforcement Learning, UC Berkeley | Fall 2019
Interactive Tools for ML, DL and Math
- CNN Explainer
- Play with GANs in the Browser
- ConvNet Playground
- Distill: Exploring Neural Networks with Activation Atlases
- A visual introduction to Machine Learning
- Interactive Deep Learning Playground
...
Super excited to share that I joined
@huggingface
🤗 working on open source and open science efforts and most importantly, supporting the AI community building the future! 😊🚀
We completed Part I Mathematical Foundations of the MML book! 🎉 On Sunday we'll do a remote recap session – let me know if you'd like to join!
Linear Algebra
Analytic Geometry
Matrix Decompositions
Vector Calculus
Probability & Distribution
Optimization
Interactive tools are great. We have collected a bunch more incl.
⚫ CNN Explainer
⚫ GANs in the Browser
⚫ ConvNet Playground
⚫ Activation Atlases
⚫ Initializing neural networks
⚫ Embedding Projector
⚫ OpenAI Microscope
⚫ and more
If you’ve never tried it, is the single best explanatory tool for neural networks. An essential demo for any deep learning course!
I still notice improvements in my intuition just by tinkering with it.
From
@dsmilkov
@shancarter
.
Dimitris prepared a Colab notebook with implementation blocks in Keras for
- AlexNet
- VGG
- GoogleNet/Inception
- MobileNet
- ShuffleNet
- ResNet
- DenseNet
and a cheat sheet containing architecture details and summaries of the original papers.
5 years from now, Universities will still be relevant, but there will be an army of young people assembling their own individual education system composed of free, top-quality online resources offered by leading experts. A generation of high-tech niche experts
If you're looking for an intro to applied ML, this is very well structured. Could be also something for
@__MLT__
study sessions 😊
🤖 Applied Machine Learning (Cornell CS5785, Fall 2020)
Lecture series:
GitHub:
.
@Stanford
continues to amaze me. CS106A Code in Place – free coding education.
The course teaches the fundamentals of programming with Python. They specifically encourage humanists and social scientists to join. No background in programming needed. 🙌
This is so cool. + open source.
@netflix
👏
"Polynote: a new, polyglot notebook with first-class Scala support, Apache Spark integration, multi-language interoperability including Scala, Python, and SQL, as-you-type autocomplete, and more." 🤯
Super excited to be at DeepCon on June 8! We'll look at
AlexNet
VGG
Inception
MobileNet
ShuffleNet
ResNet
DenseNet
Xception
U-Net
SqueezeNet
YOLO
RefineNet
The workshop will be recorded, you can find our code on GitHub (Part I: ConvNets.ipynb)
@dkatsios
I'm collecting interactive ML/DL/Math tools. :) Things like:
- GANs in the browser
- ConvNet Playground
- Activation Atlas
- Embedding Projector
- Initializing Neural Networks
and more. Please add to the list if you know more great resources.
@__MLT__
Updated list of open lectures from top Universities includes now:
👉 DS-GA 1008: Deep Learning (with PyTorch), NYU | Spring 2020
👉 CS285: Deep Reinforcement Learning, UC Berkeley | Fall 2020
See all:
Pretty cool:
@DeepMind
released the '21 DeepMind x
UCL Reinforcement Learning Lecture Series – comprehensive intro to modern RL. Anyone up for study sessions
@__MLT__
? 📚
I received an offer for a Computer Science degree on Coursera for $19,200. I'd rather take 6 months off, move to some nice places, e.g. Barcelona, the South of France, Belgrade, .. (living expenses: $2,500/month + flights) take some online courses and start building things.
Our new-ish, neural, pure Python stanfordnlp package provides grammatical analyses of sentences in over 50 human languages! Version 0.2.0 brought sensibly small model sizes and an improved lemmatizer. Try it out: pip install stanfordnlp
I'm happy to announce that I joined
@CausalyAI
as a Computational Linguist working on natural language understanding for biomedical literature! 🎉 We're helping scientists and research teams get evidence from millions of publications in Biomedicine fast. 🙌
Already more than 10k GitHub Stars! AutoGen enables building next-gen LLM applications based on multi-agent conversations with minimal effort. It simplifies the orchestration, automation, and optimization of a complex LLM workflow. 🔥
Some personal news: I'm happy to share that I moved to New York! 🎉 Grateful that my O-1 Visa was approved and can't wait to see what this new chapter in my life will bring. I'm still working on AI content safety for LLMs / Azure OpenAI and I still love it. And related to that, I
Yesterday someone (ML, CS PhD, Stanford) said he would not hire a person who is online educated in Machine Learning. Who here agrees and who thinks differently?
It's been 5 years since I started my machine learning journey, deeply grateful for the incredible opportunities to work on NLP/DL projects with brilliant leaders and wonderful teams. Can't wait to see what the next 5 years will bring! 💙
We receive hundreds of messages on how to get started with Deep Learning. You need Theory, Math & Code, but there's a data overload of online resources, random and scattered. It's difficult and time-consuming to understand and curate which resources are the best for your purpose.
Funny thing just happened on the train.
Him: Excuse me, are you a ML Engineer?
Me: 🧐
Him: Is that you? (Shows me my LinkedIn on his phone)
Me: 🧐🧐
Him: I sent you a message, I’m <name>.
Me: Have we met before?
Him: No, you never replied. I’m a recruiter.
WOooops. Weird. 😆
3 pretty cool open source annotation tools for Computer Vision and NLP tasks:
⚫ Make Sense
⚫ Doccano
⚫ INCEpTION
Bonus points for Make Sense: Runs in your browser without installation, and without storing your images! 👏👏👏
I'm looking for resources/tutorials/blog posts on OOP and SWE best practices for ML and Data Science projects in production. Can anyone help?
Something like
@phoxicle
's slides and code for "OO Design Patterns For ML" at MLT Women in Machine Learning:
Woke up this morning with the idea of taking a week off this summer to only do ml math. with a tutor, visualizations, exercises and the goal of creating something, like a blog post or a video or a notebook. would anyone else be up for that?
Starting on Sunday, we'll be coding up some of the most fundamental Deep Learning architectures
@__MLT__
, following
A great opportunity to dive into Deep Learning if you're new to the field!
APAC
@pierre_wuethri
&
@mrityunjay_99
This repo contains examples and best practices for building NLP systems, provided as Jupyter notebooks and utility functions, incl. state-of-the-art methods and common scenarios that are popular among researchers and practitioners working on text/language.
A great way to approach complexity is through visual and interactive exploration. Added "Interpretability" and the "Language Interpretability Tool" to the list of interactive tools in machine learning and math. 🔥
LIT:
More tools:
How do you read research papers and implement the networks?
practice. practice. practice.
@dkatsios
will show us a few example today. And we added some more that you can try on your own. :) Find the cheat sheet and the notebook here: ConvNets
Complete Lecture Series:
Stanford CS224N | NLP with Deep Learning | Winter 2019
We're very lucky that we can be wherever we want to be and learn from the most brilliant people in the world - for free.
NLP community: Interpreting text models with Captum – an open source, extensible library for model interpretability built on PyTorch. Sentiment Analysis and interpreting BERT Models in the tutorials.
What's the most important implementation information in a Deep Learning paper and how do we code it up?
@dkatsios
teaches you exactly that with a series on CNN Architectures.
Notebooks
Videos
MLT
@__MLT__
Thank you for 6.3k 🌟 You'll find all things machine learning in our
@__MLT__
GitHub repos – learning resources, papers with annotations, a collection of interactive tools and our open source projects. Big thanks to all contributors! 🤖
A
@__MLT__
team has been translating material for Stanford's CS 230 Deep Learning course from English into Japanese. We're excited to share our work and hope that it'll be useful for Japanese speakers studying ML/DL.
深層学習のチートシートは日本語で利用可能になりました。
Loved our first ML math sessions. 130 people from all over the world joined. Looking forward to more!
How to read Mathematics
● Don't Miss the Big Picture
● Don’t be a Passive Reader
● Don’t Read Too Fast
● Make the Idea Your Own
● Know Thyself
I'm kicking off a series of 11 fully remote Machine Learning Math Reading Sessions tonight at 7pm, Tokyo time. We'll go through “Mathematics For Machine Learning” by Marc Peter Deisenroth, A Aldo Faisal, and Cheng Soon Ong, to be published by Cambridge University Press.
I thought that self-isolation wouldn't be a big problem for me, I work a lot, and I'm an introvert. It's day 5 and nope, that's not for me. I need to walk in order to think and be productive and creative, I miss cycling, and yes, people too. This sucks. I need some virtual hugs.
All lectures on probability theory by
@fabinger
are now online! Along with those, Michal also provides quizzes and homework! 🔥 Let us know if you'd like to see more math content!
🤖 See all
🧠 Lecture 1
What is
#AlphaZero
computing when it learns to play chess, and what can we learn about chess from it? Our team investigates the emergence of human concepts in AlphaZero and the evolution of its play through training. 1/2
We‘re a community of open education and open source contributors in ML and held more than 200 technical AI workshops, talks and study sessions with thousands of participants in the past. It’s wonderful to see us grow, way beyond Tokyo.
@__MLT__
💗
Working with a team of Full Stack ML Engineers on something in prod.
Me: How can I learn more about Software Engineering?
Them: We'll assign some tasks to you.
Me: Yeah. Totally. Awesome. Cool. Thanks. 😎
Real me: 😱
"During 2018 I achieved a Kaggle Master badge (...). Very often I found myself re-using most of the old pipelines over and over again. At some point it crystallized into this repository."
PyTorch extensions for fast R&D prototyping and Kaggle farming
Super excited to kick off 3 days of talks on JAX/Flax, TPUs, Transformers and more during our
@GoogleAI
&
@huggingface
JAX/Flax community week! What a speaker lineup! 😍
See you there! 🤗
In 2016 I met a Japanese 部長 in Frankfurt in a seminar. When I told him that I was about to move to Tokyo he said "You're a foreigner, you're young, and you're a woman. Nobody is going to take you seriously."
I spent the best 4 years of my life in Japan. 💙
Rules of Machine Learning: Best Practices for ML Engineering
⚫ understand whether the time is right for building a ML system
⚫ deploy your first pipeline
⚫ launch and iterate, while adding new features, evaluate
⚫ what to do when you reach a plateau
Celebrating 3 years
@__MLT__
!
- From 2 to 6,000 members
- 150 workshops & talks
- OSS & AI for Social Good
- ML Research
- University collaborations
- Fully community-driven
- Nonprofit status
- Tokyo | London
<3
Hello Math friends! :) We're super excited to host a series of 4 lectures on probability theory at
@__MLT__
starting next week, created and led by
@fabinger
(The University of Tokyo,
@acalonia_x
)! 🎉
Lecture 1-4 in the thread 👇
Netflix open sourced the DS Framework Metaflow.
"What is the hardest thing for you as a data scientist at Netflix?" (..) getting the first version to production took surprisingly long – mainly because of reasons related to software engineering.
?
Attention is all you need. 🧠🤖
👨💻 Today & June 13: Theory and implementation of "Attention Mechanisms" (PyTorch) in our Dive into Deep Learning sessions
👩🏿💻 On June 13 we'll read and discuss "Attention Is All You Need" (2017) in our MLT __init__ session
Feeling a bit intimidating to write about it but work on attacks can lead to good insights for mitigation. Plan to write about mitigation work separately later.
Also want to thank all the researchers who shared disclosure reports w/ us so far. 🙏🙏🙏
One of the best ways to learn is to pick up a book. In April we'll start a new series of 6 study sessions based on . Join us if you're new to DL and would like to code along with us in PyTorch. Sessions will be announced here 👇
Our Machine Learning meetups are for free. We’re holding 30-40 events per year, 60 attendees each on average. So we’d have to cover thousands of dollars to keep our open education efforts for free. The alternative is that our community pays Meetup? Bye,
@Meetup
. 👋🏽
WOW is now charging attendees a mandatory $2 to attend events. If you want it to be free, you have to cover the $2 for every person that comes to your event.
As someone who ran meetups for > 4 years, this sucks.
A couple of years ago I created this repo with interactive tools for math, deep learning and interpretability. Are there good tools for LLMs out there that I could add to this list?
Super excited to kick off weekly study sessions at
@__MLT__
where we'll go through
@fchollet
's "Deep Learning with Python" and notebooks, led by the amazing
@dkatsios
! 🙌 Best time to get into Deep Learning! Join us for the first session here
The nonprofit "Open Syllabus Project" collects and analyzes millions of syllabi across fields (using ML and other methods) to support educational research and novel teaching and learning applications. The results are made freely available.
#OpenEducation
"If you can spend Monday coding, Tuesday designing a workshop, Wednesday writing a blog post and Thursday crafting ML memes, this job opening is for you!"
Come join us! 🤗
Amazing, congrats to the team! This sounds extremely time consuming and stressful. I’m not sure if those guys have a full time job, but if so, I will immediately stop complaining about having too much work. 👏🏼👏🏼
Wanted to visit my mom today, told her it’s been the longest week and I need to take my eyes off the screen. When I arrived she prepared a space in the garden for me to read and sleep and be for myself. It’s good to be home.
Neuroscience and AI. I'm thrilled to have been accepted to the Neuro-inspired Computation Course at the University of Tokyo, with lectures on Brain Architecture, Brain Dynamics, Machine Learning and Reinforcement Learning.
Had a very important meeting today and screwed up. I was not myself, trying too hard, very nervous and all over the place. Note to self: Don't try to impress people. There's nothing to prove, to anyone. Be yourself. That's good enough.