William Falcon ⚡️ Profile Banner
William Falcon ⚡️ Profile
William Falcon ⚡️

@_willfalcon

14,239
Followers
463
Following
395
Media
2,322
Statuses

CEO @LightningAI . Creator, PyTorch Lightning⚡, Lightning Studio. x AI PhD student w @kchonyc @ylecun | @AIatMeta , @Columbia @GoldmanSachs @NSF @usnavy | 🇻🇪

New York City
Joined August 2014
Don't wanna be here? Send us removal request.
Pinned Tweet
@_willfalcon
William Falcon ⚡️
3 months
Most people don't know that Lightning Studios offer: - free persistent storage - free persistent environments - unlimited background execution - VSCode, PyCharm, (any IDE) integration Set up your Studio environment once and reuse it again any time 🤯🤯
Tweet media one
44
167
2K
@_willfalcon
William Falcon ⚡️
1 year
GPT-4 paper : Let me save you the trouble:
Tweet media one
60
400
3K
@_willfalcon
William Falcon ⚡️
6 years
When you explain your poster and don’t notice Geoffrey Hinton standing behind you #NeurIPS2018
Tweet media one
24
316
2K
@_willfalcon
William Falcon ⚡️
1 year
everyone in AI research before 2020
Tweet media one
24
146
2K
@_willfalcon
William Falcon ⚡️
1 year
I know this is fake because now that @Microsoft is in the mix, GPT numbers will be: - GPT-3 - GPT-4 - GPT-XP - GPT-2000 - GPT-7 - GPT-9 - GPT-10
Tweet media one
92
77
1K
@_willfalcon
William Falcon ⚡️
7 months
did I get this right?
Tweet media one
30
120
1K
@_willfalcon
William Falcon ⚡️
5 months
In this new 90 minute lecture, I show how to pretrain a 3B LLM from scratch. No edits. No detail skipped. Companies want you to believe pretraining models is super hard and costly. With the right tools, it's not. - We start by tuning the model on a cheap A10G. - Then we scale
Tweet media one
11
168
1K
@_willfalcon
William Falcon ⚡️
3 months
At this point, if you're still "GPU poor" you just haven't started using Lightning Studios...
Tweet media one
37
63
953
@_willfalcon
William Falcon ⚡️
4 years
Excited to announce the release of bolts! - linear + Logistic regression on TPUs/GPUs - self-supervised learning - RL - GANs - GPT - Callbacks library - Datamodules library All powered by lightning and rigorously tested. Don't waste more time implementing your own baselines...
@PyTorch
PyTorch
4 years
Bolts is a new Deep Learning research and production toolbox from PyTorch Lightning. Iterate faster with pre-trained models, components, callbacks, and data sets, all modular, tested, and optimized for GPUs/TPUs. Simply subclass, override, and train.
1
202
692
14
142
719
@_willfalcon
William Falcon ⚡️
3 months
Excited to announce our new compiler - Thunder! (built in collaboration with NVIDIA). 🤯 🤯 Thunder is a source to source compiler for PyTorch. It speeds up PyTorch models. As an example, it speeds up llama 2 7B by 40%.
Tweet media one
Tweet media two
21
83
700
@_willfalcon
William Falcon ⚡️
4 years
Excited to announce the launch of GPT-42. - Half the size of GPT-3 (100 billion parameters) - runs on 775 watts a day (2000 calories) - can do one-shot learning - multi-modal - does NOT require 9,000 GPUs - 300,000 years worth of evolution research! Inference API coming soon!
21
92
693
@_willfalcon
William Falcon ⚡️
1 year
LLM? old news. New hotness will be LVMs (Large Vision Models). This time, looks like @GoogleAI is finally staying ahead of @OpenAI . How well does it work? a 🧵-> 1/n
Tweet media one
16
87
688
@_willfalcon
William Falcon ⚡️
4 years
Excited to release our latest paper (with @kchonyc ) which establishes a conceptual framework for characterizing contrastive learning methods (SimCLR, BYOL, CPC, AMDIM, Swav). (work done at @facebookai ) Btw.. this was the motivation for @PyTorchLightnin
6
167
658
@_willfalcon
William Falcon ⚡️
1 year
We sped up the best opensource LLM (Falcon 40B) by 20% simply by porting it over to use Lightning. We'll keep making it faster over the next few weeks. Stay tuned! Available now in Lit-parrot:
13
108
628
@_willfalcon
William Falcon ⚡️
2 years
It's PhD application season 😑. PhDs will tell you not to get one. non-phds will tell you to do it. Job postings tell you to get one. What to do? a 🧵 on: - industry vs academia - schools vs advisor - minority applicants and how not to get intimidated - age for phd 1/9
20
97
636
@_willfalcon
William Falcon ⚡️
1 year
AI should be fully open source and part of the collective knowledge! Excited to announce a fully open-source (Apache 2.0), high-performance implementation of llama Join our discord ( @LightningAI ) to build AI
32
112
625
@_willfalcon
William Falcon ⚡️
1 year
lol… i started my PhD at 31 🙄
27
9
582
@_willfalcon
William Falcon ⚡️
1 year
are RNNs now “classic” ML? 😂
57
23
563
@_willfalcon
William Falcon ⚡️
7 months
Don’t forget, Google created transformers… Open AI scaled it. Don’t be shocked when Google wins here 😉
26
28
563
@_willfalcon
William Falcon ⚡️
4 years
Code for @OpenAI very deep VAE was released. Cool trick mentioned: "Gradient Skipping - Skip an update when a grad norm is above a threshold (if the KL blows up)" VAE twitter, what other tricks do you use normally? Code: Paper:
Tweet media one
Tweet media two
10
78
529
@_willfalcon
William Falcon ⚡️
3 years
We scaled up @karpathy 's minGPT to 45B parameters on 8 A100s using @PyTorchLightnin with the @MSFTResearch Deepspeed integration. The best part... no changes to your @PyTorchLightnin code...
Tweet media one
3
67
510
@_willfalcon
William Falcon ⚡️
5 months
Round 2... I use CodeLlama 70B and Mixtral MoE to write code to finetune a model on 16 GPUs (multi-node) 🤯🤯 Video has zero edits. This is a realistic iterative development workflow. TL;DR: Both are good. Mixtral MoE is super fast and writes clean code. More below 👇🏻
11
64
471
@_willfalcon
William Falcon ⚡️
4 years
Neural networks get a bad reputation for being black boxes. In this tutorial, I’ll show you how to use backpropagation to change the input as to classify it as whatever you would like. Had fun making this with @alfcnz
3
96
456
@_willfalcon
William Falcon ⚡️
5 months
Best investment zuck made was starting Facebook AI ( @AIatMeta ). even after this move, I'm pretty sure wall street still doesn't know what llama is...
Tweet media one
18
37
410
@_willfalcon
William Falcon ⚡️
3 years
@PyTorchLightnin has become a favorite for AI researchers and industry experts... but we thought there was a high-level framework (lightning is mid-level), missing for common use cases. Excited to introduce Flash! Blog: Repository:
Tweet media one
Tweet media two
5
101
397
@_willfalcon
William Falcon ⚡️
4 years
Super cool! Excited to announce our partnership with @facebookai to help standardize research and production code with @PyTorchLightnin . If you're a company not using lightning, look into it! It's great for research AND production :)
5
44
392
@_willfalcon
William Falcon ⚡️
4 years
I feel like @PyTorch dataloaders are underrated... i remember early days messing with hdf5, csv, numpy arrays, etc... data processing used to take forever. Dataloaders are brilliant.
12
24
393
@_willfalcon
William Falcon ⚡️
4 years
i still can't believe corporate america still thinks there are huge advantages to @TensorFlow over @PyTorch ... let me put it bluntly
Tweet media one
15
50
385
@_willfalcon
William Falcon ⚡️
1 year
The PyTorch @LightningAI 2.0 API is so stable that we got the code on our NYC HQ door!
Tweet media one
13
23
369
@_willfalcon
William Falcon ⚡️
6 months
2024 AI predictions ⚡️ 1. 1B models will outperform 70B models. 2. Models will be deployed on CPUs for almost free. Not API services. 3. Data quality will yield the next 10x boost in performance. 4. A combination of open source models will beat the best private models.
19
50
351
@_willfalcon
William Falcon ⚡️
4 years
Here's a deep dive into VAEs for color images (not MNIST for a change!), and matching implementation in @PyTorchLightnin . - ELBO, KL and reconstruction intuition - Implementation Colab: Github: Blog:
6
58
328
@_willfalcon
William Falcon ⚡️
1 year
Yesterday, AI became about corporate self interests. A divorce from the broad AI research field that made these companies even possible. PyTorch Lightning and @LightningAI will not sell out, we commit to continuing to give back to the AI community and opensource.
@sharongoldman
Sharon Goldman
1 year
Since OpenAI’s surprise release of its GPT-4 model yesterday, there has been a raft of online criticism about the accompanying 'technical report.' Thanks to @_willfalcon for sharing his thoughts in this new Q&A:
2
8
46
10
32
319
@_willfalcon
William Falcon ⚡️
7 months
This week in AI summarized
Tweet media one
5
29
322
@_willfalcon
William Falcon ⚡️
4 years
Excited to release @PyTorchLightnin 1.1.0. Built in collaboration with @facebookai you can now train any NLP Transformer modek, Swav (self-supervised), Speech or Image GPU with 60% less memory! With ZERO code changes! Read more here:
Tweet media one
3
48
316
@_willfalcon
William Falcon ⚡️
4 months
Casually access remote GPUs from the comfort of your local VSCode with Lightning Studios 🔥🔥
18
31
287
@_willfalcon
William Falcon ⚡️
1 year
Don’t work on LLMs without learning a few key principles! Today we release our next set of lectures on LLMs and NLP! - working with embeddings - limitations of RNNs - self-attention principles - how LLMs work
5
99
299
@_willfalcon
William Falcon ⚡️
3 years
Huge kudos to the @MSFTResearch team! GPT-3 sized models on a single GPU! Use deepspeed with lightning:
1
61
301
@_willfalcon
William Falcon ⚡️
4 years
In 2015 I attended @NeurIPSConf for the first time. That was the year i built my first neural network (in Theano). After that conference i was so convinced about the future of deep learning that I quit my job at @GoldmanSachs and went all in on DL! How did you get started?
20
6
291
@_willfalcon
William Falcon ⚡️
13 days
Excited to launch "Open in Studio" 🤯🤯 - the Open in Colab alternative 🙂 ✅ Open Github repos ✅ Open to specific notebooks/files ✅ Persistent storage ✅ Persistent environment ✅ Live code together ✅ T4, L4, A100, H100 and more instances ✅ 22 free GPU hours per month
Tweet media one
14
39
291
@_willfalcon
William Falcon ⚡️
4 years
If you're working on a new ML product/startup and need for funding, please reach out! I'm especially interested in supporting fellow latinxs, (and any underrepresented group). I can't wait to see what you build next and to help get you there! @black_in_ai @_LXAI @wimlds
6
66
264
@_willfalcon
William Falcon ⚡️
2 months
Glad we’re inspiring more companies to give free GPUs 🔥 - always lead from the front All users get 22 free GPU hours on Lightning Studios our free tier has even more!
Tweet media one
8
25
247
@_willfalcon
William Falcon ⚡️
1 year
Which opensource LLM should you use? We tested all of them 🤯🤯
Tweet media one
16
53
246
@_willfalcon
William Falcon ⚡️
2 months
Tweet media one
5
14
243
@_willfalcon
William Falcon ⚡️
4 years
Excited to announce or Series A funding of $18.6! If you thought Lightning gave you superpowers, wait till you try our new platform @_gridai ! With no code changes, spin up hundreds of models across thousands of GPUs on the cloud of your choice!
8
24
228
@_willfalcon
William Falcon ⚡️
2 months
Finetuning does not have to be expensive! We put together a guide that discusses trade-offs between compute and time requirements. Would love to hear other's experience in the responses 👇🏼
Tweet media one
1
49
229
@_willfalcon
William Falcon ⚡️
3 years
Excited to launch torch-metrics - @PyTorch metrics optimized for distributed training at scale. We thought the lightning metrics could be useful to all @PyTorch users beyond just @PyTorchLightnin :)
Tweet media one
Tweet media two
1
38
213
@_willfalcon
William Falcon ⚡️
4 months
Highly recommend this video on writing optimized cuda kernels by @marksaroufim from the @PyTorch team. Perf checklist: - coalesced global memory access - maximize occupancy - memory or compute bound - minimize control divergence ... + 4 other items
Tweet media one
3
29
202
@_willfalcon
William Falcon ⚡️
8 months
4-bit precision is all you need...... ...to stay under the reporting guidelines the white house imposed on AI models.
Tweet media one
6
15
204
@_willfalcon
William Falcon ⚡️
4 years
In this video, I convert a VAE from the @PyTorch repo into Lightning in under 45 minutes. As it's obvious from the video it's a faithful attempt to replicate the experience of doing this for an unseen project.
2
28
199
@_willfalcon
William Falcon ⚡️
4 years
It's official, @PyTorchLightnin has hit 100k downloads in just 5 months! 45k just in the last 2 months. If you are still coding @PyTorch for-loops by hand or doing your own distributed training, you should REALLY check out lightning.
Tweet media one
0
37
200
@_willfalcon
William Falcon ⚡️
3 years
Implement a transformer in @PyTorch and @PyTorchLightnin by : @full_stack_dl . Colab: Original tweet with a great summary:
@full_stack_dl
The Full Stack
3 years
🛠️Tooling Tuesday🛠️ Today, we share a @GoogleColab notebook implementing a Transformer with @PyTorch , trained using @PyTorchLightnin . We show both encoder and decoder, train with teacher forcing, and implement greedy decoding for inference. 👇1/N
2
84
377
0
32
195
@_willfalcon
William Falcon ⚡️
4 years
Anyone want to run their PyTorch code on TPUs?? Now you can with Lightning... Best part is - no need to change your code!!! it’s 100% hardware agnostic.
@LightningAI
Lightning AI ⚡️
4 years
You can now run @PyTorch code on TPUs and GPUs with Lightning... wait for it... WITHOUT CODE CHANGES!!! Retweet if this is useful to you!
15
247
754
5
29
197
@_willfalcon
William Falcon ⚡️
4 months
Who are the best ML engineers you know? we are hiring in NYC and SF. 🚀🚀 Lightnings is only 45 people 🤯🤯 (vs the 200+ at most AI startups). - We are flat, focused and lean. - No meetings or noise. - Ship or bust Every leader ships - My GH below
Tweet media one
13
18
195
@_willfalcon
William Falcon ⚡️
4 years
CPC, AMDIM, SimCLR: BatchNorm in contrastive learning is needed to keep networks from "cheating" and thus collapsing. The ablations below: Turns out Batchnorm IS what keeps BYOL from collapsing. sigh... smh
3
34
192
@_willfalcon
William Falcon ⚡️
4 years
Excited about our transfer learning tutorial with @PyTorchLightnin . Our resnet50 pretrained using unsupervised learning achieved double the performance of a torchvision pretrained resnet50. Our model was trained using swav (from @facebookai , implemented in Lightning). 1/n
@alfcnz
Alfredo Canziani
4 years
Fall semester week ⑩ practicum with @_willfalcon ! 🎉 Learn about: • Supervised and self-supervised transfer learning and fine-tuning; • Formatting your code with @PyTorch @PyTorchLightnin ; • Access latest paper re-implementation.
Tweet media one
Tweet media two
5
37
288
1
17
187
@_willfalcon
William Falcon ⚡️
1 year
Lit-llama can now be finetuned with LLaMA-Adapter. LLaMA-adapter is a technique to finetune LLMs that only tweaks 1.2 million parameters. Within 1 hour, "LLaMA becomes an instruction-following model!" LLaMA-Adapter paper - (nice
Tweet media one
4
43
181
@_willfalcon
William Falcon ⚡️
5 years
It's official! @PyTorchLightnin is part of the @PyTorch ecosystem now - Keras-like ML library for researchers. Check it out if you haven't yet! @MILAMontreal @AmiiThinks @berkeley_ai @stanfordnlp @MIT_CSAIL @DataScienceCtrl @DeepLearningHub @hmason
Tweet media one
7
40
176
@_willfalcon
William Falcon ⚡️
3 months
Lightning is focused, small and lean 😊😊 I grew up in special operations (SEAL training) and operate our company with the same mindset. Incredibly proud of our small, FOCUSED, team for building a product that's already redefining AI development. (Also, shout out to our fellow
Tweet media one
6
11
181
@_willfalcon
William Falcon ⚡️
5 months
we finally launched Lightning Studio - Google Colab and Sagemaker done “the Lightning way” ⚡⚡ Took us ~3 years to develop the “iPhone” experience for AI tooling. This video shows a very simple way to save ~60% on cloud cost by first debugging on CPU,
9
27
178
@_willfalcon
William Falcon ⚡️
4 years
Amazing to cross 10k @github stars (559 days) for @PyTorchLightnin . Whoever is out there coding the next cool thing, keep at it! But i'm now a small part of this equation. Lightning is now a true community-driven project, so congrats to the contributors and community!
Tweet media one
3
9
171
@_willfalcon
William Falcon ⚡️
3 months
Thunder is the number 5 trending repo on Github! 🤯🤯 Thunder speeds up PyTorch models up to 40% (it's early still 😅😅). Contribute and support it so we can make PyTorch models even faster!
Tweet media one
2
19
168
@_willfalcon
William Falcon ⚡️
1 year
go to a park. go for a swim. don’t think about LLMs and ChatGPT for one weekend. detox.
15
18
168
@_willfalcon
William Falcon ⚡️
4 months
excited to present at GTC this year! we’ll show how to prep data, pretrain, finetune and deploy a 7B param LLM 9 am in room 230C
Tweet media one
4
8
165
@_willfalcon
William Falcon ⚡️
4 years
My approach to AI research has been to work on fundamental science with less focus on chasing for new SOTAs. But seems like the community over indexes on work with a 1% improvement to SOTA instead? For senior researchers, how do you balance this? What about early career?
4
4
153
@_willfalcon
William Falcon ⚡️
1 year
We're kicking of our Large scale Infinite Training initiative (LIT-training) - LLMs for all ⚡⚡⚡ Donate your spare compute capacity to train Lit-LLaMa fully open-source and transparently. AI is the new electricity, it cannot be owned by gatekeepers!
Tweet media one
10
33
146
@_willfalcon
William Falcon ⚡️
3 years
Excited to finally release @gridai_ ! Run any code on the cloud at scale from your laptop with ZERO code changes. Over the years we've been passionate about helping the AI community scale up with @PyTorchLightnin . It's time to bring that to your ML infra!
3
21
143
@_willfalcon
William Falcon ⚡️
3 years
Still haven’t checked out @PyTorchLightnin ? If you’re still using keras, tensorflow, or simply haven’t refactored your pytorch code, doesn’t hurt to see what it’s like in the lightning world!
1
17
134
@_willfalcon
William Falcon ⚡️
3 years
🚀 New feature alert 🔥⚡️ Today we take a big step to increase the flexibility of @PyTorchLightning with LightningLite, get the benefits of the scaling of Lightning without necessarily needing the full framework. Convert your @PyTorch code in seconds.
1
26
132
@_willfalcon
William Falcon ⚡️
4 years
Amazing job to the core team, everyone who made this release possible, and the 300+ contributors! More to come about 1.0.0 but in the meantime, enjoy!
@LightningAI
Lightning AI ⚡️
4 years
Lightning 1.0.0 is here! We also launched our new homepage - the home for everything Lightning, from blogs to tutorials to docs! We're also on producthunt today!
Tweet media one
7
94
437
2
14
130
@_willfalcon
William Falcon ⚡️
4 years
For all our friends who use @PyTorchLightnin with @huggingface . Enable a single flag in the Lightning Trainer to train models with 2x the parameters, 30% faster. The result of a great collaboration between @facebookai + @PyTorchLightnin
Tweet media one
0
14
130
@_willfalcon
William Falcon ⚡️
2 years
The best use of AI (imho) is to augment humans, not replace them. The focus of Muse () is to help artists get inspiration... We also made it free and open source so it can run on your laptop without needing the cloud
3
22
128
@_willfalcon
William Falcon ⚡️
5 years
Talk I gave explaining strategies to supercharge your PyTorch code (16-bit, multi+single node parallelization, dataloaders, accumulated grads, and the @PyTorchLightnin flags to turn these features on. @MILAMontreal @AmiiThinks @NYUDataScience @stanfordnlp
2
25
124
@_willfalcon
William Falcon ⚡️
3 months
Weekend treat - L4 GPUs are now available on Studios 🔥🔥 - $0.88 per hour - 17 free hours per month 🤯 - choose 1 or 4 L4 GPUs ⚡⚡⚡⚡ - 1/2 the price of A10G 🤑 - 24 GB RAM - free persistent storage included
Tweet media one
4
19
122
@_willfalcon
William Falcon ⚡️
4 years
Had you told me in 2012 when I was in the middle of going through @us_navyseals training that I would have written a deep learning framework used by thousands just a few years later, I would have said “what is coding???” 1/n
1
6
119
@_willfalcon
William Falcon ⚡️
5 years
Humbled to hear so many people interested in joining the first @PyTorchLightnin maintainers' group. I'm looking to build a diverse team of 5 peeps to scale the explosive growth! 2.5k github stars, 46k downloads in a few months. For those interested:
1
26
119
@_willfalcon
William Falcon ⚡️
2 months
100,000 users in < 5 months 🤯🤯🤯 next stop: 1 million users! Lightning AI Studio adoption by top AI labs and enterprises has been truly humbling... The 100,000th user gets free Pro tier for 3 months 👇🏼👇🏼 Congrats to Suhas Nandiraju at Crosby Health, the 100,000th user! who
8
12
113
@_willfalcon
William Falcon ⚡️
2 months
Congrats to the @AIatMeta team on Llama 3 🤯🤯 It is by FAR the best open source model i've played with! Run your personal llama 3 in a Lightning Studio now... let me know what you think about the model!
Tweet media one
3
20
112
@_willfalcon
William Falcon ⚡️
4 years
(yes, it's the human brain... please don't put this on techcrunch lol)
0
1
107
@_willfalcon
William Falcon ⚡️
1 year
⚡ PyTorch @LightningAI 2.0 is out! 🤯 Fireside chat at (12 ET): - Why AI has to stay OpenSource ⚡⚡ - History of PyTorch @LightningAI - New features in 2.0 to help you with foundation models - How we introduced our final, stable API 👉
Tweet media one
3
16
108
@_willfalcon
William Falcon ⚡️
2 months
Being GPU poor is a choice nowadays 😂😂
Tweet media one
9
7
107
@_willfalcon
William Falcon ⚡️
3 years
1/3 As a researcher from a non-traditional background, I take attribution very seriously and embrace constructive scientific discussion that accelerates learning and advances progress. Flash was developed to serve the evolving needs of PyTorch Lightning users.
4
12
103
@_willfalcon
William Falcon ⚡️
5 months
A new open source model drops every day… but it feels like people just tweet what they hear instead of what they experience?? 🤔 My gut says no one is using them in reality… just talking about potentially using them… what am I missing?
33
9
103
@_willfalcon
William Falcon ⚡️
3 years
Awesome community tutorial for finetuning BERT for sentiment analysis on TPUs with @PyTorchLightnin (1 million datapoint dataset) by tanaymehta
1
13
103
@_willfalcon
William Falcon ⚡️
2 months
... Pretraining on 32 H100s on Lightning Studios 🤯🤯! A single Studio can scale from interactive development to multi-node training. Full video walkthrough to pretrain a 3B param LLM on Studios.
Tweet media one
2
16
104
@_willfalcon
William Falcon ⚡️
2 years
This is a MAJOR milestone for the AI community!! to say i’m super excited is an understatement. thanks for your leadership here @soumithchintala and support from @MetaAI @ylecun @schrep Purest commitment to opensource i’ve seen to date 😍
@PyTorch
PyTorch
2 years
Thank you @_willfalcon and @LightningAI for your support of the #PyTorchFoundation . We’re ready to continue building with our community.
Tweet media one
0
12
87
0
13
99
@_willfalcon
William Falcon ⚡️
28 days
Live now, casually connect your Cursor IDE to @LightningAI Studio for remote development on GPUs 🤯🤯 ✅ 22 GPU hours free per month ✅ Unlimited storage ✅ Auto sleep to save costs ✅ Persistent environment ✅ Persistent filesystem
5
18
100
@_willfalcon
William Falcon ⚡️
4 years
ODE twitter, would love to know some of your use cases for ODEs! For those not familiar with neural ODEs, here’s a walk-through (by: @MichaelPoli6 ) for implementing neural ODE models with @Diffeq_ml powered by @PyTorchLightnin .
0
20
99
@_willfalcon
William Falcon ⚡️
5 years
I just open-sourced a research library I built for myself as I work on my PhD. It's like keras for with more control for research purpose. Would love contributors if anyone's interested in polishing it up! @NYUDataScience @PyTorch #AI #DataScience
0
27
97
@_willfalcon
William Falcon ⚡️
4 years
Love this implementation of pix2pix, style transfer, deepdream and cycleGAN in @PyTorchLightnin .
Tweet media one
Tweet media two
0
12
97
@_willfalcon
William Falcon ⚡️
3 years
The goal of Flash is to make building baselines in PyTorch trivial! This is a great example!
@neribr
Bruno Neri
3 years
Training a @PyTorch ConvNet with the new @PyTorchLightnin Flash framework #deeplearning #pytorch #convnet
Tweet media one
2
6
24
4
17
94
@_willfalcon
William Falcon ⚡️
4 years
Training with multiple GPUs has many approaches. I cover Data Parallel, Distributed Data Parallel, and the new more efficient sharded which works with any distributed data parallel mode. So far we are seeing 60% savings in memory by enabling sharded.
0
9
92
@_willfalcon
William Falcon ⚡️
2 years
if you are in a PhD, you WILL get whatever job you are interested in at the end of it. But spend that time learning new things, growing intellectually and engaging in scientific discourse. Don't rush it! It's about the journey, not the destination 3/9
4
3
93