Alec Helbling Profile Banner
Alec Helbling Profile
Alec Helbling

@alec_helbling

2,604
Followers
1,995
Following
123
Media
679
Statuses

Interested in ML, visualization, generative modeling, and open source. CS PhD Student @GeorgiaTech . NSF Fellow. Prev Intern @IBMResearch , @Microsoft , @NASAJPL .

us-east-1
Joined December 2017
Don't wanna be here? Send us removal request.
Pinned Tweet
@alec_helbling
Alec Helbling
2 years
Been working on a tool for visualizing neural networks using python code. Here is a visualization of a convolutional neural network. It is built on top of the @manim_community library.
26
301
2K
@alec_helbling
Alec Helbling
1 year
I added the ability to visualize activation functions to my tool for animating neural network architectures with python. It is built on top of the @manim_community library. Code:
27
269
2K
@alec_helbling
Alec Helbling
1 year
I added a guide to the repository for my project ManimML, which is a tool for visualizing neural network architectures and algorithms using only python code. It is built on top of @manim_community which was started by @3blue1brown . Link:
16
285
2K
@alec_helbling
Alec Helbling
1 year
I added the ability to visualize a convolutional layer with padding to ManimML, my tool for animating neural network architectures using only python code. It is built on top of the @manim_community library. Code:
6
139
961
@alec_helbling
Alec Helbling
3 months
Langevin Monte Carlo allows you to draw samples from a probability distribution using its log gradient ∇ log p(x). By performing a sort of gradient ascent with noise you can navigate around the distribution. Langevin MC is heavily related to modern diffusion models.
5
94
690
@alec_helbling
Alec Helbling
2 years
One-by-one convolution is a useful way to change the number of filters in a convolutional neural network. The visualization can be created with just a few lines of pytorch-like python code using a library called ManimML () built on top of @manim_community
3
76
500
@alec_helbling
Alec Helbling
2 years
I am making a python tool for intuitively visualizing machine learning algorithms. You can do things like animate the forward pass of a neural network described with a pytorch-like syntax. It is built on top of the Manim library @manim_community
7
44
403
@alec_helbling
Alec Helbling
8 months
where is my turing award
Tweet media one
4
6
365
@alec_helbling
Alec Helbling
1 year
When sampling with MCMC it is often necessary to have a burn-in or warm-up period before the chain converges on the underlying distribution. This visualization was made entirely in python using @manim_community Link to code:
8
53
359
@alec_helbling
Alec Helbling
3 years
This is a visualization of a decision tree classifier I made using @manim_community . Decision trees are one of the most intuitive and under appreciated machine learning techniques. This is a part of a video I am making for the upcoming @3blue1brown competition.
8
51
321
@alec_helbling
Alec Helbling
1 year
Here is a cool animation of a neural network residual block. It was made entirely using python as a part of my ManimML project. ManimML is built on top of @manim_community . Link to project:
6
59
317
@alec_helbling
Alec Helbling
2 months
Introducing ClickDiffusion! We developed a system for precise image manipulation and generation that combines natural language instructions with visual feedback provided by the user through a direct manipulation interface.
5
61
314
@alec_helbling
Alec Helbling
2 years
Here is an animation of neural network dropout, a now classic regularization technique. Individual units and their weights are removed randomly during training to prevent overfitting. I made it using a tool I am building on top of @manim_community with a simple python API.
1
45
287
@alec_helbling
Alec Helbling
3 months
The vector field of a diffusion model has some interesting properties. Early on samples are mostly noise, so the model's best guess is to predict the mean. Later on the model has more information and the vector field starts to reveal the local structure of the distribution.
7
35
291
@alec_helbling
Alec Helbling
3 months
Implementing a Diffusion Model on a simple 2D dataset is a fun exercise. Here is a visualization of the the paths followed when sampling from a Denoising Diffusion Probabilistic Model (DDPM) trained on a simple spiral dataset.
3
33
254
@alec_helbling
Alec Helbling
1 year
I found this cool blog post by @jeffrey_heer where he makes interactive visualizations of Barnes-Hut approximations of N-body forces. It can be used to simulate the trajectories of stars in a galaxy as well as force layouts for graphs. Link:
3
43
221
@alec_helbling
Alec Helbling
1 year
I've been wanting to make some visualizations of diffusion models. I'm starting of with a basic visualization of Markov Chain Monte Carlo. This was made completely with python code as a part of the ManimML project. @manim_community Link to project:
5
31
219
@alec_helbling
Alec Helbling
3 months
Introducing DinoDiffusion 🦖 Here is a visualization of points being sampled from a dinosaur shaped 2D distribution using a diffusion model.
5
18
198
@alec_helbling
Alec Helbling
4 months
Why do we use log everywhere in optimization? It turns products into sums, which are easier to differentiate. And because log is monotone it doesn't change the location of the extremes, making the solutions stay the same. I think this is understated in optimization courses.
2
16
182
@alec_helbling
Alec Helbling
1 year
@8teAPi This narrative reads like that of Watson and Crick rushing to publish the structure of DNA before Linus Pauling could beat them to it, while snubbing Rosalind Franklin by excluding her from the paper.
1
6
103
@alec_helbling
Alec Helbling
1 year
I added a Max Pooling layer to ManimML, my tool for animating neural network architectures with python. It is built on top of the @manim_community library. Code:
1
15
104
@alec_helbling
Alec Helbling
1 year
I made an animation of Fukushima's Neocognitron, a predecessor to modern convolutional neural networks. I made the animation entirely using python using ManimML, the library I'm building. ManimML is built on top of @manim_community . Code:
2
16
97
@alec_helbling
Alec Helbling
2 years
This is a part of a bigger project called ManimML for visualizing neural networks and other ML algorithms using python code. (This feature is still a work in progress though).
1
10
91
@alec_helbling
Alec Helbling
1 year
I added the ability to make residual connections between layers to my library ManimML. This is something similar to what you would see in the classic ResNet architecture. This animation was generated using only python code @manim_community Link:
3
10
84
@alec_helbling
Alec Helbling
3 months
I found out the other day that I won the NSF Fellowship! Now I can continue to work on the intersection of generative models, visualization, and interactive machine learning. A special thanks to my advisor @PoloChau for all of his support.
9
1
55
@alec_helbling
Alec Helbling
2 years
Traditional Autoencoders suffer from the issue of learning latent representations with “holes”, discontinuities in latent space that don’t decode to quality outputs. Variational Autoencoders solve this by decoding vectors sampled from continuous distributions. @manim_community
4
6
54
@alec_helbling
Alec Helbling
2 years
The same features in an image can occur in many different places. Convolutional networks exploit this symmetry by sharing parameters across many locations in an image, drastically improving the parameter efficiency of the model. This property is called translation equivariance.
2
11
54
@alec_helbling
Alec Helbling
1 year
Text-to-image generation models like Stable Diffusion often have interpretable cross-attention maps. The maps corresponding to a token (e.g. "astronaut") often highlight the image region corresponding to the token's concept. Code for visualization:
1
15
53
@alec_helbling
Alec Helbling
10 months
I'll be presenting ManimML at @ieeevis next month in Australia. ManimML is a python library for automatically generating animations of machine learning architectures like neural networks from just code. Code:
2
8
48
@alec_helbling
Alec Helbling
1 year
This is a fun graphic showing the benefit of momentum in gradient based optimization.
@DSaience
Sairam Sundaresan
2 years
Trying out something new :) Poorly drawn machine learning #1 - How momentum helps gradient descent
Tweet media one
21
100
945
2
4
46
@alec_helbling
Alec Helbling
1 year
I've been digging into @mbostock 's blog post (although calling it a blog post is an understatement) on Visualizing Algorithms. "Visualization leverages the human visual system to augment human intellect." Link:
0
8
45
@alec_helbling
Alec Helbling
1 year
I just found the website of @BCiechanowski where he has these amazing interactive blogs explaining various science and engineering concepts. Here is one he did explaining in detail how sound works with really fun interactive visualizations. Link:
1
6
45
@alec_helbling
Alec Helbling
3 months
Excited that this summer I will be working on diffusion models as a research intern at Adobe on the Firefly team. If you are in the Bay Area feel free to hit me up!
2
0
44
@alec_helbling
Alec Helbling
1 year
I added a light mode to ManimML, my library for visualizing machine learning architectures using python code. Here is a very simple convolutional neural network. It seems to give the visualizations more of a professional feel than dark mode. Code:
1
1
44
@alec_helbling
Alec Helbling
2 years
A simple clustering algorithm. 1. Form a graph from a set of points by creating an edge between pairs of points within an epsilon radius. 2. Then label points according to their connected component (the connected subgraph). Visualization made with @manim_community .
2
3
39
@alec_helbling
Alec Helbling
3 months
Dug up some old visualizations of decision tree classifiers. Considering revisiting this topic.
1
8
41
@alec_helbling
Alec Helbling
1 year
Diffusion models learn to take images with noise repeatedly added to them and then iteratively denoise them until they resemble real images. This visualization was made in Python using ManimML, a library built on @manim_community . Link to code:
0
4
39
@alec_helbling
Alec Helbling
3 years
I made a video about Decision Tree Classifiers using @manim_community . I hope these visualizations help people understand this interesting ML algorithm intuitively. It is inspired by many of @3blue1brown 's videos.
Tweet media one
0
3
36
@alec_helbling
Alec Helbling
3 years
Here is another Decision Tree Classifier visualization made from @manim_community . We want to split data into two subsets at the location that maximizes information gain. This maximizes the class homogeneity of each of the separated regions.
1
7
32
@alec_helbling
Alec Helbling
2 years
Here is an animated visualization of a Generative Adversarial Network. These models learn to render realistic seeming images through an adversarial training process in which a Generator network tries to outsmart a Discriminator network. @manim_community
1
4
36
@alec_helbling
Alec Helbling
1 year
Easing functions help make animations feel less blunt by letting you adjust the rate of change of an animation. Here is a cool cheat sheet on different easing functions that can be used as a part of animations. Link:
0
7
36
@alec_helbling
Alec Helbling
10 months
@_jasonwei This metric would likely undervalue those who work with more junior researchers and students: who are likely to produce less highly cited preliminary work as a part of their learning process. All metrics are imperfect though.
1
0
35
@alec_helbling
Alec Helbling
1 year
Here is a cool project called git-sim from @initcommit . It visualizes the commit-graph of a git repository! Code:
1
9
33
@alec_helbling
Alec Helbling
11 months
Excited to be starting my PhD at Georgia Tech next week. I’ll be working with @PoloChau at the intersection of ML and Visualization. I’ve heard a lot of buzz about these so called “publications” and “grants”. They seem important. Anyone care to explain the significance?
Tweet media one
5
0
33
@alec_helbling
Alec Helbling
9 months
I'm excited to have won the Best Poster Award at IEEE VIS ( @ieeevis ) in Melbourne last week for ManimML. Made the loooonnngg flight more worth it. Thanks to @PoloChau for his incredibly helpful support on the submission.
Tweet media one
1
2
32
@alec_helbling
Alec Helbling
2 years
Visualization of a 2D convolutional layer. Made as a part of my project ManimML, an extension to the @manim_community library. Repo:
0
6
31
@alec_helbling
Alec Helbling
3 months
Abdul Fatir ( @solitarypenman ) has a great blog post that goes much more in depth on this topic.
0
5
31
@alec_helbling
Alec Helbling
2 years
@txhf But your genetics do give you a pretty good foundation model to conduct transfer learning on. Learning within your life is just fine tuning 🤣.
1
2
29
@alec_helbling
Alec Helbling
2 years
VAEs learn continuous latent representations, which can be sampled from to produce realistic generated images at any location in latent space. Here are some interpolations of the MNIST handwritten digit dataset. @manim_community
1
4
27
@alec_helbling
Alec Helbling
28 days
in an effort to cater to a gen z audience I'm experimenting with different ways of promoting my research what do you think?
3
2
26
@alec_helbling
Alec Helbling
3 years
I have been making math visualizations with the @manim_community python library. Using a great tool like this really brings into question why we are still using static whitepapers to communicate research. Here is a visualization of information entropy that I made with Manim.
1
3
23
@alec_helbling
Alec Helbling
1 year
This is part of a bigger project called ManimML. The goal is to allow researchers to visualize various ML architectures using Pytorch-like python code. The specific example can be found here:
3
2
21
@alec_helbling
Alec Helbling
2 years
Variational Autoencoders can learn “disentangled” representations, where each of the latent attributes correspond to meaningfully different features. This is enforced by the isotropic Gaussian prior, which encourages independence between latent units. @manim_community
1
4
20
@alec_helbling
Alec Helbling
1 year
I tried out one of those models for improving the aesthetic of an image on an old profile picture. It worked great.
Tweet media one
2
1
20
@alec_helbling
Alec Helbling
2 years
I am making a python tool for intuitively visualizing machine learning algorithms. This is a visualization of the forward pass of a Variational Autoencoder generated with a pytorch-like syntax. It is built on top of the Manim library @manim_community .
1
2
20
@alec_helbling
Alec Helbling
2 years
@WenhuChen The LinkedIn hive mind has infiltrated Twitter.
0
1
19
@alec_helbling
Alec Helbling
7 months
I am at #NeurIPS2023 this week presenting some work at the Workshop on Creativity and Design. We developed a method enabling diffusion models to compose multiple reference objects w/o fine-tuning. If you would like to risk getting food poisoning on some oysters, reach out!
Tweet media one
2
3
18
@alec_helbling
Alec Helbling
1 year
Structure and Interpretation of Computer Programs definitely has some of the most intuitive visualizations of programming concepts out of any book I have read. Here is the tree-recursive process generated when computing the Fibonacci sequence. Link:
Tweet media one
0
3
18
@alec_helbling
Alec Helbling
10 months
This blog post from Max Slater gives a very thorough introduction to representing functions as vectors. It has awesome interactive visualizations as well.
0
2
16
@alec_helbling
Alec Helbling
2 years
Check out our paper at the ICLR DGM4HSD workshop this Friday. We developed a system that allows users to control the content of images synthesized with VAEs by asking users queries of the form, “do you prefer image A or image B?” Paper:
1
5
17
@alec_helbling
Alec Helbling
1 year
1
0
15
@alec_helbling
Alec Helbling
2 years
This is a part of an ongoing project that I call Manim Machine Learning. I want to make primitive assets so that anyone can quickly make intuitive visualizations of complex machine learning topics.
0
3
16
@alec_helbling
Alec Helbling
2 years
This is a part of a larger project called ManimML. The focus of ManimML is to create a tool for visualizing neural network architectures and other machine learning systems directly using python.
0
2
16
@alec_helbling
Alec Helbling
1 year
I tested the Segmenting Anything Model (SAM) on the task of referring image segmentation. The goal is to segment an image given a text prompt. I made a simple baseline and tested it on RefCOCO. Here are some interesting results. Code:
Tweet media one
2
2
16
@alec_helbling
Alec Helbling
8 months
I hacked together a system for doing progressive rendering for diffusion models and latent consistency models. It allows a user to see a coarse version of an image composition very quickly (>1 frame per second) before the full scheduler can run. Code:
1
5
16
@alec_helbling
Alec Helbling
2 months
I’ll be at ICLR in Vienna this week. Hit me up!
3
0
14
@alec_helbling
Alec Helbling
3 months
ok which one should I get
Tweet media one
Tweet media two
8
1
14
@alec_helbling
Alec Helbling
3 months
you wouldn’t last an hour in the asylum where they raised me
Tweet media one
1
3
15
@alec_helbling
Alec Helbling
2 years
This is a part of an ongoing project that I call Manim Machine Learning (ManimML). I want to make primitive assets so that anyone can quickly make intuitive visualizations of complex machine learning topics.
1
1
11
@alec_helbling
Alec Helbling
4 months
A fun fact about convexity. The intersection of convex sets is convex. If you find the overlapping regions of a bunch of half-spaces then it will be a polytope. Any point along the line between a pair of points in this set will also be in this set.
1
0
11
@alec_helbling
Alec Helbling
3 years
@colinraffel Also I like this one from the MAML paper. It is very brief but gives a good high level intuition of the major concept. ()
Tweet media one
1
0
12
@alec_helbling
Alec Helbling
1 year
It seems like the success of LLMs was an efficient network architecture (transformer) combined with SSL on huge datasets (of text). It seems logical to attempt to replicate this success on other modalities (e.g. user interface data, video) that can be collected at immense scales.
1
1
9
@alec_helbling
Alec Helbling
1 year
This is a part of a bigger project called ManimML for visualizing neural networks and other ML algorithms using python code. (This feature is still a work in progress though).
0
1
11
@alec_helbling
Alec Helbling
2 years
The unit circle can be intuitively connected to Sine and Cosine waves. A Sine wave can be thought of as a function of the height of a point as it is rotated around the unit circle at a constant rate. This was always my favorite way to think about the unit circle.
0
0
12
@alec_helbling
Alec Helbling
1 year
tensor.detach().cpu().numpy()
0
0
11
@alec_helbling
Alec Helbling
3 years
@colinraffel I like this one from Graph Attention Networks ()
Tweet media one
2
0
11
@alec_helbling
Alec Helbling
1 year
Is anyone working on integrating the context from more than just code into a GitHub Copilot type model. I’m talking about things like clipboard data, which files are open, and the contents of the terminal?
2
1
9
@alec_helbling
Alec Helbling
2 years
This is a really interesting paper from @andrewgwils and @Pavel_Izmailov that provides a more rigorous perspective on inductive bias and generalization. Inductive bias is a concept that seems to have many different definitions in the research community.
Tweet media one
1
2
10
@alec_helbling
Alec Helbling
8 months
Using only a restaurant, tell us where you went to college
Tweet media one
@garrytan
Garry Tan
8 months
Using only a restaurant, tell us where you went to college
Tweet media one
162
5
482
2
0
9
@alec_helbling
Alec Helbling
1 year
It's now mandatory to have at least one of these radar charts in every new paper or it will be desk rejected.
Tweet media one
0
2
9
@alec_helbling
Alec Helbling
3 years
@fchollet This seems to be a reflection of the general human tendency to hero worship individuals instead of groups for collective accomplishments. For example, the praise of Neil Armstrong as opposed to the general NASA community.
1
0
9
@alec_helbling
Alec Helbling
3 months
Cool effect of interpolating between points distributed like different digits. Wait until the end ...
0
1
9
@alec_helbling
Alec Helbling
2 years
@dpkingma @manim_community Here is another one I made of decision tree classifiers.
@alec_helbling
Alec Helbling
3 years
This is a visualization of a decision tree classifier I made using @manim_community . Decision trees are one of the most intuitive and under appreciated machine learning techniques. This is a part of a video I am making for the upcoming @3blue1brown competition.
8
51
321
0
1
9
@alec_helbling
Alec Helbling
2 years
@stanislavfort Wow. Looks like you can prompt ChatGPT to do some sort of chain of thought reasoning in order to get better results at reasoning tasks.
2
0
8
@alec_helbling
Alec Helbling
1 year
This is an awesome introduction to Geometric Deep Learning and Graph Neural Networks.
@chaitjo
Chaitanya K. Joshi
1 year
❓New to Geometric GNNs, GDL, PyTorch Geometric, etc.? Want to understand how theory/equations connect to real code? Try this practical notebook before diving into this exciting area! **Geometric GNNs 101**
Tweet media one
7
168
808
1
0
8
@alec_helbling
Alec Helbling
8 months
I wish Gödel would’ve completed that theorem of his. I’d sure like to see the rest of it.
0
0
8
@alec_helbling
Alec Helbling
6 months
One of my greatest frustrations with learning math is that it lacks the instant feedback loop of a programming environment.
2
0
8
@alec_helbling
Alec Helbling
2 years
With text-guided diffusion has anyone addressed the problem of fixing the identity of a face while allowing the rest of the context to be changed? It seems like generating pictures of the same person or thing in many different contexts would be very useful. #stablediffusion
3
0
6
@alec_helbling
Alec Helbling
4 months
@justinliang1020 Using monitors instead of looking down at a laptop fixed pain in my traps and neck.
1
0
1
@alec_helbling
Alec Helbling
2 years
I've been tinkering with a code generated animation of the forward pass of a convolutional neural network. I use a Pytorch style syntax for constructing a neural network layer by layer.
1
0
7