What is the integrative logic of proprioception? Sensory information is collected in muscle spindles distributed in the body, but how is the information integrated. We took a computational approach to this problem...
.
@marius10p
thanks for the idea - seems like nothing is camouflaged to
#DeepLabCut
- Trained with !! 14 frames !! for 8min on GPU (6,000 iterations). Frame-by-frame prediction. Paper:
@sofiabiologista
Thanks to
@snsf_ch
for funding my research project titled: "A theory-driven approach to understanding the neural circuits of proprioception".
I'm looking for a postdoc, PhD student and research engineer. Please RT.
Do you want to learn how to track poses, analyze what animals are doing, identify your favorite bear or link behavioral to neural data?
Join the hands-on, 5 day Cajal course from 6 to 10 November!
@Cajal_Training
Thank you for this recognition. Receiving this award with Mackenzie means a lot to me. We are also grateful to all past and present collaborators as well as our group members for making this possible. Thanks for the fantastic journey to
@NeuroVenki
,
@MatthiasBethge
,
@DulacLab
, ..
🍾Deeply humbled, honored 🙏🏼, and so, so happy to share this with my best friend & life partner
@TrackingPlumes
🥰.
Also a huge thanks to our mentors and labs - it’s been a privilege to work with you🙏🏼
One major idea we are putting forward in this work is that proprioception can be understood normatively as having to solve action-recognition from receptor inputs! These "actions" can then be used for perception, balance, motor control and learning...
✨
#tweetprint
: Introducing our ANN model(s) of the proprioceptive system!
@OpenSimSU
+spindles+ANNs+action-recognition task.
We find task-training really matters for ANNs + trained units resemble S1 neurons
preprint: project page:
Very excited to introduce the beta version of the
@DeepLabCut
Model Zoo! We envision that over time, for common experiments training will no longer necessary. For now, during stay at home orders dog and cat models might be particularly useful.
Did someone say dogs? 🐶
✨ Introducing the DeepLabCut Model Zoo ✨
A growing collection of trained models for you to immediately use - no training required, no installation required (see below)! 🐕🐵🐈🐕🧍🐭
#openscience
Implicit shape representations, such as Signed Distance Fields (SDF) are great for 3D reconstruction. But do they help 3D pose estimation? I'm sharing our recent work HOISDF, which has just been accepted by
#CVPR2024
work led by PhD student Haozhe Qi!!
The Schools of Computer and Communication Sciences (IC) and Life Sciences (SV) at EPFL invite applications for a faculty position at the intersection of computational and life sciences.
@epflSV
@ICepfl
Please RT.
My lab will be working on quantifying behavior and computational models of motor control! I’ll be recruiting soon... please reach out to me if you are interested!
Greatly enjoyed contributing a tiny part to this exciting cancer study of Profs.
@EPFL_altug_lab
and Camilla Jandus. I joined the project after discussions at the
@epflSV
retreat in Jan 2020 and learned a lot!
We had great fun participating & writing the paper with all the winning teams and organizers-- check it out! I'm quite excited what one can learn about biological motor control based on the new simulators! Our team:
@chiappa_alberto
@pablo_tano8
@nisheet0
@pouget_alex
MyoChallenge 22 paper is out 🔥
Have a look at the smart solutions 🧠 of last year's winners
Lots of good ideas in it for this year's manipulation challenge😈
➡️
We are looking for a few more teaching assistants --> so please apply!
So far, we have local hubs in Athens, Bochum, Buenos Aires, Geneva, Kigali, London, Munich, Okinawa, and Oxford... you can pick any of those or a virtual hub, when you apply as a student!
Do you want to learn how to track poses, analyze what animals are doing, identify your favorite bear or link behavioral to neural data?
Join the hands-on, 5 day Cajal course from 6 to 10 November!
@Cajal_Training
👋
@amathislab
&
@mwmathislab
are looking to jointly hire a research software engineer at EPFL - one of the nicest places to work, very multicultural, super team oriented, work hard/play hard with Mount Blanc & Lake Léman in your backyard type of vibes 🏔️🧗♀️🎿⛵️Ad coming soon! 👀
Second day on the job & excited to participate in a symposium on “Emerging technologies for chemosensory research” symposium at
#ISOT2020
with
@DulacLab
,
@ShohamLab
& Florin Albeanu (live from my chalet)
Check out this exciting work by PhD student
@GabeffValentin
co-advised with
@devistuia
! This is the full version of the article that was selected as an oral at CV4animals
#cvpr2023
.
Trails of hominid footprints, “motion” captured by Pliocene deposits at Laetoli that date to 3.66 million years ago, firmly established that early hominoids achieved an upright, bipedal, and free-striding gait (Leakey & Hay, 1979). Beyond fossilized locomotion, behavior can [...]
🥳
#teamDLC
is delighted to present our Primer on Motion Capture with Deep Learning in
@NeuroCellPress
!
📑 want to know more about neural networks, optimization, data augmentation?
📈 pitfalls to avoid?
Paper:
Complete w/code:
What is the integrative logic of proprioception? Sensory information is collected in muscle spindles distributed in the body, but how is the information integrated. We took a computational approach to this problem...
Neuro jobs at
@epfl_en
-- The Ecole polytechnique fédérale de Lausanne (EPFL) invites applications for two faculty positions at the Tenure Track Assistant Professor level in the Brain Mind Institute of the School of Life Sciences.
🤩 Exceptional registration opened to attend the
#FENS
#SummerSchool
2022 on "Artificial and natural computations for sensory perception: what is the link?", taking place on 22-28 May in Bertinoro, Italy. Apply before 18 March.
@BathellierL
@AToliasLab
"On the inference speed and video-compression robustness of
#DeepLabCut
" -up to 1,000 FPS, testing different GPUs, and showing that DLC is to robust video-compression! - save time & data space. New work from
@TrackingPlumes
& Rick Warren.
@biorxivpreprint
Really love the technical solution behind this GUI, which is actually more of an interactive plot -- pure
@matplotlib
mastery by
@jessy_lauer
! So happy to have him on the team. Thanks
@cziscience
for the support!
#maDeepLabCut
: multi-animal support + high perf. pose estimation. We will highlight updates in the coming days, but here is 1:
This NEW tracking GUI built w/
@matplotlib
allows editing errors or swaps, & deal with occlusions:
📹
📹
Join the Cajal hands-on neuroscience course Quantitative Approaches to Behaviour, in which I will be taking part with
@DeepLabCut
and friends!
19 July - 8 August 2020 at the
Champalimaud.
Apply here:
Super proud of my first two PhD students
@LStoffl
, who finished his first study on using transformers for pose estimation and
@a_marinvargas
, who wrote a review on ML for motor neuro!!! and
We developed an end-to-end trainable multi-instance pose estimation model with transformers - - great work by PhD student
@LStoffl
and Master's student
@vmaxmc2
!!!
Thus, transfer learning - by using pretrained-ImageNet networks - boosts robustness & performance for pose estimation, and is really important for lab-sized datasets to achieve robust results! This work was led by:
@TrackingPlumes
Preprint: 5/
Thanks to my wonderful team, collaborators and the DLC community! 2022 was great fun!! Happy holidays!!! --> Next, I'll be off the grid for a while!
#ski
🎁 The
#DeepLabCut
2022 Year in Review 🗓
- what we achieved, what you loved, where we are going in 2023!
- THANK YOU to the amazing community
Read on our blog now⬇️
(and a new DLC release will be out before we head off for a holiday break 2.3💜✅🙏)
A real treat to present together with the
@DeepLabCut
developers
@TrackingPlumes
and
@TrackingActions
in a symposium at the International Winter Neuroscience Conference
#IWNC
in Sölden, Austria. For both Alexander and me our first science presentation in our motherland!! 😯😊🥳🇦🇹
DLC 2.1 includes dynamic auto-cropping. This dramatically improves the inference speed for open field setups (like for the rat below)where the animal is much smaller than the frame and no prior tracking is necessary.
#DeepLabCut
🚨 ODEFormer is on Arxiv!
We show that Transformers can recover the differential equations governing dynamical systems from noisy & irregularly sampled trajectories.
Very fun collaboration with
@SorenBecker
,
@TrackingPlumes
,
@pschwllr
&
@k__niki
!
🧵⤵️
This is the second time we organize this course. Last year we had 93 students from across the globe!
Here are the logos of the hubs we had then, from London, Nairobi, Munich, Buenos Aires and Warsaw. This year we hope to cast an even wider net!! Join us.
🥳
@akaijsa
&
@pranavm42
et al introduce task-driven modelling of the proprioceptive system, now out in
@eLife
!
Our work combines deep learning & biomechanics to test theories of💪sensory representations
We show that HOISDF achieves state-of-the-art results on the widely used hand-object pose estimation benchmarks (DexYCB and HO3Dv2). Check out our project page for more details - This work was done with Chen Zhao and Mathieu Salzmann (EPFL).
3D arm movements were analyzed using multiple cameras and the awesome
@DeepLabCut
Monkeys optimized their behavior (although differently) across trials despite not being restrained. I.e. we can do conventional analyses as in “constrained” experiments but within a large env 7/n
Special thanks to
@slimanjbensmaia
with whom we had deep discussions during his visit to EPFL in 2021 and of course
@TrackingActions
for discussions throughout.
📣Part 1 of our quest to better understand the brain was
@DeepLabCut
.
🔥🦓Now Part 2: Introducing
#CEBRA
to jointly model neural dynamics & behavior with self-supervised learning. Hypothesis- or data-driven, highly consistent, decodable neural latents
🧵👇
The grids are a torus, and continuous attractors are real!! Neuropixels makes a difference, as do outstanding postdocs and collaborators!
#Gridcell
#KiloNeurons
@KISNeuro
#CAN
#EOSSmtg
@cziscience
kick off day 1! Beyond excited to see all the incredible tools, open source enthusiasm, and focus on inclusion and community (And so fun to have
@DeepLabCut
w/
@matplotlib
on the meeting cover)
Character recognition form muscle spindle input is just one example for this objective, but I think it has many advantages and as we show is sufficient for the emergence of "interesting" coding properties!
Dynamic cropping is run just like the standard analyze_videos but with setting the dynamic flag: deeplabcut.analyze_videos(path_config_file,[video], dynamic=(True,.1,30)). See:
@kendmil
@AndreaBanino
@RaiaHadsell
Why do I bring (c) up: Whatever the motivations for "citing/not citing" are: forgetting, purposeful omitting, not being able to cite something because it didn't yet exist... IMHO authors should give other authors the benefit of a doubt... 4/
#EBBS2021
fully hybrid format seems to be working!
We have
@EdvardMoser
, Nobel Prize winner presenting his work and an engaging discussion animated by Pierre Lavenex and
@TrackingPlumes
.
@EBBS_Science
Meeting taking place right now at Forum Rolex
@EPFL_en
!
Thanks everyoneü
Self-attention mechanism can be viewed as the update rule of a Hopfield network with continuous states.
Deep learning models can take advantage of Hopfield networks as a powerful concept comprising pooling, memory, and attention.
Very happy to share our latest work! Frustrated with expensive and inflexible rodent behavior testing systems, we compared
@DeepLabCut
head-to-head with 2 leading commercial systems. But first, please turn up the volume to 11 for this trailer 🙂
#Rammstein
#MachineLearning
We seek candidates who use machine learning and/or other original computational approaches to solve challenges in life sciences. The appointment could be either at the assistant professor or tenured level.
@EPFL_en
Implicit shape representations, such as Signed Distance Fields (SDF) are great for 3D reconstruction. But do they help 3D pose estimation? I'm sharing our recent work HOISDF, which has just been accepted by
#CVPR2024
work led by PhD student Haozhe Qi!!