🎉WE ARE OPEN!
Seiryu new Live2D Vtuber rigging service is now available! Check out our new website and showreel of our works here
Links and details in the comment below:
[Live2D Protips]
Eyes are delicate matter for face tracking, so expect a lot of jitter out of it. I've seen plenty of 2D Vtubers with this 'lazy eyes' look on them.
You can try to compensate the jitters from rigging side by adding some extra 'buffer' keys to the eye open state.
Here's the breakdown for transition effect I made in Live2D earlier! As you may guess it's just layered parameters animation and masks magic.
I believe the params can be simplified but I ain't changing what's works for now lol
Here's an oversimplified note I wrote on a way to prepare hands layers for fingers tracking. You'll want to break down the hands to each of its knuckles and have everything separate as much as you can. Ofc this is not a definitive guide, different hand may require diff methods.
Some notable tracking differences I found while testing the new
#VtubeStudio
RTX tracker compared to iOS and OSF (OpenSeeFace)
This is not a definitive comparison, just take it as something to look for if you are calibrating between trackers!
Did you know
@AiriGwynevere
got nose?
Here's one way to make nose transition in Live2D, I used a skin-colored layer with soft edges to hide the nose.
Then decide the angle to reveal the nose, I find revealing the nose tip first then to the bridge looks more natural.
I used to avoid Glue Tool before, I always found it finicky to work with but once I spent more time with it I can see how it'll be useful in some cases beyond connecting joints like most guides shown 🔍
A little example on
#Live2D
Hoodie rigging with glue
A whole mental image of a Live2D rigger when they see a flat image be like
This is PSD notes I wrote for
@YuikaiChan
's jacket, I want them to be fully functional so least I can do is to make sure the layer cutting are just 👌right👌
[Live2D Protips]
By default, params in Live2D moves in linear trajectory. Which made some parts that supposed to bend in arch (eg. hair) may looks... stiff. This is where 'Extended Interpolation' comes to rescue!
Live2D Protips
#4
:
Improve your model hair physics by adding just one more parameter ;)
By only using warp deformers or mesh deform, you can make a smooth hair physics with less parameters and deformers than skinning!
[Live2D Protips]
Did you know you can play animation scenes RIGHT on the Physics viewer?
You won't need to switch to your Vtuber app just for testing idle motion. Or you can also create custom breathing parameters to link your physics input to! Here's how.
#Live2D
ngl it feels like a crime that Vitamins plugin for VTube Studio is so underrated 😭
Vitamins made it so easy to make conditional triggers and interactivity in your Live2D model by power of JavaScript!
I want to use this 🧵to showcase cool stuff people made with Vitamins 👇
【Quick Vitamins /
#Live2DShowcase
】
Added a few small, simple vitamin params to my model!
▷ Chuu: puckering then animating over time
▷ Zipper: Head nod triggers zipper falling
▷ Buns: Change hair size based on mouth movement
Additive blend mode is cool, even so if I can have a proper art utilized to make shiny effects like this. I mean look at how
@kinKaikii
prepared the layers there, it's very well thought-out!
#Live2D
I tried to record VTube Studio tracking data so I can test directly on Live2D Physics Editor window. The result is quite nice, having able to edit physics on the fly is really convenient.
*bottom left is original motion recorded in VTS
I wish
@VTubeStudio
have pen tilt rotation input, that'll make a drawing hand rig much better. Or maybe someone interested to make a VTS plugin for it? ( ´・ω・`)
#Live2D
If you get my name card, it has little me on it ✨
This is Live2D AR scene on a phone browser, no app install needed!
It's not a video overlay but a Live2D model, so developing more interactivity should be possible! Having Vtuber model live on your prints seems cool 👀
Here's Live2D "hologram" stand on my CF17 booth yesterday!
Another fun way to showcase your Vtuber model, and it's an effective crowd stopper too -- I saw people were drawn to it!
It's simple to build and you can make it yourself, details below 👇
(model:
@NovellusDea
)
setting aside things and trying out
#Live2D
5.1a 's 3D shape features.
Needless to say, you still will need the fundamentals to make it work as you want, but I think I can use the 3D expression feature on my future rigs :D
I'll try to explain Live2D Physics Effectiveness since it's related to new
#VTubeStudio
"Legacy Physics" option.
Effectiveness setting determines the ratio of physics output, eg. 50% of effectiveness halves the output. But there's more than that, which detailed below.
A real preview of finger tracking for 2D Vtubers, finally? 👀
@PavoStudio
's Live2DViewerEX just dropped a hand tracking to their latest update, works with a general webcam and no special equipments are required!
Working on Live2D hair physics most of times are like:
"The hair looks bad but I don't know why"
*flips the numbers*
"The hair looks great but I don't know how"
[Live2D Protips]
Your client wants their model with readable Artmesh ID and you can't imagine going through 500+ artmeshes to manually rename them? Me neither!
Here's how you can use the power of spreadsheet to quickly renames all the artmeshes in your project.
A little PSA, if you are selling or giving away your Live2D project file please be aware that you are also giving access to the whole character PSD, despite you are not including it in your package.
Anyone can export the whole character PSD just with a few clicks from the Editor.
2D VTubers now can count! ✌
Yes this all works only with a webcam! The tracking is not perfect in most cases but gesture triggers can be 'handy' to save you from hotkeys!
Live2D riggers brace yourselves you're gonna be having more hands requests from now✨
#Live2D
#VTubeStudio
yooo Live2D update that actually slaps
ever wished that you can use Path Deform on a warp deformer?? now you can! along with other transformation tools like distort, perspective, etc
Hey guys y'all now can make high definition real-time model showcases with
#VTubeStudio
and OBS, no green screen and all that jazz
Denchi wanted me to test the new beta feature on multi-instance iPhone tracking so here it is! Working out nicely✨
PSA: Procreate PSD files are no good for Live2D. For some reason Procreate export PSD layers in size of the whole canvas instead of the layer content. This may affects riggers work in Live2D, see below:
Alter Yui
@YuikaiChan
wants to have custom assets that she can put in between her hand fingers so I gave her power to hold any custom PNG.
No plugins needed, just
#VTubeStudio
!
If you are using 4K/1440p monitor on recent
#Live2D
5.0 beta, you may want to turn on a simple Windows setting to improve the Editor performance 🤔
I confirmed reports where 5.0 seems to be sluggish in HiDPI environment, and changing this setting literally gives 2x boosts.
You know what I love from VTube Studio? I can ask "hey ain't this thing cool to have" and denchi just be like "say no more"
Motion recording is now in VTS Beta! No more need for third party tools, plus it comes with nifty timeline to analyze motions
Not sure who's this advice for but worth considering anw, sometimes I lower the upper eyelid when the eye is looking down just like how it worked in real life. This whole eye and brow movement is controlled by pupil alone.
#Live2D
Looking back, my
#Live2DJourney
started almost a decade ago.
I want to close this year by thanking everyone, my clients, followers, and Live2D/Vtubers community for supporting my career!
Latest update of
#VtubeStudio
gonna bring your neon-themed Vtuber to the whole new level!
Here's how to define your model glowing parts to be ignored by VTS reactive lighting💡
#Live2D
[Live2D Protips]
This is my workflow for doing Head turns.
After I'm done with up and down view, I would copy the side view as the base to work on side up/down view, then I'll reflect them. I never have to do synthesize corners again ever since.
[Live2D Note]
When you are doing atlas work, you should not leave a mesh going beyond texture atlas boundary. You might run into some texture 'warping' issue that is only visible in runtime but not in the Editor.
[PSA for
#Live2D
5.0 users]
Don't apply Skinning on meshes with blendshape because it can break your project ⚠
Multiple cases were confirmed when people can't delete their blendshape or in the worst case, unable to export the model at all (I can't export this demo either)
New Google Mediapipe webcam tracker on
#VTubeStudio
Beta is promising!
This is RTX tracker quality that is going to be available for non RTX users🤯
Eye tracking is spot on and iOS input like Mouth X is going to be supported as well
It's been 6 months since
#Live2D
blendshape was introduced, how much of it has affected your workflow?
What's changed for me is doing the Brow form. Rig the brow to raise/lower once, then the blendshape will take care of the other form! It isn't much but feels more efficient.
There are two ways to drive Body Z (body lean sideways):
- FaceAngleZ (head Tilt, default setting)
- FacePosX (head position)
Each input has different characteristics and may give different vibe to the model, which one do you use?
#VtubeStudio
Here's an idea for Live2D youtubers:
Live2D tutorial series for Live2D Free version.
Working with free version teaches you good fundamental of Live2D and set the scope lower for beginners.
Forget the "meta" and gimmicks, learning to rig basic model should be more encouraged.
I'm always fascinated with things Live2D physics can do, all of sudden I can just make her bust a move without doing any manual keyframing💃
Model:
@Tiiruki_
Artist:
@Dat_YunYang
#Live2D
I've been using Photoshop for ages yet I'm missing out on Layer Comps? 🤯
This would've saves me a lot of time on reviewing Live2D PSD with lots of toggles!! It saves your layer state as convenient preset so no more hiding layers back and forth
The magic is in the details✨
When I saw the flow of the hair, I'm thinking if I could add some volumetric look into it. The rotation is done in the mesh level stacked with the good ol' skinning
#Live2D
[Live2D Protips]
If you use 'Skinning by deform path', you can control the amount of generated rotation deformer by laying out correct deform path points.
Just keep in mind that each path points give you 2 rotation, and the last point give you extra 1 rotation.
I was wishing for a model debugger panel in VTS cuz I need to inspect my model, like to find out how much should I set my Physics input based on VTS tracker.
Thankfully the new VTS API making it possible for anyone adding such function in!
#VtubeStudio
#Live2D
Live2D Protips
#1
!
This is a redone quick tutorial for making a perfect mirrored copy of objects in Live2D. Sometimes you don't really want to rig two objects that is completely symmetrical, so here is some time saving tips.
(character courtesy of @/FancyJasper)
From the ongoing
#Live2D_Alive2021
event, there is teaser of upcoming 4.2 update but the one that stands out that we finally getting blendshapes parameter and symmetrical mesh editing!!!!
[Live2D Note]
I tested 3 tracking apps on how they handle expressions (exp3.json) with tracked parameters like Angle XY.
Animaze tried to blend in the expression, VTS simply locks the model to your exp, and Prpr only works for a split second, taking tracker input as priority.
[Live2D Note]
When you take Breathing parameter as physics input, there might be a slight difference between what you see in Physics Viewer vs VTube Studio. VTS Auto-Breathing is significantly slower, so depending on your setup it might result in slower or jittery physics output.
[Live2D Note]
Blend Shape (not to be confused with 3D 'blendshape') is one Live2D function that rarely gets talked.
It's substitute of Paste Shape function where it will blends original shape and pasted shape, akin to previewing state between two keyform without making keyforms.
Welcome! I am an Illustrator and Live2D animator. My specialty is in making Facerig 2D avatars for streamers and virtual performers. You can see some of my published works below:
Check my website for commission info! (link on my profile)
These overlapping booba gave me a revelation that "Group by Draw Order" will enable draw order to only affect meshes within that group🤯
I'm just 6 years late in realizing that
My first impression of Nizima Live, official Live2D Vtuber app. I heard this app is Japan exclusive so some folks cannot download it, I went through the hassle so you don't have to.
All this time I'm merging animation scenes by importing motion3 files or copy pasting keys... until I learned there's Import Scenes function hidden under this weird drag-n-drop mechanism?!
Come on
@Live2D
how hard is it to add "Import scenes from other can3" menu??🤯
well I'm late into this but i just want to have the opportunity to say
artists, please separate your booba layers. believe me it will be worth it
#いいおっぱいの日
#Live2D
Posting here to make note for myself:
You can batch edit multiple Live2D parameters (rename, changing values, enable Repeat/Blendshape) from Parameter Settings window
A little demo of using physics for dynamic tail movement. It first came out from a little experimentation which turned out to look nice, no need to animate it manually at least!
Model:
@Chandravtuber
Art:
@kinKaikii
#Live2D
#VTubeStudio
Got full gamepad buttons
#Live2D
rig working!
Nyarupad might be my most fav VTS plugin out there. It's lightweight, no config screen, fire the app and it just works, like magic. I hope there are more VTS plugins like it.
Live2D Vtuber facetrackers showdown! With Facerig, Animaze (Beta) and Prprlive (with GameAnimoji DLC), featuring @/_PixieWillow's avatar for testing.
I ran these three in the same time, under default calibration, same hardware, same lighting condition.
Trivia time!
Live2D first release was a fully vector-based program.
It even had a pen tool so you can draw right in the Editor, PSD support wasn't available back then.
ArtPath tool may have been a leftover from this.
I'd like to ask from a new perspective, how do we convince more game devs to use Live2D?
Live2D career path shouldn't be always around Vtubers. I imagine some types of games like RPG or VN would totally benefit in having Live2D, some even make it as their selling point.
295 voted "i'm ok", 2236 voted "i'm struggling". 88% Vtuber model creators do not have a sustainable career in 2024. What are the root problems? How do you think things should be? "Raise price" sounds like a straight foward answer, but, does it actually help the 88%?