Does anyone use Topaz Labs?
I see so many credits on ai videos, but I donโt think Iโve seen many list them in their workflow.
Seems like a great product though.
I started with a single prompt to test Lumaโs text to video capabilitiesโฆ then I kept going.
Proud to present:
GRIZZLY CITY
What I used to make it:
@LumaLabsAI
@midjourney
V6
@elevenlabs
and FCX to put it all together.
Out of the 36 shots that make up this trailer:
21 are
I shot this photo of 50 Cent years ago for a magazine.
Today I threw it into
@runwayml
without a prompt to see what happens.
I think we got a new G-Unit member ๐
Yesterday I was hired by a global Ad agency to create a 30-60 second ai film for a pitch to a potential client theyโre going after.
Last night I finished the script and this morning I finalized the shot list.
Now itโs time to start generating the images. ๐ฆพ๐
@EHuanglu
Imagine if Nikon came out in the 90s and said โwe are sticking to film and we will never get into digital photography because real photography involves film.โ ๐
I just did a comparison using my own traditional photography (non ai) with Kling and
@runwayml
Gen3 (image>video)
The results are quite different.
Runway keeps the details on the product and interpreted the prompt much better.
More examples below ๐
Some ai short films are just way too long. There I said it.
As creators we get emotionally attached to every shot and end up dragging out those beautiful cinematic shots.
The issue is that most peopleโs attention spans are short and cinematic beauty shots are only going to go
@dustinhollywood
respectfully canโt agree here bro. Iโve used them all and imho MJ has the best range and quality, among ease (for what I like to do which is photo based). I feel itโs training is superior on that front to any other. But to each their own ๐ซก
This is a polaroid I shot of Gerard and Mikey Way of My Chemical Romance after running it through
@runwaym
with no prompt.
Original Polaroid in the comments.
I was invited to show my AI film at Snow Birds, a KDR group Art Show in Miami.
Excited and grateful to be showing among so many dope af artists. ๐
Being peppered all night with questions about ai, from people that had no idea about ai, was also fun!
Will share FLOUR in my
We will see ai take over advertising and commercials before we see it take over Hollywood.
So whatโs going on behind the scenes irl with AI?
The current state of the Ad industryโฆ
All year Ad agencies have been hiring consultants and are actively searching for AI artist to
At
@Tektite_AI_Film
,
@HaukeHilberg
and me have dedicated countless hours to fine-tuning and experimenting with our workflows. The immense effort of the past months has culminated in this important project:
Introducing โThe Revivedโ at the
#Olympic2024
for
@ukraine_ua
together
Gen AI + Real World Solutions for the W!
So Iโve been hired by another agency to create 3 highly specific tennis themed campaign photos for a major alcohol brand.
I got the first image pretty quickly, except for the female models back arm+hand holding the racket.
I spent all
@JeanPaulBande
@indeed
@runwayml
@midjourney
@elevenlabsio
From start to finish, 2 nights (so about 12hrs)
What took the longest was the empty chair rotating and the falling paper scene because I was being really picky on what I wanted them to look like ๐
Hereโs another Kling and
@runwayml
example with the prompt I used.
[a static tripod shot of a hand squeezing a lemon over a plate of tacos resting on a picnic table. A small amount of Lemon juice droplets squirt and fall from the lemon and onto the tacos. Camera stays still.]
@MarcAndre_V
@indeed
@runwayml
@midjourney
@elevenlabsio
Youโve completely missed the point ๐. Filming this Ad would have required an entire production. A crew of people and a giant budget.
I created this sitting in a chair by myself.
Imo thatโs pretty special.
Hope yall are having a good day!
Today I wanted to test Hyperspeed.
Hereโs the prompt:
Continuous hyperspeed FPV footage transitioning from teenagers bedroom to a house party filled with roudy teenagers then transitioning to a backyard party filled with hundreds of epic party
@bradcumbers
Weโve entered the world of one man production companies.
What use to take pre-production, casting, shooting, post production and more, can now be done from a chair in an office.
Itโs the craziest sht ever tbh.
Tested
@LumaLabsAI
key frames.
For my end frame I took a slice of the street view and used firefly to expand the image to get a clean landscape image of the scene in front of the couple.
I was hoping to get the camera pushing in past our couple and through the windshield to
Once my images are all generated, I load them all up on runway (unlimited plan) and start running them feverishly. I donโt watch them as they finish though. I wait.
I wait until I have about 20-30 generations, then like an excited little kid opening presents on Xmas, I watch
@Pizza_Later
๐ฏ๐. tbh for anything online or social I make, I almost always aim for a minute (if possible). Itโs a starting goal I set to keep me tight.
60% of the time it works every time ๐
One of the main issues I have with all the AI short films and videos Iโve been seeing is the audio.
Dont get me wrong, the videos are great.. but when the audio is off or not balanced within the film , Itโs all I can focus on.
Also,
@elevenlabsio
labs is an amazing game
If you think about it, our journey to AI started the moment we invented a way to capture life on film.
Recording our lives through to the digital age and eventually loading all that into computers to train them.
@CharaspowerAI
Is the model outputting the actual products we see here? Or are you training a model on specific products?
Iโve used
@letz_ai
to train specific products and it worked well.
Loving this Luma Key frame transition.
I took two photos and dropped them into
@LumaLabsAI
without a prompt and enhance prompt off to let it do its thing.
Then I kept extending the scene over and over for one long continuous shot.
Only bad thing is that every generation
@kimmonismus
Solid visuals. The fact that a commercial like this was made in 24hrs with the tools we have now is exciting af. And itโs just getting better. Soon weโll have more control, believable acting, higher resolution and more. Gonna be wild!
@nftlisa
I produce content for brands. I mostly use MJ+Runway+Magnific to help build image libraries for these brands, but Iโve also trained a model on a specific product to be able to generate photos for a client every month with ease. ๐ค๐ฆพ