Apr 5, 2023
A lot has been written about AI over the past few weeks, and there is little doubt it will impact every part of life. But what about the impact on the creative industry? This could be the most significant thing to impact our industry since the launch of the Apple Mac, and the birth of the internet, perhaps even the impact of these combined. We chatted with friend and director Paul Dixon to get his take.
Paul: I‘m seeing the fake images. The Trump arrest photos, some Boris ones, and Macron amongst the Paris riots. But suddenly photorealistic people are possible, they used to not quite be right, the hands were weird or whatever, but now they are really convincing.
What interests me, especially in 3D, is making CG not feel like CG. I had loads of visuals in my head that I wanted to make, and now those ideas are kind of ruined. So for the first time I kind of wish AI wasn‘t here. Can I now build some of the 3D stuff I was thinking about previously in Midjourney? The answer is yes, in about 4 seconds!
Midjourney is just stills at the moment - I’m sure they are working on an animation/video app with the same simple word prompts, for now, it doesn’t exist so I’m using stable diffusion - specifically disco diffusion 5.2. It’s a wrestle with code and word prompts and crashing and running out of credits, but it can create some really beautiful work.
I have been experimenting, I like to throw myself in and see what happens. Because of the way my brain works, I‘m getting bored of Midjourney so I started by creating objects and using Luma labs to capture scenes on the phone and turn them into a lit 3D scene. It's crazy! It works out all the lighting and stitching the images together with AI to form a really good scene. A few companies have started using this, McDonald's just did an ad using it.
I‘m uploading footage I have shot and using AI to augment it. Using notebooks to have AI affect the image, Google assigns you a different GPU to process it (each has a different graphic card) so it‘s potluck allocation as to the speed, but using the notebook you can add parameters and word prompts, to change the code and affect the look. There are hundreds of settings you can change, and it‘s a bit of a trial-and-error process, but the AI learns where the edge of the subject is and you can add effects.
This type of work would take forever to do traditionally and days to render out, but with Google Colab, it takes seconds to adjust things and just a few hours to render out using their cloud rendering.
With the Jujitsu footage I shot rather than fighting the image, I wanted it to flow. Jujitsu is all about flow, the power of one to another. What‘s interesting is if you were doing this with traditional FX, you wouldn't think of this. It throws up stuff you just wouldn't have thought of.
I used these prompts for the footage: ferrous fluid, swirls, liquid, and this is the effect you get, and I can save the notepad for future reference. The next steps are to change the coordinates of the camera, go back in and play with masking, drop the footage back over the top, mask bits out, and get rid of the stray stuff.
I think the people who are writing these notepads, can tailor the code to exactly what they want. I was doing a thing for Bare Knuckles (BK) and wanted a logo. AI initially couldn‘t recognise the letters 'B' and a 'K' but now it does, so that's interesting from a brand identity POV.
There is obviously always a long catchup time before this becomes the industry norm way of working, but in terms of concepting frames for something like a gaming job – I have already seen a company that offers that by doing it just through AI alone.
Day by day it rockets, we can‘t keep up with it, and even tech leaders are calling for a pause. Good ideas will rise to the top, and knowing what looks good and what doesn‘t will be key. But no doubt it‘s going to devastate the industry completely, and totally change everything! Moore‘s law and the exponential curve spring to mind (especially this week), he was a genius, only late on some predictions by a decade or so, you can see how it can run away.
See more of Paul‘s work here