.
Arvid Tappert - Generative AI Experiments

ANIMATED SHORT


The Odd Birds Kingdom: The Royal Wedding




Join the Odd Birds as they celebrate their quirky and odd traditions in "The Odd Birds Kingdom: The Royal Wedding".

This short is based on my Odd Birds AI model, trained on my work. I use it to explore new workflows for creating unique Gen AI animations with traditional animation tools.

Going all in on consistent AI animation and stop-motion aesthetics. Combining all my previous workflows. 


Process: The 'Odd Birds' AI model as a base for texturing my drawings with Krita AI diffusion and Adobe Character Animator for the animation.


Clips and experiments










CHARACTER DEVELOPMENT


Using my Odd Birds LoRA and Adobe Character Animator to bring the birds to life



























Stable Video 3D > High res fix in Krita AI

Creating the Odd Birds AI model 



For me personalizing the AI output is becoming key, so I created and trained the “Odd Birds” AI model using my images as a base. 

For the “technical stuff” in the video, check out this tutorial by Olivio Sarikas for how to create your own LoRA models based on your work: LORA: Install Guide and Super-High Quality Training

Goal: To create a unique AI model that I can use for further experiments, ensuring consistency in style and animation.

Process:
1. Created images in Blender.
2. Trained the "Odd Birds" Lora model with Kohya_ss.
3. Prompted and Generated images in Comfyui.
4. Added movement using Animatediff and Runway.
5. Enhanced resolution with Topaz.
6. Complete with compositing in After Effects

LIVE AI PAINTING





This time, I'm experimenting with live AI painting using the "Krita" tool, where I draw basic shapes and then the tool textures these drawings, utilizing the model I developed with my own images as training material.

Goal: Learn the tool and achieve control over the results before beginning to draw and texture each frame for the animations.

AI and CLAY




Developing generative AI models trained only on textures opens up a multitude of possibilities for texturing drawings and animations. This workflow provides a lot of control over the output, allowing for the adjustment and mixing of textures/models with fine control in the Krita AI app.

Goal: To get even more control over the texturing process.  My plan is to create more models and expand the texture library with additions like wool, cotton, fabric, etc., and develop an "AI shader editor" inside Krita.

Process:
Step 1: Render clay textures from Blender
Step 2: Train AI claymodels in kohya_ss
Step 3 Add the claymodels in the app Krita AI
Step 4: Adjust and mix the clay with control
Steo 5: Draw and create claymation

DRAWING FRAME BY FRAME




Experimenting with animation by manually drawing each frame of flowers growing in the app Krita. The 'Odd Birds' model serves as the basis for the texturing.

Goal: Achieve a consistent look, fine control and stop-motion aesthetic. 

WALK CYCLES




I wanted to see if the method also works with character animation and still keep the consistency by drawing each frame of character animations in the app Krita. This method merges traditional animation techniques with AI, creating a texture and style for the animation based on my work. 

Goal: Currently, the animations serves as a quick proof of concept. However, by incorporating smoother animation and more details closer to the output I'm convinced the result could be really cool. This technique requires more effort compared to other Gen AI video tools but offers significantly more control.

THE ERASE WORKFLOW

🖌




I developed a simple method for creating organic animations with Krita AI Diffusion. Working 'backwards' by erasing frame by frame and letting the AI model fill in the gaps.