Inpainting & outpainting: edit parts, not everything
Inpainting lets you mask a problem area and have the AI fix only that part. Outpainting extends the canvas beyond the original frame.
When to use this in your workflow
Section titled “When to use this in your workflow”- Fixing specific areas of a good generation: Product shot almost perfect but the background has an artifact? Mask it, describe what should be there, regenerate only that region.
- Removing unwanted objects: Text, watermark artifacts, extra figures — mask them and prompt “clean background” or describe the replacement.
- Replacing backgrounds: Keep your subject, mask everything else, describe the new environment. Place products in different settings without regenerating from scratch.
- Extending the frame (outpainting): Need more headroom for title safe? Outpainting generates new content matching the style beyond the original borders. Useful for reframing horizontal → vertical or adding space for graphics.
- When full regeneration is too risky: Once you have a generation that’s 90% right, full regeneration is a gamble. Inpainting lets you keep what works and fix what doesn’t.
Common inpainting subjects:
- Hands and fingers (the most common AI artifact)
- Faces that need subtle corrections
- Text or watermark artifacts
- Distracting background elements
- Objects that need swapping (different product, different prop)
How it works in modelBridge
Section titled “How it works in modelBridge”Filter by the “Inpainting” category in model search. These models accept two inputs: your source image and a mask.
The mask workflow:
- Select your source image from the timeline or project bin
- Create a mask — a black-and-white image where white = area to regenerate, black = area to keep
- Upload the mask in the model’s interface
- Write a prompt describing what should appear in the masked area
- Generate — the AI fills only the white areas
For outpainting, the mask extends beyond the original image borders. Some models have dedicated outpainting modes that handle canvas extension automatically.
Mask creation: You can create masks in Premiere’s graphics tools, Photoshop, or any image editor. Soft mask edges blend better than hard lines.
Recommended settings
Section titled “Recommended settings”- Denoising strength 0.3–0.5: Blends new content seamlessly with the existing image. This is the sweet spot for inpainting.
- Denoising strength 0.7+: Generates more dramatically different content — can look obviously pasted in.
- Start at 0.4 and adjust by 0.1 until the blend looks natural.
Common mistakes
Section titled “Common mistakes”- Denoising strength too high. Above 0.6, the inpainted area starts looking visually disconnected from the rest of the image.
- Hard mask edges. Sharp boundaries between mask and non-mask areas create visible seams. Use soft/feathered edges.
- Vague prompts for the masked area. “Something nice” won’t work. Describe specifically what should fill the space.
- Using inpainting when you should regenerate. If more than 40% of the image needs fixing, a full regeneration is usually faster and better.
Quick answers
Section titled “Quick answers”Do I need Photoshop to create masks? No. A simple black-and-white image works. Premiere’s graphics tools, Photoshop, or any image editor will do. White = change, black = keep.
Can I inpaint video, not just images? Some models support video inpainting, but most work on single frames. For video, you’d typically inpaint key frames and use other techniques for consistency.
What if the inpainted area doesn’t blend well? Lower your denoising strength to 0.3–0.5. Also make sure your mask has soft edges rather than hard lines.
Is inpainting cheaper than full regeneration? Usually about the same cost per generation, but you need fewer attempts because you’re only fixing one area. Net result: you spend less.
What’s the difference between inpainting and image-to-image? Image-to-image transforms the entire image. Inpainting only changes the masked area. Use inpainting when most of the image is already right.
In real projects
Section titled “In real projects”Wrong area in a generated clip: A 6-second video looks perfect except for one artifact in the corner. Instead of regenerating the full clip, select the problem area with an inpainting model in modelBridge. Fix only that region — the rest of the clip stays untouched.
Product shot, wrong background: The product looks right but the background is wrong. Mask the background, inpaint with a new environment description. Product stays exactly as generated.
Extend the frame: Your shot is composed too tightly for the cut you need. Use outpainting to extend the frame edges — more sky, wider environment, extra room for motion. The original content stays intact.