Skip to content

From moodboard to locked shot

TL;DR: Define what the shot needs to do. Scout fast with a cheap model. Generate short variations. Pick 2–3 candidates. Fix specific problems locally. Lock when it meets its purpose — not when it’s perfect.


Answer three questions before generating anything:

  • What does this shot need to do? (Hero moment, B-roll, texture, transition)
  • What’s the minimum acceptable quality for this use?
  • How many generations am I willing to spend?

No criteria = no way to stop. That’s how a 20-minute pass becomes a 4-hour one.


Your first prompt should be short and incomplete. Its job is to give you something to react to — not to be the final answer.

Include: subject, action, emotional register, one visual reference. Leave out: background details, multiple focal points, technical parameters.

“Aerial city at dusk, warm amber light, slow push in, cinematic” beats a paragraph describing the same thing.

If you’re stuck on the prompt, use the ✨ Enhance button next to the prompt field in modelBridge — it knows which model and category you’re using and tailors the enhancement accordingly. Around $0.01 per use, requires an active license.


Your first pass is scouting. Don’t start with your most expensive model.

In modelBridge’s Browse tab, filter by generation category. Look for:

  • A lower cost badge above the Generate button
  • A shorter time estimate on the Generate button tooltip — appears after 3+ uses of that model
  • A model you’ve used before — learned constraints and time estimates already in place

Generate 3–5 seconds, not full duration. You’re evaluating direction, not quality.


Run 3–4 short variations before deciding anything. Change one thing between each — seed, one prompt phrase, or motion descriptor.

Use Dual Mode to test two model families simultaneously. Both run in parallel, compare results in split preview before anything touches your timeline. Without modelBridge: two browser tabs, two downloads, two manual imports.

Start a new generation while the first runs. Switch models in the panel and click Generate — both process in the Active Generations Panel while you keep editing. Note: background generations don’t survive a Premiere restart.

Evaluate in the preview panel for: composition, motion direction, vibe. Not detail, not anatomy — those come later.


Make the call now, not after three more rounds.

Keep: Composition is salvageable. Motion direction is right. Closer to the brief than anything else.

Drop: Fundamental direction is wrong. Or it’s “almost interesting” in a way you can’t articulate — the most expensive category to keep iterating on.

Use Save to Project Bin in modelBridge to keep candidates accessible without touching the timeline.


Two minutes of diagnosis prevents hours of aimless iteration.

Be specific: “Motion too fast in the first 2 seconds” — not “it doesn’t feel right.”

Then check what your model actually exposes. Scroll the modelBridge panel. Click ⓘ on any field — 726 hand-written parameter explanations with recommended starting values. If the control doesn’t exist, your levers are prompt, source material, and model choice.


When one element is wrong, fix that element — not the whole shot.

In modelBridge:

  • Wrong area in frame → search “inpaint” in Browse, use an inpainting model to fix that region only, preview before importing
  • Needs more detail → search “upscal” in Browse, run through an upscaling model first

In Premiere:

  • Sharpness → Effects panel → Video Effects → Blur & Sharpen → Sharpen or Unsharp Mask
  • Style inconsistency → Window → Lumetri Color, apply same preset across all AI clips
  • Text problems → generate background plate only in modelBridge, add real copy in Window → Essential Graphics
  • Grain to unify shots → File → New → Adjustment Layer, add Video Effects → Noise & Grain → Add Grain on top track

The self-learning validation system in modelBridge catches media constraint errors — image too small, wrong aspect ratio, video too long — before the API call on all future attempts. You never pay for the same constraint error twice.


You’re improving: Each generation is measurably closer to your criteria. You can say what changed and why it’s better.

You’re in AI thrash: Generations are different but not better. You’ve lost track of what the shot needs to do.

Rule: if three consecutive generations don’t move the shot forward — stop. Change something significant before generating again.


When you’ve chosen your candidate in the preview, click Import to Timeline. modelBridge routes automatically:

  • Source clip was selected → replaces it at exact timecode, track, and scale. Without modelBridge: delete source, import file, drag to exact timecode, scale manually.
  • No source clip selected → inserts at playhead on the best available track.
  • Audio or TTS result → routes to the first available audio track automatically.
  • First + End Frame → replaces both adjacent source clips as a single spanning clip.

After import, mark the shot locked in your project structure. Done — until a client revision changes the brief.


The cost badge above Generate updates in real time as you adjust parameters — duration, resolution, audio toggle. After generation, it flashes green with the actual fal.ai charge.

Tag generations to a client or project using the picker in the Generate tab. Every generation logs automatically to that client’s cost history. Export an HTML report from the Billing tab when the project wraps — KPIs, per-client breakdown, currency conversion included.


Agency pitch, 24-hour turnaround: Scout with a fast model at 3 seconds. Generate 3 seed variations. Use Dual Mode to test one alternative. Choose the winner. Refine once if needed. Lock. Total: 20 minutes of generation time.

Social campaign, 6 clips needed: Lock the first clip fully. Save the model, seed, and core prompt. Apply to clips 2–6 with minimal variation. Run as background generations while you keep editing.

Documentary, archival fill: First clip establishes the look. Every subsequent clip reuses the same model, grain descriptors, and seed range. Consistency is a decision made at the start, not the end.


Common failure modes — what to do when a specific element isn’t working.

Building a signature look — how to use seeds, LoRAs, and prompt consistency to keep a visual identity across a full project.