Runway
Runway Aleph is V2V-only — it operates on existing footage and preserves the original camera and timing exactly. While its primary use is style transfer and reference-driven re-render, Aleph can also be used for extension by pairing the source clip with a continuation reference image and a directional prompt. The output picks up the source's motion vector while honoring the new reference. For an editor who needs a hero shot lengthened with a specific look (a brand seasonal pivot mid-clip), Aleph is the most controllable extend option.
On Martini's canvas, drop the source clip into a video reference node. Then place a separate image reference (photo, painting, or storyboard frame) that defines the look the extension should land on. Aleph uses the image reference to bias the new footage's style while preserving the source's motion timing.
Aleph takes the source as a timing anchor. Trim the source to the segment immediately before the extension target, then route into the Aleph node along with the look reference. Aleph generates the new continuation that picks up at the source's end frame while shifting toward the reference image's style.
Aleph reads prompts as direction-giving rather than scene-defining. "Continue the camera move forward, transition the scene from afternoon golden hour into early evening blue hour, character's wardrobe shifts to darker tones." Aleph respects the source motion exactly and applies the directional shifts gradually across the extension.
Aleph is the right pick when the extension is not just "more of the same" but introduces a creative shift: time-of-day transition, palette change, mood pivot. Pixverse Extend cannot do this — it only preserves. Wan can attempt it but gives looser control. For the seasonal mid-clip pivot or an emotional shift in a narrative extension, Aleph delivers the tightest creative output.
On the canvas, route the source and the Aleph extension into the sequence builder. Because Aleph applies directional shifts gradually, the cut between source and extension is invisible — the look transition happens across several frames inside the extension. No cross-fade needed.
When the extension does not land cleanly, swap the look reference image rather than rewriting the prompt. Aleph reads visual references more strongly than language. If the seasonal pivot is supposed to feel cooler, swap to a cooler-toned reference image; if warmer, swap to a warmer one. Each reference swap is a single Aleph re-render.
Time-of-day pivot extension. Aleph applies the shift gradually across the extension, not as a hard cut.
Continue the camera move forward, transition the scene from afternoon golden hour into early evening blue hour, character wardrobe shifts to darker tones
Seasonal pivot. Useful for brand campaigns crossing seasonal boundaries in a single shot.
Continue forward dolly, the foliage shifts from autumn orange to winter snow, character's breath becomes visible
Lighting pivot for product hero shots. Aleph respects the rotation timing while changing the look.
Continue the slow turntable rotation, the lighting shifts from clean studio white to dramatic side rim, the product reveal intensifies
Aleph is V2V-only — feed it a source clip plus a look reference image, never text alone.
Use Aleph when the extension needs a creative pivot (time, palette, mood); use Pixverse Extend for pure preservation.
Iterate on the look reference image, not the prompt — Aleph reads visual references more strongly than language.
Aleph preserves original camera and timing exactly; the directional pivot happens across the extension gradually.
Stitching is invisible because the look transition is inside the extension — no cross-fade needed.
Runway Aleph outputs at the source resolution and timing — the model is built for V2V transformation, so motion fidelity is exact. Render times: 90-180 seconds per extension. Aleph's strength here is creative-pivot extensions that Pixverse cannot do (same-look only) and Wan does loosely. For a hero shot that needs to lengthen with a seasonal or palette pivot, Aleph is the cleanest creative-control extension. For pure preservation, switch to Pixverse Extend; for budget, switch to Wan.
Connect Runway Aleph with other AI models on Martini's infinite canvas. No GPU required — start free.
Get Started FreeByteDance
Pixverse Extend is the specialist for one job and one job only: take a video clip and seamlessly continue it. It does not generate from text, it does not animate a still image, and it does not edit content — its single purpose is preserving original motion, style, and lighting in the continuation. For an editor who inherits a 5-second AI clip that needs to be 12 seconds, Pixverse drops onto the canvas as a downstream V2V node and extends the footage without re-prompting from scratch.
View guideAlibaba
Wan 2.6 extends clips by chaining its general I2V mode off the source clip's last frame — a budget-friendly approach when Pixverse Extend is overkill or you want creative control over the extension. Drop the source into a frame-extraction tool node, pipe the last frame into Wan 2.6 with a continuation prompt, and the model generates new footage that picks up where the original left off. For high-volume production where credits matter and the extension can absorb a small style shift, Wan is the practical pick.
View guide