Source: Hyperframes GitHub repo, Hyperframes prompting guide

Hyperframes is HeyGen’s open-source video rendering framework (Apache 2.0, 2.6k stars) where video compositions are written as plain HTML with data-* attributes and rendered to MP4 via Puppeteer + FFmpeg. It’s the composition layer — motion graphics, captions, transitions, TTS, and layout, authored by an AI agent. The framework ships installable skills (npx skills add heygen-com/hyperframes) that teach Claude Code, Cursor, Codex, and Gemini CLI the composition rules so the agent gets HTML, timelines, and GSAP easing right on the first prompt.

Key Takeaways

  • Tagline: “Write HTML. Render video. Built for agents.” Compositions are plain HTML with data-start / data-duration / data-track-index attributes — no proprietary DSL, no React components.
  • Three slash commands installed by one command. npx skills add heygen-com/hyperframes registers /hyperframes (composition), /hyperframes-cli (init/lint/preview/render/transcribe/tts), and /gsap (animation API) into Claude Code / Cursor / Codex / Gemini CLI.
  • Non-interactive CLI by design. npx hyperframes init | preview | render is agent-friendly — no prompts, no TTY interaction, deterministic outputs (identical inputs → identical outputs) for automated pipelines.
  • Frame Adapter pattern. Supports multiple animation runtimes side-by-side: GSAP, Lottie, CSS animations, Three.js. One composition can mix them.
  • Six-package monorepo. hyperframes (CLI), @hyperframes/core (types/parsers/runtime), @hyperframes/engine (Puppeteer + FFmpeg capture), @hyperframes/producer (full pipeline), @hyperframes/studio (browser editor), @hyperframes/player (embeddable web component).
  • Two prompt shapes for agents. Cold start — describe duration, aspect ratio, mood, key elements from scratch. Warm start — feed a URL / PDF / CSV / transcript and ask for synthesis (e.g., “turn this changelog into a 30s release announcement”).
  • Natural-language easing vocabulary. smooth, snappy, bouncy, springy, dramatic, dreamy map to GSAP eases (power2.out, power4.out, back.out, elastic.out, expo.out, sine.inOut). Timing shorthand: fast 0.2s / medium 0.4s / slow 0.6s / cinematic 1–2s.
  • Caption tone presets. Hype, Corporate, Tutorial, Storytelling, Social — each defines typography, animation, and size ranges. Per-word styling is supported (“make brand names larger with accent color”).
  • Local Kokoro TTS, no API key. Voices bucketed by content type (product demo / tutorial / marketing). Requestable in natural language: “British male voice at 1.1× speed.”
  • Seven technical rules the skill enforces automatically. Register timelines on window.__timelines, videos must be muted (audio in separate <audio> elements), no Math.random(), synchronous timeline construction (no async/await in GSAP setup), timed elements need class="clip", entrance animations on every scene, scene-to-scene transitions.
  • 50+ prebuilt blocks. Shader transitions, social overlays, data visualizations, cinematic effects — npx hyperframes add, browsable at hyperframes.heygen.com/catalog.
  • Node.js ≥ 22 + FFmpeg. Those are the only runtime dependencies.
  • Apache 2.0. Forkable, self-hostable, commercial use permitted.

Why It Matters

  • Closes the agent-authoring gap for motion graphics. Generating motion graphics from prompts has historically required hand-rolling prompt patterns against a React-based framework. Hyperframes ships that as an official skill with enforced conventions — the agent gets the rules before it writes the first line.
  • HTML is a more portable target than React. Every team member can read a Hyperframes composition without understanding component state or JSX. Lowers the review bar for non-engineering collaborators.
  • Deterministic rendering is a scheduled-agent-friendly contract. Same HTML + assets → same MP4. That’s what you need to wire video rendering into a scheduled agent: no flaky replays, no “why is this frame different.”
  • Non-interactive CLI fits the Claude Code / subagent model. A subagent can run npx hyperframes render without a human in the loop — contrast with tools that require clicking through a UI.

Implementation

Tool/Service: HeyGen Hyperframes (open source, github.com/heygen-com/hyperframes, Apache 2.0)

Setup (agent workflow, recommended):

  1. In the project directory, run:

    npx skills add heygen-com/hyperframes

    This installs the /hyperframes, /hyperframes-cli, and /gsap skills into Claude Code.

  2. Scaffold a new video project:

    npx hyperframes init my-video
    cd my-video
  3. Open the project in Claude Code. Prompt with the slash command explicitly — e.g.

    /hyperframes Create a 10-second product intro, 9:16 vertical, warm grain analog mood, fade-in title over a dark background, lower third at 0:03 with tagline, Kokoro af_heart voiceover reading the script.

  4. Preview live in a browser while iterating:

    npx hyperframes preview
  5. Iterate with small targeted prompts — “make the title 2x bigger,” “swap to dark mode,” “move the captions up.” Avoid re-specifying from scratch.

  6. Final render:

    npx hyperframes render --output final.mp4

Setup (manual, no agent): Steps 2, 4, 6 work standalone — but you write the HTML by hand. The agent path is the intended one.

Cost:

  • Hyperframes itself: free (Apache 2.0). ^[inferred — pricing not on the GitHub or prompting-guide pages]
  • Rendering compute: your local machine (Puppeteer + FFmpeg). ^[inferred]
  • Kokoro TTS: free, runs locally, no API key.

Integration notes:

  • Node.js ≥ 22 and FFmpeg are required. Expect a one-time install bump on older dev boxes.
  • Works with Claude Code, Cursor, Gemini CLI per the repo; Codex works per the prompting guide.
  • Deterministic rendering means compositions can be version-controlled and re-rendered from any commit — useful for asset refresh cycles.
  • The @hyperframes/player web component is the path for embedding a live-previewable composition in a browser-based review tool. ^[inferred]
  • Plain-HTML output should be much easier for non-engineering reviewers to comment on line-by-line than Remotion’s React tree.

Prompting Patterns Worth Stealing

Applicable to any motion-graphics workflow even if you don’t adopt Hyperframes:

  • Natural-language easing vocabulary (smooth, snappy, bouncy, etc.) is a clean abstraction for briefing any motion tool. Works for Remotion and After Effects agents too.
  • Caption tone presets (Hype / Corporate / Tutorial / Storytelling / Social) are a useful axis for any content-tone rubric.
  • Energy-based transition picker (Calm / Medium / High) is a more intuitive control than naming specific transition types. Rules out slow crossfades that read as generic when you want “high energy.”
  • Audio-reactive intensity guidance — 3–6% for text, 10–30% for backgrounds — is concrete and testable. Worth cribbing for any audio-driven motion work.

Update — May 2026 (V2 capabilities, per Jay / Robo Nuggets walkthrough)

Source: Claude + Hyperframes V2 automates video editing like never before — NEW Skill (Jay, Robo Nuggets, May 6 2026).

Hyperframes V2 ships four capability extensions the original prompting guide didn’t cover:

  • Subtitles — automatic subtitle layer with style presets matching the existing caption-tone vocabulary.
  • Floating card animations — composable card overlays with entrance + dwell + exit timings (similar to broadcast lower-thirds).
  • Smooth animations + motion graphics — automatically applied; the agent gets motion-graphics-as-default rather than as opt-in element.
  • 3D assets — visually-striking 3D elements composable into 2D HTML compositions.

Hyperframes runs across coding agents. Jay confirms the skill works equally well across Claude Code, Hermes Agent, and Open Claude — same SKILL.md, same primitives, swap the runtime. Validates the cross-runtime SKILL.md design pattern from skills applied at the third-party level.

Adoption signals. HeyGen team has been “actively shipping a lot of updates” — Hyperframes for Hermes, hyperframes.dev community website, all of HeyGen’s own promotional videos using Hyperframes. Robo Nuggets ships its own Hyperframes helper skill that pre-loads installation + best-use-cases context for any agent.

Use-case framing from Jay: podcast intros, promotional announcement videos one-shot from a prompt, branded explainer assets — all via talking to the agent with no code review needed.

Try It

  1. In a throwaway project directory: npx skills add heygen-com/hyperframes and npx hyperframes init smoke-test. Confirm the three slash commands appear in Claude Code. Render a 5s greeting from a cold-start prompt.
  2. Try the warm-start pattern on any source: feed a blog post, ask for a 45s pitch video.
  3. Read the prompting guide in full once — the caption/tone and easing tables are the highest-value cribs even outside Hyperframes itself.

Open Questions

  • Is hyperframes.heygen.com a hosted service, or just marketing + docs? The repo implies local rendering only; a hosted tier would change the cost model.
  • Does the /hyperframes skill play nicely alongside other Claude Code skill ecosystems (design skills, wiki skills)? No known conflicts, but an install-time test is warranted.
  • What’s the render performance vs Remotion on the same hardware for a ~30s composition? Switching decision may hinge on this.
  • Catalog contents: what prebuilt blocks exist by default, and at what vertical coverage? Not inventoried during this ingest.