Source: “I Tested Higgsfield’s Official MCP In Claude” YouTube tutorial, ~10-min, youtube.com/watch?v=MMGVGA2DYro, fetched 2026-05-03. Auto-captions normalized for “Cowork” (caption: “Coowwork”), “Claude Code” (caption: “ClaudeCode” / “Cloud Code”), “Olipop” (caption: “Allipop”), “Sonnet” (caption: “Sonet”). Creator name not in transcript.

A practical operator tutorial for using Higgsfield’s official MCP from Claude Desktop to generate a 50-image Instagram ad campaign from a single product photo, with research → strategy → generation → delivery as the four phases. Sister tutorial to Robo Nuggets’ brand-launch walkthrough (same MCP, different use case — Robo Nuggets does brand book + storyboard + landing page; this one does ad-campaign at scale) and to Mike Futia’s DTC ad-agency workflow (Claude Code rather than Desktop, but same Higgsfield MCP backend). The most reusable lesson is the save-the-workflow-as-a-Skill move at the end: after one successful 50-ad run, /skill-creator packages the entire 4-phase workflow into a reusable /ad-creator skill that takes only a product image as input.

Key Takeaways

  • MCP setup is one-screen on Claude Desktop. Settings → Connectors → Add custom connector → name “Higgsfield” + Remote MCP Server URL higgsfield.ai/mcp → Connect → OAuth handshake (allows Claude to access identity, basic profile, email). Verification appears in chat after redirect. Distinct from the Robo Nuggets setup which used Antigravity IDE; this one uses Claude Desktop’s native Connectors UI directly.
  • Four-phase ad campaign workflow. Research → Strategy → Generation → Delivery. Single prompt sets all four phases at once and gates between them with approval steps. Author’s prompt explicitly instructed: (1) Playwright-driven Instagram research, (2) 50-ad matrix across three dimensions, (3) batch generation with progress shown, (4) organized delivery with AB-test recommendations + copy variations. Chain feeds Claude through each step with human approval gates at the matrix and pre-flight credit check.
  • Research uses Playwright MCP for live Meta Ads Library scraping. Author asked Claude to scrape Instagram ads matching the product category, identify 2026 trends, top-5 creative angles proven to convert, color compositions. Claude searched 260 Olipop ads (used as the demo product) from the Meta Ads Library, summarized findings into a brief before moving to phase 2. Operator gotcha (line 188–212 in source): Claude initially reported “Playwright MCP not installed in this project” despite the user having it installed locally — they had to clone microsoft/playwright-mcp into the project folder, restart Claude Desktop, and re-prompt. Source’s lesson: always verify Claude’s tool-availability claims because Claude Desktop only loads MCP tool schemas at session startup. If you skim the wrong assertion, Claude silently falls back to web fetch + WebSearch and the research quality plummets.
  • Strategy phase = 50-ad matrix across 3 dimensions. Five creative angles × five settings/backgrounds per angle × two aspect ratios (1:1 + 9:16) + a mood mix (bright/airy, moody, premium, vibrant, playful). Matrix presented as a table for human approval before any generation tokens are spent. The matrix doubles as the audit log — source treats this approval gate as the most important defensive step before burning credits.
  • Generation phase routes to two specific Higgsfield models. Nano Banana 2 for product-accurate compositions (the actual product image must look like the actual product). Soul 2 (Higgsfield’s human-model model) for any ad calling for humans holding/using the product. Generated in batches with progress shown per batch. Each ad gets a short caption + three hashtag suggestions baked in.
  • Pre-flight credit check is mandatory. Before firing the 50-ad generation, Claude lists workspaces (Higgsfield supports multiple workspaces via the list_workspaces tool), then confirms credit balance with the user. Author’s run consumed ~949 credits for the full 50. Operator implication: even with the Robo Nuggets-flagged gap of no per-call credit costs in the MCP, the pre-flight list_workspaces + balance check pattern is the manual workaround.
  • Generations land in the Higgsfield Community tab, not the main Image gallery. Operator gotcha (line 388–407 in source): the author was confused why the Image gallery only showed 8 generations after running 50. Generations from MCP land in Community tab → My Generations, not the main Image tab. The main Image gallery shows only generations from the Higgsfield desktop UI itself. Important to know if you’re trying to retrieve, organize, or share the output later.
  • Delivery phase = local download via Claude. Rather than manually downloading 50 images from the Higgsfield Community tab, prompt: “Download all 50 generations in separated batch folders into this project folder.” Claude wrote them as output/batch-1/ through batch-5/ with 1x1/ and 9x16/ subfolders inside each. Distinct from the Robo Nuggets brand-launch tutorial where outputs stayed in Higgsfield because the demo ended at the storyboard phase.
  • The decisive move — save as a /ad-creator skill via /skill-creator. After one successful run, the entire 4-phase workflow + every prompt detail + every approval gate gets packaged: /skill-creator → “Save this workflow as a skill that I can run anytime, name it ad-creator.” Refresh Claude Desktop → /ad-creator is now a single-step command that takes only a product image as input. This is the operator-track use of the multi-step-workflow-to-skill primitive — a one-time interactive build pays for itself the moment you have a second product to advertise. Pairs with the Skill Creator pattern (the meta-skill that builds every other skill).
  • Workflow can be extended further but author chose not to. Author explicitly noted: “You can also set up an automation to connect Claude to your ads manager, whatever platform that you’re using. I’m not going to go over that today because I’m not super familiar with pushing ads.” Natural next step: pair /ad-creator with Meta Ads CLI for a fully automated upload + schedule pipeline.
  • Spec-ad ethics flag. Author used Olipop as a hypothetical product; Claude raised IP concerns and asked for confirmation it was a spec-ad exercise. Operator note: when generating ads for a brand you don’t represent, label them “spec ad” in deliverables and don’t run them as live ads. Aligns with banned AI patterns for WEO Marketly internal use.
  • Settings used for the run. Project folder mounted (so Claude can write outputs to disk). Edit permissions: bypass-permissions to avoid waiting on per-step approvals (author’s stated preference for workflow setups). Model: Opus 4.7 on extra-high effort. Aligns with Opus 4.7 best practices (xhigh default for Pro/Max post-W15).

The 50-ad matrix (3 dimensions)

DimensionValuesWhy
Creative angle5 (e.g., product hero, lifestyle, social proof, problem-solution, novelty)Coverage breadth — different angles appeal to different audience segments
Setting / background5 per angle (e.g., farmer’s market, kitchen counter, beach, bathroom mirror selfie, office desk)Visual variety prevents creative fatigue when AB-testing
Aspect ratio2 (1:1 square + 9:16 vertical/Reels)Instagram feed (1:1) + Stories/Reels (9:16) — covers both placements
Mood overlay5 mixed across the 25 (1:1) and 25 (9:16) variantsBright/airy, moody, premium, vibrant, playful — aesthetic diversification within constant product

Result: 5 × 5 × 2 = 50 distinct ads. Approved as a table before any generation runs.

Implementation

Tool/Service: Higgsfield MCP (official, via higgsfield.ai/mcp) + Claude Desktop. Optional: Playwright MCP (microsoft/playwright-mcp) for the research phase. Optional: project folder mounting for local downloads.

Setup:

Claude Desktop → ☰ → Settings → Connectors → Add custom connector
  Name: Higgsfield
  Remote MCP Server URL: higgsfield.ai/mcp
  → Connect → OAuth → Allow → verify "Connected to Higgsfield" in chat

For Playwright MCP (if research phase fails):

Drop into Claude chat: "clone github.com/microsoft/playwright-mcp and install Playwright"
→ Restart Claude Desktop (MCP tool schemas only load at session start)
→ Resume previous chat

Cost:

  • Higgsfield subscription required (auth ties to subscription tier, not pay-as-you-go credits).
  • Per-call credit cost not exposed in MCP yet (operator gotcha — workaround: maintain side-table from Higgsfield pricing page, or do the pre-flight list_workspaces + balance check).
  • Author’s full run used ~949 credits for 50 images (Nano Banana 2 + Soul 2 mix).

Integration notes:

  • Mount a project folder before generating so the Delivery phase can write outputs locally. Otherwise generations live only in the Higgsfield Community tab.
  • Use bypass-permissions for the Edits permission level if you don’t want per-step approvals. Don’t combine with destructive operations.
  • Model: Opus 4.7 extra-high for the orchestration session (Claude is doing 4 chained phases with research, strategy, batched generation, and delivery — heavy reasoning load).
  • Generations appear in Higgsfield Community tab → My Generations, not the main Image gallery. Save the URL bookmark for direct access.
  • Save as /ad-creator skill via /skill-creator after one successful run. Future runs: drop a product image + /ad-creator → reuse the entire workflow.
  • Reference image vs product image: the saved skill prompts for an image at the start; pass the product photo to use it as a reference for every generated ad.

Try It

  • Run a small-scale variant first: 10-ad matrix (2 angles × 1 setting × 5 moods × 1 aspect ratio) on your own real product to validate quality before committing to 50.
  • Verify Playwright MCP availability in your Claude Desktop project before starting; do not skim Claude’s tool-availability claims (the source’s most-emphasized warning).
  • Pair /ad-creator with Meta Ads CLI for end-to-end generate-and-upload. The /ad-creator skill produces files; Meta Ads CLI consumes them via campaign/adset/ad/creative endpoints.
  • Compare with the Robo Nuggets brand-launch walkthrough before deciding which use case is closer to a real client need: brand-launch (one-off, deeper, multimedia) vs ad-campaign-at-scale (repeatable, shallower per-ad, single medium).
  • For WEO Marketly operator track: run this as the first hands-on Higgsfield MCP exercise after the Intermediate Course — it’s tighter than the brand-launch demo and converts faster to client-deliverable templates.

Open Questions

  • Author identity not in transcript. Tone consistent with a Claude Code creator-style YouTube channel; signs off “feel free to follow… see you guys in the next one.” No name shown. Worth identifying via channel page if this becomes a load-bearing reference.
  • Source claims /ad-creator works on its own without re-prompting. The full workflow is supposed to run autonomously after the skill is saved. Worth a re-run to verify approval gates are preserved (or whether they get baked-in as auto-approve when saved).
  • /skill-creator MCP behavior with multi-tool chains. Source shows /skill-creator correctly captures Playwright + Higgsfield MCP tool calls into the saved skill. Open question whether this works for arbitrary chains across 3+ MCPs.
  • Does the saved /ad-creator skill bake in the bypass-permissions setting or does it inherit the user’s session preferences? Source doesn’t say.
  • Aspect-ratio coverage. 1:1 + 9:16 covers feed and Stories/Reels but not 4:5 (portrait feed-optimized). Adding 4:5 would push to 75 ads — open whether the Higgsfield model surface supports the third aspect natively.