Source: ai-research/cowork-apify-scraping-recipe-notion-mcp-2026-05-02.md (Notion AI Recipe at notion.so/jonathonc/How-to-Scrape-Any-Website-with-Claude-Cowork-dc325679607683f1a2df016e3071fa43, fetched via Notion MCP 2026-05-02 with toggles fully expanded; March 16 2026; 5-minute recipe; lives inside an “AI Recipe Vault” → “AI Recipes” Notion database alongside the Cowork Getting Started walkthrough; British spelling confirms UK-authored origin)

A no-code Cowork recipe: connect the Apify connector to Claude Cowork, then chain Cowork prompts to scrape local-business directories, do “vibe prospecting” on individuals at those companies, scrape LinkedIn job descriptions, and synthesize a PR-ready research report on AI-skill demand. Now captures the full step-by-step prompts (Notion toggle internals) — the earlier Tavily-only fetch had the outline; the Notion MCP refetch added the actual keystrokes.

Key Takeaways

  • The combo: Cowork + Apify connector. Claude Cowork is the orchestration layer; Apify is the scraping engine. The user prompts Cowork (“scrape local plumbing businesses in South West of England with contact details”); Cowork calls Apify actors via the connector; results come back into Claude as structured data ready for follow-up prompts. No code, no Apify-actor selection by hand — Cowork picks the right actor.
  • Five concrete deliverables in one ~5-minute session. Apify connected to Cowork, a scraped local-business list with contact info + websites, a vibe-prospecting deeper-dive on specific people at those companies, a LinkedIn job-description scrape, and a synthesized AI-skills-demand research report.
  • Prerequisites are real. Claude Pro with Cowork access, Claude Desktop installed, an Apify account (free tier starts the workflow), and the Apify API key from Apify account settings. Apify free credits run out fast on production-scale scrapes — the recipe explicitly flags credit monitoring as a top tip.
  • “Vibe prospecting” is the distinctive frame. Not just scraping a list — once you have companies, prompt Cowork to research specific individuals at those companies (titles, public profiles, recent posts) and qualify them by fit. Sits between bulk scraping and 1:1 manual research.
  • PR-ready research report is the killer-app step. Step 6 reframes the LinkedIn scrape as a data-journalism asset — “how many jobs actively require AI skills, by region/sector/seniority” — the kind of output that goes into an agency PR pitch or a thought-leadership LinkedIn post. Reusable as a quarterly recurring report.
  • British spelling throughout (personalised, optimisation, “South West of England” example) suggests this is a UK-authored AI Recipe template duplicated into the workspace, not WEO-internal documentation. Treat as a curated reference rather than agency-tested SOP.

The Six Step Prompts (Verbatim)

Re-fetched via Notion MCP after the initial Tavily-only fetch missed the toggle internals. Use these as a reusable Cowork starter set.

Step 1 — Connect Apify to Claude Cowork. UI walkthrough: Cowork tab → + button → Connectors → search Apify → paste Apify API key → confirm connection.

Step 2 — Quick Tour of Apify. 2-minute browse of store.apify.com to see the available “actors” (scrapers): Google Maps businesses / websites + landing pages / social media / e-commerce product listings / job boards / and more. Cowork calls actors directly via the connector — no manual install.

Step 3 — Scrape Local Business Data.

Use Apify to scrape 20 solar panel installation companies in the South West of England. (e.g., Bristol, Bath, Exeter, Plymouth, Gloucester, Cheltenham). For each firm, collect the following details: business name, website, contact person (if available), email address, phone number, and location Put everything into a clean spreadsheet.

Claude picks the Google Maps actor automatically.

Step 4 — Vibe Prospecting.

Take the list of solar panel companies we just scraped. For each one, use Vibe Prospecting to find the key decision makers — founders, directors, or managing directors. Get their names, job titles, LinkedIn profiles if available, and any other useful contact information. Add this to the spreadsheet alongside the company data.

Cross-references company websites + LinkedIn to enrich the spreadsheet from “list of companies” to “person you need to talk to.”

Step 5 — LinkedIn Scrape (Live Test).

Use Apify to scrape LinkedIn for 100 administratvie job listings in the UK. I want roles across different industries. Pull the job title and full job description for each listing. Get 50 job lisitngs.

(The source contains the typos administratvie and lisitngs. plus an internal 100-vs-50 inconsistency — flagged here, fix locally before running.)

Step 6 — Build the AI Skills Research Report.

Analyse all the job descriptions we scraped from LinkedIn. I want to know:

1. How many mention AI skills as a requirement (ChatGPT, Claude, prompt engineering, AI tools, etc.)
2. Break it down by industry/sector
3. Break it down by seniority level
4. What specific AI skills are mentioned most often
5. Any patterns or trends worth highlighting

Turn this into a short, punchy research report that could be used for a press release. Include the key stats, a summary of findings, and 2-3 quotable takeaways. Frame it around the headline: "X% of UK job listings now require AI skills."

Output: PR-ready report ready for a press release, YouTube video, or LinkedIn post.

Optimisation Tips (Verbatim)

  • Start with a small scrape first — test with 10-20 results before going big. Make sure the data format is what you need
  • Be specific with locations — “South West of England” works better than just “UK” for local business scraping
  • Expect imperfection — web scraping is inherently messy. Some results will be incomplete or duplicated. That’s normal
  • Watch your Apify credits — large scrapes burn through free tier credits quickly. Monitor your usage in the Apify dashboard
  • Save your data as you go — ask Claude to export to a spreadsheet after each step so you don’t lose progress
  • Combine with other connectors — once you’ve got prospect data, use the Gmail connector to draft personalised outreach emails

Forward Applications

The page lists five extensions of the workflow:

  • Personalised outreach — feed prospect data into Claude + Gmail connector to draft tailored emails (the Cowork-native pairing)
  • Competitor analysis — scrape competitor websites, pricing pages, product features. Direct overlap with Clawdbot’s 8-channel intel surface but with Cowork’s chat-first orchestration instead of Clawdbot’s structured pipeline
  • Content research — scrape industry forums, Reddit threads, review sites for content ideas. Pairs with the MEGA PROMPT CHEST Forensic Psychology Analysis prompt (Reddit text → Customer DNA Profile + tripwires)
  • Market research — scrape job boards in your niche for hiring trends. The recipe’s own step 6 is one instance of this
  • PR campaigns — recurring quarterly run of the AI-skills-demand report as a data story. Natural fit for a Claude Cowork Routine / scheduled task

Caveats

  • Apify cost is the constraint. Free tier covers exploration; production scrapes burn credits. Worth pricing the recurring-report step at expected volume before committing to it as a quarterly deliverable.
  • LinkedIn scraping legality varies. LinkedIn’s terms prohibit automated scraping; the recipe’s “warts and all” framing acknowledges this. Use for prospecting research, not bulk data resale; consult WEO AI governance before any client work that depends on it.
  • Source provenance is curated, not authoritative. The page lives in an “AI Recipe Vault” Notion database in jonathonc’s workspace (UK-authored AI-Recipe template format), not as a Claude.com or Anthropic-published document. Treat findings as starting points, not best-practice claims.
  • Source has typos in step 5. administratvie and lisitngs. plus an internal 100-vs-50 contradiction. Fix before running.

Try It

  1. First-pass dry run — sign up for an Apify free account, connect to Cowork, and run step 3 against a small test query (10-20 results) in your local market. Confirm the data shape matches what you’d want for outreach.
  2. Vibe-prospecting iteration — pick three companies from step 3 output and prompt Cowork to research specific individuals (founders, marketing leads). See how Cowork reasons over public profiles + recent posts to qualify fit.
  3. AI-skills report at WEO scale — adapt the step-6 prompt to scrape “AI marketer / AI strategist” listings across Portland / dental-marketing / mid-size-agency niches; produce a quarterly trend report. Pairs with the 2026 business-demand field data article as a methodology cross-check.

Open Questions

  • Author attribution. Page lives inside a Notion “AI Recipe Vault” → “AI Recipes” database in jonathonc’s workspace; British spelling + AI-Recipe template format suggest an external UK-authored origin. The companion Getting Started Cowork recipe cites three YouTube videos that may identify the creator (youtu.be/ZeWfksNXlbU, youtube.com/watch?v=O8Li2AIc_Ps, youtube.com/watch?v=1GZYNyuL6Lg) — backtrace if the recipe becomes load-bearing for client work.
  • Apify-credit pricing for the WEO use case. What does a quarterly LinkedIn-scrape + AI-skills-report cost in Apify credits at WEO Marketly’s expected volume? Worth pricing before committing to the recurring deliverable.
  • More AI Recipes in the parent vault. The parent database (collection://b5025679-6076-8201-a321-877a8516d309) likely contains other Cowork recipes worth ingesting. Flag for follow-up notion-search against the workspace.