Back to Blog AI Engineering

Building a Real-Time Job Dashboard with Claude + Apify

April 2026 10 min read Srikanth Badavath


Job searching is painful. You open LinkedIn, set filters, and manually scan through hundreds of postings — only to find most were posted weeks ago and already have 500+ applicants. What if Claude could do all of that for you, scrape the results, and hand you a clean interactive dashboard in one shot?

This post walks through exactly that: using Claude's Apify connector to scrape LinkedIn jobs in real time and render them as a fully filterable, sortable HTML dashboard — with the complete prompt you can use right now.

What Is the Apify Connector in Claude?

Claude has a built-in Apify MCP (Model Context Protocol) connector that lets it call Apify actors directly — no coding required. Apify is a web scraping platform with thousands of pre-built actors (scrapers) for LinkedIn, Google, Amazon, and more.

When you give Claude access to the Apify connector, it can:

Setup required: Go to Claude.ai → Settings → Integrations → Apify and connect your Apify account. You'll need a free Apify account and an API token. The LinkedIn jobs actor costs approximately $0.50–$1.00 per 100 results on Apify's pay-per-result model.

The Full System Architecture

flowchart LR User(["You\n(Claude chat)"]) Prompt["Prompt Claude\nwith role + level\n+ location"] Claude["Claude\n(with Apify connector)"] Apify["Apify Platform\ncurious_coder/\nlinkedin-jobs-scraper"] LinkedIn["LinkedIn\nJobs Search\n(100+ results)"] JSON["Structured JSON\njob listings"] HTML["Interactive\nHTML Dashboard\nfilter · sort · search"] You2(["You\nopen in browser"]) User --> Prompt Prompt --> Claude Claude -->|"calls actor\nwith search URL"| Apify Apify -->|"scrapes"| LinkedIn LinkedIn -->|"returns results"| JSON JSON -->|"back to Claude"| Claude Claude -->|"renders"| HTML HTML --> You2

How Claude Prompting Works Here

The key insight is that Claude isn't just fetching data — it's acting as a workflow orchestrator. The prompt gives Claude three things:

  1. Configuration variables — role, level, location (parameterized so you can reuse the prompt)
  2. Tool invocation instructions — which Apify actor to call, what URL to pass, how many results to fetch
  3. Output specification — exactly what the HTML dashboard should look like, column by column

This is a pattern called structured tool-use prompting. You're not asking Claude to write code you then run — you're asking Claude to call a live tool and process real data in a single turn.

Understanding the LinkedIn Search URL Parameters

The Apify actor takes a LinkedIn jobs search URL. Understanding the URL parameters lets you customize precisely:

https://www.linkedin.com/jobs/search/?keywords=Software+Engineer
  &f_E=2          ← seniority level filter
  &f_TPR=r86400   ← time posted: last 24 hours (86400 seconds)
  &f_JT=F         ← job type: Full-time (optional)
  &f_WT=2         ← work type: Remote (optional, 1=On-site, 2=Remote, 3=Hybrid)
  &position=1
  &pageNum=0
ParameterValueMeaning
f_E=1InternshipInternship level
f_E=2Entry level0–2 years experience
f_E=3AssociateEarly career
f_E=3%2C4Mid-SeniorMid + Senior combined
f_E=5DirectorDirector level
f_TPR=r3600Last hourVery fresh postings
f_TPR=r86400Last 24hDaily fresh postings
f_TPR=r604800Last weekWeekly postings
f_WT=2RemoteRemote jobs only

The Complete Prompt

Copy this entire prompt into Claude (with the Apify connector enabled). Change the 3 config variables at the top:

-------------------------------------------------------------
CONFIGURE THESE BEFORE RUNNING:
  ROLE     = Software Engineer
  LEVEL    = Entry level
  LOCATION = United States
-------------------------------------------------------------

Using the Apify connector, search LinkedIn for [ROLE] jobs posted
in the last 24 hours at the [LEVEL] seniority level in [LOCATION].

Use the actor: curious_coder/linkedin-jobs-scraper

Pass this search URL to the actor:
https://www.linkedin.com/jobs/search/?keywords=[ROLE]&f_E=2&f_TPR=r86400&position=1&pageNum=0

(Level filter reference:
  Entry level   = f_E=2
  Early career  = f_E=3
  Mid-Senior    = f_E=3%2C4
  Senior        = f_E=4
  Internship    = f_E=1)

Fetch at least 100 results.

Then present ALL results as a styled, interactive HTML file with
these exact columns:
  # | Job Title (linked) | Company | Location | Type | Level | Salary | Applicants | Posted | Link

Requirements for the HTML output:
- Dark background (#0e0f31), monospace font (JetBrains Mono or Fira Code)
- Clean grid/table layout with subtle row hover highlight
- Color-coded employment type tags:
    green  = Full-time
    amber  = Contract
    blue   = Part-time
    purple = Internship
- Flag any posting with 200+ applicants with a fire emoji
- Search bar at the top: filters rows live by title, company, or location
- Dropdown filters for: Employment Type, Seniority Level, Remote/On-site
- Clickable column headers to sort ascending/descending (show arrow indicator)
- Each job title links directly to the LinkedIn posting URL
- "View →" button in the last column also links to the posting
- Footer shows: total job count + "Fetched on [current date and time]"
- Export button: downloads the table as a CSV file

Do not summarize. Return the complete, self-contained HTML file
as your output. All CSS and JavaScript must be inline.

What Claude Actually Does Step by Step

sequenceDiagram participant You participant Claude participant Apify participant LinkedIn You->>Claude: Paste prompt with ROLE=Data Scientist Claude->>Claude: Parse config variables Claude->>Claude: Build LinkedIn search URL Claude->>Apify: Call curious_coder/linkedin-jobs-scraper Note over Claude,Apify: actor input: {startUrl: "linkedin.com/jobs/search/?..."} Apify->>LinkedIn: Scrape search results pages LinkedIn-->>Apify: Raw HTML job listings Apify-->>Claude: JSON array of 100+ job objects Claude->>Claude: Parse JSON, extract columns Claude->>Claude: Generate HTML with inline CSS+JS Claude-->>You: Complete HTML file (copy → save → open in browser) You->>You: Interactive dashboard in browser

What the JSON Output Looks Like

The Apify actor returns structured JSON per job. Here's what each object contains:

{
  "id": "3876543210",
  "title": "Software Engineer, Machine Learning",
  "company": "Meta",
  "location": "Menlo Park, CA",
  "employmentType": "Full-time",
  "seniorityLevel": "Entry level",
  "salary": "$130,000/yr - $180,000/yr",
  "applicantsCount": 347,
  "postedAt": "2026-04-03T08:14:00Z",
  "jobUrl": "https://www.linkedin.com/jobs/view/3876543210/",
  "description": "We are looking for a Software Engineer..."
}

Claude receives an array of 100+ of these objects, then generates the entire dashboard HTML in one shot — no post-processing needed.

The Dashboard HTML Structure Claude Generates

The prompt instructs Claude to build a self-contained HTML file. Here's the skeleton of what it produces:

<!DOCTYPE html>
<html>
<head>
  <style>
    /* Dark theme */
    body { background: #0e0f31; color: #e0e0e0; font-family: 'JetBrains Mono', monospace; }
    .search-bar { width: 100%; padding: 1rem; background: #1a1a2e; border: 1px solid #333; color: #fff; }
    .filters { display: flex; gap: 1rem; margin: 1rem 0; }
    table { width: 100%; border-collapse: collapse; }
    th { cursor: pointer; background: #1877F2; color: #fff; padding: 0.8rem; }
    th:hover { background: #0f46a2; }
    th.asc::after  { content: ' ↑'; }
    th.desc::after { content: ' ↓'; }
    tr:hover td { background: rgba(255,255,255,0.05); }
    .tag-full    { background: #0d6e4e22; color: #00c2a0; border: 1px solid #00c2a0; }
    .tag-contract{ background: #b4530922; color: #f39c12; border: 1px solid #f39c12; }
    .tag-intern  { background: #42017722; color: #a855f7; border: 1px solid #a855f7; }
    .hot { font-size: 1.2rem; }  /* 🔥 for 200+ applicants */
    .view-btn { background: #1877F2; color: #fff; padding: 0.3rem 0.8rem; border-radius: 4px; }
  </style>
</head>
<body>
  <h1>Software Engineer Jobs — Entry Level — United States</h1>
  <input class="search-bar" placeholder="Search by title, company, location..."
         oninput="filterTable(this.value)">
  <div class="filters">
    <select onchange="filterType(this.value)">...</select>
    <select onchange="filterLevel(this.value)">...</select>
    <button onclick="exportCSV()">Export CSV</button>
  </div>
  <table id="jobsTable">
    <thead>
      <tr>
        <th onclick="sortTable(0)">#</th>
        <th onclick="sortTable(1)">Job Title</th>
        <th onclick="sortTable(2)">Company</th>
        <th onclick="sortTable(3)">Location</th>
        <th onclick="sortTable(4)">Type</th>
        <th onclick="sortTable(5)">Level</th>
        <th onclick="sortTable(6)">Salary</th>
        <th onclick="sortTable(7)">Applicants</th>
        <th onclick="sortTable(8)">Posted</th>
        <th>Link</th>
      </tr>
    </thead>
    <tbody id="tableBody">
      <!-- Claude populates 100+ rows here -->
    </tbody>
  </table>
  <footer>Total: 127 jobs | Fetched on Apr 3, 2026 at 3:45 PM</footer>
  <script>
    // Sort, filter, and export logic — all inline
    function sortTable(col) { ... }
    function filterTable(query) { ... }
    function exportCSV() { ... }
  </script>
</body>
</html>

Prompt Variations for Different Use Cases

The base prompt is configurable. Here are the key variations:

Use CaseChange This in the Prompt
Remote jobs onlyAdd &f_WT=2 to the search URL
Last week instead of 24hChange f_TPR=r86400 to f_TPR=r604800
Multiple rolesRun the prompt twice with different ROLE values
Specific cityAdd &location=San+Francisco%2C+CA to the URL
Exclude certain companiesAdd a post-processing instruction: "Remove rows where Company is [X]"
Sort by least applicantsAdd: "Default sort by Applicants ascending (lowest competition first)"
Add AI match scoreAdd: "Add a 'Match %' column — score each job 0-100 based on [your skills]"

Advanced: Ask Claude to Score Jobs Against Your Resume

This is where it gets powerful. After getting the job data, you can ask Claude to rank them:

After fetching the jobs, add a "Match %" column to the table.
Score each job 0-100 based on how well it matches this candidate profile:

Skills: Python, PyTorch, SQL, AWS, distributed systems, React
Experience: 2 years data science, 1 year ML engineering
Education: MS Computer Science (Virginia Tech)
Preferences: Remote preferred, min $120k, ML/AI roles weighted higher

Color-code the Match % column:
  90-100% → green background
  70-89%  → yellow background
  below 70% → no highlight

Sort the table by Match % descending by default.

Claude will score each job based on the description text returned by the scraper and color-code the dashboard accordingly — effectively acting as a recruiter filtering the list for you.

Why This Approach Works

What makes this pattern powerful isn't the scraping — it's the prompt chain:

  1. Tool use — Claude calls Apify, which calls LinkedIn. No browser automation, no Selenium, no API key negotiations.
  2. Structured output — the JSON → HTML transformation is specified precisely in the prompt. Claude doesn't decide the format; you do.
  3. Single-shot execution — one prompt, one response, one usable artifact. Compare this to writing a Python scraper + pandas + a Plotly dashboard from scratch.
  4. Composability — you can chain follow-up prompts: "Now filter to only remote jobs and re-render" or "Add a column for Glassdoor rating".

Cost reality check: The curious_coder/linkedin-jobs-scraper actor charges approximately $0.008 per result on Apify. 100 results = ~$0.80. Claude API calls are additional if you're using the API directly. For job searching, this is a bargain compared to LinkedIn Premium ($40+/month).

Limitations to Know

Full Workflow Summary

flowchart TD A["1. Open Claude.ai\nEnable Apify connector\nin Settings → Integrations"] B["2. Configure the prompt\nSet ROLE, LEVEL, LOCATION\nat the top of the prompt"] C["3. Paste and run\nClaude calls Apify\n~60-90 seconds to complete"] D["4. Get HTML output\nCopy the entire HTML\nfrom Claude's response"] E["5. Save as .html file\nOpen in your browser\nFull interactive dashboard"] F["6. Optional follow-ups\nFilter by remote\nScore against your resume\nExport CSV"] A --> B --> C --> D --> E --> F

Takeaways