# AI Agents

If you are operating Hooksbase from a coding agent, terminal automation, or LLM-driven workflow, start with the CLI. It is the most stable public surface for non-browser automation because it gives you predictable JSON output, profile management, and a narrower operational contract than scraping the dashboard.

**Available through**

Agents should prefer deterministic terminal and HTTP surfaces, then stop after re-reading the target state.

| Surface | Status | Notes |
| --- | --- | --- |
| [CLI](/docs/cli.md) | Preferred | Use profiles, environment variables, --json output, limit/cursor paging, and explicit mutations. |
| [Public API](/docs/api-reference.md) | Available | Use raw HTTP only for documented public route families that the CLI does not wrap. |
| [Dashboard](/docs/dashboard.md) | Not applicable | Agents should not scrape or automate the browser UI. Use dashboard docs only to understand human workflows. |
| [TypeScript SDK](/docs/sdks.md) | Available | Use the SDK when the agent is embedded in a Node.js or TypeScript runtime. |

## Why agents should start with the CLI

The CLI is the default control plane for external agents because it already solves three problems that agents otherwise have to reinvent:

- credential resolution through profiles, environment variables, or explicit flags
- stable machine-readable output through `--json`
- common operator actions such as webhook listing, delivery inspection, replay, DLQ export, schedule lifecycle, and ingest send

Use the TypeScript SDK when you are embedding Hooksbase inside your own Node.js or TypeScript runtime. Use the CLI when the agent is orchestrating shell commands, scripts, or local terminal workflows.

## Auth model

- **Preferred credential** `project API key`: Give the agent a project admin or write key depending on the actions it needs. Admin keys unlock `projects get` and admin-only Public API routes.
- **CLI profile**: Store durable credentials with `hooksbase auth login` when the host environment should remember them across runs.
- **Ephemeral runs**: For one-shot automation, pass `--base-url` and `--api-key` directly or provide `HOOKSBASE_BASE_URL`, `HOOKSBASE_API_KEY`, and optionally `HOOKSBASE_PROFILE`.

## Recommended agent loop

For most agent workflows, use this sequence:

1. Validate credentials and discover the current profile state.
2. Read the current project and webhook inventory.
3. Inspect deliveries, replay jobs, or DLQ rows one page at a time.
4. Execute one explicit mutation such as replay, re-drive, schedule create, or ingest send.
5. Re-read the mutated surface and stop.

That pattern keeps the agent grounded in server truth and avoids wide, unbounded crawls.

**Minimal agent-safe loop**

```bash
hooksbase auth profiles --validate --json
hooksbase projects get --json
hooksbase webhooks list --limit 20 --json
hooksbase deliveries list --limit 20 --json
```

## Core command patterns

Use these patterns as the defaults for terminal agents:

**Inventory and inspection**

```bash
hooksbase projects get --json
hooksbase templates list --json
hooksbase webhooks get wh_123 --json
hooksbase deliveries get del_123 --json
hooksbase dlq get dlq_123 --json
hooksbase usage show --from 1740000000000 --to 1740086400000 --json
```

**Explicit mutations**

```bash
hooksbase deliveries replay del_123 --json
hooksbase dlq redrive dlq_123 --json
hooksbase schedules create wh_123 --file ./schedule.json --json
hooksbase webhooks rotate-signing-secret wh_123 --json
hooksbase ingest send --ingest-url https://hooks.hooksbase.com/v1/ingest/hook_123 --ingest-secret whsec_... --file ./payload.json --json
hooksbase ingest send --webhook-name orders --ingest-secret whsec_... --file ./payload.json --json
```

Operational guidance:

- prefer `--json` for every agent-invoked command
- use `--limit` and `--cursor` instead of assuming full collection reads
- use the CLI for explicit remote state changes, not inferred local bookkeeping
- use idempotency keys on ingest sends whenever the agent may retry
- prefer `--webhook-name` when the agent already has project API key access and should resolve the current ingest URL automatically
- stop after one mutation unless the user explicitly asks for a broader batch
- re-read the same webhook, delivery, schedule, DLQ entry, or operation job before deciding whether the mutation worked

## When to fall back to raw HTTP

The CLI does not currently wrap every customer-facing public route.

Use raw HTTP when the agent needs:

- HTTP pack catalog reads
- destinations and routing
- payload transforms
- metrics or backlog
- email allowlist management
- event drain management
- operator alerting
- file retrieval helpers
- bulk replay, bulk DLQ recovery, or bulk-operation polling

Use the dashboard when the agent request touches browser-only product workflows:

- project creation or deletion
- team membership and invitations
- billing checkout or provider management
- onboarding wizard state and validation

That separation is intentional. Raw HTTP and the CLI are public automation surfaces; the dashboard is a product UI for signed-in humans.

## Common mistakes

- Scraping the dashboard instead of using the CLI or raw HTTP.
- Running list commands without `--json` and then trying to parse table output.
- Giving the agent a write key and expecting admin-only project reads to work.
- Letting the agent perform broad destructive actions without first re-reading the target resource.
- Falling back to undocumented browser routes for dashboard-only workflows.
