Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.beta.adapter.com/llms.txt

Use this file to discover all available pages before exploring further.

Custom connectors let you push data from sources Adapter doesn’t natively integrate with — internal tools, legacy systems, anything you can hit with HTTPS. Instead of Adapter pulling on a schedule (the OAuth model), you push events whenever they happen, authenticated by an API key.

When to use one

Reach for a custom connector when:
  • The source isn’t on the supported providers list.
  • You already have an export pipeline and want Adapter to receive events directly.
  • You want to mix first-party data (emails, pages, etc.) with synthetic events Adapter doesn’t otherwise see.
If your source is already supported (Gmail, Slack, Drive, Notion, Linear, …), use the OAuth connector instead — it’s faster to set up and stays in sync automatically.

Creating a connector

1

Open the Console

Go to Connectors in app.adapter.com and switch to the Custom tab. Click Create connector.
2

Name it

Give it a display name and a source slug (auto-derived from the name, lowercase + hyphens). The slug is immutable after creation — it shows up in the event source (custom:your-slug) and the storage path. Pick something descriptive.
3

Pick accepted kinds

Select which first-party event types this connector may emit (email, page, message, calendar, issue, or generic custom). Selecting none means all kinds are accepted. Anything outside the list is rejected with a 400.
4

Save your credentials

The console shows the client_id and a pk_live_… API key once. Copy both immediately — the key won’t be shown again.

Pushing data

Each connector exposes four ingest endpoints under /v1/custom-connectors/{client_id}. All require Authorization: Bearer pk_live_…. Use the typed path when your data fits one of Adapter’s first-party shapes (email, page, etc.). Adapter applies the same processing as native connectors — entity extraction, relationship resolution, the works.
curl -X POST "https://api.adapter.com/v1/custom-connectors/$CLIENT_ID/ingest/typed" \
  -H "Authorization: Bearer $PK_LIVE_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "items": [
      {
        "kind": "email",
        "payload": {
          "sender": "alice@acme.com",
          "subject": "Q2 review",
          "body_snippet": "Numbers look good."
        }
      }
    ]
  }'
The response echoes an external_ids list — useful if you want to look up the resulting document later.

Plain text and arbitrary JSON

When your data doesn’t fit a typed shape — free-text notes, log lines, custom records from a tool that doesn’t have an Adapter integration — use the generic /ingest endpoint. It stores the data field verbatim as a StandardCustom event with event_type: "adapter.data.custom". Free text:
curl -X POST "https://api.adapter.com/v1/custom-connectors/$CLIENT_ID/ingest" \
  -H "Authorization: Bearer $PK_LIVE_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "items": [
      { "data": { "text": "Quick note from the standup: we shipped the migration." } }
    ]
  }'
The data field accepts any JSON value — it’s stored as-is. There’s no required shape, but using a stable key (e.g. text, body, content) makes downstream queries easier. Custom record with structure:
curl -X POST "https://api.adapter.com/v1/custom-connectors/$CLIENT_ID/ingest" \
  -H "Authorization: Bearer $PK_LIVE_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "items": [
      {
        "external_id": "ticket-9182",
        "data": {
          "title": "Customer reported login loop",
          "severity": "high",
          "tags": ["auth", "regression"],
          "reported_by": "casey@acme.com"
        }
      }
    ]
  }'
Batch: post multiple items in a single request — Adapter accepts up to a few hundred per call.
curl -X POST "https://api.adapter.com/v1/custom-connectors/$CLIENT_ID/ingest" \
  -H "Authorization: Bearer $PK_LIVE_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "items": [
      { "external_id": "log-001", "data": { "text": "user signed in" } },
      { "external_id": "log-002", "data": { "text": "password reset requested" } },
      { "external_id": "log-003", "data": { "text": "two-factor enabled" } }
    ]
  }'
LLM enrichment does not run on /ingest (JSON-only). If you have a long blob of text and want Adapter to extract summary/key-facts from it, save it to a file and post via /ingest/upload — the enricher picks up text/plain, PDFs, images, and Office docs.

Binary uploads

For files (PDFs, images, Office docs), use the multipart upload endpoint:
curl -X POST "https://api.adapter.com/v1/custom-connectors/$CLIENT_ID/ingest/typed/upload" \
  -H "Authorization: Bearer $PK_LIVE_KEY" \
  -F "file=@/path/to/contract.pdf" \
  -F "kind=page" \
  -F 'payload={"title":"Q2 contract"}'
Adapter stores the bytes and runs LLM enrichment automatically — supported types include PDFs, images (PNG/JPEG/GIF/WebP), and Office files (.docx/.xlsx/.pptx). The resulting document carries an enriched_content block with summary, extracted text, and key facts. Use /ingest/upload (without /typed/) for binaries that don’t map to a typed kind.

Typed event shapes

The payload you send to /ingest/typed is validated against the corresponding type schema. Fields like source, user_id, container_id, and event_type are filled in automatically from the connector — you only supply the fields below. external_id is optional everywhere; Adapter generates one if you omit it. timestamp (ISO 8601) is also optional and defaults to ingest time.

email

FieldTypeRequiredNotes
senderstringDisplay name or address.
subjectstring
body_snippetstringFirst few hundred chars of the body. Larger content goes through enrichment if attached as a binary.

page

For documents — Notion pages, Confluence pages, internal wikis, PDFs uploaded as kind=page.
FieldTypeRequiredNotes
titlestring
content_snippetstring
urlstringCanonical link back to the source.
parent_idstringParent page/folder identifier.
parent_typestringe.g. "folder", "page".
created_bystring
last_edited_bystring
last_edited_atdatetimeISO 8601.

message

For chat / Slack-like messages.
FieldTypeRequiredNotes
channel_idstring
senderstring
textstring
is_directbooleantrue for DMs, false for channels.
thread_tsstringThread parent timestamp, if replying.

calendar

FieldTypeRequiredNotes
titlestring
start_timedatetimeUse for timed events.
end_timedatetime
start_datedateUse for all-day events.
end_datedate
Provide either the _time pair (timed) or the _date pair (all-day), not both.

issue

For tickets — Linear, Jira, GitHub issues, internal tracker rows.
FieldTypeRequiredNotes
titlestring
numberintegerDisplay number (e.g. ENG-12341234).
descriptionstring
statusstringFree-form (e.g. "open", "in-progress").
assigneestring
priorityinteger
labelsarrayList of label strings.
urlstring
actionstringOne of "create", "update", "remove".

custom

Escape hatch for data that doesn’t fit any typed shape. The whole data payload is stored verbatim.
FieldTypeRequiredNotes
dataobjectAny JSON.
You can also send custom via the simpler /ingest endpoint (no kind field required) — see Generic events above.

Looking at ingested data

Each connector card in the console has a View Data button that expands to show document counts per resource. Counts update as data lands; if they ever drift, the recovery path is to re-ingest or contact support.

Tips

  • external_id is optional — Adapter generates one if you omit it. Provide your own when you want idempotent re-ingest (re-posting the same external_id overwrites the document).
  • metadata on the connector itself is attached to every event under raw._connector.metadata. Useful for tagging the source environment (e.g. {"env": "staging"}).
  • API key scope: pk_live_… keys are tenant-scoped — one key works against any connector in your container. Rotate by creating a new key and revoking the old.
  • Deactivation stops new ingest but retains existing data.

What’s next

API reference

Full schema for the ingest endpoints.

Evidence types

The shape of typed first-party events.