Appearance
Connectors & Tool Providers
A connector is a Rust crate under backend/sources/agentcy-source-* that knows how to talk to one external system. Every connector can play two roles:
- Ingestion source — batch read data and write it into the knowledge graph.
- Tool provider — expose live methods the agent can call during chat (e.g.
aws.s3_list_objects,github.list_pulls).
Most connectors implement both. The full catalog is at Connectors Overview.
The two traits
rust
// Ingestion: bulk ETL into the graph
#[async_trait]
pub trait IngestionSource {
async fn validate_config(&self, config: &Value) -> Result<()>;
async fn ingest(&self, cx: IngestContext) -> Result<IngestStats>;
}
// Live tools: what the agent can call mid-conversation
#[async_trait]
pub trait ConnectorToolProvider {
fn tools(&self) -> &[ToolSpec];
async fn execute_tool(&self, call: ToolCall) -> Result<ToolResult>;
}validate_config runs at source creation and before every ingestion run — bad credentials fail fast with a typed error (ConfigError::Credentials, ConfigError::Scope) that the UI renders directly.
ingest yields node/edge batches over a stream. agentcy-ingest::PipelineRunner handles batching, transactions, retries, and event fan-out.
The registry
At boot, agentcy-api::state::AppState builds a SourceRegistry by instantiating every compiled-in connector:
rust
let registry = SourceRegistry::new()
.register(GitHubSource::new(cfg.clone()))
.register(AwsSource::new(cfg.clone()))
.register(KubernetesSource::new(cfg.clone()))
// …
;Which connectors compile in is controlled by Cargo features:
| Feature | Connectors |
|---|---|
| (always) | csv, json |
connectors-dev | github, kubernetes, openapi, remote-exec, mcp, ciab, os, git, jenkins, figma, remotion |
connectors-cloud | aws, gcp, vercel, supabase, localstack, firecrawl, websearch, elevenlabs, runway, slack, google-workspace, readai, hubspot, grafana |
connectors-data | sql, mongodb, powerbi |
Disable what you don't need to shrink build time and binary size.
Per-source configuration
Each connector declares a JSON schema for its config. The UI renders a form from it; the API validates against it. Example:
json
{
"name": "aws-prod",
"connector": "aws",
"realm": "infrastructure",
"config": {
"auth": { "kind": "assume_role", "role_arn": "arn:aws:iam::1234:role/agentcy" },
"regions": ["us-east-1","eu-west-2"],
"services": ["ec2","s3","iam","cloudwatch"]
}
}Secrets are never returned from GET /sources/:id — only a masked form. Rotation is atomic (PATCH /sources/:id with new config.auth).
How tools reach the agent
Dumping every tool from every connector into the LLM prompt would be wasteful (and wrong — you'd leak aws.* tools to a chat about design). Instead the agent sees just four catalog meta-tools:
list_connectors() # what sources are enabled here
search_connector_tools(query) # semantic search over ToolSpec descriptions
execute_connector_tool(connector, tool, args) # call it
request_connector_access(connector) # ask an admin to enable itThe catalog is realm- and policy-aware: a chat in realm crm sees only connector_enabled AND realm_match tools; policies can further filter.
See Agent Loop and How-To: Tool Calling.
Auth patterns across connectors
| Pattern | Examples |
|---|---|
| Static API key | Vercel, Supabase service key, HubSpot, Firecrawl, OpenAI |
| Personal access token | GitHub PAT, Jenkins |
| OAuth | GitHub OAuth, Google Workspace, Read.ai, Slack |
| GitHub App | github (preferred for orgs) |
| AssumeRole | AWS |
| Service account | GCP (JSON key), Kubernetes (in-cluster), Supabase (service role) |
| mTLS / kubeconfig | Kubernetes |
OAuth callbacks land on public routes at /api/v1/sources/oauth/:provider and associate with the initiating source by state token. See backend/crates/agentcy-api/src/routes/sources.rs::public_router.
Rate limits and back-pressure
Most connectors define a rate limiter per auth principal. agentcy-ingest respects these via token-bucket; per-user tool calls from chat respect them too. When the limit is hit:
- Ingestion: pauses and resumes after the cooldown.
- Tool call: returns
error.code = "rate_limited"withretry_after_ms; the agent loop surfaces it to the LLM so it can retry or back off.
Writing a custom connector
Full guide: Custom Connectors. Short version:
cargo new --lib backend/sources/agentcy-source-mything.- Depend on
agentcy-core,agentcy-ingest,async-trait. - Implement
IngestionSourceand/orConnectorToolProvider. - Add to the workspace
Cargo.tomland register instate.rs. - Ship a config schema (
config_schema()) and a human-readabledisplay_name().
Next
- Connectors Overview — the 26 built-in sources with a capability matrix.
- Custom Connectors — write your own.
- Agent Loop — how tools get called during chat.