Skip to content

Connectors & Tool Providers

A connector is a Rust crate under backend/sources/agentcy-source-* that knows how to talk to one external system. Every connector can play two roles:

  1. Ingestion source — batch read data and write it into the knowledge graph.
  2. Tool provider — expose live methods the agent can call during chat (e.g. aws.s3_list_objects, github.list_pulls).

Most connectors implement both. The full catalog is at Connectors Overview.

The two traits

rust
// Ingestion: bulk ETL into the graph
#[async_trait]
pub trait IngestionSource {
    async fn validate_config(&self, config: &Value) -> Result<()>;
    async fn ingest(&self, cx: IngestContext) -> Result<IngestStats>;
}

// Live tools: what the agent can call mid-conversation
#[async_trait]
pub trait ConnectorToolProvider {
    fn tools(&self) -> &[ToolSpec];
    async fn execute_tool(&self, call: ToolCall) -> Result<ToolResult>;
}

validate_config runs at source creation and before every ingestion run — bad credentials fail fast with a typed error (ConfigError::Credentials, ConfigError::Scope) that the UI renders directly.

ingest yields node/edge batches over a stream. agentcy-ingest::PipelineRunner handles batching, transactions, retries, and event fan-out.

The registry

At boot, agentcy-api::state::AppState builds a SourceRegistry by instantiating every compiled-in connector:

rust
let registry = SourceRegistry::new()
    .register(GitHubSource::new(cfg.clone()))
    .register(AwsSource::new(cfg.clone()))
    .register(KubernetesSource::new(cfg.clone()))
    // …
    ;

Which connectors compile in is controlled by Cargo features:

FeatureConnectors
(always)csv, json
connectors-devgithub, kubernetes, openapi, remote-exec, mcp, ciab, os, git, jenkins, figma, remotion
connectors-cloudaws, gcp, vercel, supabase, localstack, firecrawl, websearch, elevenlabs, runway, slack, google-workspace, readai, hubspot, grafana
connectors-datasql, mongodb, powerbi

Disable what you don't need to shrink build time and binary size.

Per-source configuration

Each connector declares a JSON schema for its config. The UI renders a form from it; the API validates against it. Example:

json
{
  "name": "aws-prod",
  "connector": "aws",
  "realm": "infrastructure",
  "config": {
    "auth": { "kind": "assume_role", "role_arn": "arn:aws:iam::1234:role/agentcy" },
    "regions": ["us-east-1","eu-west-2"],
    "services": ["ec2","s3","iam","cloudwatch"]
  }
}

Secrets are never returned from GET /sources/:id — only a masked form. Rotation is atomic (PATCH /sources/:id with new config.auth).

How tools reach the agent

Dumping every tool from every connector into the LLM prompt would be wasteful (and wrong — you'd leak aws.* tools to a chat about design). Instead the agent sees just four catalog meta-tools:

list_connectors()                                     # what sources are enabled here
search_connector_tools(query)                         # semantic search over ToolSpec descriptions
execute_connector_tool(connector, tool, args)         # call it
request_connector_access(connector)                   # ask an admin to enable it

The catalog is realm- and policy-aware: a chat in realm crm sees only connector_enabled AND realm_match tools; policies can further filter.

See Agent Loop and How-To: Tool Calling.

Auth patterns across connectors

PatternExamples
Static API keyVercel, Supabase service key, HubSpot, Firecrawl, OpenAI
Personal access tokenGitHub PAT, Jenkins
OAuthGitHub OAuth, Google Workspace, Read.ai, Slack
GitHub Appgithub (preferred for orgs)
AssumeRoleAWS
Service accountGCP (JSON key), Kubernetes (in-cluster), Supabase (service role)
mTLS / kubeconfigKubernetes

OAuth callbacks land on public routes at /api/v1/sources/oauth/:provider and associate with the initiating source by state token. See backend/crates/agentcy-api/src/routes/sources.rs::public_router.

Rate limits and back-pressure

Most connectors define a rate limiter per auth principal. agentcy-ingest respects these via token-bucket; per-user tool calls from chat respect them too. When the limit is hit:

  • Ingestion: pauses and resumes after the cooldown.
  • Tool call: returns error.code = "rate_limited" with retry_after_ms; the agent loop surfaces it to the LLM so it can retry or back off.

Writing a custom connector

Full guide: Custom Connectors. Short version:

  1. cargo new --lib backend/sources/agentcy-source-mything.
  2. Depend on agentcy-core, agentcy-ingest, async-trait.
  3. Implement IngestionSource and/or ConnectorToolProvider.
  4. Add to the workspace Cargo.toml and register in state.rs.
  5. Ship a config schema (config_schema()) and a human-readable display_name().

Next

Built by AgentcyLabs. For in-house deployment or Agentcy Cloud (PaaS) access, visit agentcylabs.com.