For Security, IT Ops, and AI teams

Secure Company Document Access for OpenClaw

Your team's agents get way more useful when they can search your real company docs. ShaleLight keeps that access secure, private, and grounded in your team's actual knowledge.

Open source. On-prem ready. Built for private docs.

Why teams choose ShaleLight

Useful AI answers, with private company context.

01

Agent ready knowledge layer.

Connect your docs once so OpenClaw can answer with the context your team actually uses.

  • Index internal files and docs fast
  • Give OpenClaw shared company context

02

Secure Company Document Access

Keep private docs private while still letting OpenClaw search what it needs.

  • Keep data inside trusted boundaries
  • Built for restricted and air-gapped networks

03

Source-Cited Answers

Give users verifiable responses with citations back to the original source documents.

  • Reduce hallucination risk in workflows
  • Make answers auditable by default

04

Role-Aware Access Control

Respect permission boundaries with access controls suited for security-sensitive environments.

  • Align with least-privilege practices
  • Support permission-sensitive teams

How Search Works

Local pipeline. Company docs. Fast, grounded answers.

Ingest Embed Retrieve Rank Answer + Cite

Local Embedding Model

Builds vectors on your hardware so search matches meaning, not just keywords.

Local LLM

Writes answers from retrieved internal context on local infrastructure.

Local Entity Extraction

Maps names, terms, and aliases so queries match real company language.

Hybrid Retrieval

Runs vector and lexical search together to balance recall and precision.

Rank Fusion

Fuses scoring signals so top hits stay strong across query styles.

Chunking + Metadata

Indexes chunked content with source metadata for cleaner answers and citations.

Local PostgreSQL + pgvector

Stores chunks and embeddings in local PostgreSQL with pgvector for fast semantic retrieval.

Incremental Ingestion

Uses file hashes to skip unchanged files and process only new changes.

Citation-Grounded Output

Returns citations with every answer so users can verify every claim.

Deploy where your data already lives.

Turn OpenClaw into a company-aware teammate.

Recommended hardware: Mac Studio (M3 Ultra) with 96GB+ RAM for local LLM workloads.

Explore the Repository