The ABI Library is the core implementation of ABI's concepts, designed to build a unified AI system. This library provides the fundamental building blocks for connecting, processing, and utilizing data across different AI components.
Integrations provide standardized connections to third-party services and data sources. They handle:
- Authentication and authorization
- API communication
- Data format standardization
- Error handling and retries
Pipelines are responsible for data ingestion and transformation into the ontological layer. They:
- Utilize integrations to fetch data
- Transform raw data into semantic representations
- Maintain data consistency and quality
- Map external data models to ABI's ontology
Workflows leverage the ontological layer to implement business logic and provide data to consumers. They can be used by:
- Large Language Models (LLMs)
- Remote APIs and services
- Other automated processes
Services form the foundational layer of ABI, implementing the Hexagonal Architecture (Ports & Adapters) pattern to provide flexible and system-agnostic interfaces. This architectural approach allows ABI to seamlessly integrate with existing systems while maintaining clean separation of concerns.
Each service defines a primary port (interface) that specifies its capabilities, while multiple secondary adapters can implement this interface for different backend systems. This means you can:
- Easily swap implementations without changing business logic
- Add new integrations by implementing new adapters
- Test components in isolation using mock adapters
For example, the Secret Service could connect to various backend systems through different adapters:
- Hashicorp Vault
- AWS Secrets Manager
- Azure Key Vault
- Environment Variables
- Local File System
- Google Cloud Secret Manager
- Kubernetes Secrets
This modular approach ensures that ABI can be deployed in any environment while maintaining consistent interfaces and behavior across different infrastructure choices.