Administration
LLMs
Coming Soon
Babbage Insight supports the LLM service of your choice:
- Third Party Providers: Support for other common and prominent LLM providers such as Anthropic and OpenAI.
- ** Your Hosted LLMs**: For example, we can route all requests through AWS Bedrock on your cloud.
- On-Premise LLMs: Provision of Docker images of open weight LLMs (e.g. Llama3) running on a popular service (e.g. Ollama).