Skip to content

iwamot/collmbo

Repository files navigation

Collmbo

CI codecov

Collmbo icon

A Slack bot that lets you chat with 100+ LLMs via LiteLLM. Pronounced the same as "Colombo".

Quick Start

Collmbo supports multiple LLMs, but let's begin with OpenAI's gpt-4o model for a quick setup.

1. Create a Slack App

Create a Slack app and obtain the required tokens:

  • App-level token (xapp-1-...)
  • Bot token (xoxb-...)

2. Create a .env File

Save your credentials in a .env file:

SLACK_APP_TOKEN=xapp-1-...
SLACK_BOT_TOKEN=xoxb-...
LITELLM_MODEL=gpt-4o
OPENAI_API_KEY=sk-...

3. Run Collmbo Container

Start the bot using Docker:

docker run -it --env-file .env ghcr.io/iwamot/collmbo:latest

Note

For versioned releases, you can specify a tag like x.x.x. For more details, please check the list of available tags.

4. Say Hello!

Mention the bot in Slack and start chatting:

@Collmbo hello!

Collmbo should respond in channels, threads, and DMs.

Want to Use a Different LLM?

First, pick your favorite LLM from LiteLLM supported providers.

To use it, update the relevant environment variables in your .env file and restart the container.

Here are some examples:

Gemini - Google AI Studio (Gemini 2.0 Flash)

SLACK_APP_TOKEN=xapp-1-...
SLACK_BOT_TOKEN=xoxb-...
LITELLM_MODEL=gemini/gemini-2.0-flash-001
GEMINI_API_KEY=...

Azure OpenAI (gpt-4o)

SLACK_APP_TOKEN=xapp-1-...
SLACK_BOT_TOKEN=xoxb-...
LITELLM_MODEL=azure/<your_deployment_name>

# Specify the model type to grab details like max input tokens
LITELLM_MODEL_TYPE=azure/gpt-4o

AZURE_API_KEY=...
AZURE_API_BASE=...
AZURE_API_VERSION=...

Amazon Bedrock (Claude Sonnet 4)

SLACK_APP_TOKEN=...
SLACK_BOT_TOKEN=...
LITELLM_MODEL=bedrock/us.anthropic.claude-sonnet-4-20250514-v1:0

# You can specify a Bedrock region if it's different from your default AWS region
AWS_REGION_NAME=us-west-2

# You can use your access key for authentication, but IAM roles are recommended
AWS_ACCESS_KEY_ID=...
AWS_SECRET_ACCESS_KEY=...

Deployment

Collmbo does not serve endpoints and can run in any environment with internet access.

Features

Configuration

Collmbo runs with default settings, but you can customize its behavior by setting optional environment variables.

Contributing

Contributions are welcome! Feel free to open an issue or submit a pull request.

Before opening a PR, please run:

./validate.sh

This helps maintain code quality.

Related Projects

License

The code in this repository is licensed under the MIT License.

The Collmbo icon (assets/icon.png) is licensed under CC BY-NC-SA 4.0. For example, you may use it as a Slack profile icon.

About

A Slack bot that lets you chat with 100+ LLMs via LiteLLM.

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors 13

Languages