MatrixGPT/README.md

1.6 KiB

MatrixGPT

Chatbots for Matrix.

This bot supports OpenAI, Anthropic, and locally hosted models that use an OpenAI-compatible endpoint. It can run multiple different models using different triggers, such as !c4 for GPT4 and !ca for Anthropic, all through the same bot.

OpenAI and Anthropic vision is supported.


Install

  1. Install requirements:
    sudo apt install libolm-dev gcc python3-dev
    pip install -r requirements.txt
    
  2. Copy config.sample.yaml to config.yaml and fill it out with the bot's Matrix authentication and your OpenAI and/or Anthropic API keys.
  3. Start the bot with python3 main.py

Pantalaimon is required for the bot to be able to talk in encrypted rooms.

I included a sample Systemd service (matrixgpt.service).

Use

First, invite your bot to a room. Then you can start a chat by prefixing your message with your trigger (for example, !c hello!). The bot will create a thread when it replies. You don't need to use the trigger in the thread.

Use !matrixgpt to view the bot's help. The bot also responds to !bots.


  • Don't try to use two bots in the same thread.
  • You can DM a bot for a private chat.
  • The bot will move its read marker whenever a message is sent in the room.

The bot can give helpful reactions:

  • 🚫 means permission denied (not allowed to chat with the bot).
  • 🕒 means the API timed out.
  • means the bot encountered an exception.
  • 🔐 means there was a decryption failure.

TODO

  • Dalle bot
  • Fix the typing indicator being removed when two responses are generating