Click in the icon to start the chatbot. The bot is based on LLama wich was fine tuned for the ABAP domain.
Training such model requires insane amount of GPU.
Even running the trained model in the cloud can cost 1000$ per month.
To ensure proper response time the model has to be in VRAM at all times.
This LLM runs on a cheap video card attached to commodity hardware wich is connected
via mobile internet without a even public internet address.
If you are interessed how to train and run such chatbot in your domain / company drop
a mail to


  • smart_toy

    Hello 👏
    How can I assist you today?