Guide to Praison Labs’s code interface for interacting with your codebase using AI, including file management, model configuration, and advanced features
Praison Labs Code helps you to interact with your whole codebase using the power of AI.
Interface | Description | URL |
---|---|---|
UI | Multi Agents such as CrewAI or AG2 | https://docs.praisonlabs.com/ui/ui |
Chat | Chat with 100+ LLMs, single AI Agent | https://docs.praisonlabs.com/ui/chat |
Code | Chat with entire Codebase, single AI Agent | https://docs.praisonlabs.com/ui/code |
Username and Password will be asked for the first time.
admin
is the default username and password.
Set Model name to be gpt-4o-mini in the settings
export GEMINI_API_KEY=xxxxxxxxx
praisonai code
gemini/gemini-1.5-flash
in
the settings
.praisonignore
file in the root folder
of the project
(.praisonignore is preferred)
settings.yaml
file in the root folder
of the project
.env
file in the root folder of the
project
.praisoninclude
to the original context (files
in the folder - .gitignore - .praisonignore)
.praisoninclude
file in the root
folder of the project
.praisoncontext
to the context
.praisoncontext
file in the root
folder of the project
Note: By Default Max Tokens set is 900,000
or
~/.praison/database.sqlite
Praison Labs Code now includes internet search capabilities using Crawl4AI and Tavily. This feature allows you to retrieve up-to-date information and code snippets during your coding sessions, enhancing your ability to find relevant programming information and examples.
To use this feature:
While primarily designed for code interactions, Praison Labs Code also supports Vision Language Model capabilities. This feature can be particularly useful when dealing with visual aspects of programming, such as UI design, data visualization, or understanding code structure through diagrams.
To use this feature:
These new features significantly expand the capabilities of Praison Labs Code, allowing for more comprehensive and up-to-date coding assistance.
To facilitate local development with live reload, you can use Docker. Follow the steps below:
Create a Dockerfile.dev
:
Create a docker-compose.yml
:
Run Docker Compose:
This setup will allow you to develop locally with live reload, making it easier to test and iterate on your code.