non-OpenAI llm endpoint support

I saw your .env supports

https://docs.opnform.com/configuration/environment-variables

Other Environment Variables

Configuration Environment Variables

Variable Name Description

OPEN_AI_API_KEY API key for accessing OpenAI services.

I use ollama and vllm in my local network for inference engines and they offer openai compatible communication.

Please allow admins to configure .env a fqdns url and/or ip address to any different openai endpoint

Please authenticate to join the conversation.

Upvoters
Status

Rejected

Board

πŸ’‘ Feature Request

Date

About 2 months ago

Author

sdf

Subscribe to post

Get notified by email when there are changes.