Openrouter LLM still asks for OpenAI key

I’m exploring ways of using Answerly without using OpenAI, and by using Mistral 7B for example. The thing is when I select Openrouter as an LLM it asks me for two API keys, one of them being OpenAI. My question is : Can we use these free models as the only ones to both train our chatbot and answer customers ? If so, can you please tell us how to do it. This would make for an amazing and more sustainable solution for all of us.

Hey @Santofer,

OpenRouter requires an API key because it doesn’t provide some of the additional API functionalities required, such as embedding.

However, embedding is a very cost-effective resource. We are even considering removing it from our users’ costs altogether.