How to Connect LlamaIndex with Private LLM API Deployments | by Peng Qian | Nov, 2024


When your enterprise doesn’t use public models like OpenAI

Towards Data Science
How to Connect LlamaIndex with Private LLM API Deployments.
How to Connect LlamaIndex with Private LLM API Deployments. Image by Author

Starting with LlamaIndex is a great choice when building an RAG pipeline. Usually, you need an OpenAI API key to follow the many tutorials available.

However, you might face these situations:

  • Your company can only use privately deployed models due to compliance.
  • You’re using a model fine-tuned by your data science team.
  • Legal restrictions prevent your company’s private data from leaving the country.
  • Other reasons that require using privately deployed models.

When building enterprise AI applications, you can’t use OpenAI or other cloud providers’ LLM services.

This leads to a frustrating first step: How do I connect my LlamaIndex code to my company’s private API service?

To save your time, if you just need the solution, install this extension:

pip install -U llama-index-llms-openai-like

This will solve your problem.

If you want to understand why, let’s continue.

Recent Articles

Related Stories

Leave A Reply

Please enter your comment!
Please enter your name here