How to write a Fast API server for your chat bot | by Bryson Meiling | Oct, 2024


It is your brain or the AI that is keeping the world going? Its you, don’t let the hype get to you…

Chat bots are all the rage right now and you need something to show your boss on your company’s next AI product with design decisions. Model companies like Open AI and Anthropic are making lots of hype, infrastructure and chip manufacturers like AWS, Microsoft and NVIDIA are making money hand over fist, and the energy provider across the world are struggling to keep up with demand.

You can make money too! So the next time someone asks you to make a chat bot, use this as a cheat sheet!

To be specific, I’ll show you the basic of how to make a back end server in Python that calls a OpenAI for a LLM response. I’ll show you how to make your chatbot with either simple HTTP post, polling, web sockets, or server-side events.

Use this link if you don’t have medium access: https://medium.com/@BryMei/how-to-write-a-fast-api-server-for-your-chat-bot-c30b25f9549d?sk=4b5280e054d0f9679f31a17e68bea120

Assumptions:

  1. You are using python since a lot of machine learning languages are in python
  2. You are using FastApi for the back end server
  3. You are using OpenAI for the…

Recent Articles

Related Stories

Leave A Reply

Please enter your comment!
Please enter your name here