Chat bots are all the rage right now and you need something to show your boss on your company’s next AI product with design decisions. Model companies like Open AI and Anthropic are making lots of hype, infrastructure and chip manufacturers like AWS, Microsoft and NVIDIA are making money hand over fist, and the energy provider across the world are struggling to keep up with demand.
You can make money too! So the next time someone asks you to make a chat bot, use this as a cheat sheet!
To be specific, I’ll show you the basic of how to make a back end server in Python that calls a OpenAI for a LLM response. I’ll show you how to make your chatbot with either simple HTTP post, polling, web sockets, or server-side events.
Use this link if you don’t have medium access: https://medium.com/@BryMei/how-to-write-a-fast-api-server-for-your-chat-bot-c30b25f9549d?sk=4b5280e054d0f9679f31a17e68bea120
Assumptions:
- You are using python since a lot of machine learning languages are in python
- You are using FastApi for the back end server
- You are using OpenAI for the…