Quick and Easy API Integration to ChatGPT and other Large Language Model providers.
We’ve recently been doing some internal process improvement work and just like every man and his dog at the moment, we identified a need to enhance some of our internal automation by integrating to a Large Language Model (LLM) provider such as ChatGPT.
Fortunately, If you’re a Pythonista, this process couldn’t be much simpler – thanks to a new generic AI integration framework called Marvin, from the folks who brought you Prefect Workflow Orchestration: https://www.prefect.io/marvin
Whilst OpenAI provide a relatively straightforward Python API Client for ChatGPT – we weren’t exactly sure if OpenAI was the right fit for our needs. For instance, the ChatGPT services from Azure might be more enterprise ready from a security and compliance perspective. We’ve also heard some great things about Anthropic’s Claude LLM.
The beauty of the Marvin framework, is that it provides a simple layer of abstraction in front of the LLM provider’s API. This allows the developer to focus on a single codebase, secure in the knowledge that LLM providers can be snapped in and out of their code as needs dictate. Whilst abstracting complexity is nearly always a great strategy to solve computing problems, the rate of change when working with LLMs makes this doubly so.
In practice, after a hot cup of coffee, we had our Python automation code securely hooked up to a private ChatGPT instance on Azure with less than a mornings effort. We were able to switch over to using API with a couple of lines of configuration – Fantastic! Marvin is definitely one to watch.
In my next article in this series, I will go into further detail around our compliance assessment of Azure ChatGPT and why we opted to use this service when integrating LLMs to our internal processes.