Azure OpenAI | LangChain SDK with MSI

Image from https://www.pexels.com/@tara-winstead/
Image from https://www.pexels.com/@tara-winstead/

In this blog, we show how to set up an Azure Function App to use Azure OpenAI Service with Managed Service Identity (MSI) with Python SDK 

Pre-requisite

  1. Create an Azure Function App and turn on its Managed Service Identity
  2. Create an Azure OpenAI Service and grant the function app MSI to the Cognitive Services OpenAI User role.
These are the function app configurations, below is an example
        "OPENAI_API_TYPE": "azure_ad",
        "OPENAI_API_BASE": "https://sample.openai.azure.com/",
        "OPENAI_API_VERSION": "2023-03-15-preview",
        "COMPLETIONS_MODEL": "gpt-35-turbo"
Your local.setttings.json looks like this
{
    "IsEncrypted": false,
    "Values": {
        "AzureWebJobsStorage": "",
        "FUNCTIONS_WORKER_RUNTIME": "python",
        "OPENAI_API_TYPE": "azure_ad",
        "OPENAI_API_BASE": "https://sample.openai.azure.com/",
        "OPENAI_API_VERSION": "2023-03-15-preview",
        "COMPLETIONS_MODEL": "gpt-35-turbo"
    }
}

Python Dependencies

python = "~3.10.0"
azure-functions = "^1.15.0"
openai = "^0.27.8"
azure-identity = "^1.13.0"
langchain = "^0.0.203"

Since I am showing code for both openai and langchain, I have both as dependencies.

MSI with OpenAI SDK

import json
import openai
import os

import azure.functions as func

from azure.identity import DefaultAzureCredential


async def main(req: func.HttpRequest) -> func.HttpResponse:
    question = req.params.get("q")
    if not question:
        question = "Write a short poem"

    default_credential = DefaultAzureCredential()
    token = default_credential.get_token("https://cognitiveservices.azure.com")
    deployment_name = os.getenv("COMPLETIONS_MODEL")
    api_type = os.getenv("OPENAI_API_TYPE")

    openai.api_key = token.token
    openai.api_type = api_type
    openai.api_base = os.getenv("OPENAI_API_BASE")
    openai.api_version = os.getenv("OPENAI_API_VERSION")

    response = openai.Completion.create(
        engine=deployment_name,
        prompt=question,
        temperature=0.7,
        max_tokens=300,
    )

    return func.HttpResponse(
        body=json.dumps(response.choices[0].text),
        mimetype="application/json",
    )

A few things to callout.
  1. api_type is azure_ds instead of the usual azure
  2. get the bearer token from https://cognitiveservices.azure.com with the DefaultAzureCredential and set it as openai.api_key.

MSI with LangChain

Similarly, we can do the same with LangChain.
import json
import os

import azure.functions as func

from azure.identity import DefaultAzureCredential

from langchain import LLMChain
from langchain.chat_models import AzureChatOpenAI
from langchain.prompts.chat import (
    ChatPromptTemplate,
    HumanMessagePromptTemplate,
)


async def main(req: func.HttpRequest) -> func.HttpResponse:
    question = req.params.get("q")
    if not question:
        question = "Write a short poem"

    default_credential = DefaultAzureCredential()
    token = default_credential.get_token("https://cognitiveservices.azure.com")
    deployment_name = os.getenv("COMPLETIONS_MODEL")
    api_type = os.getenv("OPENAI_API_TYPE")

    chat_prompts = ChatPromptTemplate.from_messages(
        [
            HumanMessagePromptTemplate.from_template(template="{question}"),
        ]
    )

    llm = AzureChatOpenAI(
        deployment_name=deployment_name,
        openai_api_key=token.token,
        openai_api_type=api_type,
        temperature=0.7,
    )
    chain = LLMChain(llm=llm, prompt=chat_prompts)
    response = await chain.agenerate([{"question": question}])

    return func.HttpResponse(
        body=json.dumps(response.generations[0][0].text),
        mimetype="application/json",
    )






Comments

Popular posts from this blog

OpenAI: Functions Feature in 2023-07-01-preview API version

Storing embedding in Azure Database for PostgreSQL

Happy New Year, 2024 from DALL-E