Azure AI Foundry LangChain Python module 0.1.0

 

https://www.pexels.com/photo/green-christmas-tree-with-red-and-gold-baubles-10424949/
https://www.pexels.com/photo/green-christmas-tree-with-red-and-gold-baubles-10424949/

In the blog, we try out the Azure AI Foundry LangChain Python Module which is newly released. We created a Azure OpenAI model to translate "Merry Christmas" to Japanese and then have Azure Phi-3 model to translate Japanese words back to English.

Source code can be found here.

A few things to note:

  1. The OpenAI endpoint needs to be in a certain format. see this. (AZURE_OPENAI_ENDPOINT)
  2. Creating model for other GenerativeAI models are different from OpenAI model. see
    1. Non OpenAI model (code)
    2. OpenAI model (code) which is slightly more complicated. Even using DefaultAzureCredential needs an extra function call.
  3. We can enable logging by setting up a logger (code)
  4. We write a custom parser to print the token usage (code)

The main source code looks like this.

from dotenv import load_dotenv
from langchain_core.runnables import RunnableParallel, RunnablePassthrough

from az_langchain.azure_openai_model import get_model as get_openai_model
from az_langchain.azure_phi3_model import get_model as get_phi3_model
from az_langchain.logger import enable_logging
from az_langchain.output_parser import OutputParser
from az_langchain.templates import get_back2english_template, get_translate_template

load_dotenv()

enable_logging()
phi3_model = get_phi3_model()
openai_model = get_openai_model()

translate_template = get_translate_template()
back2english_template = get_back2english_template()

parser = OutputParser()

translate_chain = translate_template | openai_model | parser
retranslate_chain = back2english_template | phi3_model | parser

chain = translate_chain | RunnableParallel(
    translate=RunnablePassthrough(),
    back2english=RunnablePassthrough() | retranslate_chain,
)
response = chain.invoke({"language": "japanese", "text": "Merry Christmas"})
print(response)


We created two chains, one to perform the translation with OpenAI model, and other chain uses the input of the first chain with Phi-3 model to translate the Japanese text back to English.

Here is the output

{'translate': 'メリークリスマス', 'back2english': ' Merry Christmas.'}

Pretty cool. This simplifies a lot code.




Comments