Learn how to monitor and troubleshoot LangChain applications in production.
LangChain library is extremely useful for building AI applications that are based on or using LLMs. Chaining inference, tools and actions seems to be the natural way to use LLMs.
When LangChain apps are deployed, especially if facing external users, it becomes important to ensure low latency and high reliability. As multiple API calls and actions can be chained in a single run to complete a task, a chain-level visibility into runs is important. For example, to identify the slowest components or analyze token rate limits.
Graphsignal automatically instruments and starts tracing and monitoring chains. It's only necessary to set it up by providing Graphsignal API key and a deployment name.
import graphsignal
# Provide an API key directly or via GRAPHSIGNAL_API_KEY environment variable
graphsignal.configure(api_key='my-api-key' deployment='my-langchain-app-prod')
In order to additionally trace full runs and see a breakdown by chains and tools, you can wrap the calling routine, or the handler in case of a web framework.
with graphsignal.start_trace('mychain'):
chain.run("some text")
You can also use a decorator. See the Quick Start for complete setup instructions.
To demonstrate, I run this example app. It simulates periodic chain runs. After running it, sample traces and metrics are continuously recorded and available in the dashboard for analysis.
We can look into a particular trace sample to answer questions about slow latency, errors, see the data statistics of that call and whether it may have been the root cause of an issue.
If OpenAI LLM is used, Graphsignal automatically instruments and traces OpenAI API providing additional insights such as token counts, sizes, finish reasons and more.
Additionally, performance, data metrics and resource utilization is available for every worker, to monitor applications over time and correlate any changes or issues.
Alerts can be set up to get notified on exceptions or outliers.
Give it a try and let us know what you think. Follow us at @GraphsignalAI for updates.