FeaturesObservability

Introduction to LLM Observability

In the realm of AI development, understanding the inner workings of your models is crucial for optimizing performance and debugging issues. LangSmith provides powerful observability tools that allow you to monitor and analyze the behavior of your Large Language Models (LLMs). By integrating LangSmith with AgentForge, you can gain deep insights into each workflow run, ensuring that your AI agents operate as expected.

Getting Started with LangSmith

Step 1: Set Up Your Environment

To begin observing your LLM workflows, you need to populate the LANGCHAIN_API_KEY environment variable with your LangSmith API key. This key allows AgentForge to send detailed telemetry data to the LangSmith console.

.env.local

1...
2LANGCHAIN_API_KEY=your_langsmith_api_key
3LANGCHAIN_TRACING_V2=true
4LANGCHAIN_PROJECT="Agent Forge"
5LANGCHAIN_CALLBACKS_BACKGROUND=true

Step 2: Run Your Workflows

With the environment variable set, any workflow run in AgentForge will automatically be tracked by LangSmith. This includes detailed information about each step, the data processed, and the interactions between different components.

Step 3: Explore the LangSmith Console

Navigate to the LangSmith console to view detailed logs and insights for each workflow run. The console provides a comprehensive interface to explore the execution flow, inspect inputs and outputs, and identify potential bottlenecks or errors.

Detailed Observability Features

LangSmith offers a range of observability features to help you understand and optimize your AI workflows:
  • Trace Visualization: View a visual representation of the entire workflow, showing each step and the data processed.
  • Input and Output Inspection: Dive into the details of the data passing through each step, including raw inputs and generated outputs.
  • Error Tracking: Identify and troubleshoot errors at any step in the workflow, with detailed error messages and stack traces.
  • Performance Metrics: Monitor performance metrics for each component, helping you to optimize processing times and resource usage.

Example Workflow Run

The following screenshot provides an example of what you can expect in the LangSmith console. It shows a workflow run with detailed insights into each step, including function calls, data processed, and the interactions between different agents.
 
notion image

Conclusion

Integrating LangSmith with AgentForge provides powerful observability tools that are essential for developing, debugging, and optimizing AI workflows. By setting up the LANGCHAIN_API_KEY and utilizing the LangSmith console, you can gain deep insights into the behavior of your LLMs, ensuring they perform reliably and efficiently.