How to 𝗧𝗿𝗮𝗰𝗲 the 𝗟𝗟𝗠 𝗖𝗮𝗹𝗹𝘀 𝘄𝗶𝘁𝗵 𝗟𝗮𝗻𝗴𝘁𝗿𝗮𝗰𝗲 𝗔𝗜 𝗮𝗻𝗱 Visualize in 𝗚𝗿𝗮𝗳𝗮𝗻𝗮

How to 𝗧𝗿𝗮𝗰𝗲 the 𝗟𝗟𝗠 𝗖𝗮𝗹𝗹𝘀 𝘄𝗶𝘁𝗵 𝗟𝗮𝗻𝗴𝘁𝗿𝗮𝗰𝗲 𝗔𝗜 𝗮𝗻𝗱 Visualize in 𝗚𝗿𝗮𝗳𝗮𝗻𝗮 In the era of AI-driven applica...

How to 𝗧𝗿𝗮𝗰𝗲 the 𝗟𝗟𝗠 𝗖𝗮𝗹𝗹𝘀 𝘄𝗶𝘁𝗵 𝗟𝗮𝗻𝗴𝘁𝗿𝗮𝗰𝗲 𝗔𝗜 𝗮𝗻𝗱 Visualize in 𝗚𝗿𝗮𝗳𝗮𝗻𝗮

In the era of AI-driven applications, it's more important than ever to trace and observe LLM (Large Language Model) calls for better performance, cost management, and reliability.

  In this blog you will understand how to effectively trace and observe LLM calls to optimize performance, manage costs, and ensure the reliability of your AI-powered applications.

What is Langtrace:

Langtrace is an open-source observability tool that collects and analyzes traces in order to help you improve your LLM apps.

Two components:

1. SDK – Python and Typescript.

2. Langtrace Dashboard – The Dashboard is a web-based interface where you can view and analyze your traces.
Langtrace optimizes for the following 3 pillars of observability for your LLM apps

1. Usage - Tokens and Cost

2. Accuracy

3. Performance - Latency and Success Rate

HOW TO SETUP?
We can set up the Langtrace.ai project as a web-based UI or you can set up the project in your local as well using Docker, Docker compose, and kubernetes.
    If you want to setup the dashboard through online then signup the langtrace using you email id or github account so it will give you the project dashboard like below.
Now we need to click Create Project to create a new project. Here you need to give the Name of your project and Description and Project Type and click create Project. Now you can able to view your project in the Dashboard.

Now click Setup the Project option to create and setup your project in python or typescript.click Generate the API-key to generate the langtrace apikey


For Us we are creating the Python AI-ChatBot Project using AI(windsurf) then we will trace the AI traces using Langtrace and visualize in their UI.
Now i am using vibecode tool to create a Python based AI-chatbot application which needs to Talk to LLM and give you the answers.
I am giving the prompt like “I want to create a Python based AI-chatbot which needs to use the openai/gpt-4.1 and create the required file in the below location” It will generate the Code and your given location.


Once the Basic code is generated using the Vibecoding now we need to set up the env variables and need to use the langtrace SDK inside our python application.

To set up the Langtrace SDK, we need to add the Two lines of code inside our python ai-bot application or you can add those using Vibecoding as well so it will generate the code for you.
 Step 1: # Install the SDK

    pip install -U langtrace-python-sdk openai
Step 2: Setup environment variables
If you want to setup env in you local you can use below command
    export LANGTRACE_API_KEY=YOUR_LANGTRACE_API_KEY

    export AZURE_OPENAI_ENDPOINT=YOUR_AZURE_OPENAI_ENDPOINT

    export AZURE_OPENAI_API_KEY=YOUR_AZURE_OPENAI_API_KEY

    export AZURE_API_VERSION=YOUR_API_VERSION

    export AZURE_DEPLOYMENT_NAME=YOUR_DEPLOYMENT_NAME

3. The code for setting up the SDK is shown below
# Import it into your project

from langtrace_python_sdk import langtrace # Must precede any llm module imports

langtrace.init(api_key = '<LANGTRACE_API_KEY>')

I already Deployed the model in Azure openAi so i was using required envs and updated locally.
Now i need to test the application for that running python command
Command: python main.py
Once ran the above command In my terminal the python ai-chatbot was opened so i will ask some questions to my ai-bot

Wow!! Now I am getting response from the ai-bot. Now i need to check the traces for the above chats in the langtrace dashboard. 


Now click the metrics, inside the metrics you can able to find the metrics such as input tokens and output tokens and total cached tokens and cost of the tokens and also also you can able find the latency as well.

Now we visualize the data in langtrace now i want to visualize the same data into the grafana.

HOW TO INTEGRATE WITH GRAFANA ? For my used case i am using my free grafana cloud account.
Inside the grafana click Administrator → User and access → cloud access policies and click create access policy.
In the Policy creation we need to choose which one needs to read and write delete and also you can add extra scope as well.
Now click create option to create a policy
Once policy has been created click ADD token to create a token and type the name for the token and give the expiry period.
Once the Token created click create Datasource so that it will create a Tempo for you.



Now click Home and click OpenTelemetry and click quick start and choose your language for our used case click python and choose the Use an existing token to use the existing token which created previously and click Use Token.

Now it will give you instructions to auto-instrument your Python application.

Now we need to install the required python package for the auto-instrumentions.

Ref: https://docs.langtrace.ai/supported-integrations/observability-tools/grafana
Once the installation is completed we need to update the some env variables for us it was running in local so use export with those variables.
export OTEL_RESOURCE_ATTRIBUTES="service.name=my-app,service.namespace=my-application-group,deployment.environment=production" 

export OTEL_EXPORTER_OTLP_ENDPOINT="https://otlp-gateway-prod-ap-south-1.grafana.net/otlp" 

export OTEL_EXPORTER_OTLP_HEADERS="Authorization=Basic%20MTMxNjEyMDpnbGNfZXlKdklqb2lNVFE0TVRrMU1pSXNJbTRpT2lKMFpXTm9hV1YyTFdKc2IyY3RkR1ZqYUdsbGRpMWliRzluSWl3aWF5STZJazh6WkRGTFp6VlBSRGgxWjB3d2RUbEJNVGQyTlRKS1Z5SXNJbTBpT25zaWNpSTZJbkJ5YjJRdFlYQXRjMjkxZEdndE1TSjlmUT09" 

export OTEL_EXPORTER_OTLP_PROTOCOL="http/protobuf"

Run your application using the OpenTelemetry Python automatic instrumentation tool opentelemetry-instrument
COMMAND: opentelemetry-instrument python main.py

Now our chat bot gave us some data now we need to check the traces data inside the grafana dashboard.


We can able to get the traces for our Ai-bot application in grafana dashboard.

Ref:
1. https://docs.langtrace.ai/quickstart

2. https://docs.langtrace.ai/supported-integrations/llm-tools/azure-openai

3. https://docs.langtrace.ai/supported-integrations/observability-tools/grafana

4. https://grafana.com/auth/sign-in/





Name

AI,4,AWS,20,CNCF,20,Devops,51,linux,11,Tech Update,6,
ltr
item
Techie View: How to 𝗧𝗿𝗮𝗰𝗲 the 𝗟𝗟𝗠 𝗖𝗮𝗹𝗹𝘀 𝘄𝗶𝘁𝗵 𝗟𝗮𝗻𝗴𝘁𝗿𝗮𝗰𝗲 𝗔𝗜 𝗮𝗻𝗱 Visualize in 𝗚𝗿𝗮𝗳𝗮𝗻𝗮
How to 𝗧𝗿𝗮𝗰𝗲 the 𝗟𝗟𝗠 𝗖𝗮𝗹𝗹𝘀 𝘄𝗶𝘁𝗵 𝗟𝗮𝗻𝗴𝘁𝗿𝗮𝗰𝗲 𝗔𝗜 𝗮𝗻𝗱 Visualize in 𝗚𝗿𝗮𝗳𝗮𝗻𝗮
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgpGwfgAC0y450ScIVvIwOLaj3iSt3awp-2X1ZEX9ZFine9iKCKGe0o4ibK8ucyGi0nc4vwIO2z_nHabN7CZyiWe1LIa7D5N5b7IfeJRe5LIVKWTE9nCfrm_jl-sBK7uquXa2mtQgHmq19F53L_vQpXWCY_KyGcbXooowb5FFw4TlqfYby_i8-256BlPKsl/w640-h640/langtrace%20.png
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgpGwfgAC0y450ScIVvIwOLaj3iSt3awp-2X1ZEX9ZFine9iKCKGe0o4ibK8ucyGi0nc4vwIO2z_nHabN7CZyiWe1LIa7D5N5b7IfeJRe5LIVKWTE9nCfrm_jl-sBK7uquXa2mtQgHmq19F53L_vQpXWCY_KyGcbXooowb5FFw4TlqfYby_i8-256BlPKsl/s72-w640-c-h640/langtrace%20.png
Techie View
https://www.techiev.com/2025/07/how-to-visualize-in.html
https://www.techiev.com/
https://www.techiev.com/
https://www.techiev.com/2025/07/how-to-visualize-in.html
true
7013663511659419322
UTF-8
Loaded All Posts Not found any posts VIEW ALL View Full Article Reply Cancel reply Delete By Home PAGES POSTS View All RECOMMENDED FOR YOU LABEL ARCHIVE SEARCH ALL POSTS Not found any post match with your request Back Home Sunday Monday Tuesday Wednesday Thursday Friday Saturday Sun Mon Tue Wed Thu Fri Sat January February March April May June July August September October November December Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec just now 1 minute ago $$1$$ minutes ago 1 hour ago $$1$$ hours ago Yesterday $$1$$ days ago $$1$$ weeks ago more than 5 weeks ago Followers Follow THIS PREMIUM CONTENT IS LOCKED STEP 1: Share to a social network STEP 2: Click the link on your social network Copy All Code Select All Code All codes were copied to your clipboard Can not copy the codes / texts, please press [CTRL]+[C] (or CMD+C with Mac) to copy