Wednesday, July 3, 2024

Install OpenLIT Locally - Best Free Tool for LLM Monitoring and Tracing

 This video installs OpenLIT locally and integrates it with local Ollama models. OpenLIT is an OpenTelemetry-native tool designed to help developers gain insights into the performance of their LLM applications in production. It automatically collects LLM input and output metadata, and monitors GPU performance for self-hosted LLMs.




Code:

conda create -n lit python=3.11 -y && conda activate lit

pip install torch
pip install git+https://github.com/huggingface/transformers

git clone https://github.com/openlit/openlit.git

docker compose up -d

pip install openlit
pip install ollama

import ollama
prompt="what is happiness"

import openlit
openlit.init(otlp_endpoint="http://127.0.0.1:4318",trace_content=False)

response = ollama.generate(model='llama3', prompt=prompt)

No comments: