Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions Appraisals
Original file line number Diff line number Diff line change
Expand Up @@ -49,3 +49,9 @@ appraise "opentelemetry-latest" do
gem "opentelemetry-sdk", ">= 1.10"
gem "opentelemetry-exporter-otlp", ">= 0.31"
end

# for multiple_projects.rb only, test both openai and ruby_llm
appraise "ruby-llm-openai" do
gem "openai", ">= 0.34"
gem "ruby_llm", ">= 1.9"
end
5 changes: 3 additions & 2 deletions Gemfile.lock
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ PATH
remote: .
specs:
braintrust (0.0.11)
base64 (~> 0.2)
openssl (~> 3.3.1)
opentelemetry-exporter-otlp (~> 0.28)
opentelemetry-sdk (~> 1.3)
Expand Down Expand Up @@ -50,7 +51,7 @@ GEM
builder
minitest (>= 5.0)
ruby-progressbar
openssl (3.3.1)
openssl (3.3.2)
opentelemetry-api (1.7.0)
opentelemetry-common (0.23.0)
opentelemetry-api (~> 1.0)
Expand Down Expand Up @@ -153,4 +154,4 @@ DEPENDENCIES
yard (~> 0.9)

BUNDLED WITH
2.4.19
2.4.19
1 change: 1 addition & 0 deletions Rakefile
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,7 @@ end

def appraisal_for(example)
case example
when /multiple_projects/ then "ruby-llm-openai"
when /ruby_llm/ then "ruby_llm"
when /ruby-openai/, /ruby_openai/, /alexrudall/ then "ruby-openai"
when /anthropic/ then "anthropic"
Expand Down
4 changes: 4 additions & 0 deletions braintrust.gemspec
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,10 @@ Gem::Specification.new do |spec|
# Runtime dependencies
spec.add_runtime_dependency "opentelemetry-sdk", "~> 1.3"
spec.add_runtime_dependency "opentelemetry-exporter-otlp", "~> 0.28"
# Ruby 3.4+ considers this a bundled gem, removed from default gems
# bundler should use default base64 lib in Ruby <3.4
# https://stdgems.org/base64/
spec.add_runtime_dependency "base64", "~> 0.2"

# OpenSSL 3.3.1+ fixes macOS CRL (Certificate Revocation List) verification issues
# that occur with OpenSSL 3.6 + Ruby (certificate verify failed: unable to get certificate CRL).
Expand Down
47 changes: 40 additions & 7 deletions examples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,24 +14,57 @@ export BRAINTRUST_API_KEY="your-api-key-here"

## Running Examples

### Using Rake (Recommended)

The rake task automatically uses the correct gemfile for each example:

```bash
# Run a single example
rake 'example[examples/trace/multiple_projects.rb]'

# Run all examples
rake examples
```

### Running Directly

From the project root:

```bash
# Run a specific example
ruby examples/login/login_basic.rb
ruby examples/login.rb

# Enable debug logging
BRAINTRUST_DEBUG=true ruby examples/login/login_basic.rb
BRAINTRUST_DEBUG=true ruby examples/login.rb
```

## Available Examples

### Login Examples

- **`login/login_basic.rb`**: Basic login example showing how to authenticate and retrieve organization information
- **`login.rb`**: Basic login example showing how to authenticate and retrieve organization information


### Tracing Examples

- **`trace.rb`**: Basic OpenTelemetry tracing example
- **`trace/span_filtering.rb`**: Example of filtering out non-AI spans in traces to reduce noise
- **`trace/trace_attachments.rb`**: Example of adding attachments (images, PDFs, BLOBs) to traces
- **`trace/multiple_projects.rb`**: Example of logging traces to multiple Braintrust projects simultaneously

### LLM Integration Examples

- **`openai.rb`**: OpenAI integration example
- **`anthropic.rb`**: Anthropic integration example
- **`ruby_llm.rb`**: Ruby LLM integration example
- **`alexrudall_openai.rb`**: Alexrudall's ruby-openai gem integration example

### Evaluation Examples

- **`eval.rb`**: Defining scorers and running evals
- **`eval/dataset.rb`**: Running an evaluation against a dataset
- **`eval/remote_functions.rb`**: Using remote functions (server-side prompts) in evaluations

## Coming Soon
### API Examples

- OpenTelemetry tracing examples
- OpenAI integration examples
- Eval framework examples
- **`api/dataset.rb`**: Dataset API usage example
200 changes: 200 additions & 0 deletions examples/trace/multiple_projects.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,200 @@
#!/usr/bin/env ruby
# frozen_string_literal: true

require "bundler/setup"
require "braintrust"
require "opentelemetry/sdk"
require "ruby_llm"
require "openai"

project1 = "Project-A"
project2 = "Project-B"
model1 = "gpt-4o-mini"
model2 = "claude-sonnet-4"

# check for API keys
unless ENV["OPENAI_API_KEY"] && ENV["ANTHROPIC_API_KEY"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this can be removed

puts "Error: Both OPENAI_API_KEY and ANTHROPIC_API_KEY environment variables are required"
puts "Get your API key from: https://platform.openai.com/api-keys"
puts "Get your Anthropic API key from: https://console.anthropic.com/"
puts "Set with `export OPENAI_API_KEY=<your_key> and export ANTHROPIC_API_KEY=<your_key>`"
exit 1
end

unless ENV["BRAINTRUST_API_KEY"]
puts "Error: BRAINTRUST_API_KEY environment variable is required"
puts "Get your API key from https://www.braintrust.dev/app/settings or ask your org administrator"
exit 1
end

# Example: Log/Trace to Multiple Projects with Separate States
#
# This example demonstrates how to:
# 1. Create multiple Braintrust states for different projects
# 2. Set up separate tracer providers for each project
# 3. Log traces to different projects simultaneously
#
# Usage:
# bundle exec ruby examples/trace/multiple_projects.rb

# Create first state for Project A (non-global)
state_a = Braintrust.init(
default_project: project1,
set_global: false,
enable_tracing: false, # We'll manually set up tracing
blocking_login: true # Ensure login completes before tracing setup
# Not required if only tracing, login is async by default and can lead to a broken permalink if not synchronous
)
# Create second state for Project B (non-global)
state_b = Braintrust.init(
default_project: project2,
set_global: false,
enable_tracing: false,
blocking_login: true
)

# Wrap all instances of RubyLLM client
Braintrust::Trace::Contrib::Github::Crmne::RubyLLM.wrap

RubyLLM.configure do |config|
config.openai_api_key = ENV["OPENAI_API_KEY"]
config.anthropic_api_key = ENV["ANTHROPIC_API_KEY"]
end

chat_openai = RubyLLM.chat(model: model1)
chat_anthropic = RubyLLM.chat(model: model2)

# Create first tracer provider
tracer_provider_a = OpenTelemetry::SDK::Trace::TracerProvider.new

# Setup using Trace.setup
# When you pass an explicit tracer_provider, it won't set it as global
Braintrust::Trace.setup(state_a, tracer_provider_a)

# Get tracer for Project A
tracer_a = tracer_provider_a.tracer("MultiTurn")

# Note: You can also use Trace.enable instead of Trace.setup:
# Braintrust::Trace.enable(tracer_provider_a, state: state_a)
# Braintrust::Trace.enable(tracer_provider_b, state: state_b)
# Both work the same when you provide explicit providers

# Now create spans in first project
puts "\nProject A: Multi-turn conversation"
puts "=" * 50
root_span_a = nil
tracer_a.in_span("chat_ask") do |span|
root_span_a = span
span.set_attribute("project", project1)

# Nested spans for multi-turn convo
tracer_a.in_span("turn1") do |nested_t1|
# Using OTEL GenAI Semantic Conventions for properties
# https://www.braintrust.dev/docs/integrations/sdk-integrations/opentelemetry#manual-tracing
# Braintrust automatically maps `gen_ai.*` attributes to native Braintrust fields
# tracer_b will use native fields
nested_t1.set_attribute("gen_ai.operation.name", "chat")
nested_t1.set_attribute("gen_ai.request.model", model1)
input = "What is the best season to visit Japan?"
puts "\nTurn 1 (#{model1}):"
puts "Q: #{input}"
output = chat_openai.ask(input)

nested_t1.set_attribute("gen_ai.prompt", input)
nested_t1.set_attribute("gen_ai.completion", output.content)
puts "A: #{output.content[0..100]}..."
puts " Tokens: #{output.to_h[:input_tokens]} in, #{output.to_h[:output_tokens]} out"

tracer_a.in_span("turn2") do |nested_t2|
nested_t2.set_attribute("gen_ai.operation.name", "chat")
nested_t2.set_attribute("gen_ai.request.model", model2)
input = "Which airlines fly to Japan from SFO?"
puts "\nTurn 2 (#{model2}):"
puts "Q: #{input}"
output = chat_anthropic.ask(input)

nested_t2.set_attribute("gen_ai.prompt", input)
nested_t2.set_attribute("gen_ai.completion", output.content)
puts "A: #{output.content[0..100]}..."
puts " Tokens: #{output.to_h[:input_tokens]} in, #{output.to_h[:output_tokens]} out"
end
end
end

puts "\n✓ Multi-turn conversation completed"
puts "\n✓ View Project A trace in Braintrust:"
puts " #{Braintrust::Trace.permalink(root_span_a)}"

url = "https://upload.wikimedia.org/wikipedia/commons/thumb/6/65/Tokyo_Tower_during_daytime.jpg/330px-Tokyo_Tower_during_daytime.jpg"

# For second project, we'll use the Ruby OpenAI client
# You can log to multiple projects even if your clients use different client libs
client = OpenAI::Client.new(api_key: ENV["OPENAI_API_KEY"])

# Create second tracer provider
tracer_provider_b = OpenTelemetry::SDK::Trace::TracerProvider.new
Braintrust::Trace.setup(state_b, tracer_provider_b)

# Get tracer for Project A
tracer_b = tracer_provider_b.tracer("ImageUpload")

# Wrapping OpenAI client with second trace provider
# We could simply call `wrap` without tracer_provider, but then it would be bound to our global state
Braintrust::Trace::OpenAI.wrap(client, tracer_provider: tracer_provider_b)

puts "\nProject B: Describe Image"
puts "=" * 50

# chat completion should automatically nest
root_span_b = nil
tracer_b.in_span("vision") do |span|
root_span_b = span
# Example 1: Vision - Image Understanding
puts "\n Vision (Image Understanding)"
puts "-" * 50

input = "Tell me about this landmark."
tracer_b.in_span("example-vision") do |nested|
response = client.chat.completions.create(
model: model1,
messages: [
{
role: "user",
content: [
{type: "text", text: input},
{
type: "image_url",
image_url: {
url: url
}
}
]
}
],
max_tokens: 100
)

# Using Braintrust native span attributes
# For comparisons with OTEL GenAI semantic convention properties,
# see https://www.braintrust.dev/docs/integrations/sdk-integrations/opentelemetry#manual-tracing
nested.set_attribute("braintrust.span_attributes.type", "llm")
nested.set_attribute("metadata.model", model1)
nested.set_attribute("braintrust.input", input)
nested.set_attribute("braintrust.output", response.choices[0].message.content.to_s)

puts "✓ Vision response: #{response.choices[0].message.content[0..100]}..."
puts " Tokens: #{response.usage.total_tokens}"
rescue OpenAI::Errors::BadRequestError => e
puts "⊘ Skipped - Image URL error (#{e.message.split("\n").first[0..80]}...)"
rescue => e
puts "⊘ Error: #{e.class}"
end
end

puts "\n✓ Vision example completed"
puts "\n✓ View Project B trace in Braintrust:"
puts " #{Braintrust::Trace.permalink(root_span_b)}"

# Shutdown both tracer providers to flush spans
tracer_provider_a.shutdown
tracer_provider_b.shutdown
9 changes: 9 additions & 0 deletions gemfiles/ruby_llm_openai.gemfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
# This file was generated by Appraisal

source "https://rubygems.org"

gem "minitest-reporters", "~> 1.6"
gem "openai", ">= 0.34"
gem "ruby_llm", ">= 1.9"

gemspec path: "../"