Skip to content

Open-MBEE/flexo-performance-test

Repository files navigation

Flexo Performance Test Suite

Setup

  1. Generate Flashlight.json (if not already done):

    python main.py
  2. Create CSV data file:

    python3 create_csv_data.py

    This creates flashlight_data.csv with the JSON content in the LARGEFIELD column.

Running Tests in Postman

  1. Import the collection: Flexo_Performance_Test.postman_collection.json

  2. Create/Select Environment with:

    • flexo_endpoint = https://flexo.openmbee.org (or your endpoint)
  3. Run Collection with CSV:

    • Open Collection Runner
    • Go to Data tab
    • Click Select File
    • Choose flashlight_data.csv
    • Set iterations:
      • "1. Create Project": 20 iterations
      • "2. Commit Project": 20 iterations
      • "3. Get Projects": 1 iteration
      • "4. Get Elements": 20 iterations
    • Click Run

How it works:

  • The CSV file contains the large JSON in the LARGEFIELD column
  • Pre-request script reads it: pm.iterationData.get('LARGEFIELD')
  • Sets environment variable: pm.environment.set('envLARGE_FIELD', ...)
  • Request body uses: {{envLARGE_FIELD}}

Running Tests with Newman CLI

./run_with_newman.sh

This automatically loads Flashlight.json and runs all tests.

Running Performance Tests with Python Script

The run_performance_tests.py script provides comprehensive performance testing with sequential and concurrent modes, plus performance charts.

Setup

  1. Install dependencies (in starforge conda environment):

    conda activate starforge
    pip install aiohttp matplotlib python-dotenv
    # OR
    pip install aiohttp matplotlib python-dotenv
  2. Create .env file with your bearer token:

    echo "FLEXO_TOKEN=your_bearer_token_here" > .env
  3. Run tests (simplified - token loaded from .env):

    # Sequential mode only
    python run_performance_tests.py --mode sequential
    
    # Concurrent mode only
    python run_performance_tests.py --mode concurrent
    
    # Both modes (default)
    python run_performance_tests.py

Options

  • --mode: Test mode - sequential, concurrent, or both (default: both)
  • --iterations: Number of iterations to run (default: 10)
  • --concurrency: Concurrency level for concurrent mode (default: 5)
  • --endpoint: Flexo API endpoint (default: https://flexo.openmbee.org)
  • --token: Bearer token for authentication (optional - defaults to FLEXO_TOKEN from .env)
  • --model: Path to test model JSON file (default: Flashlight.json)
  • --output: Output directory for charts and results (default: performance_results)

Examples

Simple sequential test (token from .env):

python run_performance_tests.py --mode sequential

Sequential test with 20 iterations:

python run_performance_tests.py --mode sequential --iterations 20

Concurrent test with 50 iterations and concurrency of 10:

python run_performance_tests.py --mode concurrent --iterations 10 --concurrency 5

Both modes with custom output directory:

python run_performance_tests.py --mode both --iterations 15 --output my_results

Override token from command line (if needed):

python run_performance_tests.py --mode sequential --token YOUR_TOKEN

Output

The script generates:

  • Performance charts: Response times by operation, response times over time, and triple count over time
  • JSON results: Detailed statistics and metrics in JSON format
  • Console summary: Performance statistics printed to console

All outputs are saved to the specified output directory (default: performance_results/).

Test Workflow

Each test iteration follows this workflow:

  1. Create Project: Create a new project
  2. Commit Model: Commit the test model to the project
  3. Get Elements: Retrieve elements from the project
  4. Get Projects: (Once at end) Retrieve all projects
  5. Track Triples: Monitor triple count over time

The script tracks:

  • Response times for each operation (min, max, average)
  • Triple count growth over time
  • Performance metrics over time

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •