From 6f17d211e4d176b1e51a7bbf1f92cdaafea7a9f6 Mon Sep 17 00:00:00 2001 From: Naveen GOPU Date: Fri, 21 Nov 2025 12:34:02 +0530 Subject: [PATCH 1/4] NLB-7299: NginxaaS cost analysis tool for standardv3 plan --- .../billing/usage-and-cost-estimator.md | 126 ++++++ static/scripts/nginxaas_cost_analysis.py | 405 ++++++++++++++++++ 2 files changed, 531 insertions(+) create mode 100644 static/scripts/nginxaas_cost_analysis.py diff --git a/content/nginxaas-azure/billing/usage-and-cost-estimator.md b/content/nginxaas-azure/billing/usage-and-cost-estimator.md index 2ea97e8a1..9b788f374 100644 --- a/content/nginxaas-azure/billing/usage-and-cost-estimator.md +++ b/content/nginxaas-azure/billing/usage-and-cost-estimator.md @@ -145,3 +145,129 @@ Max( {{< /raw-html >}} + +

Cost Analysis Tool for Standard V3 Plan

+ +

Overview

+ +The NGINXaaS for Azure cost analysis tool provides a detailed hourly cost breakdown of your NGINXaaS deployment usage for each component (NCU, WAF, Ports, Data). It fetches real-time metrics directly from Azure Monitor using the Azure SDK and calculates costs based on actual usage. + +

Prerequisites

+ +Before using the cost analysis script: + +1. **Python 3.7+** installed on your system +2. **pip3** (Python package manager) installed +3. **Azure SDK for Python** installed: + + ```bash + pip3 install azure-identity azure-mgmt-monitor + ``` + +4. **NGINXaaS for Azure deployment** with monitoring enabled +5. **Azure AD Tenant ID** (required for authentication) +6. **Monitoring Reader permissions** on your NGINXaaS resource + +

Setting up Azure Permissions

+ +Get Tenant ID: + +1. Go to Azure Portal → Microsoft Entra ID → Overview +2. Copy the Tenant ID + +Grant Access: + +1. Go to your NGINX resource → Access control (IAM) → Add role assignment +2. Role: Monitoring Reader → Assign to your user account + +

Download and Usage

+ +#### Download Script + +{{}} {{}} + +#### Basic Usage + +Run the script with the required parameters: + +```bash +python3 nginxaas_cost_analysis.py \ + --resource-id "/subscriptions/xxx/resourceGroups/my-rg/providers/Nginx.NginxPlus/nginxDeployments/my-nginx" \ + --location "eastus2" \ + --date-range "2025-11-18T00:00:00Z/2025-11-19T23:59:59Z" \ + --tenant-id "your-tenant-id" \ + --output "my-cost-analysis.csv" +``` + +#### Required Parameters + +| Parameter | Description | Example | +|-------------------|---------------------------------------------|----------------------------------------------| +| `--resource-id` | Azure resource ID of NGINXaaS deployment | `/subscriptions/.../my-nginx` | +| `--location` | Azure region for pricing tier | `eastus2`, `westus2` | +| `--date-range` | Analysis period (max 30 days) | `2025-11-18T00:00:00Z/2025-11-19T23:59:59Z` | +| `--tenant-id` | Azure AD Tenant ID (required for login) | `12345678-1234-...` | +| `--output` | Output CSV filename (optional) | `my-cost-analysis.csv` | + +#### Sample Output + +{{< details "View sample output" >}} + +``` +🌐 Using InteractiveBrowserCredential with tenant: d106871e-7b91-4733-8423-f98586303b68 +📈 Processing 72 hours of data... +============================================================ +📈 COST ANALYSIS SUMMARY +============================================================ +Total Analysis Period: 72 hours +Total Cost: $32.40 + +🕐 HOURLY COST BREAKDOWN (First 5 hours): +------------------------------------------------------------ +Hour 1 - 2025-11-18T00:00:00Z + Fixed: $0.250 | WAF: $0.000 | NCU: $0.160 + Ports: $0.000 | Data: $0.000 | Total: $0.410 + +Hour 2 - 2025-11-18T01:00:00Z + Fixed: $0.250 | WAF: $0.000 | NCU: $0.160 + Ports: $0.000 | Data: $0.000 | Total: $0.410 + +Hour 3 - 2025-11-18T02:00:00Z + Fixed: $0.250 | WAF: $0.000 | NCU: $0.160 + Ports: $0.000 | Data: $0.000 | Total: $0.410 + +Hour 4 - 2025-11-18T03:00:00Z + Fixed: $0.250 | WAF: $0.000 | NCU: $0.160 + Ports: $0.000 | Data: $0.000 | Total: $0.410 + +Hour 5 - 2025-11-18T04:00:00Z + Fixed: $0.250 | WAF: $0.000 | NCU: $0.160 + Ports: $0.000 | Data: $0.000 | Total: $0.410 + +... and 67 more hours + +✅ Cost breakdown exported to nginxaas_cost_breakdown.csv + 📊 Summary: 72 hours, Total cost: $32.40 +✅ Cost analysis completed successfully! +``` + +{{< /details >}} + +

Understanding the Results

+ +

Cost Components

+ +- **Fixed costs**: Fixed deployment cost (varies by region and WAF usage) +- **NCU costs**: Variable costs based on actual NCU consumption +- **WAF costs**: Additional costs when Web Application Firewall is enabled +- **Port costs**: Additional costs for listen ports beyond the first 5 +- **Data processing**: Costs for data processed ($0.005/GB across all regions) + +

Additional Billing Resources

+ +For comprehensive billing information and cost planning, refer to these additional resources: + +- **[Usage and Cost Estimator]({{< relref "usage-and-cost-estimator.md" >}})**: Interactive tool for planning and estimating costs before deployment +- **[Billing Overview]({{< relref "overview.md" >}})**: Complete billing model explanation and pricing details + +This cost analysis tool helps you understand your actual NGINX for Azure spending by analyzing real usage metrics, enabling you to optimize costs and plan future deployments effectively. diff --git a/static/scripts/nginxaas_cost_analysis.py b/static/scripts/nginxaas_cost_analysis.py new file mode 100644 index 000000000..797860de4 --- /dev/null +++ b/static/scripts/nginxaas_cost_analysis.py @@ -0,0 +1,405 @@ +#!/usr/bin/env python3 +""" +NGINX for Azure Cost Analysis Tool + +This script analyzes your actual NGINX for Azure usage and calculates precise hourly costs +based on real Azure Monitor metrics with 1-minute granularity aggregated to hourly intervals. +""" + +from datetime import datetime +from azure.identity import InteractiveBrowserCredential +from azure.mgmt.monitor import MonitorManagementClient +import argparse +import sys + +# Tier-specific pricing for NGINX and NGINX + WAF +PRICING = { + "Tier 1": { + "fixed": {"NGINX": 0.25, "NGINX + WAF": 0.45}, + "ncu": {"NGINX": 0.008, "NGINX + WAF": 0.0144}, + "data_processing": 0.005 # Per GB + }, + "Tier 2": { + "fixed": {"NGINX": 0.33, "NGINX + WAF": 0.594}, + "ncu": {"NGINX": 0.01064, "NGINX + WAF": 0.01952}, + "data_processing": 0.005 # Per GB + }, + "Tier 3": { + "fixed": {"NGINX": 0.42, "NGINX + WAF": 0.75}, + "ncu": {"NGINX": 0.01328, "NGINX + WAF": 0.0239}, + "data_processing": 0.005 # Per GB + } +} + +# Regional tier mapping +TIER_MAPPING = { + "Tier 1": ["eastus2","centraluseuap", "northeurope", "southcentralus", "westcentralus", "westus2", "westus3"], + "Tier 2": ["canadacentral", "centralindia", "centralus", "eastus", "germanywestcentral", "koreacentral", + "northcentralus", "southeastasia", "swedencentral", "westeurope", "westus"], + "Tier 3": ["australiaeast", "brazilsouth", "japaneast", "southindia", "uksouth", "ukwest"] +} + +def determine_tier(location): + """Determine the regional tier based on location.""" + for tier, locations in TIER_MAPPING.items(): + if location in locations: + return tier + raise ValueError(f"Location '{location}' is not recognized in any tier mapping.") + +def validate_date_range(date_range): + """Validates that the date range does not exceed 30 days.""" + try: + start_date, end_date = date_range.split("/") + start_date = datetime.fromisoformat(start_date.replace("Z", "+00:00")) + end_date = datetime.fromisoformat(end_date.replace("Z", "+00:00")) + if end_date < start_date: + raise ValueError("End date must be greater than the start date.") + + days_diff = (end_date - start_date).days + if days_diff > 30: + raise ValueError(f"The date range exceeds the allowed maximum of 30 days. Current range: {days_diff} days.") + + except ValueError as e: + raise ValueError(f"Invalid date range: {e}") + +def get_metrics_azure_sdk(client, resource_id, metric_name, start_time, end_time): + """Fetch metrics using Azure Monitor SDK with 1-minute granularity.""" + metrics_data = client.metrics.list( + resource_uri=resource_id, + timespan=f"{start_time}/{end_time}", + interval="PT1M", # 1-minute intervals + metricnames=metric_name, + aggregation="Average,Maximum,Total", + ) + return metrics_data + +def aggregate_hourly_data(metrics_data, metric_name): + """Aggregate 1-minute metric data into hourly values.""" + from datetime import datetime, timezone + + # Check if we have valid SDK format data + if not (metrics_data.value and len(metrics_data.value) > 0 and + metrics_data.value[0].timeseries and len(metrics_data.value[0].timeseries) > 0): + return [] + + data_points = metrics_data.value[0].timeseries[0].data + hourly_aggregates = {} + + for point in data_points: + # SDK format - use time_stamp attribute + timestamp = point.time_stamp + hour_key = timestamp.replace(minute=0, second=0, microsecond=0) + + if hour_key not in hourly_aggregates: + hourly_aggregates[hour_key] = { + 'timestamp': hour_key.isoformat().replace('+00:00', 'Z'), + 'values': [], + 'totals': [] + } + + if metric_name == "system.interface.total_bytes": + total_val = point.total or 0 + hourly_aggregates[hour_key]['totals'].append(total_val) + else: + avg_val = point.average or 0 + max_val = point.maximum or 0 + hourly_aggregates[hour_key]['values'].append(max_val if max_val > 0 else avg_val) + + result = [] + for hour_key in sorted(hourly_aggregates.keys()): + hour_data = hourly_aggregates[hour_key] + + if metric_name == "system.interface.total_bytes": + final_value = sum(hour_data['totals']) + else: + final_value = max(hour_data['values']) if hour_data['values'] else 0 + + result.append({ + 'timestamp': hour_data['timestamp'], + 'value': final_value + }) + + return result + +def calculate_cost_breakdown(date_range, resource_id, location, credential, subscription_id=None): + """Calculates the cost breakdown for the specified date range and resource using Azure SDK.""" + validate_date_range(date_range) + + start_time, end_time = date_range.split("/") + + if subscription_id is None: + try: + subscription_id = resource_id.split('/')[2] + except IndexError: + raise ValueError("Unable to extract subscription ID from resource ID. Please provide it explicitly.") + + client = MonitorManagementClient(credential, subscription_id) + + tier = determine_tier(location) + pricing = PRICING[tier] + + waf_metrics_raw = get_metrics_azure_sdk(client, resource_id, "waf.enabled", start_time, end_time) + ports_metrics_raw = get_metrics_azure_sdk(client, resource_id, "ports.used", start_time, end_time) + ncu_metrics_raw = get_metrics_azure_sdk(client, resource_id, "ncu.provisioned", start_time, end_time) + network_metrics_raw = get_metrics_azure_sdk(client, resource_id, "system.interface.total_bytes", start_time, end_time) + + waf_hourly = aggregate_hourly_data(waf_metrics_raw, "waf.enabled") + ports_hourly = aggregate_hourly_data(ports_metrics_raw, "ports.used") + ncu_hourly = aggregate_hourly_data(ncu_metrics_raw, "ncu.provisioned") + network_hourly = aggregate_hourly_data(network_metrics_raw, "system.interface.total_bytes") + + cost_breakdown = [] + + if not (waf_hourly and ports_hourly and ncu_hourly and network_hourly): + raise Exception("No metric data available for the specified time range") + + min_length = min(len(waf_hourly), len(ports_hourly), len(ncu_hourly), len(network_hourly)) + print(f"📈 Processing {min_length} hours of data...") + + for i in range(min_length): + timestamp = waf_hourly[i]['timestamp'] + waf_enabled = waf_hourly[i]['value'] == 1 # 1 means WAF is enabled + ports_used = ports_hourly[i]['value'] + ncu_provisioned = ncu_hourly[i]['value'] + total_bytes = network_hourly[i]['value'] + + # Convert total bytes to GB + data_processed_gb = total_bytes / (1024 ** 3) if total_bytes else 0 + + # Check if deployment is active (both NCU and ports > 0 indicates active deployment) + deployment_active = ncu_provisioned > 0 or ports_used > 0 + + # Cost components - only charge fixed costs if deployment is active + if deployment_active: + fixed_deployment_cost = pricing["fixed"]["NGINX"] + waf_cost = pricing["fixed"]["NGINX + WAF"] - fixed_deployment_cost if waf_enabled else 0 + else: + fixed_deployment_cost = 0 + waf_cost = 0 + + base_ncu_cost = ncu_provisioned * pricing["ncu"]["NGINX"] + waf_ncu_cost = (ncu_provisioned * (pricing["ncu"]["NGINX + WAF"] - pricing["ncu"]["NGINX"])) if waf_enabled else 0 + + ports_cost = (max(ports_used - 5, 0) * pricing["ncu"]["NGINX"]) # Cost for ports > 5 + data_processing_cost = data_processed_gb * pricing["data_processing"] + + # Total hourly cost + total_cost = fixed_deployment_cost + waf_cost + base_ncu_cost + waf_ncu_cost + ports_cost + data_processing_cost + + # Append hourly cost breakdown + cost_breakdown.append({ + "timestamp": timestamp, + "fixed_deployment_cost": round(fixed_deployment_cost, 6), + "waf_cost": round(waf_cost, 6), + "base_ncu_cost": round(base_ncu_cost, 6), + "waf_ncu_cost": round(waf_ncu_cost, 6), + "ports_cost": round(ports_cost, 6), + "data_processing_cost": round(data_processing_cost, 6), + "total_cost": round(total_cost, 6) + }) + + return cost_breakdown + +def parse_arguments(): + """Parse command-line arguments.""" + parser = argparse.ArgumentParser( + description='NGINX for Azure Cost Analysis Tool (Interactive Login)', + formatter_class=argparse.RawDescriptionHelpFormatter, + epilog=""" +Example usage: + python3 nginxaas_cost_analysis.py \ + --resource-id "/subscriptions/xxx/resourceGroups/my-rg/providers/Nginx.NginxPlus/nginxDeployments/my-nginx" \ + --location "eastus2" \ + --date-range "2025-11-18T00:00:00Z/2025-11-19T23:59:59Z" \ + --tenant-id "your-tenant-id" \ + --output "my-cost-analysis.csv" + +Note: --tenant-id is required for authentication. + """ + ) + + # Required arguments + parser.add_argument('--resource-id', '-r', required=True, + help='Azure resource ID of the NGINX deployment') + parser.add_argument('--location', '-l', required=True, + help='Azure region where NGINX is deployed (e.g., eastus2, westus2)') + parser.add_argument('--date-range', '-d', required=True, + help='Analysis period in ISO format: start/end (e.g., 2025-11-18T00:00:00Z/2025-11-19T23:59:59Z)') + + # Required tenant-id argument + parser.add_argument('--tenant-id', '-t', required=True, + help='Azure AD Tenant ID (required for authentication)') + parser.add_argument('--subscription-id', + help='Azure Subscription ID (extracted from resource-id if not provided)') + parser.add_argument('--output', '-o', default='nginxaas_cost_breakdown.csv', + help='Output CSV filename (default: nginxaas_cost_breakdown.csv)') + parser.add_argument('--verbose', '-v', action='store_true', + help='Enable verbose debug output') + + return parser.parse_args() + +def main(): + """Main function to run the cost breakdown analysis.""" + + args = parse_arguments() + + config = { + "subscription_id": args.subscription_id, + "resource_id": args.resource_id, + "location": args.location, + "date_range": args.date_range, + "tenant_id": args.tenant_id, + "output_file": args.output, + "verbose": args.verbose + } + + # Validate required arguments + if not config["resource_id"].startswith("/subscriptions/"): + print("❌ Error: Invalid resource ID format") + print(" Resource ID should start with /subscriptions/") + print(" Example: /subscriptions/xxx/resourceGroups/my-rg/providers/Nginx.NginxPlus/nginxDeployments/my-nginx") + return 1 + + if not config["date_range"] or "/" not in config["date_range"]: + print("❌ Error: Invalid date range format") + print(" Use format: start/end (e.g., 2025-11-18T00:00:00Z/2025-11-19T23:59:59Z)") + return 1 + + try: + print("🔧 Initializing NGINX for Azure Cost Analysis...") + print(f"📍 Location: {config['location']}") + print(f"📅 Date Range: {config['date_range']}") + print(f"🔗 Resource: {config['resource_id'].split('/')[-1]}") + print() + + # Authentication with Azure + print("🔐 Authenticating with Azure...") + + # Use InteractiveBrowserCredential with required tenant_id + credential = InteractiveBrowserCredential( + tenant_id=config["tenant_id"] + ) + print(f"🌐 Using InteractiveBrowserCredential with tenant: {config['tenant_id']}") + + # Run the cost calculation + result = calculate_cost_breakdown( + config["date_range"], + config["resource_id"], + config["location"], + credential, + config["subscription_id"] + ) + + # Display summary statistics + total_hours = len(result) + total_cost = sum(entry["total_cost"] for entry in result) + avg_hourly_cost = total_cost / total_hours if total_hours > 0 else 0 + + print("=" * 60) + print("📈 COST ANALYSIS SUMMARY") + print("=" * 60) + print(f"Total Analysis Period: {total_hours} hours") + print(f"Total Cost: ${total_cost:.2f}") + print() + + # Display detailed breakdown for first few hours + print("🕐 HOURLY COST BREAKDOWN (First 5 hours):") + print("-" * 60) + for i, entry in enumerate(result[:5]): + print(f"Hour {i+1} - {entry['timestamp']}") + print(f" Fixed: ${entry['fixed_deployment_cost']:.3f} | WAF: ${entry['waf_cost']:.3f} | NCU: ${entry['base_ncu_cost']:.3f}") + print(f" Ports: ${entry['ports_cost']:.3f} | Data: ${entry['data_processing_cost']:.3f} | Total: ${entry['total_cost']:.3f}") + print() + + if total_hours > 5: + print(f"... and {total_hours - 5} more hours") + print() + + # Export to CSV + export_to_csv(result, config["output_file"]) + + print("✅ Cost analysis completed successfully!") + return 0 + + except Exception as e: + error_message = str(e) + print(f"❌ Error during cost analysis: {error_message}") + + if "authorization" in error_message.lower() or "403" in error_message: + print("\n🔐 PERMISSIONS ERROR") + print("=" * 25) + print("Your Azure account needs access to read metrics from this NGINX resource.") + print("This typically requires 'Monitoring Reader' or 'Reader' role on the resource.") + else: + print(f"\n💡 Please check:") + print(" • Your Azure permissions (Monitoring Reader role)") + print(" • That the resource ID is correct") + print(" • That the date range is within the last 30 days") + print(" • Try running 'az login' first for DefaultAzureCredential") + + return 1 + +def export_to_csv(cost_breakdown, filename="nginx_cost_breakdown.csv"): + """Export cost breakdown to CSV file.""" + import csv + + if not cost_breakdown: + print("⚠️ No data to export") + return + + try: + with open(filename, 'w', newline='') as csvfile: + fieldnames = cost_breakdown[0].keys() + header_mapping = { + "timestamp": "Timestamp", + "fixed_deployment_cost": "Base Fixed Cost ($USD)", + "waf_cost": "WAF Cost ($USD)", + "base_ncu_cost": "Base NCU Cost ($USD)", + "waf_ncu_cost": "WAF NCU Cost ($USD)", + "ports_cost": "Ports Cost ($USD)", + "data_processing_cost": "Data Processing Cost ($USD)", + "total_cost": "Total Cost ($USD)" + } + + writer = csv.DictWriter(csvfile, fieldnames=fieldnames) + + # Write custom headers + writer.writerow(header_mapping) + + # Write hourly data + for row in cost_breakdown: + writer.writerow(row) + + # Calculate and write summary totals + total_hours = len(cost_breakdown) + total_fixed_deployment = sum(entry["fixed_deployment_cost"] for entry in cost_breakdown) + total_waf = sum(entry["waf_cost"] for entry in cost_breakdown) + total_base_ncu = sum(entry["base_ncu_cost"] for entry in cost_breakdown) + total_waf_ncu = sum(entry["waf_ncu_cost"] for entry in cost_breakdown) + total_ports = sum(entry["ports_cost"] for entry in cost_breakdown) + total_data_processing = sum(entry["data_processing_cost"] for entry in cost_breakdown) + total_cost = sum(entry["total_cost"] for entry in cost_breakdown) + + # Add separator row + writer.writerow({field: "" for field in fieldnames}) + + # Add totals row with dollar signs + totals_row = { + "timestamp": f"TOTALS ({total_hours} hours)", + "fixed_deployment_cost": f"${round(total_fixed_deployment, 4):.4f}", + "waf_cost": f"${round(total_waf, 4):.4f}", + "base_ncu_cost": f"${round(total_base_ncu, 4):.4f}", + "waf_ncu_cost": f"${round(total_waf_ncu, 4):.4f}", + "ports_cost": f"${round(total_ports, 4):.4f}", + "data_processing_cost": f"${round(total_data_processing, 4):.4f}", + "total_cost": f"${round(total_cost, 2):.2f}" + } + writer.writerow(totals_row) + + print(f"✅ Cost breakdown exported to {filename}") + print(f" 📊 Summary: {total_hours} hours, Total cost: ${total_cost:.2f}") + except Exception as e: + print(f"❌ Error exporting to CSV: {e}") + +if __name__ == "__main__": + sys.exit(main()) \ No newline at end of file From f81c54429d68f57647778622518bebcc4612d325 Mon Sep 17 00:00:00 2001 From: Naveen GOPU Date: Mon, 24 Nov 2025 11:37:04 +0530 Subject: [PATCH 2/4] NLB-7299: Cleanedup code to remove print statements --- .../billing/usage-and-cost-estimator.md | 40 +---- static/scripts/nginxaas_cost_analysis.py | 151 +++++------------- 2 files changed, 45 insertions(+), 146 deletions(-) diff --git a/content/nginxaas-azure/billing/usage-and-cost-estimator.md b/content/nginxaas-azure/billing/usage-and-cost-estimator.md index 9b788f374..91a1cfb11 100644 --- a/content/nginxaas-azure/billing/usage-and-cost-estimator.md +++ b/content/nginxaas-azure/billing/usage-and-cost-estimator.md @@ -164,7 +164,7 @@ Before using the cost analysis script: pip3 install azure-identity azure-mgmt-monitor ``` -4. **NGINXaaS for Azure deployment** with monitoring enabled +4. **NGINXaaS for Azure deployment on Standard V3 Plan** with monitoring enabled 5. **Azure AD Tenant ID** (required for authentication) 6. **Monitoring Reader permissions** on your NGINXaaS resource @@ -214,41 +214,9 @@ python3 nginxaas_cost_analysis.py \ {{< details "View sample output" >}} ``` -🌐 Using InteractiveBrowserCredential with tenant: d106871e-7b91-4733-8423-f98586303b68 -📈 Processing 72 hours of data... -============================================================ -📈 COST ANALYSIS SUMMARY -============================================================ -Total Analysis Period: 72 hours -Total Cost: $32.40 - -🕐 HOURLY COST BREAKDOWN (First 5 hours): ------------------------------------------------------------- -Hour 1 - 2025-11-18T00:00:00Z - Fixed: $0.250 | WAF: $0.000 | NCU: $0.160 - Ports: $0.000 | Data: $0.000 | Total: $0.410 - -Hour 2 - 2025-11-18T01:00:00Z - Fixed: $0.250 | WAF: $0.000 | NCU: $0.160 - Ports: $0.000 | Data: $0.000 | Total: $0.410 - -Hour 3 - 2025-11-18T02:00:00Z - Fixed: $0.250 | WAF: $0.000 | NCU: $0.160 - Ports: $0.000 | Data: $0.000 | Total: $0.410 - -Hour 4 - 2025-11-18T03:00:00Z - Fixed: $0.250 | WAF: $0.000 | NCU: $0.160 - Ports: $0.000 | Data: $0.000 | Total: $0.410 - -Hour 5 - 2025-11-18T04:00:00Z - Fixed: $0.250 | WAF: $0.000 | NCU: $0.160 - Ports: $0.000 | Data: $0.000 | Total: $0.410 - -... and 67 more hours - -✅ Cost breakdown exported to nginxaas_cost_breakdown.csv - 📊 Summary: 72 hours, Total cost: $32.40 -✅ Cost analysis completed successfully! +Cost breakdown exported to nginxaas_cost_breakdown.csv +Summary: 96 hours, Total cost: $71.66 +Cost analysis completed successfully! ``` {{< /details >}} diff --git a/static/scripts/nginxaas_cost_analysis.py b/static/scripts/nginxaas_cost_analysis.py index 797860de4..62b5b4f7d 100644 --- a/static/scripts/nginxaas_cost_analysis.py +++ b/static/scripts/nginxaas_cost_analysis.py @@ -69,7 +69,7 @@ def get_metrics_azure_sdk(client, resource_id, metric_name, start_time, end_time timespan=f"{start_time}/{end_time}", interval="PT1M", # 1-minute intervals metricnames=metric_name, - aggregation="Average,Maximum,Total", + aggregation="Total", ) return metrics_data @@ -86,24 +86,17 @@ def aggregate_hourly_data(metrics_data, metric_name): hourly_aggregates = {} for point in data_points: - # SDK format - use time_stamp attribute + if point.total is None: + continue timestamp = point.time_stamp hour_key = timestamp.replace(minute=0, second=0, microsecond=0) - if hour_key not in hourly_aggregates: hourly_aggregates[hour_key] = { 'timestamp': hour_key.isoformat().replace('+00:00', 'Z'), - 'values': [], 'totals': [] } - - if metric_name == "system.interface.total_bytes": - total_val = point.total or 0 - hourly_aggregates[hour_key]['totals'].append(total_val) - else: - avg_val = point.average or 0 - max_val = point.maximum or 0 - hourly_aggregates[hour_key]['values'].append(max_val if max_val > 0 else avg_val) + total_val = point.total + hourly_aggregates[hour_key]['totals'].append(total_val) result = [] for hour_key in sorted(hourly_aggregates.keys()): @@ -112,7 +105,7 @@ def aggregate_hourly_data(metrics_data, metric_name): if metric_name == "system.interface.total_bytes": final_value = sum(hour_data['totals']) else: - final_value = max(hour_data['values']) if hour_data['values'] else 0 + final_value = max(hour_data['totals']) if hour_data['totals'] else 0 result.append({ 'timestamp': hour_data['timestamp'], @@ -141,59 +134,48 @@ def calculate_cost_breakdown(date_range, resource_id, location, credential, subs waf_metrics_raw = get_metrics_azure_sdk(client, resource_id, "waf.enabled", start_time, end_time) ports_metrics_raw = get_metrics_azure_sdk(client, resource_id, "ports.used", start_time, end_time) ncu_metrics_raw = get_metrics_azure_sdk(client, resource_id, "ncu.provisioned", start_time, end_time) - network_metrics_raw = get_metrics_azure_sdk(client, resource_id, "system.interface.total_bytes", start_time, end_time) + data_processed_metrics_raw = get_metrics_azure_sdk(client, resource_id, "system.interface.total_bytes", start_time, end_time) waf_hourly = aggregate_hourly_data(waf_metrics_raw, "waf.enabled") ports_hourly = aggregate_hourly_data(ports_metrics_raw, "ports.used") ncu_hourly = aggregate_hourly_data(ncu_metrics_raw, "ncu.provisioned") - network_hourly = aggregate_hourly_data(network_metrics_raw, "system.interface.total_bytes") + data_processed_hourly = aggregate_hourly_data(data_processed_metrics_raw, "system.interface.total_bytes") cost_breakdown = [] - if not (waf_hourly and ports_hourly and ncu_hourly and network_hourly): + if not (waf_hourly and ports_hourly and ncu_hourly and data_processed_hourly): raise Exception("No metric data available for the specified time range") - min_length = min(len(waf_hourly), len(ports_hourly), len(ncu_hourly), len(network_hourly)) - print(f"📈 Processing {min_length} hours of data...") - + min_length = min(len(waf_hourly), len(ports_hourly), len(ncu_hourly), len(data_processed_hourly)) for i in range(min_length): timestamp = waf_hourly[i]['timestamp'] - waf_enabled = waf_hourly[i]['value'] == 1 # 1 means WAF is enabled + waf_enabled = float(waf_hourly[i]['value']) == 1.0 # 1.0 means WAF is enabled ports_used = ports_hourly[i]['value'] ncu_provisioned = ncu_hourly[i]['value'] - total_bytes = network_hourly[i]['value'] + total_bytes = data_processed_hourly[i]['value'] # Convert total bytes to GB data_processed_gb = total_bytes / (1024 ** 3) if total_bytes else 0 - # Check if deployment is active (both NCU and ports > 0 indicates active deployment) - deployment_active = ncu_provisioned > 0 or ports_used > 0 - - # Cost components - only charge fixed costs if deployment is active - if deployment_active: - fixed_deployment_cost = pricing["fixed"]["NGINX"] - waf_cost = pricing["fixed"]["NGINX + WAF"] - fixed_deployment_cost if waf_enabled else 0 - else: - fixed_deployment_cost = 0 - waf_cost = 0 - + base_deployment_cost = pricing["fixed"]["NGINX"] + waf_deployment_cost = pricing["fixed"]["NGINX + WAF"] - base_deployment_cost if waf_enabled else 0 base_ncu_cost = ncu_provisioned * pricing["ncu"]["NGINX"] waf_ncu_cost = (ncu_provisioned * (pricing["ncu"]["NGINX + WAF"] - pricing["ncu"]["NGINX"])) if waf_enabled else 0 - ports_cost = (max(ports_used - 5, 0) * pricing["ncu"]["NGINX"]) # Cost for ports > 5 + ports_ncu_cost = (max(ports_used - 5, 0) * 2 * pricing["ncu"]["NGINX"]) # Cost for ports > 5 data_processing_cost = data_processed_gb * pricing["data_processing"] # Total hourly cost - total_cost = fixed_deployment_cost + waf_cost + base_ncu_cost + waf_ncu_cost + ports_cost + data_processing_cost + total_cost = base_deployment_cost + waf_deployment_cost + base_ncu_cost + waf_ncu_cost + ports_ncu_cost + data_processing_cost # Append hourly cost breakdown cost_breakdown.append({ "timestamp": timestamp, - "fixed_deployment_cost": round(fixed_deployment_cost, 6), - "waf_cost": round(waf_cost, 6), + "base_deployment_cost": round(base_deployment_cost, 6), + "waf_deployment_cost": round(waf_deployment_cost, 6), "base_ncu_cost": round(base_ncu_cost, 6), "waf_ncu_cost": round(waf_ncu_cost, 6), - "ports_cost": round(ports_cost, 6), + "ports_ncu_cost": round(ports_ncu_cost, 6), "data_processing_cost": round(data_processing_cost, 6), "total_cost": round(total_cost, 6) }) @@ -225,16 +207,13 @@ def parse_arguments(): help='Azure region where NGINX is deployed (e.g., eastus2, westus2)') parser.add_argument('--date-range', '-d', required=True, help='Analysis period in ISO format: start/end (e.g., 2025-11-18T00:00:00Z/2025-11-19T23:59:59Z)') - - # Required tenant-id argument parser.add_argument('--tenant-id', '-t', required=True, help='Azure AD Tenant ID (required for authentication)') + # Optional arguments parser.add_argument('--subscription-id', help='Azure Subscription ID (extracted from resource-id if not provided)') parser.add_argument('--output', '-o', default='nginxaas_cost_breakdown.csv', help='Output CSV filename (default: nginxaas_cost_breakdown.csv)') - parser.add_argument('--verbose', '-v', action='store_true', - help='Enable verbose debug output') return parser.parse_args() @@ -250,36 +229,22 @@ def main(): "date_range": args.date_range, "tenant_id": args.tenant_id, "output_file": args.output, - "verbose": args.verbose } # Validate required arguments if not config["resource_id"].startswith("/subscriptions/"): - print("❌ Error: Invalid resource ID format") - print(" Resource ID should start with /subscriptions/") - print(" Example: /subscriptions/xxx/resourceGroups/my-rg/providers/Nginx.NginxPlus/nginxDeployments/my-nginx") + print("Error: Invalid resource ID format.\nResource ID should start with /subscriptions/.\nExample: /subscriptions/xxx/resourceGroups/my-rg/providers/Nginx.NginxPlus/nginxDeployments/my-nginx") return 1 - + if not config["date_range"] or "/" not in config["date_range"]: - print("❌ Error: Invalid date range format") - print(" Use format: start/end (e.g., 2025-11-18T00:00:00Z/2025-11-19T23:59:59Z)") + print("Error: Invalid date range format.\nUse format: start/end (e.g., 2025-11-18T00:00:00Z/2025-11-19T23:59:59Z)") return 1 - try: - print("🔧 Initializing NGINX for Azure Cost Analysis...") - print(f"📍 Location: {config['location']}") - print(f"📅 Date Range: {config['date_range']}") - print(f"🔗 Resource: {config['resource_id'].split('/')[-1]}") - print() - - # Authentication with Azure - print("🔐 Authenticating with Azure...") - + try: # Use InteractiveBrowserCredential with required tenant_id credential = InteractiveBrowserCredential( tenant_id=config["tenant_id"] ) - print(f"🌐 Using InteractiveBrowserCredential with tenant: {config['tenant_id']}") # Run the cost calculation result = calculate_cost_breakdown( @@ -290,53 +255,19 @@ def main(): config["subscription_id"] ) - # Display summary statistics - total_hours = len(result) - total_cost = sum(entry["total_cost"] for entry in result) - avg_hourly_cost = total_cost / total_hours if total_hours > 0 else 0 - - print("=" * 60) - print("📈 COST ANALYSIS SUMMARY") - print("=" * 60) - print(f"Total Analysis Period: {total_hours} hours") - print(f"Total Cost: ${total_cost:.2f}") - print() - - # Display detailed breakdown for first few hours - print("🕐 HOURLY COST BREAKDOWN (First 5 hours):") - print("-" * 60) - for i, entry in enumerate(result[:5]): - print(f"Hour {i+1} - {entry['timestamp']}") - print(f" Fixed: ${entry['fixed_deployment_cost']:.3f} | WAF: ${entry['waf_cost']:.3f} | NCU: ${entry['base_ncu_cost']:.3f}") - print(f" Ports: ${entry['ports_cost']:.3f} | Data: ${entry['data_processing_cost']:.3f} | Total: ${entry['total_cost']:.3f}") - print() - - if total_hours > 5: - print(f"... and {total_hours - 5} more hours") - print() - # Export to CSV export_to_csv(result, config["output_file"]) - print("✅ Cost analysis completed successfully!") + print("Cost analysis completed successfully!") return 0 except Exception as e: error_message = str(e) - print(f"❌ Error during cost analysis: {error_message}") - + print(f"Error during cost analysis: {error_message}") if "authorization" in error_message.lower() or "403" in error_message: - print("\n🔐 PERMISSIONS ERROR") - print("=" * 25) - print("Your Azure account needs access to read metrics from this NGINX resource.") - print("This typically requires 'Monitoring Reader' or 'Reader' role on the resource.") + print("\nPERMISSIONS ERROR\n" + "=" * 25 + "\nYour Azure account needs access to read metrics from this NGINX resource.\nThis typically requires 'Monitoring Reader' or 'Reader' role on the resource.") else: - print(f"\n💡 Please check:") - print(" • Your Azure permissions (Monitoring Reader role)") - print(" • That the resource ID is correct") - print(" • That the date range is within the last 30 days") - print(" • Try running 'az login' first for DefaultAzureCredential") - + print("\nPlease check:\n - Your Azure permissions (Monitoring Reader role)\n - That the resource ID is correct\n - That the date range is within the last 30 days\n") return 1 def export_to_csv(cost_breakdown, filename="nginx_cost_breakdown.csv"): @@ -344,7 +275,7 @@ def export_to_csv(cost_breakdown, filename="nginx_cost_breakdown.csv"): import csv if not cost_breakdown: - print("⚠️ No data to export") + print("No data to export") return try: @@ -352,11 +283,11 @@ def export_to_csv(cost_breakdown, filename="nginx_cost_breakdown.csv"): fieldnames = cost_breakdown[0].keys() header_mapping = { "timestamp": "Timestamp", - "fixed_deployment_cost": "Base Fixed Cost ($USD)", - "waf_cost": "WAF Cost ($USD)", + "base_deployment_cost": "Base Deployment Cost ($USD)", + "waf_deployment_cost": "WAF Deployment Cost ($USD)", "base_ncu_cost": "Base NCU Cost ($USD)", "waf_ncu_cost": "WAF NCU Cost ($USD)", - "ports_cost": "Ports Cost ($USD)", + "ports_ncu_cost": "Ports NCU Cost ($USD)", "data_processing_cost": "Data Processing Cost ($USD)", "total_cost": "Total Cost ($USD)" } @@ -372,11 +303,11 @@ def export_to_csv(cost_breakdown, filename="nginx_cost_breakdown.csv"): # Calculate and write summary totals total_hours = len(cost_breakdown) - total_fixed_deployment = sum(entry["fixed_deployment_cost"] for entry in cost_breakdown) - total_waf = sum(entry["waf_cost"] for entry in cost_breakdown) + total_fixed_deployment = sum(entry["base_deployment_cost"] for entry in cost_breakdown) + total_waf = sum(entry["waf_deployment_cost"] for entry in cost_breakdown) total_base_ncu = sum(entry["base_ncu_cost"] for entry in cost_breakdown) total_waf_ncu = sum(entry["waf_ncu_cost"] for entry in cost_breakdown) - total_ports = sum(entry["ports_cost"] for entry in cost_breakdown) + total_ports = sum(entry["ports_ncu_cost"] for entry in cost_breakdown) total_data_processing = sum(entry["data_processing_cost"] for entry in cost_breakdown) total_cost = sum(entry["total_cost"] for entry in cost_breakdown) @@ -386,20 +317,20 @@ def export_to_csv(cost_breakdown, filename="nginx_cost_breakdown.csv"): # Add totals row with dollar signs totals_row = { "timestamp": f"TOTALS ({total_hours} hours)", - "fixed_deployment_cost": f"${round(total_fixed_deployment, 4):.4f}", - "waf_cost": f"${round(total_waf, 4):.4f}", + "base_deployment_cost": f"${round(total_fixed_deployment, 4):.4f}", + "waf_deployment_cost": f"${round(total_waf, 4):.4f}", "base_ncu_cost": f"${round(total_base_ncu, 4):.4f}", "waf_ncu_cost": f"${round(total_waf_ncu, 4):.4f}", - "ports_cost": f"${round(total_ports, 4):.4f}", + "ports_ncu_cost": f"${round(total_ports, 4):.4f}", "data_processing_cost": f"${round(total_data_processing, 4):.4f}", "total_cost": f"${round(total_cost, 2):.2f}" } writer.writerow(totals_row) - print(f"✅ Cost breakdown exported to {filename}") - print(f" 📊 Summary: {total_hours} hours, Total cost: ${total_cost:.2f}") + print(f"Cost breakdown exported to {filename}") + print(f"Summary: {total_hours} hours, Total cost: ${total_cost:.2f}") except Exception as e: - print(f"❌ Error exporting to CSV: {e}") + print(f"Error exporting to CSV: {e}") if __name__ == "__main__": sys.exit(main()) \ No newline at end of file From d088aebd19ce597dbfac0dcabd5287b29a00619d Mon Sep 17 00:00:00 2001 From: Naveen GOPU Date: Mon, 24 Nov 2025 23:16:41 +0530 Subject: [PATCH 3/4] NLB-7299: Remove implementation details --- content/nginxaas-azure/billing/usage-and-cost-estimator.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/content/nginxaas-azure/billing/usage-and-cost-estimator.md b/content/nginxaas-azure/billing/usage-and-cost-estimator.md index 91a1cfb11..91eac3b82 100644 --- a/content/nginxaas-azure/billing/usage-and-cost-estimator.md +++ b/content/nginxaas-azure/billing/usage-and-cost-estimator.md @@ -150,7 +150,7 @@ Max(

Overview

-The NGINXaaS for Azure cost analysis tool provides a detailed hourly cost breakdown of your NGINXaaS deployment usage for each component (NCU, WAF, Ports, Data). It fetches real-time metrics directly from Azure Monitor using the Azure SDK and calculates costs based on actual usage. +The NGINXaaS for Azure cost analysis tool provides a detailed hourly cost breakdown of your NGINXaaS deployment usage for each component (NCU, WAF, Ports, Data). It fetches real-time metrics directly from Azure Monitor and calculates costs based on actual usage.

Prerequisites

From 959aa0237127a3601ca524e7971b28b73c335385 Mon Sep 17 00:00:00 2001 From: Naveen GOPU Date: Tue, 9 Dec 2025 10:31:21 +0530 Subject: [PATCH 4/4] NLB-7299: Update headings to sentence case --- .../billing/usage-and-cost-estimator.md | 20 +++++++++---------- 1 file changed, 10 insertions(+), 10 deletions(-) diff --git a/content/nginxaas-azure/billing/usage-and-cost-estimator.md b/content/nginxaas-azure/billing/usage-and-cost-estimator.md index 91eac3b82..d194ccbb6 100644 --- a/content/nginxaas-azure/billing/usage-and-cost-estimator.md +++ b/content/nginxaas-azure/billing/usage-and-cost-estimator.md @@ -146,7 +146,7 @@ Max( {{< /raw-html >}} -

Cost Analysis Tool for Standard V3 Plan

+

Cost analysis tool for standard V3 plan

Overview

@@ -168,7 +168,7 @@ Before using the cost analysis script: 5. **Azure AD Tenant ID** (required for authentication) 6. **Monitoring Reader permissions** on your NGINXaaS resource -

Setting up Azure Permissions

+

Setting up Azure permissions

Get Tenant ID: @@ -180,13 +180,13 @@ Before using the cost analysis script: 1. Go to your NGINX resource → Access control (IAM) → Add role assignment 2. Role: Monitoring Reader → Assign to your user account -

Download and Usage

+

Download and usage

-#### Download Script +#### Download script {{}} {{}} -#### Basic Usage +#### Basic usage Run the script with the required parameters: @@ -199,7 +199,7 @@ python3 nginxaas_cost_analysis.py \ --output "my-cost-analysis.csv" ``` -#### Required Parameters +#### Required parameters | Parameter | Description | Example | |-------------------|---------------------------------------------|----------------------------------------------| @@ -209,7 +209,7 @@ python3 nginxaas_cost_analysis.py \ | `--tenant-id` | Azure AD Tenant ID (required for login) | `12345678-1234-...` | | `--output` | Output CSV filename (optional) | `my-cost-analysis.csv` | -#### Sample Output +#### Sample output {{< details "View sample output" >}} @@ -221,9 +221,9 @@ Cost analysis completed successfully! {{< /details >}} -

Understanding the Results

+

Understanding the results

-

Cost Components

+

Cost components

- **Fixed costs**: Fixed deployment cost (varies by region and WAF usage) - **NCU costs**: Variable costs based on actual NCU consumption @@ -231,7 +231,7 @@ Cost analysis completed successfully! - **Port costs**: Additional costs for listen ports beyond the first 5 - **Data processing**: Costs for data processed ($0.005/GB across all regions) -

Additional Billing Resources

+

Additional billing resources

For comprehensive billing information and cost planning, refer to these additional resources: