Stop Manually Downloading Event Logs: Automate Your API Monitoring

Stop Manually Downloading Event Logs: Automate Your API Monitoring

By

Monitoring API usage isn’t just a technical task — it’s a core part of keeping your org healthy. As an admin, you’re the guardian of your platform’s limits and performance. Having a clear view of your API consumption helps you answer leadership questions, flag risky integrations before they break, and reduce overall operational risk. 

The Event Log Browser provides access to API Total Usage events — CSV files that show every API call made to your Salesforce org. But here’s the challenge: Standard orgs only retain Event Log Files for 24 hours. Miss a single day and your data is gone.

For admins who need to track API consumption for capacity planning, troubleshoot integration issues, or provide usage reports to your team, this manual daily process simply doesn’t scale. This post shows you how to automate that process so you can stop “firefighting” and start managing your org proactively.  

Why automate?

  • Zero data loss: Capture every event within the 24-hour retention window.
  • Historical records: Build a long-term database for trend analysis and capacity planning. Answer “Why did we spike yesterday?” with data, not guesses.
  • Operational peace of mind: Set it once and let it run automatically.
  • Reduce risk: No more forgotten downloads or manual mistakes.

Who is this for? 

This approach is ideal if you’re an admin who’s comfortable with light scripting or already working with command line interface (CLI) tools. Even if you don’t plan on maintaining the code yourself, understanding this flow is a massive level up. It gives you the language you need to partner with a developer or IT teammate to get this into production. If you’re the admin who owns integrations, works closely with dev teams, or gets asked about “hitting the limits,” this is a powerful pattern to have in your toolkit.

Why this matters for admins

When the script runs, it’s also generating a log file. If an integration breaks or a sync goes haywire, you won’t be scrambling. You’ll have a timestamped audit trail ready for your next meeting with IT. 

This isn’t just about a Python script — it’s about operational excellence. You’re moving from being an admin who reacts to “Limit Exceeded” emails to an admin who predicts them.

Automate event log file downloads in 5 steps

The following flow illustrates how the script interacts with Salesforce securely.

Script architecture for API total usage.

Step 1: Install the prerequisites

These tools form the foundation for automated, unattended data collection. Most admins already have these installed if they work with CLI tools.

Check if you have Python installed:

# Check Python version
python3 --version

If you don’t have Python:

  • Mac: brew install python3
  • Windows: Download from python.org
  • Linux: sudo apt install python3 (Ubuntu/Debian)

Salesforce CLI

The script uses the CLI to manage secure authentication.

# Install via npm (cross-platform)
npm install -g @salesforce/cli

# Verify installation
sf --version

Step 2: Create an External Client App

Modern Salesforce authentication uses External Client Apps (ECAs) with JSON Web Token (JWT) flow. This is more secure than traditional Connected Apps and what Salesforce recommends for automated, server-to-server access. Getting security right here protects your org from unauthorized access while enabling reliable automation.

Why JWT?

JWT authentication uses certificate-based verification instead of passwords, which means no credentials are stored or transmitted that could be compromised. It’s designed for exactly this use case: automated processes that need to authenticate without human interaction, making it ideal for scheduled scripts and meeting enterprise security standards.

Generate certificates

First, create a private key and certificate.

# Generate private key (keep this secure - never share it)
openssl genrsa -out server.key 2048

# Generate self-signed certificate (upload this to Salesforce)
openssl req -new -x509 -key server.key -out server.crt -days 365

# Set secure permissions on private key (Mac/Linux)
chmod 600 server.key

Important: You’ll have two files:

  • server.crt (Certificate): Upload this to Salesforce (one-time setup).
  • server.key (Private Key): Use this when running the script (keep it secure).

When you generate the certificate, openssl will ask you a few questions (Country, Organization, etc.), and you can hit Enter through most of them. The defaults work fine for internal use.

Configure the ECA in Salesforce

  1. Navigate to Setup → App Manager → New External Client App.
  2. Fill in the basic information:
    • App Name: API Usage Monitor
    • API Name: API_Usage_Monitor
  3. Under OAuth Settings, enable “Use digital signatures”.
  4. Upload your server.crt certificate file.
  5. Add these OAuth scopes:
    • Manage user data via APIs (api)
    • Manage user data via Web browsers (web)
    • Perform requests at any time (refresh_token, offline_access)
  6. Save and note the Consumer Key (Client ID).

This setup allows the script to authenticate without storing passwords, which is critical for meeting security compliance requirements and audit reviews.

Set up user permissions

Assign appropriate permissions to the user who will run the script. Ensure the user has the View Event Log Files permission. Here’s what matters: This user needs Read access to Event Log Files and nothing more.

The ECA creation with JWT Bearer flow.

Step 3: Download and configure the script

Now we’ll get the actual script and set up authentication. This step creates the foundation for hands-off daily collection. Once configured, it runs without your involvement.

Get the script

# Clone the repository
git clone https://github.com/riyasurisalesforce/ApiTotalUsageScript.git

# Navigate to the script directory
cd ApiTotalUsageScript

# Install Python dependencies
pip install -r requirements.txt

The script only needs one external library (requests), which keeps things simple and reduces maintenance overhead.

Step 4: Test the script

Before setting up the cron job for automating, validate that the script works for you by running the command manually once. Here’s an example.

Tip: Run `python3 extract_total_usage_calls.py –help` to see all required and optional parameters with their descriptions. Below is an image representing the same. For setting up org alias (if you don’t have one), refer to the README.

List of Optional and required arguments.

Now you can run the command as follows:

# Run the script manually
python3 extract_total_usage_calls.py \
  --client-id "YOUR_CONSUMER_KEY" \
  --username "admin@yourcompany.com" \
  --jwt-key-file "/path/to/server.key" \
  --instance-url "[https://yourcompany.my.salesforce.com](https://yourcompany.my.salesforce.com)" \
  --org-alias "your_org_alias" \
  --output-dir "/path/to/output"

You should see output indicating the creation of CSV and Log files. Check the output directory; you’ll find:

File

Content It Contains

logs/extract_usage_YYYYMMDD.log

Detailed execution log

output/ApiTotalUsage_YYYYMMDD_EventLogFileId.csv

Your API usage data

This is the part people usually get stuck on. If the script fails, check the log file first. It will tell you exactly what went wrong. Common issues are file path errors (use absolute paths, not relative ones) or permission problems with the output directory.

Pro tip: Open the CSV file to confirm it contains actual API usage data. You’re looking for rows with timestamps, user IDs, and API call details. If you see data, you’re ready to automate.

Step 5: Set up automation

Now you’re ready to schedule this to run daily. This is where the real value kicks in: You set it once and never think about it again. Automation eliminates the human error that leads to missing days of data, which means you can answer questions with complete historical records instead of explaining, “I only have data since last Tuesday.”

Mac/Linux (cron)

# Edit your crontab
crontab -e

# Add this line to run daily at 2:00 AM

0 2 * * * /usr/bin/python3 /path/to/ApiTotalUsageScript/extract_total_usage_calls.py \
  --client-id "YOUR_CONSUMER_KEY" \
  --username "admin@yourcompany.com" \
  --jwt-key-file "/secure/path/server.key" \
  --instance-url "[https://yourcompany.my.salesforce.com](https://yourcompany.my.salesforce.com)" \
  --org-alias "api-monitor" \
  --output-dir "/data/salesforce-logs"

Windows (Task Scheduler)

  1. Open Task Scheduler.
  2. Create a Basic Task.
  3. Set trigger to Daily at 2:00 AM.
  4. Set action to Start a Program:
    • Program/script: python3
    • Add arguments: C:\path\to\ApiTotalUsageScript\extract_total_usage_calls.py –client-id “YOUR_CONSUMER_KEY” –username “admin@yourcompany.com” –jwt-key-file “C:\secure\path\server.key” –instance-url “https://yourcompany.my.salesforce.com” –org-alias “your_org_alias” –output-dir “C:\data\salesforce-logs”

Pro tip: In Task Scheduler, enable Run whether the user is logged on or not so it runs even when the machine is idle. This ensures consistent collection regardless of who’s logged in.

After scheduling, wait 24 hours and check your output directory to confirm the first automated run succeeded. Once you see that working, you’ve achieved operational peace of mind: Your org’s API usage data is being collected automatically.

Understanding your API usage data

The CSV files contain detailed information about every API call. Here are some of the important fields.

Column

Description

CLIENT_NAME

The app or client making the call


API_RESOURCE

The endpoint or object accessed

USER_ID

Salesforce user who made the call

TIMESTAMP

When the API call occurred

API_FAMILY

Type of API (REST, SOAP, Bulk, etc.)

HTTP_METHOD

REST operation (GET, POST, PUT, DELETE)

Most admins start by sorting on CLIENT_NAME to see which integrations are consuming the most API calls, then drill into TIMESTAMP to identify daily patterns. The combination of these fields lets you answer specific questions, such as “Which integration spiked at 3 PM yesterday?” or “Is our mobile app making more REST calls than it should?”

Why this script?

You might wonder what makes this particular script suitable for production use. Here’s why it’s reliable for daily, unattended operation.

  • Production ready: It’s built with robust error handling and comprehensive logging. When something goes wrong, the logs tell you exactly what and why, which is critical for troubleshooting without manual monitoring. 
  • Memory efficient: It streams large CSV files directly to disk without loading everything into memory. This matters for large orgs with a huge number of API calls, as the script won’t crash or slow down regardless of file size.
  • Modern authentication: It uses ECAs with JWT flow for secure access. This is the Salesforce-recommended approach that passes security reviews and audit requirements.
  • Minimal dependencies: It uses only standard Python libraries plus requests. Fewer dependencies mean less maintenance and fewer things that can break over time.
  • Organized output: It creates separate directories for logs and CSV files, which makes it easy to archive or analyze data without hunting through messy file structures.

All of this means you can trust the script to run quietly in the background while you focus on higher-value admin work.

Built-in error handling and logging

One of the script’s key strengths is comprehensive error handling. When something goes wrong, you get clear, actionable error messages, not cryptic stack traces. This is critical when the script runs unattended via cron or Task Scheduler.

What the script catches

The script validates and handles errors at every stage.

  • Configuration validation: Checks that all required parameters are provided before starting
  • File system checks: Verifies the JWT key file exists and output directories are writable
  • Authentication failures: Catches and reports JWT authentication errors with specific guidance
  • API errors: Handles Salesforce API rate limits, timeouts, and invalid responses
  • Data streaming issues: Manages network interruptions during CSV downloads
  • Logging failures: Ensures logs are written even when other operations fail

Clear error messages

When errors occur, the script logs specific error details. Here are the exact messages you’ll see based on the actual script code.

File System Error (Missing JWT Key):

[2026-02-23 02:00:10] Starting API Total Usage extraction...
[2026-02-23 02:00:10] ERROR: JWT key file not found: /path/to/server.key

Configuration Error:

[2026-02-23 02:00:10] ERROR: Missing required parameters: client_id, username

Authentication Error:

[2026-02-23 02:00:12] Authenticating with Salesforce using JWT flow...
[2026-02-23 02:00:15] ERROR: SF CLI command failed: sf org login jwt --client-id ...
[2026-02-23 02:00:15] ERROR: Error: <SF CLI error details>
[2026-02-23 02:00:15] ERROR: Authentication failed: SF CLI error: <message>

Query Error (Permissions):

[2026-02-23 02:00:16] Querying Event Log Files for date range: 2026-02-22T00:00:00.000Z...
[2026-02-23 02:00:17] ERROR: Failed to query EventLogFile metadata: <exception details>

Download Error:

[2026-02-23 02:00:17] Processing EventLogFile: 0ATxx0000000001CAA
[2026-02-23 02:00:18] ERROR: Failed to download CSV for EventLogFile 0ATxx0000000001CAA: <exception>

Comprehensive logging

Every run generates a timestamped log file in the logs/ directory. The logs include:

  • Start and completion timestamps
  • Authentication status
  • EventLogFile query results (how many records found)
  • File download progress (file size, records processed)
  • Success or failure status with details

Successful Run example:

[2026-02-23 02:00:10] Starting API Total Usage extraction...
[2026-02-23 02:00:12] Authenticating with Salesforce using JWT flow...
[2026-02-23 02:00:14] JWT authentication successful
[2026-02-23 02:00:15] Salesforce org version: v62.0
[2026-02-23 02:00:16] Found 1 EventLogFile record(s)
[2026-02-23 02:00:17] Processing EventLogFile: 0ATxx0000000001CAA
[2026-02-23 02:00:17] File size: 2,456,789 bytes
[2026-02-23 02:00:22] Saved complete ApiTotalUsage file with 15,243 total API calls
[2026-02-23 02:00:22] SUCCESS: API Total Usage extraction completed!

Troubleshooting common issues

The script’s error handling, detailed in the previous section, will guide you when issues occur. Here are the most common scenarios and their solutions.

“Permission denied” error when running script

Cause: The private key file (server.key) doesn’t have correct permissions.
Solution: Run chmod 600 server.key (Mac/Linux) or check file permissions (Windows).

“JWT validation failed” error

Cause: The ECA isn’t properly configured or the certificate doesn’t match.
Solution: Verify the Consumer Key and ensure you uploaded the correct server.crt file in Step 2.

Ready to get started?

By automating your API Total Usage downloads, you’ll never lose valuable usage data again. This simple script provides a practical solution for admins who need consistent, reliable API usage tracking, and transforms how you manage your org’s operational health.

The setup takes a couple of minutes and, once configured, it runs automatically every day. You’ll build a historical record of your API usage that leads to better capacity planning, faster troubleshooting, and more informed decision-making. 

Questions? Review the resources below or reach out on LinkedIn. I’d love to hear how you’re using this approach in your org.

Resources

Awesome Admin Highlights from Dreamforce 2023.

Awesome Admin Highlights from Dreamforce 2023

In a flash, Dreamforce 2023 has come to a close, leaving us with a clear vision of how #AwesomeAdmins are empowered to shape the future of business. Whether you joined us in person or streamed content on Salesforce+, this year’s Dreamforce illuminated a path to a 🌟bright future🌟 for admins, driven by the latest technology […]

READ MORE
Cloudy standing near a cliff and text that says, "Easy Integrations with Flow HTTP Callouts."

Integrations Are Easier Than Ever with Flow HTTP Callouts

Are you tired of waiting for IT resources to build out an integration to your supplier? Are you frustrated that your payroll system lacks reliable application programming interface (API) documentation, leaving your IT team to weed through JSON and just hoping it’ll work? Then Flow HTTP Callout is for you! This feature does the hard […]

READ MORE
Jen Cole pointing to herself and text to the left of her that says, "Manage Data with MuleSoft."

How I Solved It: Manage Data with MuleSoft

In this episode of “How I Solved It” on Salesforce+, #AwesomeAdmin Jen Cole solves an inefficient fulfillment and sales process using MuleSoft Composer. Learn how she approached building her solution and her tips for developing admin skills. The problem Once upon a time, not so long ago, I was asked to fix an inefficient sales […]

READ MORE