A command-line interface and JavaScript client library for interacting with the LoadFocus Load Testing API.
npm install -g @loadfocus/loadfocus-api-client
npm install @loadfocus/loadfocus-api-client
npx @loadfocus/loadfocus-api-client <command>
Before using the client, you need to configure your API key and team ID:
loadfocus-api config set --apikey YOUR_API_KEY --teamid YOUR_TEAM_ID
By default, the client uses https://loadfocus.com
as the API base URL. This URL is locked for production use to ensure security and reliability.
For development purposes only, you can change the API URL by setting the environment variable NODE_ENV=development
or LOADFOCUS_DEV=true
:
# Development mode
NODE_ENV=development loadfocus-api config set --url http://localhost:3000
# Or alternatively
LOADFOCUS_DEV=true loadfocus-api config set --url http://localhost:3000
You can also create a .dev-mode
file in your project root to enable development mode.
View your current configuration:
loadfocus-api config show
This section provides guidance on integrating the LoadFocus JMeter API Client with popular CI/CD platforms for automated performance testing.
Regardless of the CI/CD platform, there are common steps to integrate the LoadFocus JMeter API Client:
-
Install the client: Install
@loadfocus/loadfocus-api-client
at the start of your workflow - Configure credentials: Set up your API key and team ID securely
- Execute tests: Run the tests as part of your pipeline
- Process results: Collect and analyze test results
- Make decisions: Based on test results, determine whether to proceed with deployment
version: 2.1
jobs:
performance_test:
docker:
- image: cimg/node:16.13
steps:
- checkout
- run:
name: Install LoadFocus JMeter API Client
command: npm install -g @loadfocus/loadfocus-api-client
- run:
name: Configure LoadFocus API Client
command: |
loadfocus-api config set apiKey $LOADFOCUS_API_KEY
loadfocus-api config set teamId $LOADFOCUS_TEAM_ID
- run:
name: Run Performance Tests
command: |
loadfocus-api jmeter run-test \
--name "CircleCI_${CIRCLE_PROJECT_REPONAME}_${CIRCLE_BRANCH}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
- store_artifacts:
path: performance_results.json
destination: performance-test-results
name: Performance Tests
on:
push:
branches: [ main, develop ]
pull_request:
branches: [ main ]
jobs:
performance-test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Setup Node.js
uses: actions/setup-node@v3
with:
node-version: '16'
- name: Install LoadFocus JMeter API Client
run: npm install -g @loadfocus/loadfocus-api-client
- name: Configure LoadFocus API Client
run: |
loadfocus-api config set apiKey ${{ secrets.LOADFOCUS_API_KEY }}
loadfocus-api config set teamId ${{ secrets.LOADFOCUS_TEAM_ID }}
- name: Run Performance Tests
run: |
loadfocus-api jmeter run-test \
--name "GitHub_${{ github.repository }}_${{ github.ref_name }}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
- name: Upload Performance Test Results
uses: actions/upload-artifact@v3
with:
name: performance-test-results
path: performance_results.json
trigger:
- main
- develop
pool:
vmImage: 'ubuntu-latest'
steps:
- task: NodeTool@0
inputs:
versionSpec: '16.x'
displayName: 'Install Node.js'
- script: |
npm install -g @loadfocus/loadfocus-api-client
displayName: 'Install LoadFocus JMeter API Client'
- script: |
loadfocus-api config set apiKey $(LOADFOCUS_API_KEY)
loadfocus-api config set teamId $(LOADFOCUS_TEAM_ID)
displayName: 'Configure LoadFocus API Client'
- script: |
loadfocus-api jmeter run-test \
--name "AzureDevOps_$(Build.Repository.Name)_$(Build.SourceBranchName)" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > $(Build.ArtifactStagingDirectory)/performance_results.json
displayName: 'Run Performance Tests'
- task: PublishBuildArtifacts@1
inputs:
pathtoPublish: '$(Build.ArtifactStagingDirectory)'
artifactName: 'performance-test-results'
displayName: 'Publish Performance Test Results'
pipeline {
agent {
docker {
image 'node:16-alpine'
}
}
environment {
LOADFOCUS_API_KEY = credentials('loadfocus-api-key')
LOADFOCUS_TEAM_ID = credentials('loadfocus-team-id')
}
stages {
stage('Performance Test') {
steps {
// Install LoadFocus JMeter API Client
sh 'npm install -g @loadfocus/loadfocus-api-client'
// Configure LoadFocus API Client
sh 'loadfocus-api config set apiKey $LOADFOCUS_API_KEY'
sh 'loadfocus-api config set teamId $LOADFOCUS_TEAM_ID'
// Run Performance Tests
sh '''
loadfocus-api jmeter run-test \
--name "Jenkins_${JOB_NAME}_${BUILD_NUMBER}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
'''
// Archive the results
archiveArtifacts artifacts: 'performance_results.json', fingerprint: true
}
}
}
}
performance_test:
stage: performance
image: node:16
script:
# Install LoadFocus JMeter API Client
- npm install -g @loadfocus/loadfocus-api-client
# Configure LoadFocus API Client
- loadfocus-api config set apiKey $LOADFOCUS_API_KEY
- loadfocus-api config set teamId $LOADFOCUS_TEAM_ID
# Run Performance Tests
- |
loadfocus-api jmeter run-test \
--name "GitLab_${CI_PROJECT_NAME}_${CI_COMMIT_REF_NAME}" \
--thresholds "avgresponse<=200,errors==0,p95<=250" \
--format json > performance_results.json
artifacts:
paths:
- performance_results.json
expire_in: 1 week
For more detailed documentation, please refer to the full documentation in the package's docs
directory after installation.
loadfocus-api jmeter execute --name "my-test-name"
loadfocus-api jmeter status --name "my-test-name" --id "test-id"
loadfocus-api jmeter results --name "my-test-name" --id "test-id"
loadfocus-api jmeter run-test --name "my-test-name" --thresholds "avgresponse<=200,errors==0,p95<=250"
loadfocus-api jmeter run-tests --config path/to/tests-config.json
Example configuration file (JSON):
{
"mode": "sequential",
"tests": [
{
"name": "Test1",
"thresholds": {
"avgresponse": {
"operator": "<=",
"value": 200
},
"errors": {
"operator": "==",
"value": 0
}
}
},
{
"name": "Test2",
"thresholds": {
"avgresponse": {
"operator": "<=",
"value": 150
}
}
}
]
}
Or with YAML:
mode: sequential
tests:
- name: Test1
thresholds:
avgresponse:
operator: "<="
value: 200
errors:
operator: "=="
value: 0
- name: Test2
thresholds:
avgresponse:
operator: "<="
value: 150
const { jmeterService } = require('@loadfocus/loadfocus-api-client');
async function runMyTest() {
const client = jmeterService.createClient({
apiKey: 'your-api-key',
teamId: 'your-team-id'
});
// Execute a test
const executeResult = await jmeterService.commands.execute({
name: 'my-test-name'
});
console.log(`Test started with ID: ${executeResult.testId}`);
}
runMyTest().catch(console.error);
-
config
: Manage configuration settings -
jmeter execute
: Execute a JMeter test -
jmeter status
: Get the status of a JMeter test -
jmeter results
: Get results from a JMeter test -
jmeter runs
: Get recent test runs -
jmeter labels
: Get labels from a JMeter test -
jmeter plan
: Get your current plan information -
jmeter config
: Get configuration of a JMeter test -
jmeter run-test
: Execute a test, wait for completion, and evaluate results against thresholds -
jmeter run-tests
: Execute multiple tests from a configuration file, either sequentially or in parallel
MIT