Enabling Lighthouse for mobile
emerge-score
is a community-driven framework for quantifying mobile app quality through standardized metrics and evaluations.
emerge-score helps teams measure, track, and improve mobile app quality by providing:
- A crowdsourced collection of app quality indicators and metrics
- An open scoring algorithm that converts indicators into a single numerical score
- Cross-platform support for iOS and Android
- Data-driven insights based on ecosystem comparisons
- Open-source implementation for transparency and community contributions
Think Lighthouse for web performance, but specifically designed for mobile apps.
- Standardized Scoring: Generate a clear, comparable quality score for your mobile app
- Multi-section Analysis: Evaluate various aspects of your app including Performance, Size, Accessibility, Privacy/Security, and Best Practices
- Interactive CLI: Guided process for evaluating your app
- Visual Reports: Generate HTML reports with detailed breakdowns of your app's scores
- Cross-platform: Support for both iOS and Android applications
The easiest way to run emerge-score is via npx
or pnpx
:
# Check the version
npx @emergetools/emerge-score --version
# Start the scoring process
npx @emergetools/emerge-score score
The CLI will guide you through selecting a platform, choosing a rubric, and evaluating your app across different quality metrics.
emerge-score evaluates your app through sections containing individual audits:
- Sections: Grouped aspects of app quality (e.g., Performance, Size) with weighted importance
- Audits: Individual measurements of specific quality attributes
Audit types:
- Metrics: Numeric values compared against ecosystem data (scored from 0-N points)
- Checks: Binary yes/no evaluations (scored as 0 or N points, typically N=1)
The final score is calculated by:
- Summing weighted audit points within each section
- Combining weighted section scores into a raw total
- Normalizing to a 0-100 integer score
Example output: HTML Report | JSON Data
# View available commands
npx @emergetools/emerge-score --help
# Score your app
npx @emergetools/emerge-score score
# Generate a report from existing results
npx @emergetools/emerge-score report --input=results.json --output=report.html
# View the current scoring rubric
npx @emergetools/emerge-score rubric
emerge-score can also be used as a TypeScript library:
import { computeAlgorithm, ALGORITHM } from "@emergetools/emerge-score";
// Your audit results
const results = {
metricA: 100,
checkB: true,
// ...
};
// Platform context
const context = { platform: "ios" };
// Calculate score
const score = computeAlgorithm(ALGORITHM, results, context);
console.log(`Your app score: ${score.total}`);
# Clone the repository
git clone https://github.com/emergeTools/emerge-score
cd emerge-score
# Install dependencies
pnpm install --frozen-lockfile
# Build the project
./tools/build
# Run tests
./tools/test
# Format code
./tools/fix
# Run all pre-submit checks
./tools/presubmit
# Use the latest main branch build
npx @emergetools/emerge-score@next
We welcome contributions from the community! See CONTRIBUTING.md for detailed guidelines.
Areas where you can help:
- Adding new audits and metrics
- Improving documentation
- Enhancing UI/reporting
- Fixing bugs
- Adding tests
Term | Description |
---|---|
Algorithm | The overall scoring system definition |
Section | A category of related audits (e.g., Performance) |
Audit | An individual quality measurement |
Metric | A numeric measurement compared to ecosystem data |
Check | A binary (yes/no) verification |
Result | The outcome of a single audit |
Results | Collection of all audit results |
Context | Platform and environment information |
MIT License - see LICENSE for details.