Regression Detector
Compare test results between runs to catch regressions before they ship. Identify new failures, fixed tests, and diff API response snapshots.
Get the CLI Tool
Run the regression detector locally as an MCP server, or try it online below.
npx @clinetools/regression- Compares before/after test results to detect new failures
- Identifies regressions, fixes, new tests, and removed tests
- Deep-diffs API response snapshots with path-level detail
- Severity assessment from none to critical
- Zero config — just run with npx
How to Use It
Three ways to detect regressions — pick the one that fits your workflow.
Try Online
Use the interactive demo below to compare test results — no install needed.
Use via CLI
Run as a local MCP server and connect any MCP-compatible client.
Add to AI Agent
Add the tool to your MCP settings for instant regression detection from your AI assistant.
MCP Client Configuration (Cline)
{
"mcpServers": {
"regression": {
"command": "npx",
"args": ["@clinetools/regression"]
}
}
}Claude Code Configuration
# In your project's .mcp.json:
{
"mcpServers": {
"regression": {
"command": "npx",
"args": ["@clinetools/regression"]
}
}
}Example Prompt: Detect Regressions After Refactor
// Prompt to your AI agent:
"Compare the test results from before and after my refactor"
// The agent calls:
detect_regressions({
before: '[{"name":"auth login","status":"pass"}, ...]',
after: '[{"name":"auth login","status":"fail","error":"timeout"}, ...]'
})
// Returns detailed regression report with severityTry It Online
Compare before and after test results to find regressions instantly.
Test Result Comparison
Paste baseline and current test results as JSON arrays
Paste test results and click Detect Regressions to compare runs.
Details
Understanding Regressions
Why regressions happen and how to prevent them from reaching production.
Why Regressions Happen
Regressions occur when new code changes break existing functionality. Common causes include refactors that miss edge cases, dependency updates with breaking changes, merge conflicts resolved incorrectly, and shared state mutations. Automated detection catches them before users do.
CI/CD Integration
Run regression detection in your CI pipeline after every test suite execution. Compare against the last passing baseline to instantly flag new failures. Block merges when regressions are detected and require explicit sign-off to proceed.
Snapshot Testing
Capture API response snapshots and compare them between deploys. Detect added, removed, or changed fields with exact path-level detail. Use ignore paths for dynamic fields like timestamps and request IDs that change every run.
Regression Prevention
Write focused unit tests for every bug fix to prevent the same regression twice. Maintain a stable baseline of test results. Use feature flags to isolate changes. Run the full test suite before merging and monitor test stability trends over time.
Never Ship a Regression
Add the Regression Detector to your agent's toolkit and catch test failures before they reach production.
View Plans