8.4 KiB
Slow Test Suite Documentation
Date: 2026-01-28
Status: ✅ Active
Overview
The Slow Test Suite contains performance tests that explicitly validate performance characteristics of the application. These tests are intentionally slower because they use larger datasets or test performance-critical paths (e.g., N+1 query prevention, filter performance with many records).
These tests are marked with @tag :slow or @moduletag :slow and are excluded from standard test runs to improve developer feedback loops while maintaining comprehensive coverage.
Purpose
Performance tests serve to:
- Validate Performance Characteristics: Ensure queries and operations perform within acceptable time limits
- Prevent Regressions: Catch performance regressions before they reach production
- Test Scalability: Verify that the application handles larger datasets efficiently
- N+1 Query Prevention: Ensure proper preloading and query optimization
Identified Performance Tests
1. Member LiveView - Boolean Filter Performance
File: test/mv_web/member_live/index_test.exs
Test: "boolean filter performance with 150 members"
Duration: ~3.8 seconds
Purpose: Validates that boolean custom field filtering performs efficiently with 150 members
What it tests:
- Creates 150 members (75 with
true, 75 withfalsefor a boolean custom field) - Tests filter performance (< 1 second requirement)
- Verifies correct filtering behavior
2. Group LiveView - Index Performance
File: test/mv_web/live/group_live/index_test.exs
Describe Block: "performance"
Tests: 2 tests
Purpose: Validates efficient page loading and member count calculation
What it tests:
- Page loads efficiently with many groups (no N+1 queries)
- Member count calculation is efficient
3. Group LiveView - Show Performance
File: test/mv_web/live/group_live/show_test.exs
Describe Block: "performance"
Tests: 2 tests
Purpose: Validates efficient member list loading and slug lookup
What it tests:
- Member list is loaded efficiently (no N+1 queries)
- Slug lookup uses unique_slug index efficiently
4. Member LiveView - Membership Fee Status Performance
File: test/mv_web/member_live/index_membership_fee_status_test.exs
Describe Block: "performance"
Tests: 1 test
Purpose: Validates efficient cycle loading without N+1 queries
What it tests:
- Cycles are loaded efficiently without N+1 queries
- Multiple members with cycles render without performance issues
5. Group Integration - Query Performance
File: test/membership/group_integration_test.exs
Describe Block: "Query Performance"
Tests: 1 test
Purpose: Validates N+1 query prevention for group-member relationships
What it tests:
- Preloading groups with members avoids N+1 queries
- Query optimization for many-to-many relationships
Running Slow Tests
Local Development
Run only fast tests (default):
just test-fast
# or
mix test --exclude slow
# With specific files or options
just test-fast test/membership/member_test.exs
just test-fast --seed 123
Run only performance tests:
just test-slow
# or
mix test --only slow
# With specific files or options
just test-slow test/mv_web/member_live/index_test.exs
just test-slow --seed 123
Run all tests (fast + slow):
just test
# or
mix test
# With specific files or options
just test-all test/mv_web/
just test-all --max-failures 5
Note: All suite commands (test-fast, test-slow, test-all) support additional arguments. The suite semantics (tags) are always preserved - additional arguments are appended to the command.
CI/CD
Standard CI Pipeline:
- Runs
mix test --exclude slowfor faster feedback - Executes on every push to any branch
Nightly Pipeline:
- Runs
mix test --only slowfor comprehensive performance coverage - Executes daily at 2 AM via cron trigger
- Pipeline name:
nightly-tests
Manual Execution:
The nightly pipeline can be manually triggered using:
Drone CLI:
drone build start <owner>/<repo> <branch> --event custom
Drone Web UI:
- Navigate to the repository in Drone
- Go to the
nightly-testspipeline - Click "Run" or "Restart" and select event type
custom
Example:
# Trigger nightly-tests pipeline manually
drone build start local-it/mitgliederverwaltung main --event custom
Best Practices for New Performance Tests
When to Tag as :slow
Tag tests as :slow when they:
- Explicitly test performance: Tests that measure execution time or validate performance characteristics
- Use large datasets: Tests that create many records (e.g., 50+ members, 100+ records)
- Test scalability: Tests that verify the application handles larger workloads
- Validate query optimization: Tests that ensure N+1 queries are prevented
When NOT to Tag as :slow
Do NOT tag tests as :slow if they are:
- Simply slow by accident: Tests that are slow due to inefficient setup, not intentional performance testing
- Slow due to bugs: Tests that are slow because of actual performance bugs (fix the bug instead)
- Integration tests: Integration tests should be tagged separately if needed (
@tag :integration)
Tagging Guidelines
For single tests:
@tag :slow
test "boolean filter performance with 150 members" do
# test implementation
end
For describe blocks (all tests in block):
@moduletag :slow
describe "performance" do
test "page loads efficiently" do
# test implementation
end
test "member count is efficient" do
# test implementation
end
end
Performance Test Structure
Recommended Structure
defmodule MvWeb.SomeLiveViewTest do
use MvWeb.ConnCase, async: true
# Regular tests here (not tagged)
@moduletag :slow
describe "performance" do
test "loads efficiently without N+1 queries" do
# Create test data
# Measure/validate performance
# Assert correct behavior
end
test "handles large datasets efficiently" do
# Create large dataset
# Measure performance
# Assert performance requirements
end
end
end
Performance Assertions
Performance tests should include explicit performance assertions:
# Example: Time-based assertion
start_time = System.monotonic_time(:millisecond)
# ... perform operation ...
end_time = System.monotonic_time(:millisecond)
duration = end_time - start_time
assert duration < 1000, "Operation took #{duration}ms, expected < 1000ms"
# Example: Query count assertion (using Ecto query logging)
# Verify no N+1 queries by checking query count
Monitoring and Maintenance
Regular Review
- Quarterly Review: Review slow tests quarterly to ensure they're still relevant
- Performance Baselines: Update performance assertions if legitimate performance improvements occur
- Test Cleanup: Remove or optimize tests that become redundant
Success Metrics
- ✅ Performance tests catch regressions before production
- ✅ Standard test runs complete in < 3 minutes
- ✅ Nightly builds complete successfully
- ✅ No false positives in performance tests
Troubleshooting
If a performance test fails:
- Check if it's a real regression: Compare with previous runs
- Check CI environment: Ensure CI has adequate resources
- Review test data: Ensure test data setup is correct
- Check for flakiness: Run test multiple times to verify consistency
If a performance test is too slow:
- Review test implementation: Look for inefficiencies in test setup
- Consider reducing dataset size: If still representative
- Split into smaller tests: If testing multiple concerns
Related Documentation
- Test Optimization Summary:
docs/test-optimization-summary.md - Seeds Test Optimization:
docs/test-optimization-seeds.md - Testing Standards:
CODE_GUIDELINES.md- Section 4 (Testing Standards) - CI/CD Configuration:
.drone.yml
Changelog
2026-01-28: Initial Setup
- Marked 5 performance test suites with
@tag :slowor@moduletag :slow - Added
test-fast,test-slow, andtest-allcommands to Justfile - Updated CI to exclude slow tests from standard runs
- Added nightly CI pipeline for slow tests
- Created this documentation