Implementation Plan
1. Integrate Supertester as Test Dependency
- Add Supertester to DSPex and configure for OTP-compliant testing
- Maintain three-layer testing architecture with Supertester
- Requirements: 1.1, 3.1, 8.1
1.1 Add Supertester dependency to mix.exs
- Add
{:supertester, github: "nshkrdotcom/supertester", only: :test}
to deps - Keep existing dependencies (snakepit, instructor_lite, etc.)
- Run
mix deps.get
to fetch Supertester - Verify Supertester modules are available
- Requirements: 1.1, 1.2
- Add
1.2 Update test_helper.exs for Supertester integration
- Keep three-layer test mode configuration
- Import Supertester configuration after ExUnit.start()
- Configure ExUnit for async tests with Supertester
- Ensure test modes work with Supertester isolation
- Requirements: 1.1, 3.1, 3.5
1.3 Update test aliases in mix.exs
- Modify test.fast to use Supertester with mock adapter
- Update test.protocol to use Supertester with bridge mock
- Enhance test.integration with Supertester for full Python
- Add test.pattern_check for Process.sleep detection
- Requirements: 3.1, 3.2, 3.3, 3.4
2. Create DSPex Test Support Modules
- Build DSPex-specific helpers on top of Supertester
- Support three-layer architecture properly
- Requirements: 1.2, 3.1, 4.1
2.1 Implement DSPex.TestHelpers
- Create with_test_router/3 using Supertester.OTPHelpers
- Implement with_python_bridge/2 for three-layer support
- Add wait_for_pipeline_completion/2 without Process.sleep
- Create router-specific assertion helpers
- Import all necessary Supertester modules
- Requirements: 2.1, 3.1, 3.2, 5.1
2.2 Implement DSPex.MockPythonBridge
- Create Layer 2 bridge mock using Supertester patterns
- Use Supertester.MessageHelpers for protocol testing
- Implement Snakepit-compatible interface
- Add message history tracking for verification
- Use OTP timers instead of Process.sleep
- Requirements: 2.2, 3.2, 4.3
2.3 Create test mode configuration module
- Support mock_adapter, bridge_mock, full_integration modes
- Use Supertester.UnifiedTestFoundation isolation modes
- Configure adapter availability per mode
- Set up Python bridge behavior per mode
- Requirements: 3.1, 3.5, 4.1
3. Migrate Unit Tests to Supertester
- Replace existing tests with Supertester-based tests
- Eliminate any Process.sleep usage
- Requirements: 1.1, 1.3, 1.4
3.1 Migrate DSPex.RouterTest
- Use setup_isolated_genserver for router setup
- Replace timing-based tests with cast_and_sync
- Add assert_genserver_state for state verification
- Test concurrent routing with concurrent_calls
- Use stress_test_server for load testing
- Requirements: 5.1, 5.2, 5.4
3.2 Migrate DSPex.PipelineTest
- Test sequential execution with proper synchronization
- Use Task.async with Supertester patterns for parallel
- Implement wait_for_pipeline_completion helper
- Verify error propagation without sleep
- Test mixed native/Python pipelines
- Requirements: 5.2, 5.3, 5.4, 5.5
3.3 Migrate DSPex.Native tests
- Update Signature tests with Supertester.Assertions
- Migrate Template tests to use OTP patterns
- Update Validator tests with proper helpers
- Ensure all tests use async: true
- Requirements: 1.1, 1.4
3.4 Migrate DSPex.LLM tests
- Update adapter tests with Supertester patterns
- Test adapter switching with proper synchronization
- Migrate client tests using OTPHelpers
- Test timeout handling without sleep
- Requirements: 4.1, 4.2, 4.4, 4.5
4. Create Integration Tests with Supertester
- Test cross-component interactions properly
- Verify Python bridge integration
- Requirements: 2.1, 2.2, 3.3
4.1 Create Python bridge integration tests
- Test Snakepit integration with Supertester.SupervisorHelpers
- Verify session affinity using unique_session_id
- Test Python process failures with ChaosHelpers
- Validate graceful degradation patterns
- Use wait_for_process_restart for recovery
- Requirements: 2.1, 2.2, 2.3, 2.4
4.2 Create router integration tests
- Test routing decisions across adapters
- Verify fallback behavior with failures
- Test performance metric collection
- Validate smart routing logic
- Requirements: 5.1, 5.4
4.3 Create pipeline integration tests
- Test end-to-end pipeline execution
- Verify mixed native/Python steps
- Test error handling across components
- Validate parallel execution coordination
- Requirements: 5.3, 5.4, 5.5
4.4 Create three-layer integration tests
- Test layer switching during execution
- Verify mock adapter layer isolation
- Test bridge mock protocol compliance
- Validate full integration with Python
- Requirements: 3.1, 3.2, 3.3, 3.4
5. Implement Performance Tests
- Use Supertester.PerformanceHelpers throughout
- Benchmark all critical paths
- Requirements: 6.1, 6.2, 6.3
5.1 Create router benchmarks
- Use benchmark_operations for routing overhead
- Test with varying adapter counts
- Apply workload_pattern_test for load patterns
- Measure decision latency percentiles
- Requirements: 6.1, 6.2
5.2 Create pipeline benchmarks
- Benchmark sequential vs parallel execution
- Measure overhead of mixed native/Python
- Test pipeline throughput limits
- Use concurrent_load_test for stress
- Requirements: 6.2, 6.3
5.3 Create Python bridge benchmarks
- Measure Snakepit communication overhead
- Benchmark serialization/deserialization
- Test session affinity performance
- Compare three layers performance
- Requirements: 6.2, 6.3
5.4 Set up performance regression detection
- Create baselines with collect_performance_metrics
- Configure benchmark_with_thresholds
- Set up performance_regression_detector
- Integrate with CI/CD pipeline
- Requirements: 6.4, 6.5
6. Implement Chaos Engineering Tests
- Use Supertester.ChaosHelpers for all scenarios
- Test DSPex-specific failure modes
- Requirements: 7.1, 7.2, 7.3
6.1 Create Python process chaos tests
- Use inject_process_failure for Python crashes
- Test Snakepit pool recovery
- Simulate Python interpreter issues
- Verify session recovery patterns
- Requirements: 7.1, 7.4, 7.5
6.2 Create adapter chaos tests
- Test LLM API failures with simulate_network_corruption
- Inject rate limiting scenarios
- Test adapter timeout handling
- Verify fallback adapter selection
- Requirements: 7.2, 7.5
6.3 Create pipeline chaos tests
- Test cascading failures across steps
- Apply create_memory_pressure during execution
- Test partial result handling
- Verify error propagation clarity
- Requirements: 7.3, 7.4
6.4 Create recovery verification tests
- Use verify_system_recovery for all components
- Test router recovery after failures
- Verify pipeline resumption capabilities
- Validate Python bridge reconnection
- Requirements: 7.5
7. Update Test Documentation
- Document Supertester usage for DSPex
- Explain three-layer architecture with Supertester
- Requirements: 8.1, 8.2, 8.3
7.1 Document DSPex test helpers
- Create guide for DSPex.TestHelpers usage
- Document MockPythonBridge for Layer 2
- Explain three-layer test mode selection
- Provide troubleshooting for common issues
- Requirements: 8.1, 8.2
7.2 Create DSPex test examples
- Provide router testing examples
- Show pipeline testing patterns
- Demonstrate Python bridge testing
- Include performance test examples
- Requirements: 8.2, 8.3
7.3 Document CI/CD integration
- Update GitHub Actions for three layers
- Configure test.pattern_check task
- Set up layer-specific test runs
- Document performance gates
- Requirements: 8.4, 8.5
7.4 Create migration guide
- Document moving from current tests to Supertester
- Show before/after test examples
- List common pitfalls and solutions
- Reference comprehensive-otp-testing-standards.md
- Requirements: 8.3, 8.5
8. Validate and Optimize Test Suite
- Ensure all tests follow OTP standards
- Verify three-layer architecture works properly
- Requirements: 1.1, 1.4, 3.1
8.1 Validate test coverage
- Run coverage analysis per layer
- Ensure >95% coverage for DSPex modules
- Identify and fill coverage gaps
- Verify all components tested
- Requirements: 1.1, 1.4
8.2 Validate OTP compliance
- Run pattern check for Process.sleep
- Verify all tests use async: true
- Check Supertester helper usage
- Ensure proper isolation per layer
- Requirements: 1.3, 1.5
8.3 Optimize test execution times
- Measure Layer 1 execution (<100ms target)
- Measure Layer 2 execution (<500ms target)
- Measure Layer 3 execution (<5s target)
- Optimize slow tests while maintaining coverage
- Requirements: 3.1, 6.1
8.4 Validate three-layer architecture
- Ensure clean separation between layers
- Verify appropriate tests in each layer
- Test layer switching works correctly
- Document layer selection criteria
- Requirements: 3.1, 3.2, 3.3, 3.4
9. Final Integration Steps
- Complete DSPex integration with Supertester
- Ensure production readiness
- Requirements: 1.1, 8.5
9.1 Update project documentation
- Add Supertester to README dependencies
- Document three-layer testing approach
- Reference code standards compliance
- Update contribution guidelines
- Requirements: 8.1, 8.5
9.2 Configure CI/CD pipeline
- Set up matrix builds for three layers
- Add quality gates per layer
- Configure performance tracking
- Enable chaos test runs
- Requirements: 6.4, 8.4
9.3 Create DSPex test playbook
- Document when to use each layer
- Provide decision tree for test placement
- Include performance expectations
- Reference Supertester best practices
- Requirements: 8.3, 8.5