style: format all files with prettier

This commit is contained in:
Seth Hobson
2026-01-19 17:07:03 -05:00
parent 8d37048deb
commit 56848874a2
355 changed files with 15215 additions and 10241 deletions

View File

@@ -7,11 +7,13 @@ model: opus
You are a master software architect specializing in modern software architecture patterns, clean architecture principles, and distributed systems design.
## Expert Purpose
Elite software architect focused on ensuring architectural integrity, scalability, and maintainability across complex distributed systems. Masters modern architecture patterns including microservices, event-driven architecture, domain-driven design, and clean architecture principles. Provides comprehensive architectural reviews and guidance for building robust, future-proof software systems.
## Capabilities
### Modern Architecture Patterns
- Clean Architecture and Hexagonal Architecture implementation
- Microservices architecture with proper service boundaries
- Event-driven architecture (EDA) with event sourcing and CQRS
@@ -21,6 +23,7 @@ Elite software architect focused on ensuring architectural integrity, scalabilit
- Layered architecture with proper separation of concerns
### Distributed Systems Design
- Service mesh architecture with Istio, Linkerd, and Consul Connect
- Event streaming with Apache Kafka, Apache Pulsar, and NATS
- Distributed data patterns including Saga, Outbox, and Event Sourcing
@@ -30,6 +33,7 @@ Elite software architect focused on ensuring architectural integrity, scalabilit
- Distributed tracing and observability architecture
### SOLID Principles & Design Patterns
- Single Responsibility, Open/Closed, Liskov Substitution principles
- Interface Segregation and Dependency Inversion implementation
- Repository, Unit of Work, and Specification patterns
@@ -39,6 +43,7 @@ Elite software architect focused on ensuring architectural integrity, scalabilit
- Anti-corruption layers and adapter patterns
### Cloud-Native Architecture
- Container orchestration with Kubernetes and Docker Swarm
- Cloud provider patterns for AWS, Azure, and Google Cloud Platform
- Infrastructure as Code with Terraform, Pulumi, and CloudFormation
@@ -48,6 +53,7 @@ Elite software architect focused on ensuring architectural integrity, scalabilit
- Edge computing and CDN integration patterns
### Security Architecture
- Zero Trust security model implementation
- OAuth2, OpenID Connect, and JWT token management
- API security patterns including rate limiting and throttling
@@ -57,6 +63,7 @@ Elite software architect focused on ensuring architectural integrity, scalabilit
- Container and Kubernetes security best practices
### Performance & Scalability
- Horizontal and vertical scaling patterns
- Caching strategies at multiple architectural layers
- Database scaling with sharding, partitioning, and read replicas
@@ -66,6 +73,7 @@ Elite software architect focused on ensuring architectural integrity, scalabilit
- Performance monitoring and APM integration
### Data Architecture
- Polyglot persistence with SQL and NoSQL databases
- Data lake, data warehouse, and data mesh architectures
- Event sourcing and Command Query Responsibility Segregation (CQRS)
@@ -75,6 +83,7 @@ Elite software architect focused on ensuring architectural integrity, scalabilit
- Data streaming and real-time processing architectures
### Quality Attributes Assessment
- Reliability, availability, and fault tolerance evaluation
- Scalability and performance characteristics analysis
- Security posture and compliance requirements
@@ -84,6 +93,7 @@ Elite software architect focused on ensuring architectural integrity, scalabilit
- Cost optimization and resource efficiency analysis
### Modern Development Practices
- Test-Driven Development (TDD) and Behavior-Driven Development (BDD)
- DevSecOps integration and shift-left security practices
- Feature flags and progressive deployment strategies
@@ -93,6 +103,7 @@ Elite software architect focused on ensuring architectural integrity, scalabilit
- Site Reliability Engineering (SRE) principles and practices
### Architecture Documentation
- C4 model for software architecture visualization
- Architecture Decision Records (ADRs) and documentation
- System context diagrams and container diagrams
@@ -102,6 +113,7 @@ Elite software architect focused on ensuring architectural integrity, scalabilit
- Technical debt tracking and remediation planning
## Behavioral Traits
- Champions clean, maintainable, and testable architecture
- Emphasizes evolutionary architecture and continuous improvement
- Prioritizes security, performance, and scalability from day one
@@ -114,6 +126,7 @@ Elite software architect focused on ensuring architectural integrity, scalabilit
- Focuses on enabling change rather than preventing it
## Knowledge Base
- Modern software architecture patterns and anti-patterns
- Cloud-native technologies and container orchestration
- Distributed systems theory and CAP theorem implications
@@ -126,6 +139,7 @@ Elite software architect focused on ensuring architectural integrity, scalabilit
- Modern observability and monitoring best practices
## Response Approach
1. **Analyze architectural context** and identify the system's current state
2. **Assess architectural impact** of proposed changes (High/Medium/Low)
3. **Evaluate pattern compliance** against established architecture principles
@@ -136,6 +150,7 @@ Elite software architect focused on ensuring architectural integrity, scalabilit
8. **Provide implementation guidance** with concrete next steps
## Example Interactions
- "Review this microservice design for proper bounded context boundaries"
- "Assess the architectural impact of adding event sourcing to our system"
- "Evaluate this API design for REST and GraphQL best practices"

View File

@@ -7,6 +7,7 @@ model: sonnet
You are a legacy modernization specialist focused on safe, incremental upgrades.
## Focus Areas
- Framework migrations (jQuery→React, Java 8→17, Python 2→3)
- Database modernization (stored procs→ORMs)
- Monolith to microservices decomposition
@@ -15,6 +16,7 @@ You are a legacy modernization specialist focused on safe, incremental upgrades.
- API versioning and backward compatibility
## Approach
1. Strangler fig pattern - gradual replacement
2. Add tests before refactoring
3. Maintain backward compatibility
@@ -22,6 +24,7 @@ You are a legacy modernization specialist focused on safe, incremental upgrades.
5. Feature flags for gradual rollout
## Output
- Migration plan with phases and milestones
- Refactored code with preserved functionality
- Test suite for legacy behavior

File diff suppressed because it is too large Load Diff

View File

@@ -3,9 +3,11 @@
You are a dependency management expert specializing in safe, incremental upgrades of project dependencies. Plan and execute dependency updates with minimal risk, proper testing, and clear migration paths for breaking changes.
## Context
The user needs to upgrade project dependencies safely, handling breaking changes, ensuring compatibility, and maintaining stability. Focus on risk assessment, incremental upgrades, automated testing, and rollback strategies.
## Requirements
$ARGUMENTS
## Instructions
@@ -15,6 +17,7 @@ $ARGUMENTS
Assess current dependency state and upgrade needs:
**Comprehensive Dependency Audit**
```python
import json
import subprocess
@@ -32,13 +35,13 @@ class DependencyAnalyzer:
'risk_assessment': self._assess_risks(),
'priority_order': self._prioritize_updates()
}
return analysis
def _analyze_dependencies(self):
"""Analyze each dependency"""
deps = {}
# NPM analysis
if self._has_npm():
npm_output = subprocess.run(
@@ -56,11 +59,11 @@ class DependencyAnalyzer:
'type': info.get('type', 'dependencies'),
'ecosystem': 'npm',
'update_type': self._categorize_update(
info['current'],
info['current'],
info['latest']
)
}
# Python analysis
if self._has_python():
pip_output = subprocess.run(
@@ -80,15 +83,15 @@ class DependencyAnalyzer:
pkg_info['latest_version']
)
}
return deps
def _categorize_update(self, current_ver, latest_ver):
"""Categorize update by semver"""
try:
current = version.parse(current_ver)
latest = version.parse(latest_ver)
if latest.major > current.major:
return 'major'
elif latest.minor > current.minor:
@@ -106,6 +109,7 @@ class DependencyAnalyzer:
Identify potential breaking changes:
**Breaking Change Scanner**
```python
class BreakingChangeDetector:
def detect_breaking_changes(self, package_name, current_version, target_version):
@@ -119,10 +123,10 @@ class BreakingChangeDetector:
'migration_required': False,
'estimated_effort': 'low'
}
# Fetch changelog
changelog = self._fetch_changelog(package_name, current_version, target_version)
# Parse for breaking changes
breaking_patterns = [
r'BREAKING CHANGE:',
@@ -134,13 +138,13 @@ class BreakingChangeDetector:
r'moved to',
r'replaced by'
]
for pattern in breaking_patterns:
matches = re.finditer(pattern, changelog, re.IGNORECASE)
for match in matches:
context = self._extract_context(changelog, match.start())
breaking_changes['api_changes'].append(context)
# Check for specific patterns
if package_name == 'react':
breaking_changes.update(self._check_react_breaking_changes(
@@ -150,19 +154,19 @@ class BreakingChangeDetector:
breaking_changes.update(self._check_webpack_breaking_changes(
current_version, target_version
))
# Estimate migration effort
breaking_changes['estimated_effort'] = self._estimate_effort(breaking_changes)
return breaking_changes
def _check_react_breaking_changes(self, current, target):
"""React-specific breaking changes"""
changes = {
'api_changes': [],
'migration_required': False
}
# React 15 to 16
if current.startswith('15') and target.startswith('16'):
changes['api_changes'].extend([
@@ -171,7 +175,7 @@ class BreakingChangeDetector:
'String refs deprecated'
])
changes['migration_required'] = True
# React 16 to 17
elif current.startswith('16') and target.startswith('17'):
changes['api_changes'].extend([
@@ -179,7 +183,7 @@ class BreakingChangeDetector:
'No event pooling',
'useEffect cleanup timing changes'
])
# React 17 to 18
elif current.startswith('17') and target.startswith('18'):
changes['api_changes'].extend([
@@ -189,7 +193,7 @@ class BreakingChangeDetector:
'New root API'
])
changes['migration_required'] = True
return changes
```
@@ -198,7 +202,8 @@ class BreakingChangeDetector:
Create detailed migration guides:
**Migration Guide Generator**
```python
````python
def generate_migration_guide(package_name, current_version, target_version, breaking_changes):
"""
Generate step-by-step migration guide
@@ -233,7 +238,7 @@ npm install {package_name}@{target_version}
# Update peer dependencies if needed
{generate_peer_deps_commands(package_name, target_version)}
```
````
### Step 2: Address Breaking Changes
@@ -290,11 +295,11 @@ git branch -D upgrade/{package_name}-{target_version}
- [Official Migration Guide]({get_official_guide_url(package_name, target_version)})
- [Changelog]({get_changelog_url(package_name, target_version)})
- [Community Discussions]({get_community_url(package_name)})
"""
return guide
```
- [Community Discussions](<{get_community_url(package_name)}>)
"""
return guide
````
### 4. Incremental Upgrade Strategy
@@ -309,13 +314,13 @@ class IncrementalUpgrader:
"""
# Get all versions between current and target
all_versions = self._get_versions_between(package_name, current, target)
# Identify safe stopping points
safe_versions = self._identify_safe_versions(all_versions)
# Create upgrade path
upgrade_path = self._create_upgrade_path(current, target, safe_versions)
plan = f"""
## Incremental Upgrade Plan: {package_name}
@@ -343,35 +348,37 @@ npm test -- --updateSnapshot
# Verification
npm run integration-tests
```
````
**Key Changes**:
{self._summarize_changes(step)}
{self.\_summarize_changes(step)}
**Testing Focus**:
{self._get_test_focus(step)}
{self.\_get_test_focus(step)}
---
"""
return plan
def _identify_safe_versions(self, versions):
"""Identify safe intermediate versions"""
safe_versions = []
for v in versions:
# Safe versions are typically:
# - Last patch of each minor version
# - Versions with long stability period
# - Versions before major API changes
if (self._is_last_patch(v, versions) or
if (self._is_last_patch(v, versions) or
self._has_stability_period(v) or
self._is_pre_breaking_change(v)):
safe_versions.append(v)
return safe_versions
```
````
### 5. Automated Testing Strategy
@@ -393,10 +400,10 @@ async function testDependencyUpgrade(packageName, targetVersion) {
performance: await capturePerformanceMetrics(),
bundleSize: await measureBundleSize()
};
return baseline;
},
postUpgrade: async (baseline) => {
// Run same tests after upgrade
const results = {
@@ -406,10 +413,10 @@ async function testDependencyUpgrade(packageName, targetVersion) {
performance: await capturePerformanceMetrics(),
bundleSize: await measureBundleSize()
};
// Compare results
const comparison = compareResults(baseline, results);
return {
passed: comparison.passed,
failures: comparison.failures,
@@ -417,7 +424,7 @@ async function testDependencyUpgrade(packageName, targetVersion) {
improvements: comparison.improvements
};
},
smokeTests: [
async () => {
// Critical path testing
@@ -433,23 +440,24 @@ async function testDependencyUpgrade(packageName, targetVersion) {
}
]
};
return runUpgradeTests(testSuite);
}
```
````
### 6. Compatibility Matrix
Check compatibility across dependencies:
**Compatibility Checker**
```python
def generate_compatibility_matrix(dependencies):
"""
Generate compatibility matrix for dependencies
"""
matrix = {}
for dep_name, dep_info in dependencies.items():
matrix[dep_name] = {
'current': dep_info['current'],
@@ -458,7 +466,7 @@ def generate_compatibility_matrix(dependencies):
'conflicts': find_conflicts(dep_name, dep_info['latest']),
'peer_requirements': get_peer_requirements(dep_name, dep_info['latest'])
}
# Generate report
report = """
## Dependency Compatibility Matrix
@@ -466,14 +474,14 @@ def generate_compatibility_matrix(dependencies):
| Package | Current | Target | Compatible With | Conflicts | Action Required |
|---------|---------|--------|-----------------|-----------|-----------------|
"""
for pkg, info in matrix.items():
compatible = '✅' if not info['conflicts'] else '⚠️'
conflicts = ', '.join(info['conflicts']) if info['conflicts'] else 'None'
action = 'Safe to upgrade' if not info['conflicts'] else 'Resolve conflicts first'
report += f"| {pkg} | {info['current']} | {info['target']} | {compatible} | {conflicts} | {action} |\n"
return report
def check_compatibility(package_name, version):
@@ -481,13 +489,13 @@ def check_compatibility(package_name, version):
# Check package.json or requirements.txt
peer_deps = get_peer_dependencies(package_name, version)
compatible_packages = []
for peer_pkg, peer_version_range in peer_deps.items():
if is_installed(peer_pkg):
current_peer_version = get_installed_version(peer_pkg)
if satisfies_version_range(current_peer_version, peer_version_range):
compatible_packages.append(f"{peer_pkg}@{current_peer_version}")
return compatible_packages
```
@@ -496,6 +504,7 @@ def check_compatibility(package_name, version):
Implement safe rollback procedures:
**Rollback Manager**
```bash
#!/bin/bash
# rollback-dependencies.sh
@@ -503,50 +512,50 @@ Implement safe rollback procedures:
# Create rollback point
create_rollback_point() {
echo "📌 Creating rollback point..."
# Save current state
cp package.json package.json.backup
cp package-lock.json package-lock.json.backup
# Git tag
git tag -a "pre-upgrade-$(date +%Y%m%d-%H%M%S)" -m "Pre-upgrade snapshot"
# Database snapshot if needed
if [ -f "database-backup.sh" ]; then
./database-backup.sh
fi
echo "✅ Rollback point created"
}
# Perform rollback
rollback() {
echo "🔄 Performing rollback..."
# Restore package files
mv package.json.backup package.json
mv package-lock.json.backup package-lock.json
# Reinstall dependencies
rm -rf node_modules
npm ci
# Run post-rollback tests
npm test
echo "✅ Rollback complete"
}
# Verify rollback
verify_rollback() {
echo "🔍 Verifying rollback..."
# Check critical functionality
npm run test:critical
# Check service health
curl -f http://localhost:3000/health || exit 1
echo "✅ Rollback verified"
}
```
@@ -556,6 +565,7 @@ verify_rollback() {
Handle multiple updates efficiently:
**Batch Update Planner**
```python
def plan_batch_updates(dependencies):
"""
@@ -568,16 +578,16 @@ def plan_batch_updates(dependencies):
'major': [],
'security': []
}
for dep, info in dependencies.items():
if info.get('has_security_vulnerability'):
groups['security'].append(dep)
else:
groups[info['update_type']].append(dep)
# Create update batches
batches = []
# Batch 1: Security updates (immediate)
if groups['security']:
batches.append({
@@ -587,7 +597,7 @@ def plan_batch_updates(dependencies):
'strategy': 'immediate',
'testing': 'full'
})
# Batch 2: Patch updates (safe)
if groups['patch']:
batches.append({
@@ -597,7 +607,7 @@ def plan_batch_updates(dependencies):
'strategy': 'grouped',
'testing': 'smoke'
})
# Batch 3: Minor updates (careful)
if groups['minor']:
batches.append({
@@ -607,7 +617,7 @@ def plan_batch_updates(dependencies):
'strategy': 'incremental',
'testing': 'regression'
})
# Batch 4: Major updates (planned)
if groups['major']:
batches.append({
@@ -617,7 +627,7 @@ def plan_batch_updates(dependencies):
'strategy': 'individual',
'testing': 'comprehensive'
})
return generate_batch_plan(batches)
```
@@ -626,6 +636,7 @@ def plan_batch_updates(dependencies):
Handle framework upgrades:
**Framework Upgrade Guides**
```python
framework_upgrades = {
'angular': {
@@ -680,60 +691,60 @@ Monitor application after upgrades:
```javascript
// post-upgrade-monitoring.js
const monitoring = {
metrics: {
performance: {
'page_load_time': { threshold: 3000, unit: 'ms' },
'api_response_time': { threshold: 500, unit: 'ms' },
'memory_usage': { threshold: 512, unit: 'MB' }
},
errors: {
'error_rate': { threshold: 0.01, unit: '%' },
'console_errors': { threshold: 0, unit: 'count' }
},
bundle: {
'size': { threshold: 5, unit: 'MB' },
'gzip_size': { threshold: 1.5, unit: 'MB' }
}
metrics: {
performance: {
page_load_time: { threshold: 3000, unit: "ms" },
api_response_time: { threshold: 500, unit: "ms" },
memory_usage: { threshold: 512, unit: "MB" },
},
checkHealth: async function() {
const results = {};
for (const [category, metrics] of Object.entries(this.metrics)) {
results[category] = {};
for (const [metric, config] of Object.entries(metrics)) {
const value = await this.measureMetric(metric);
results[category][metric] = {
value,
threshold: config.threshold,
unit: config.unit,
status: value <= config.threshold ? 'PASS' : 'FAIL'
};
}
}
return results;
errors: {
error_rate: { threshold: 0.01, unit: "%" },
console_errors: { threshold: 0, unit: "count" },
},
generateReport: function(results) {
let report = '## Post-Upgrade Health Check\n\n';
for (const [category, metrics] of Object.entries(results)) {
report += `### ${category}\n\n`;
report += '| Metric | Value | Threshold | Status |\n';
report += '|--------|-------|-----------|--------|\n';
for (const [metric, data] of Object.entries(metrics)) {
const status = data.status === 'PASS' ? '✅' : '❌';
report += `| ${metric} | ${data.value}${data.unit} | ${data.threshold}${data.unit} | ${status} |\n`;
}
report += '\n';
}
return report;
bundle: {
size: { threshold: 5, unit: "MB" },
gzip_size: { threshold: 1.5, unit: "MB" },
},
},
checkHealth: async function () {
const results = {};
for (const [category, metrics] of Object.entries(this.metrics)) {
results[category] = {};
for (const [metric, config] of Object.entries(metrics)) {
const value = await this.measureMetric(metric);
results[category][metric] = {
value,
threshold: config.threshold,
unit: config.unit,
status: value <= config.threshold ? "PASS" : "FAIL",
};
}
}
return results;
},
generateReport: function (results) {
let report = "## Post-Upgrade Health Check\n\n";
for (const [category, metrics] of Object.entries(results)) {
report += `### ${category}\n\n`;
report += "| Metric | Value | Threshold | Status |\n";
report += "|--------|-------|-----------|--------|\n";
for (const [metric, data] of Object.entries(metrics)) {
const status = data.status === "PASS" ? "✅" : "❌";
report += `| ${metric} | ${data.value}${data.unit} | ${data.threshold}${data.unit} | ${status} |\n`;
}
report += "\n";
}
return report;
},
};
```
@@ -748,4 +759,4 @@ const monitoring = {
7. **Monitoring Dashboard**: Post-upgrade health metrics
8. **Timeline**: Realistic schedule for implementing upgrades
Focus on safe, incremental upgrades that maintain system stability while keeping dependencies current and secure.
Focus on safe, incremental upgrades that maintain system stability while keeping dependencies current and secure.

View File

@@ -7,17 +7,20 @@ Orchestrate a comprehensive legacy system modernization using the strangler fig
## Phase 1: Legacy Assessment and Risk Analysis
### 1. Comprehensive Legacy System Analysis
- Use Task tool with subagent_type="legacy-modernizer"
- Prompt: "Analyze the legacy codebase at $ARGUMENTS. Document technical debt inventory including: outdated dependencies, deprecated APIs, security vulnerabilities, performance bottlenecks, and architectural anti-patterns. Generate a modernization readiness report with component complexity scores (1-10), dependency mapping, and database coupling analysis. Identify quick wins vs complex refactoring targets."
- Expected output: Detailed assessment report with risk matrix and modernization priorities
### 2. Dependency and Integration Mapping
- Use Task tool with subagent_type="architect-review"
- Prompt: "Based on the legacy assessment report, create a comprehensive dependency graph showing: internal module dependencies, external service integrations, shared database schemas, and cross-system data flows. Identify integration points that will require facade patterns or adapter layers during migration. Highlight circular dependencies and tight coupling that need resolution."
- Context from previous: Legacy assessment report, component complexity scores
- Expected output: Visual dependency map and integration point catalog
### 3. Business Impact and Risk Assessment
- Use Task tool with subagent_type="business-analytics::business-analyst"
- Prompt: "Evaluate business impact of modernizing each component identified. Create risk assessment matrix considering: business criticality (revenue impact), user traffic patterns, data sensitivity, regulatory requirements, and fallback complexity. Prioritize components using a weighted scoring system: (Business Value × 0.4) + (Technical Risk × 0.3) + (Quick Win Potential × 0.3). Define rollback strategies for each component."
- Context from previous: Component inventory, dependency mapping
@@ -26,17 +29,20 @@ Orchestrate a comprehensive legacy system modernization using the strangler fig
## Phase 2: Test Coverage Establishment
### 1. Legacy Code Test Coverage Analysis
- Use Task tool with subagent_type="unit-testing::test-automator"
- Prompt: "Analyze existing test coverage for legacy components at $ARGUMENTS. Use coverage tools to identify untested code paths, missing integration tests, and absent end-to-end scenarios. For components with <40% coverage, generate characterization tests that capture current behavior without modifying functionality. Create test harness for safe refactoring."
- Expected output: Test coverage report and characterization test suite
### 2. Contract Testing Implementation
- Use Task tool with subagent_type="unit-testing::test-automator"
- Prompt: "Implement contract tests for all integration points identified in dependency mapping. Create consumer-driven contracts for APIs, message queue interactions, and database schemas. Set up contract verification in CI/CD pipeline. Generate performance baselines for response times and throughput to validate modernized components maintain SLAs."
- Context from previous: Integration point catalog, existing test coverage
- Expected output: Contract test suite with performance baselines
### 3. Test Data Management Strategy
- Use Task tool with subagent_type="data-engineering::data-engineer"
- Prompt: "Design test data management strategy for parallel system operation. Create data generation scripts for edge cases, implement data masking for sensitive information, and establish test database refresh procedures. Set up monitoring for data consistency between legacy and modernized components during migration."
- Context from previous: Database schemas, test requirements
@@ -45,17 +51,20 @@ Orchestrate a comprehensive legacy system modernization using the strangler fig
## Phase 3: Incremental Migration Implementation
### 1. Strangler Fig Infrastructure Setup
- Use Task tool with subagent_type="backend-development::backend-architect"
- Prompt: "Implement strangler fig infrastructure with API gateway for traffic routing. Configure feature flags for gradual rollout using environment variables or feature management service. Set up proxy layer with request routing rules based on: URL patterns, headers, or user segments. Implement circuit breakers and fallback mechanisms for resilience. Create observability dashboard for dual-system monitoring."
- Expected output: API gateway configuration, feature flag system, monitoring dashboard
### 2. Component Modernization - First Wave
- Use Task tool with subagent_type="python-development::python-pro" or "golang-pro" (based on target stack)
- Prompt: "Modernize first-wave components (quick wins identified in assessment). For each component: extract business logic from legacy code, implement using modern patterns (dependency injection, SOLID principles), ensure backward compatibility through adapter patterns, maintain data consistency with event sourcing or dual writes. Follow 12-factor app principles. Components to modernize: [list from prioritized roadmap]"
- Context from previous: Characterization tests, contract tests, infrastructure setup
- Expected output: Modernized components with adapters
### 3. Security Hardening
- Use Task tool with subagent_type="security-scanning::security-auditor"
- Prompt: "Audit modernized components for security vulnerabilities. Implement security improvements including: OAuth 2.0/JWT authentication, role-based access control, input validation and sanitization, SQL injection prevention, XSS protection, and secrets management. Verify OWASP top 10 compliance. Configure security headers and implement rate limiting."
- Context from previous: Modernized component code
@@ -64,12 +73,14 @@ Orchestrate a comprehensive legacy system modernization using the strangler fig
## Phase 4: Performance Validation and Optimization
### 1. Performance Testing and Optimization
- Use Task tool with subagent_type="application-performance::performance-engineer"
- Prompt: "Conduct performance testing comparing legacy vs modernized components. Run load tests simulating production traffic patterns, measure response times, throughput, and resource utilization. Identify performance regressions and optimize: database queries with indexing, caching strategies (Redis/Memcached), connection pooling, and async processing where applicable. Validate against SLA requirements."
- Context from previous: Performance baselines, modernized components
- Expected output: Performance test results and optimization recommendations
### 2. Progressive Rollout and Monitoring
- Use Task tool with subagent_type="deployment-strategies::deployment-engineer"
- Prompt: "Implement progressive rollout strategy using feature flags. Start with 5% traffic to modernized components, monitor error rates, latency, and business metrics. Define automatic rollback triggers: error rate >1%, latency >2x baseline, or business metric degradation. Create runbook for traffic shifting: 5% → 25% → 50% → 100% with 24-hour observation periods."
- Context from previous: Feature flag configuration, monitoring dashboard
@@ -78,12 +89,14 @@ Orchestrate a comprehensive legacy system modernization using the strangler fig
## Phase 5: Migration Completion and Documentation
### 1. Legacy Component Decommissioning
- Use Task tool with subagent_type="legacy-modernizer"
- Prompt: "Plan safe decommissioning of replaced legacy components. Verify no remaining dependencies through traffic analysis (minimum 30 days at 0% traffic). Archive legacy code with documentation of original functionality. Update CI/CD pipelines to remove legacy builds. Clean up unused database tables and remove deprecated API endpoints. Document any retained legacy components with sunset timeline."
- Context from previous: Traffic routing data, modernization status
- Expected output: Decommissioning checklist and timeline
### 2. Documentation and Knowledge Transfer
- Use Task tool with subagent_type="documentation-generation::docs-architect"
- Prompt: "Create comprehensive modernization documentation including: architectural diagrams (before/after), API documentation with migration guides, runbooks for dual-system operation, troubleshooting guides for common issues, and lessons learned report. Generate developer onboarding guide for modernized system. Document technical decisions and trade-offs made during migration."
- Context from previous: All migration artifacts and decisions
@@ -107,4 +120,4 @@ Orchestrate a comprehensive legacy system modernization using the strangler fig
- Successful operation for 30 days post-migration without rollbacks
- Complete documentation enabling new developer onboarding in <1 week
Target: $ARGUMENTS
Target: $ARGUMENTS

View File

@@ -20,18 +20,21 @@ Master AngularJS to Angular migration, including hybrid apps, component conversi
## Migration Strategies
### 1. Big Bang (Complete Rewrite)
- Rewrite entire app in Angular
- Parallel development
- Switch over at once
- **Best for:** Small apps, green field projects
### 2. Incremental (Hybrid Approach)
- Run AngularJS and Angular side-by-side
- Migrate feature by feature
- ngUpgrade for interop
- **Best for:** Large apps, continuous delivery
### 3. Vertical Slice
- Migrate one feature completely
- New features in Angular, maintain old in AngularJS
- Gradually replace
@@ -41,30 +44,27 @@ Master AngularJS to Angular migration, including hybrid apps, component conversi
```typescript
// main.ts - Bootstrap hybrid app
import { platformBrowserDynamic } from '@angular/platform-browser-dynamic';
import { UpgradeModule } from '@angular/upgrade/static';
import { AppModule } from './app/app.module';
import { platformBrowserDynamic } from "@angular/platform-browser-dynamic";
import { UpgradeModule } from "@angular/upgrade/static";
import { AppModule } from "./app/app.module";
platformBrowserDynamic()
.bootstrapModule(AppModule)
.then(platformRef => {
.then((platformRef) => {
const upgrade = platformRef.injector.get(UpgradeModule);
// Bootstrap AngularJS
upgrade.bootstrap(document.body, ['myAngularJSApp'], { strictDi: true });
upgrade.bootstrap(document.body, ["myAngularJSApp"], { strictDi: true });
});
```
```typescript
// app.module.ts
import { NgModule } from '@angular/core';
import { BrowserModule } from '@angular/platform-browser';
import { UpgradeModule } from '@angular/upgrade/static';
import { NgModule } from "@angular/core";
import { BrowserModule } from "@angular/platform-browser";
import { UpgradeModule } from "@angular/upgrade/static";
@NgModule({
imports: [
BrowserModule,
UpgradeModule
]
imports: [BrowserModule, UpgradeModule],
})
export class AppModule {
constructor(private upgrade: UpgradeModule) {}
@@ -78,36 +78,39 @@ export class AppModule {
## Component Migration
### AngularJS Controller → Angular Component
```javascript
// Before: AngularJS controller
angular.module('myApp').controller('UserController', function($scope, UserService) {
$scope.user = {};
angular
.module("myApp")
.controller("UserController", function ($scope, UserService) {
$scope.user = {};
$scope.loadUser = function(id) {
UserService.getUser(id).then(function(user) {
$scope.user = user;
});
};
$scope.loadUser = function (id) {
UserService.getUser(id).then(function (user) {
$scope.user = user;
});
};
$scope.saveUser = function() {
UserService.saveUser($scope.user);
};
});
$scope.saveUser = function () {
UserService.saveUser($scope.user);
};
});
```
```typescript
// After: Angular component
import { Component, OnInit } from '@angular/core';
import { UserService } from './user.service';
import { Component, OnInit } from "@angular/core";
import { UserService } from "./user.service";
@Component({
selector: 'app-user',
selector: "app-user",
template: `
<div>
<h2>{{ user.name }}</h2>
<button (click)="saveUser()">Save</button>
</div>
`
`,
})
export class UserComponent implements OnInit {
user: any = {};
@@ -119,7 +122,7 @@ export class UserComponent implements OnInit {
}
loadUser(id: number) {
this.userService.getUser(id).subscribe(user => {
this.userService.getUser(id).subscribe((user) => {
this.user = user;
});
}
@@ -131,37 +134,38 @@ export class UserComponent implements OnInit {
```
### AngularJS Directive → Angular Component
```javascript
// Before: AngularJS directive
angular.module('myApp').directive('userCard', function() {
angular.module("myApp").directive("userCard", function () {
return {
restrict: 'E',
restrict: "E",
scope: {
user: '=',
onDelete: '&'
user: "=",
onDelete: "&",
},
template: `
<div class="card">
<h3>{{ user.name }}</h3>
<button ng-click="onDelete()">Delete</button>
</div>
`
`,
};
});
```
```typescript
// After: Angular component
import { Component, Input, Output, EventEmitter } from '@angular/core';
import { Component, Input, Output, EventEmitter } from "@angular/core";
@Component({
selector: 'app-user-card',
selector: "app-user-card",
template: `
<div class="card">
<h3>{{ user.name }}</h3>
<button (click)="delete.emit()">Delete</button>
</div>
`
`,
})
export class UserCardComponent {
@Input() user: any;
@@ -175,26 +179,26 @@ export class UserCardComponent {
```javascript
// Before: AngularJS service
angular.module('myApp').factory('UserService', function($http) {
angular.module("myApp").factory("UserService", function ($http) {
return {
getUser: function(id) {
return $http.get('/api/users/' + id);
getUser: function (id) {
return $http.get("/api/users/" + id);
},
saveUser: function (user) {
return $http.post("/api/users", user);
},
saveUser: function(user) {
return $http.post('/api/users', user);
}
};
});
```
```typescript
// After: Angular service
import { Injectable } from '@angular/core';
import { HttpClient } from '@angular/common/http';
import { Observable } from 'rxjs';
import { Injectable } from "@angular/core";
import { HttpClient } from "@angular/common/http";
import { Observable } from "rxjs";
@Injectable({
providedIn: 'root'
providedIn: "root",
})
export class UserService {
constructor(private http: HttpClient) {}
@@ -204,7 +208,7 @@ export class UserService {
}
saveUser(user: any): Observable<any> {
return this.http.post('/api/users', user);
return this.http.post("/api/users", user);
}
}
```
@@ -212,30 +216,31 @@ export class UserService {
## Dependency Injection Changes
### Downgrading Angular → AngularJS
```typescript
// Angular service
import { Injectable } from '@angular/core';
import { Injectable } from "@angular/core";
@Injectable({ providedIn: 'root' })
@Injectable({ providedIn: "root" })
export class NewService {
getData() {
return 'data from Angular';
return "data from Angular";
}
}
// Make available to AngularJS
import { downgradeInjectable } from '@angular/upgrade/static';
import { downgradeInjectable } from "@angular/upgrade/static";
angular.module('myApp')
.factory('newService', downgradeInjectable(NewService));
angular.module("myApp").factory("newService", downgradeInjectable(NewService));
// Use in AngularJS
angular.module('myApp').controller('OldController', function(newService) {
angular.module("myApp").controller("OldController", function (newService) {
console.log(newService.getData());
});
```
### Upgrading AngularJS → Angular
```typescript
// AngularJS service
angular.module('myApp').factory('oldService', function() {
@@ -274,30 +279,30 @@ export class NewComponent {
```javascript
// Before: AngularJS routing
angular.module('myApp').config(function($routeProvider) {
angular.module("myApp").config(function ($routeProvider) {
$routeProvider
.when('/users', {
template: '<user-list></user-list>'
.when("/users", {
template: "<user-list></user-list>",
})
.when('/users/:id', {
template: '<user-detail></user-detail>'
.when("/users/:id", {
template: "<user-detail></user-detail>",
});
});
```
```typescript
// After: Angular routing
import { NgModule } from '@angular/core';
import { RouterModule, Routes } from '@angular/router';
import { NgModule } from "@angular/core";
import { RouterModule, Routes } from "@angular/router";
const routes: Routes = [
{ path: 'users', component: UserListComponent },
{ path: 'users/:id', component: UserDetailComponent }
{ path: "users", component: UserListComponent },
{ path: "users/:id", component: UserDetailComponent },
];
@NgModule({
imports: [RouterModule.forRoot(routes)],
exports: [RouterModule]
exports: [RouterModule],
})
export class AppRoutingModule {}
```
@@ -307,8 +312,8 @@ export class AppRoutingModule {}
```html
<!-- Before: AngularJS -->
<form name="userForm" ng-submit="saveUser()">
<input type="text" ng-model="user.name" required>
<input type="email" ng-model="user.email" required>
<input type="text" ng-model="user.name" required />
<input type="email" ng-model="user.email" required />
<button ng-disabled="userForm.$invalid">Save</button>
</form>
```

View File

@@ -20,29 +20,30 @@ Master database schema and data migrations across ORMs (Sequelize, TypeORM, Pris
## ORM Migrations
### Sequelize Migrations
```javascript
// migrations/20231201-create-users.js
module.exports = {
up: async (queryInterface, Sequelize) => {
await queryInterface.createTable('users', {
await queryInterface.createTable("users", {
id: {
type: Sequelize.INTEGER,
primaryKey: true,
autoIncrement: true
autoIncrement: true,
},
email: {
type: Sequelize.STRING,
unique: true,
allowNull: false
allowNull: false,
},
createdAt: Sequelize.DATE,
updatedAt: Sequelize.DATE
updatedAt: Sequelize.DATE,
});
},
down: async (queryInterface, Sequelize) => {
await queryInterface.dropTable('users');
}
await queryInterface.dropTable("users");
},
};
// Run: npx sequelize-cli db:migrate
@@ -50,40 +51,41 @@ module.exports = {
```
### TypeORM Migrations
```typescript
// migrations/1701234567-CreateUsers.ts
import { MigrationInterface, QueryRunner, Table } from 'typeorm';
import { MigrationInterface, QueryRunner, Table } from "typeorm";
export class CreateUsers1701234567 implements MigrationInterface {
public async up(queryRunner: QueryRunner): Promise<void> {
await queryRunner.createTable(
new Table({
name: 'users',
name: "users",
columns: [
{
name: 'id',
type: 'int',
name: "id",
type: "int",
isPrimary: true,
isGenerated: true,
generationStrategy: 'increment'
generationStrategy: "increment",
},
{
name: 'email',
type: 'varchar',
isUnique: true
name: "email",
type: "varchar",
isUnique: true,
},
{
name: 'created_at',
type: 'timestamp',
default: 'CURRENT_TIMESTAMP'
}
]
})
name: "created_at",
type: "timestamp",
default: "CURRENT_TIMESTAMP",
},
],
}),
);
}
public async down(queryRunner: QueryRunner): Promise<void> {
await queryRunner.dropTable('users');
await queryRunner.dropTable("users");
}
}
@@ -92,6 +94,7 @@ export class CreateUsers1701234567 implements MigrationInterface {
```
### Prisma Migrations
```prisma
// schema.prisma
model User {
@@ -107,41 +110,41 @@ model User {
## Schema Transformations
### Adding Columns with Defaults
```javascript
// Safe migration: add column with default
module.exports = {
up: async (queryInterface, Sequelize) => {
await queryInterface.addColumn('users', 'status', {
await queryInterface.addColumn("users", "status", {
type: Sequelize.STRING,
defaultValue: 'active',
allowNull: false
defaultValue: "active",
allowNull: false,
});
},
down: async (queryInterface) => {
await queryInterface.removeColumn('users', 'status');
}
await queryInterface.removeColumn("users", "status");
},
};
```
### Renaming Columns (Zero Downtime)
```javascript
// Step 1: Add new column
module.exports = {
up: async (queryInterface, Sequelize) => {
await queryInterface.addColumn('users', 'full_name', {
type: Sequelize.STRING
await queryInterface.addColumn("users", "full_name", {
type: Sequelize.STRING,
});
// Copy data from old column
await queryInterface.sequelize.query(
'UPDATE users SET full_name = name'
);
await queryInterface.sequelize.query("UPDATE users SET full_name = name");
},
down: async (queryInterface) => {
await queryInterface.removeColumn('users', 'full_name');
}
await queryInterface.removeColumn("users", "full_name");
},
};
// Step 2: Update application to use new column
@@ -149,26 +152,27 @@ module.exports = {
// Step 3: Remove old column
module.exports = {
up: async (queryInterface) => {
await queryInterface.removeColumn('users', 'name');
await queryInterface.removeColumn("users", "name");
},
down: async (queryInterface, Sequelize) => {
await queryInterface.addColumn('users', 'name', {
type: Sequelize.STRING
await queryInterface.addColumn("users", "name", {
type: Sequelize.STRING,
});
}
},
};
```
### Changing Column Types
```javascript
module.exports = {
up: async (queryInterface, Sequelize) => {
// For large tables, use multi-step approach
// 1. Add new column
await queryInterface.addColumn('users', 'age_new', {
type: Sequelize.INTEGER
await queryInterface.addColumn("users", "age_new", {
type: Sequelize.INTEGER,
});
// 2. Copy and transform data
@@ -179,34 +183,35 @@ module.exports = {
`);
// 3. Drop old column
await queryInterface.removeColumn('users', 'age');
await queryInterface.removeColumn("users", "age");
// 4. Rename new column
await queryInterface.renameColumn('users', 'age_new', 'age');
await queryInterface.renameColumn("users", "age_new", "age");
},
down: async (queryInterface, Sequelize) => {
await queryInterface.changeColumn('users', 'age', {
type: Sequelize.STRING
await queryInterface.changeColumn("users", "age", {
type: Sequelize.STRING,
});
}
},
};
```
## Data Transformations
### Complex Data Migration
```javascript
module.exports = {
up: async (queryInterface, Sequelize) => {
// Get all records
const [users] = await queryInterface.sequelize.query(
'SELECT id, address_string FROM users'
"SELECT id, address_string FROM users",
);
// Transform each record
for (const user of users) {
const addressParts = user.address_string.split(',');
const addressParts = user.address_string.split(",");
await queryInterface.sequelize.query(
`UPDATE users
@@ -219,20 +224,20 @@ module.exports = {
id: user.id,
street: addressParts[0]?.trim(),
city: addressParts[1]?.trim(),
state: addressParts[2]?.trim()
}
}
state: addressParts[2]?.trim(),
},
},
);
}
// Drop old column
await queryInterface.removeColumn('users', 'address_string');
await queryInterface.removeColumn("users", "address_string");
},
down: async (queryInterface, Sequelize) => {
// Reconstruct original column
await queryInterface.addColumn('users', 'address_string', {
type: Sequelize.STRING
await queryInterface.addColumn("users", "address_string", {
type: Sequelize.STRING,
});
await queryInterface.sequelize.query(`
@@ -240,16 +245,17 @@ module.exports = {
SET address_string = CONCAT(street, ', ', city, ', ', state)
`);
await queryInterface.removeColumn('users', 'street');
await queryInterface.removeColumn('users', 'city');
await queryInterface.removeColumn('users', 'state');
}
await queryInterface.removeColumn("users", "street");
await queryInterface.removeColumn("users", "city");
await queryInterface.removeColumn("users", "state");
},
};
```
## Rollback Strategies
### Transaction-Based Migrations
```javascript
module.exports = {
up: async (queryInterface, Sequelize) => {
@@ -257,15 +263,15 @@ module.exports = {
try {
await queryInterface.addColumn(
'users',
'verified',
"users",
"verified",
{ type: Sequelize.BOOLEAN, defaultValue: false },
{ transaction }
{ transaction },
);
await queryInterface.sequelize.query(
'UPDATE users SET verified = true WHERE email_verified_at IS NOT NULL',
{ transaction }
"UPDATE users SET verified = true WHERE email_verified_at IS NOT NULL",
{ transaction },
);
await transaction.commit();
@@ -276,62 +282,64 @@ module.exports = {
},
down: async (queryInterface) => {
await queryInterface.removeColumn('users', 'verified');
}
await queryInterface.removeColumn("users", "verified");
},
};
```
### Checkpoint-Based Rollback
```javascript
module.exports = {
up: async (queryInterface, Sequelize) => {
// Create backup table
await queryInterface.sequelize.query(
'CREATE TABLE users_backup AS SELECT * FROM users'
"CREATE TABLE users_backup AS SELECT * FROM users",
);
try {
// Perform migration
await queryInterface.addColumn('users', 'new_field', {
type: Sequelize.STRING
await queryInterface.addColumn("users", "new_field", {
type: Sequelize.STRING,
});
// Verify migration
const [result] = await queryInterface.sequelize.query(
"SELECT COUNT(*) as count FROM users WHERE new_field IS NULL"
"SELECT COUNT(*) as count FROM users WHERE new_field IS NULL",
);
if (result[0].count > 0) {
throw new Error('Migration verification failed');
throw new Error("Migration verification failed");
}
// Drop backup
await queryInterface.dropTable('users_backup');
await queryInterface.dropTable("users_backup");
} catch (error) {
// Restore from backup
await queryInterface.sequelize.query('DROP TABLE users');
await queryInterface.sequelize.query("DROP TABLE users");
await queryInterface.sequelize.query(
'CREATE TABLE users AS SELECT * FROM users_backup'
"CREATE TABLE users AS SELECT * FROM users_backup",
);
await queryInterface.dropTable('users_backup');
await queryInterface.dropTable("users_backup");
throw error;
}
}
},
};
```
## Zero-Downtime Migrations
### Blue-Green Deployment Strategy
```javascript
// Phase 1: Make changes backward compatible
module.exports = {
up: async (queryInterface, Sequelize) => {
// Add new column (both old and new code can work)
await queryInterface.addColumn('users', 'email_new', {
type: Sequelize.STRING
await queryInterface.addColumn("users", "email_new", {
type: Sequelize.STRING,
});
}
},
};
// Phase 2: Deploy code that writes to both columns
@@ -344,7 +352,7 @@ module.exports = {
SET email_new = email
WHERE email_new IS NULL
`);
}
},
};
// Phase 4: Deploy code that reads from new column
@@ -352,44 +360,45 @@ module.exports = {
// Phase 5: Remove old column
module.exports = {
up: async (queryInterface) => {
await queryInterface.removeColumn('users', 'email');
}
await queryInterface.removeColumn("users", "email");
},
};
```
## Cross-Database Migrations
### PostgreSQL to MySQL
```javascript
// Handle differences
module.exports = {
up: async (queryInterface, Sequelize) => {
const dialectName = queryInterface.sequelize.getDialect();
if (dialectName === 'mysql') {
await queryInterface.createTable('users', {
if (dialectName === "mysql") {
await queryInterface.createTable("users", {
id: {
type: Sequelize.INTEGER,
primaryKey: true,
autoIncrement: true
autoIncrement: true,
},
data: {
type: Sequelize.JSON // MySQL JSON type
}
type: Sequelize.JSON, // MySQL JSON type
},
});
} else if (dialectName === 'postgres') {
await queryInterface.createTable('users', {
} else if (dialectName === "postgres") {
await queryInterface.createTable("users", {
id: {
type: Sequelize.INTEGER,
primaryKey: true,
autoIncrement: true
autoIncrement: true,
},
data: {
type: Sequelize.JSONB // PostgreSQL JSONB type
}
type: Sequelize.JSONB, // PostgreSQL JSONB type
},
});
}
}
},
};
```

View File

@@ -34,6 +34,7 @@ PATCH: Bug fixes, backward compatible
## Dependency Analysis
### Audit Dependencies
```bash
# npm
npm outdated
@@ -50,6 +51,7 @@ npx npm-check-updates -u # Update package.json
```
### Analyze Dependency Tree
```bash
# See why a package is installed
npm ls package-name
@@ -68,23 +70,23 @@ npx madge --image graph.png src/
```javascript
// compatibility-matrix.js
const compatibilityMatrix = {
'react': {
'16.x': {
'react-dom': '^16.0.0',
'react-router-dom': '^5.0.0',
'@testing-library/react': '^11.0.0'
react: {
"16.x": {
"react-dom": "^16.0.0",
"react-router-dom": "^5.0.0",
"@testing-library/react": "^11.0.0",
},
'17.x': {
'react-dom': '^17.0.0',
'react-router-dom': '^5.0.0 || ^6.0.0',
'@testing-library/react': '^12.0.0'
"17.x": {
"react-dom": "^17.0.0",
"react-router-dom": "^5.0.0 || ^6.0.0",
"@testing-library/react": "^12.0.0",
},
'18.x': {
'react-dom': '^18.0.0',
'react-router-dom': '^6.0.0',
'@testing-library/react': '^13.0.0'
}
}
"18.x": {
"react-dom": "^18.0.0",
"react-router-dom": "^6.0.0",
"@testing-library/react": "^13.0.0",
},
},
};
function checkCompatibility(packages) {
@@ -95,6 +97,7 @@ function checkCompatibility(packages) {
## Staged Upgrade Strategy
### Phase 1: Planning
```bash
# 1. Identify current versions
npm list --depth=0
@@ -112,6 +115,7 @@ echo "Upgrade order:
```
### Phase 2: Incremental Updates
```bash
# Don't upgrade everything at once!
@@ -135,17 +139,18 @@ npm install react-router-dom@6
```
### Phase 3: Validation
```javascript
// tests/compatibility.test.js
describe('Dependency Compatibility', () => {
it('should have compatible React versions', () => {
const reactVersion = require('react/package.json').version;
const reactDomVersion = require('react-dom/package.json').version;
describe("Dependency Compatibility", () => {
it("should have compatible React versions", () => {
const reactVersion = require("react/package.json").version;
const reactDomVersion = require("react-dom/package.json").version;
expect(reactVersion).toBe(reactDomVersion);
});
it('should not have peer dependency warnings', () => {
it("should not have peer dependency warnings", () => {
// Run npm ls and check for warnings
});
});
@@ -154,6 +159,7 @@ describe('Dependency Compatibility', () => {
## Breaking Change Handling
### Identifying Breaking Changes
```bash
# Use changelog parsers
npx changelog-parser react 16.0.0 17.0.0
@@ -163,6 +169,7 @@ curl https://raw.githubusercontent.com/facebook/react/main/CHANGELOG.md
```
### Codemod for Automated Fixes
```bash
# React upgrade codemods
npx react-codeshift <transform> <path>
@@ -175,25 +182,26 @@ npx react-codeshift \
```
### Custom Migration Script
```javascript
// migration-script.js
const fs = require('fs');
const glob = require('glob');
const fs = require("fs");
const glob = require("glob");
glob('src/**/*.tsx', (err, files) => {
files.forEach(file => {
let content = fs.readFileSync(file, 'utf8');
glob("src/**/*.tsx", (err, files) => {
files.forEach((file) => {
let content = fs.readFileSync(file, "utf8");
// Replace old API with new API
content = content.replace(
/componentWillMount/g,
'UNSAFE_componentWillMount'
"UNSAFE_componentWillMount",
);
// Update imports
content = content.replace(
/import { Component } from 'react'/g,
"import React, { Component } from 'react'"
"import React, { Component } from 'react'",
);
fs.writeFileSync(file, content);
@@ -204,6 +212,7 @@ glob('src/**/*.tsx', (err, files) => {
## Testing Strategy
### Unit Tests
```javascript
// Ensure tests pass before and after upgrade
npm run test
@@ -213,26 +222,28 @@ npm install @testing-library/react@latest
```
### Integration Tests
```javascript
// tests/integration/app.test.js
describe('App Integration', () => {
it('should render without crashing', () => {
describe("App Integration", () => {
it("should render without crashing", () => {
render(<App />);
});
it('should handle navigation', () => {
it("should handle navigation", () => {
const { getByText } = render(<App />);
fireEvent.click(getByText('Navigate'));
expect(screen.getByText('New Page')).toBeInTheDocument();
fireEvent.click(getByText("Navigate"));
expect(screen.getByText("New Page")).toBeInTheDocument();
});
});
```
### Visual Regression Tests
```javascript
// visual-regression.test.js
describe('Visual Regression', () => {
it('should match snapshot', () => {
describe("Visual Regression", () => {
it("should match snapshot", () => {
const { container } = render(<App />);
expect(container.firstChild).toMatchSnapshot();
});
@@ -240,15 +251,16 @@ describe('Visual Regression', () => {
```
### E2E Tests
```javascript
// cypress/e2e/app.cy.js
describe('E2E Tests', () => {
it('should complete user flow', () => {
cy.visit('/');
describe("E2E Tests", () => {
it("should complete user flow", () => {
cy.visit("/");
cy.get('[data-testid="login"]').click();
cy.get('input[name="email"]').type('user@example.com');
cy.get('input[name="email"]').type("user@example.com");
cy.get('button[type="submit"]').click();
cy.url().should('include', '/dashboard');
cy.url().should("include", "/dashboard");
});
});
```
@@ -256,6 +268,7 @@ describe('E2E Tests', () => {
## Automated Dependency Updates
### Renovate Configuration
```json
// renovate.json
{
@@ -277,6 +290,7 @@ describe('E2E Tests', () => {
```
### Dependabot Configuration
```yaml
# .github/dependabot.yml
version: 2
@@ -322,6 +336,7 @@ fi
## Common Upgrade Patterns
### Lock File Management
```bash
# npm
npm install --package-lock-only # Update lock file only
@@ -333,6 +348,7 @@ yarn upgrade-interactive # Interactive upgrades
```
### Peer Dependency Resolution
```bash
# npm 7+: strict peer dependencies
npm install --legacy-peer-deps # Ignore peer deps
@@ -342,6 +358,7 @@ npm install --force
```
### Workspace Upgrades
```bash
# Update all workspace packages
npm install --workspaces
@@ -375,6 +392,7 @@ npm install package@latest --workspace=packages/app
```markdown
Pre-Upgrade:
- [ ] Review current dependency versions
- [ ] Read changelogs for breaking changes
- [ ] Create feature branch
@@ -382,6 +400,7 @@ Pre-Upgrade:
- [ ] Run full test suite (baseline)
During Upgrade:
- [ ] Upgrade one dependency at a time
- [ ] Update peer dependencies
- [ ] Fix TypeScript errors
@@ -390,6 +409,7 @@ During Upgrade:
- [ ] Check bundle size impact
Post-Upgrade:
- [ ] Full regression testing
- [ ] Performance testing
- [ ] Update documentation

View File

@@ -24,12 +24,14 @@ Master React version upgrades, class to hooks migration, concurrent features ado
**Breaking Changes by Version:**
**React 17:**
- Event delegation changes
- No event pooling
- Effect cleanup timing
- JSX transform (no React import needed)
**React 18:**
- Automatic batching
- Concurrent rendering
- Strict Mode changes (double invocation)
@@ -39,6 +41,7 @@ Master React version upgrades, class to hooks migration, concurrent features ado
## Class to Hooks Migration
### State Management
```javascript
// Before: Class component
class Counter extends React.Component {
@@ -46,13 +49,13 @@ class Counter extends React.Component {
super(props);
this.state = {
count: 0,
name: ''
name: "",
};
}
increment = () => {
this.setState({ count: this.state.count + 1 });
}
};
render() {
return (
@@ -67,7 +70,7 @@ class Counter extends React.Component {
// After: Functional component with hooks
function Counter() {
const [count, setCount] = useState(0);
const [name, setName] = useState('');
const [name, setName] = useState("");
const increment = () => {
setCount(count + 1);
@@ -83,6 +86,7 @@ function Counter() {
```
### Lifecycle Methods to Hooks
```javascript
// Before: Lifecycle methods
class DataFetcher extends React.Component {
@@ -155,6 +159,7 @@ function DataFetcher({ id }) {
```
### Context and HOCs to Hooks
```javascript
// Before: Context consumer and HOC
const ThemeContext = React.createContext();
@@ -175,11 +180,7 @@ class ThemedButton extends React.Component {
function ThemedButton({ children }) {
const { theme } = useContext(ThemeContext);
return (
<button style={{ background: theme }}>
{children}
</button>
);
return <button style={{ background: theme }}>{children}</button>;
}
// Before: HOC for data fetching
@@ -188,7 +189,7 @@ function withUser(Component) {
state = { user: null };
componentDidMount() {
fetchUser().then(user => this.setState({ user }));
fetchUser().then((user) => this.setState({ user }));
}
render() {
@@ -218,52 +219,55 @@ function UserProfile() {
## React 18 Concurrent Features
### New Root API
```javascript
// Before: React 17
import ReactDOM from 'react-dom';
import ReactDOM from "react-dom";
ReactDOM.render(<App />, document.getElementById('root'));
ReactDOM.render(<App />, document.getElementById("root"));
// After: React 18
import { createRoot } from 'react-dom/client';
import { createRoot } from "react-dom/client";
const root = createRoot(document.getElementById('root'));
const root = createRoot(document.getElementById("root"));
root.render(<App />);
```
### Automatic Batching
```javascript
// React 18: All updates are batched
function handleClick() {
setCount(c => c + 1);
setFlag(f => !f);
setCount((c) => c + 1);
setFlag((f) => !f);
// Only one re-render (batched)
}
// Even in async:
setTimeout(() => {
setCount(c => c + 1);
setFlag(f => !f);
setCount((c) => c + 1);
setFlag((f) => !f);
// Still batched in React 18!
}, 1000);
// Opt out if needed
import { flushSync } from 'react-dom';
import { flushSync } from "react-dom";
flushSync(() => {
setCount(c => c + 1);
setCount((c) => c + 1);
});
// Re-render happens here
setFlag(f => !f);
setFlag((f) => !f);
// Another re-render
```
### Transitions
```javascript
import { useState, useTransition } from 'react';
import { useState, useTransition } from "react";
function SearchResults() {
const [query, setQuery] = useState('');
const [query, setQuery] = useState("");
const [results, setResults] = useState([]);
const [isPending, startTransition] = useTransition();
@@ -288,8 +292,9 @@ function SearchResults() {
```
### Suspense for Data Fetching
```javascript
import { Suspense } from 'react';
import { Suspense } from "react";
// Resource-based data fetching (with React 18)
const resource = fetchProfileData();
@@ -320,6 +325,7 @@ function ProfileTimeline() {
## Codemods for Automation
### Run React Codemods
```bash
# Install jscodeshift
npm install -g jscodeshift
@@ -342,22 +348,25 @@ npx codemod react/hooks/convert-class-to-function src/
```
### Custom Codemod Example
```javascript
// custom-codemod.js
module.exports = function(file, api) {
module.exports = function (file, api) {
const j = api.jscodeshift;
const root = j(file.source);
// Find setState calls
root.find(j.CallExpression, {
callee: {
type: 'MemberExpression',
property: { name: 'setState' }
}
}).forEach(path => {
// Transform to useState
// ... transformation logic
});
root
.find(j.CallExpression, {
callee: {
type: "MemberExpression",
property: { name: "setState" },
},
})
.forEach((path) => {
// Transform to useState
// ... transformation logic
});
return root.toSource();
};
@@ -368,38 +377,38 @@ module.exports = function(file, api) {
## Performance Optimization
### useMemo and useCallback
```javascript
function ExpensiveComponent({ items, filter }) {
// Memoize expensive calculation
const filteredItems = useMemo(() => {
return items.filter(item => item.category === filter);
return items.filter((item) => item.category === filter);
}, [items, filter]);
// Memoize callback to prevent child re-renders
const handleClick = useCallback((id) => {
console.log('Clicked:', id);
console.log("Clicked:", id);
}, []); // No dependencies, never changes
return (
<List items={filteredItems} onClick={handleClick} />
);
return <List items={filteredItems} onClick={handleClick} />;
}
// Child component with memo
const List = React.memo(({ items, onClick }) => {
return items.map(item => (
return items.map((item) => (
<Item key={item.id} item={item} onClick={onClick} />
));
});
```
### Code Splitting
```javascript
import { lazy, Suspense } from 'react';
import { lazy, Suspense } from "react";
// Lazy load components
const Dashboard = lazy(() => import('./Dashboard'));
const Settings = lazy(() => import('./Settings'));
const Dashboard = lazy(() => import("./Dashboard"));
const Settings = lazy(() => import("./Settings"));
function App() {
return (
@@ -446,12 +455,14 @@ function List<T>({ items, renderItem }: ListProps<T>) {
```markdown
### Pre-Migration
- [ ] Update dependencies incrementally (not all at once)
- [ ] Review breaking changes in release notes
- [ ] Set up testing suite
- [ ] Create feature branch
### Class → Hooks Migration
- [ ] Identify class components to migrate
- [ ] Start with leaf components (no children)
- [ ] Convert state to useState
@@ -461,6 +472,7 @@ function List<T>({ items, renderItem }: ListProps<T>) {
- [ ] Test thoroughly
### React 18 Upgrade
- [ ] Update to React 17 first (if needed)
- [ ] Update react and react-dom to 18
- [ ] Update @types/react if using TypeScript
@@ -470,6 +482,7 @@ function List<T>({ items, renderItem }: ListProps<T>) {
- [ ] Adopt Suspense/Transitions where beneficial
### Performance
- [ ] Identify performance bottlenecks
- [ ] Add React.memo where appropriate
- [ ] Use useMemo/useCallback for expensive operations
@@ -477,6 +490,7 @@ function List<T>({ items, renderItem }: ListProps<T>) {
- [ ] Optimize re-renders
### Testing
- [ ] Update test utilities (React Testing Library)
- [ ] Test with React 18 features
- [ ] Check for warnings in console