review of all AI-dev system #01

add refactoring phase
complete implementation phase
fix wrong links and file names
This commit is contained in:
Oleksandr Bezdieniezhnykh
2025-12-09 12:11:29 +02:00
parent d5c036e6f7
commit 73cbe43397
35 changed files with 1215 additions and 206 deletions
@@ -1,10 +1,11 @@
# Create initial structure
# Create Initial Structure
## Initial data:
- Problem description: `@_docs/00_problem/problem_description.md`.
- Input data: `@_docs/00_problem/input_data`. They are for reference only, yet it is an example of the real data.
- Restrictions: `@_docs/00_problem/restrictions.md`.
- Acceptance criteria: `@_docs/00_problem/acceptance_criteria.md`.
- Acceptance criteria: `@_docs/00_problem/acceptance_criteria.md`.
- Security approach: `@_docs/00_problem/security_approach.md`.
- Full Solution Description: `@_docs/01_solution/solution.md`
- Components with Features specifications: `@_docs/02_components`
@@ -13,24 +14,31 @@
## Task
- Read carefully all the component specs and features in the components folder: `@_docs/02_components`
- Investgate in internet what are the best way and tools to implement components and its features
- Investigate in internet what are the best way and tools to implement components and its features
- Make a plan for the creating initial structure:
- DTOs
- component's interfaces
- empty implementations
- helpers - empty implementations or interfaces
- add README.md, describe the project by @_docs/01_solution/solution.md
- Create a separate project for the integration tests
- Add .gitignore appropriate for the project's language/framework
- Add .env.example with required environment variables
- Add CI/CD skeleton (GitHub Actions, GitLab CI, or appropriate)
- Add database migration setup if applicable
- Add README.md, describe the project by @_docs/01_solution/solution.md
- Create a separate folder for the integration tests (not a separate repo)
## Example
The structure should roughly looks like this:
-
- .gitignore
- .env.example
- .github/workflows/ (or .gitlab-ci.yml)
- api
- components
- component1_folder
- component2_folder
- ...
- db
- migrations/
- helpers
- models
- tests
@@ -1,4 +1,4 @@
# Implement component and features by spec
# Implement Component and Features by Spec
## Input parameter
component_folder
@@ -7,7 +7,8 @@
- Problem description: `@_docs/00_problem/problem_description.md`.
- Input data: `@_docs/00_problem/input_data`. They are for reference only, yet it is an example of the real data.
- Restrictions: `@_docs/00_problem/restrictions.md`.
- Acceptance criteria: `@_docs/00_problem/acceptance_criteria.md`.
- Acceptance criteria: `@_docs/00_problem/acceptance_criteria.md`.
- Security approach: `@_docs/00_problem/security_approach.md`.
- Full Solution Description: `@_docs/01_solution/solution.md`
## Role
@@ -16,16 +17,19 @@
## Task
- Read carefully initial data and component spec in the component_folder: `@_docs/02_components/[##]_[component_name]/[##]._component_[component_name]`
- Read carefully all the component features in the component_folder: `@_docs/02_components/[##]_[component_name]/[##].[##]_feature_[feature_name]`
- Investgate in internet what are the best way and tools to implement component and its features
- During the investigation is is possible that found solutions required architecturally reorganization of the features. It is ok, propose that and if user agrees, include reorganization in the build feature plan. Also it is possible that interface could be changed or even removed or added new one. It is ok.
- Investigate in internet what are the best way and tools to implement component and its features
- During the investigation it is possible that found solutions required architecturally reorganization of the features. It is ok, propose that and if user agrees, include reorganization in the build feature plan. Also it is possible that interface could be changed or even removed or added new one. It is ok.
- Analyze the existing codebase and get full context for the component's implementation
- Make sure each feature is connected and communicated properly with other features and existing code
- If component has dependency on another one, create temporary mock for the dependency
- For each feature:
- Implement the feature
- Implement error handling per defined strategy
- Implement logging per defined strategy
- Implement all unit tests from the Test cases description, add checks test results to the plan steps
- Implement all integration tests for the feature, add check test results to the plan steps. Analyze existing tests, and decide whether to create new one or add to existing
- Add to the implementation plan description of all component's integration tests, add check test results to the plan steps
- After component is complete, replace mocks with real implementations (mock cleanup)
## Notes
- Ask as many questions as needed, everything should be clear how to implement each feature
@@ -0,0 +1,39 @@
# Code Review
## Initial data:
- Problem description: `@_docs/00_problem/problem_description.md`.
- Acceptance criteria: `@_docs/00_problem/acceptance_criteria.md`.
- Security approach: `@_docs/00_problem/security_approach.md`.
- Full Solution Description: `@_docs/01_solution/solution.md`
- Components: `@_docs/02_components`
## Role
You are a senior software engineer performing code review
## Task
- Review implemented code against component specifications
- Check code quality: readability, maintainability, SOLID principles
- Check error handling consistency
- Check logging implementation
- Check security requirements are met
- Check test coverage is adequate
- Identify code smells and technical debt
## Output
### Issues Found
For each issue:
- File/Location
- Issue type (Bug/Security/Performance/Style/Debt)
- Description
- Suggested fix
- Priority (High/Medium/Low)
### Summary
- Total issues by type
- Blocking issues that must be fixed
- Recommended improvements
## Notes
- Can also use Cursor's built-in review feature
- Focus on critical issues first
@@ -0,0 +1,42 @@
# CI/CD Setup
## Initial data:
- Problem description: `@_docs/00_problem/problem_description.md`.
- Restrictions: `@_docs/00_problem/restrictions.md`.
- Full Solution Description: `@_docs/01_solution/solution.md`
- Components: `@_docs/02_components`
## Role
You are a DevOps engineer
## Task
- Review project structure and dependencies
- Configure CI/CD pipeline with stages:
- Build
- Lint
- Unit tests
- Integration tests
- Security scan (if applicable)
- Deploy to staging (if applicable)
- Configure environment variables handling
- Set up test reporting
- Configure branch protection rules recommendations
## Output
### Pipeline Configuration
- Pipeline file(s) created/updated
- Stages description
- Triggers (on push, PR, etc.)
### Environment Setup
- Required secrets/variables
- Environment-specific configs
### Deployment Strategy
- Staging deployment steps
- Production deployment steps (if applicable)
## Notes
- Use project-appropriate CI/CD tool (GitHub Actions, GitLab CI, Azure DevOps, etc.)
- Keep pipeline fast - parallelize where possible
@@ -1,4 +1,4 @@
# Implement tests by spec
# Implement Tests by Spec
## Initial data:
- Problem description: `@_docs/00_problem/problem_description.md`.
@@ -13,14 +13,22 @@
## Task
- Read carefully all the initial data and understand whole system goals
- Check that a separate project in a separate folder is existing (should be generated by @3.05_implement_initial_structure.md)
- Check that a separate folder for tests is existing (should be generated by @3.05_implement_initial_structure.md)
- Set up Docker environment for testing:
- Create docker-compose.yml for test environment
- Configure test database container
- Configure application container
- For each test description:
- Prepare all the data necessary for testing, or check it is already exists
- Check existing integration tests and if a similar test is already exists, update it
- Implement the test by specification
- Run system and integration tests and in 2 separate docker containers
- Implement test data management:
- Setup fixtures/factories
- Teardown/cleanup procedures
- Run system and integration tests in docker containers
- Fix all problems if tests failed until we got a successful result. In case if one or more tests was failed due to missing data from user or API or other system, ask it from developer.
- Repeat test cycle until no failed tests, iteratively fixing found bugs. Ask user for an additional information if something new appears
- Ensure tests run in CI pipeline
- Compose a final test results in a csv with the next format:
- Test filename
- Execution time
@@ -28,3 +36,4 @@
## Notes
- Ask as many questions as needed, everything should be clear how to implement each feature