Files
ai-training/.cursor/skills/refactor/phases/01-discovery.md
T
Oleksandr Bezdieniezhnykh 243b69656b Update test results directory structure and enhance Docker configurations
- Modified `.gitignore` to reflect the new path for test results.
- Updated `docker-compose.test.yml` to mount the correct test results directory.
- Adjusted `Dockerfile.test` to set the `PYTHONPATH` and ensure test results are saved in the updated location.
- Added `boto3` and `netron` to `requirements-test.txt` to support new functionalities.
- Updated `pytest.ini` to include the new `pythonpath` for test discovery.

These changes streamline the testing process and ensure compatibility with the updated directory structure.
2026-03-28 00:13:08 +02:00

4.4 KiB

Phase 1: Discovery

Role: Principal software architect Goal: Analyze existing code and produce RUN_DIR/list-of-changes.md Constraints: Document what exists, identify what needs to change. No code changes.

Skip condition (Targeted mode): If COMPONENTS_DIR and SOLUTION_DIR already contain documentation for the target area, skip to Phase 2. Ask user to confirm skip.

Mode Branch

Determine the input mode set during Context Resolution (see SKILL.md):

  • Guided mode: input file provided → start with 1g below
  • Automatic mode: no input file → start with 1a below

Guided Mode

1g. Read and Validate Input File

  1. Read the provided input file (e.g., list-of-changes.md from the autopilot testability revision step or user-provided file)
  2. Extract file paths, problem descriptions, and proposed changes from each entry
  3. For each entry, verify against actual codebase:
    • Referenced files exist
    • Described problems are accurate (read the code, confirm the issue)
    • Proposed changes are feasible
  4. Flag any entries that reference nonexistent files or describe inaccurate problems — ASK user

1h. Scoped Component Analysis

For each file/area referenced in the input file:

  1. Analyze the specific modules and their immediate dependencies
  2. Document component structure, interfaces, and coupling points relevant to the proposed changes
  3. Identify additional issues not in the input file but discovered during analysis of the same areas

Write per-component to RUN_DIR/discovery/components/[##]_[name].md (same format as automatic mode, but scoped to affected areas only).

1i. Produce List of Changes

  1. Start from the validated input file entries
  2. Enrich each entry with:
    • Exact file paths confirmed from code
    • Risk assessment (low/medium/high)
    • Dependencies between changes
  3. Add any additional issues discovered during scoped analysis (1h)
  4. Write RUN_DIR/list-of-changes.md using templates/list-of-changes.md format
    • Set Mode: guided
    • Set Source: path to the original input file

Skip to Save action below.


Automatic Mode

1a. Document Components

For each component in the codebase:

  1. Analyze project structure, directories, files
  2. Go file by file, analyze each method
  3. Analyze connections between components

Write per component to RUN_DIR/discovery/components/[##]_[name].md:

  • Purpose and architectural patterns
  • Mermaid diagrams for logic flows
  • API reference table (name, description, input, output)
  • Implementation details: algorithmic complexity, state management, dependencies
  • Caveats, edge cases, known limitations

1b. Synthesize Solution & Flows

  1. Review all generated component documentation
  2. Synthesize into a cohesive solution description
  3. Create flow diagrams showing component interactions

Write:

  • RUN_DIR/discovery/solution.md — product description, component overview, interaction diagram
  • RUN_DIR/discovery/system_flows.md — Mermaid flowcharts per major use case

Also copy to project standard locations:

  • SOLUTION_DIR/solution.md
  • DOCUMENT_DIR/system_flows.md

1c. Produce List of Changes

From the component analysis and solution synthesis, identify all issues that need refactoring:

  1. Hardcoded values (paths, config, magic numbers)
  2. Tight coupling between components
  3. Missing dependency injection / non-configurable parameters
  4. Global mutable state
  5. Code duplication
  6. Missing error handling
  7. Testability blockers (code that cannot be exercised in isolation)
  8. Security concerns
  9. Performance bottlenecks

Write RUN_DIR/list-of-changes.md using templates/list-of-changes.md format:

  • Set Mode: automatic
  • Set Source: self-discovered

Save action (both modes)

Write all discovery artifacts to RUN_DIR.

Self-verification:

  • Every referenced file in list-of-changes.md exists in the codebase
  • Each change entry has file paths, problem, change description, risk, and dependencies
  • Component documentation covers all areas affected by the changes
  • In guided mode: all input file entries are validated or flagged
  • In automatic mode: solution description covers all components
  • Mermaid diagrams are syntactically correct

BLOCKING: Present discovery summary and list-of-changes.md to user. Do NOT proceed until user confirms documentation accuracy and change list completeness.