Refine coding standards and testing guidelines. Updated coderule.mdc to emphasize readability, meaningful comments, and scope discipline. Adjusted testing.mdc to set a 75% coverage threshold for business logic and clarified test data requirements. Enhanced tracker.mdc with a mechanism for handling Jira connection issues and added completeness audit steps in research skills.

This commit is contained in:
Oleksandr Bezdieniezhnykh
2026-04-17 20:28:48 +03:00
parent 57ff6dcd22
commit 0b3bb2fc55
17 changed files with 275 additions and 90 deletions
@@ -32,3 +32,17 @@
6. Applicable scenarios
7. Team capability requirements
8. Migration difficulty
## Decomposition Completeness Probes (Completeness Audit Reference)
Used during Step 1's Decomposition Completeness Audit. After generating sub-questions, ask each probe against the current decomposition. If a probe reveals an uncovered area, add a sub-question for it.
| Probe | What it catches |
|-------|-----------------|
| **What does this cost — in money, time, resources, or trade-offs?** | Budget, pricing, licensing, tax, opportunity cost, maintenance burden |
| **What are the hard constraints — physical, legal, regulatory, environmental?** | Regulations, certifications, spectrum/frequency rules, export controls, physics limits, IP restrictions |
| **What are the dependencies and assumptions that could break?** | Supply chain, vendor lock-in, API stability, single points of failure, standards evolution |
| **What does the operating environment actually look like?** | Terrain, weather, connectivity, infrastructure, power, latency, user skill level |
| **What failure modes exist and what happens when they trigger?** | Degraded operation, fallback, safety margins, blast radius, recovery time |
| **What do practitioners who solved similar problems say matters most?** | Field-tested priorities that don't appear in specs or papers |
| **What changes over time — and what looks stable now but isn't?** | Technology roadmaps, regulatory shifts, deprecation risk, scaling effects |