Lint

Scan the wiki for gaps, contradictions, staleness, and broken links.

Usage

CLI

vcro lint

Slash command

/lint

No arguments. Lint runs over the entire wiki.

How it works

Two stages. First, a mechanical pre-pass script scans every entity file and produces a structured JSON. Then four sub-skills read the scan and classify findings.

Stage 1: Mechanical scan

python3 scripts/lint_scan.py --wiki store/wiki --raw store/raw \
  --out store/lint/2026-04-09_scan.json

Pure Python, no LLM. Counts dimensions, checks source IDs against store/raw/, detects broken [[slug]] links, flags entities with last_compiled older than 90 days.

Stage 2: Four sub-skills

Gaps

Which entities are missing dimensions that matter for scoring? Which cohorts lack dimension 15 (biospecimen retention) or dimension 20 (collection protocol)?

Consistency

Do two entities claim conflicting facts about the same cohort? Does an investigator slug point to the wrong institution?

Staleness

Which entities were last compiled more than 90 days ago? Which source papers have been updated or retracted since compilation?

Connections

Which entities should be cross-linked but are not? Which back-references are missing? Uses the graph layer if available.

Findings are not reports

Lint findings feed back into compile. They are the orchestrator's queue, not a user-facing dashboard. Each finding says: this entity has this gap, here is the evidence, here is the fix. The user decides whether to re-compile.

What gets written to disk

store/lint/
  2026-04-09_scan.json                   # Mechanical pre-pass
  2026-04-09_report/
    gaps.md                              # Dimension coverage gaps
    consistency.md                       # Conflicting claims
    staleness.md                         # Stale entities
    connections.md                       # Missing cross-links
    report.md                            # Aggregated summary

Example

$ vcro lint

Lint workflow. Scanned 328 entities.
14 gaps: 8 cohorts missing dimension 15, 3 missing dimension 20,
3 investigators with no linked cohorts. 2 consistency conflicts:
ADNI institution slug referenced two ways. 0 staleness flags.
5 missing connections from graph analysis.
Report at store/lint/2026-04-09_report/report.md.