diff --git a/CURRENT_TASK.md b/CURRENT_TASK.md index 5be3df21..aa7c6f44 100644 --- a/CURRENT_TASK.md +++ b/CURRENT_TASK.md @@ -39,6 +39,9 @@ Pattern8 static box filter を single_planner から撤去し、planner/facts **2025-12-29: Phase 29ak P4 COMPLETE (remove Pattern1 guard in single_planner)** Pattern1 guard を single_planner から撤去し、planner/facts 側 SSOT と fallback 抑制に統一。 +**2025-12-29: Phase 29ak P5 COMPLETE (planner candidate ctx gate)** +Pattern1/8 の候補抑制を planner 側に集約し、single_planner の Pattern1 抑制を撤去。 + **2025-12-29: Phase 29aj P10 COMPLETE (single_planner unified shape)** single_planner を全パターンで planner-first → extractor フォールバックの共通形に統一(挙動不変)。 diff --git a/docs/development/current/main/10-Now.md b/docs/development/current/main/10-Now.md index 6a7faf78..468d94a3 100644 --- a/docs/development/current/main/10-Now.md +++ b/docs/development/current/main/10-Now.md @@ -2,10 +2,15 @@ ## Current Focus: Phase 29ak(PlanRuleOrder + PlannerContext) -Next: Phase 29ak P5(TBD) +Next: Phase 29ak P6(TBD) 運用ルール: integration filter で phase143_* は回さない(JoinIR 回帰は phase29ae pack のみ) 運用ルール: phase286_pattern9_* は legacy pack (SKIP) を使う +**2025-12-29: Phase 29ak P5 完了** ✅ +- 目的: planner の candidate gate を SSOT 化し、single_planner の Pattern1 抑制を撤去(仕様不変) +- 実装: `src/mir/builder/control_flow/plan/planner/build.rs` / `src/mir/builder/control_flow/plan/planner/outcome.rs` / `src/mir/builder/control_flow/plan/single_planner/rules.rs` +- 検証: `cargo build --release` / `./tools/smokes/v2/run.sh --profile quick` / `./tools/smokes/v2/profiles/integration/joinir/phase29ae_regression_pack_vm.sh` PASS + **2025-12-29: Phase 29ak P4 完了** ✅ - 目的: Pattern1 guard を single_planner から撤去(仕様不変) - 実装: `src/mir/builder/control_flow/plan/single_planner/rules.rs` diff --git a/docs/development/current/main/30-Backlog.md b/docs/development/current/main/30-Backlog.md index 7f2d4182..4bea4d99 100644 --- a/docs/development/current/main/30-Backlog.md +++ b/docs/development/current/main/30-Backlog.md @@ -26,8 +26,8 @@ Related: - **Phase 29ak(candidate): PlanRuleOrder SSOT + PlannerContext plumbing** - 入口: `docs/development/current/main/phases/phase-29ak/README.md` - - 状況: P0/P1/P2/P3/P4 ✅ 完了 - - Next: Phase 29ak P5(TBD) + - 状況: P0/P1/P2/P3/P4/P5 ✅ 完了 + - Next: Phase 29ak P6(TBD) - **Phase 29ai(candidate): Plan/Frag single-planner(Facts SSOT)** - 入口: `docs/development/current/main/phases/phase-29ai/README.md` diff --git a/docs/development/current/main/investigations/phase-29ak-domainplan-coreplan-consult.md b/docs/development/current/main/investigations/phase-29ak-domainplan-coreplan-consult.md new file mode 100644 index 00000000..a87943e4 --- /dev/null +++ b/docs/development/current/main/investigations/phase-29ak-domainplan-coreplan-consult.md @@ -0,0 +1,139 @@ +# DomainPlan / CorePlan Design Consult (for ChatGPT Pro) + +Status: Draft (Phase 29ak, consult-only) +Goal: Ask for design feedback without requiring source code reading. + +## 0. What is Hakorune (1-paragraph summary) + +Hakorune is a compiler pipeline that lowers `.hako` (Nyash) source into MIR and executes it via VM/LLVM backends. A large, tricky part is the **JoinIR line**: it recognizes “loop/if shaped” regions and rewrites/lower them through a structured pipeline (Plan → Frag → MIR merge) with strict SSOT boundaries so we can keep behavior stable while refactoring aggressively. + +## 1. The current JoinIR/Plan/Frag architecture (SSOT view) + +We are migrating from “pattern-name routing at the entry” to a **single pipeline**: + +``` +AST + → Facts (observations + derived facts) + → Normalize (canonicalize facts; pure transform) + → Planner (candidate-set; 0/1/2+ → None/Some/Freeze) + → DomainPlan (pattern-specific recipe vocabulary) + → Normalizer (DomainPlan → CorePlan; SSOT conversion) + → CorePlan (fixed vocabulary structure nodes only) + → Lowerer/Emit (CorePlan → Frag; generation only) + → Merge (Frag + Boundary → MIR blocks; contract checks) +``` + +Key SSOT docs: +- Plan/Frag SSOT registry: `docs/development/current/main/design/planfrag-ssot-registry.md` +- Freeze taxonomy: `docs/development/current/main/design/planfrag-freeze-taxonomy.md` +- JoinIR Plan/Frag responsibilities: `docs/development/current/main/design/joinir-plan-frag-ssot.md` +- EdgeCFG/Frag overview: `docs/development/current/main/design/edgecfg-fragments.md` + +## 2. Why this refactor exists (design constraints) + +We want to remove “pattern-name entry dispatch” and avoid “by-name hacks” while keeping behavior stable: + +- Default behavior must not change (release builds are the baseline). +- Fail-Fast is allowed only on clear contract violations; silent fallback is forbidden. +- No “hardcode to pass smoke” (e.g., `if pattern == "PatternX" { ... }` in arbitrary places). +- Avoid new permanent env vars; use existing strict/dev gates if needed. +- Observability should be stable, tagged, and SSOT’d (strict/dev only). + +## 3. What `DomainPlan` and `CorePlan` are today + +### 3.1 DomainPlan (pattern-specific recipe vocabulary) + +`DomainPlan` is the “recipe” layer that still knows pattern semantics (Pattern1–9). Examples: + +- `Pattern1SimpleWhile` +- `Pattern2Break` +- `Pattern3IfPhi` +- `Pattern4Continue` +- `Pattern5InfiniteEarlyExit` (loop(true) + early exit) +- `ScanWithInit` / `SplitScan` / `BoolPredicateScan` +- `Pattern9AccumConstLoop` + +DomainPlan is consumed by a SSOT `PlanNormalizer` that emits `CorePlan`. + +### 3.2 CorePlan (fixed vocabulary) + +`CorePlan` is a stable, low-level “structure node” vocabulary meant to be composed, verified, and emitted without re-parsing AST or re-analyzing CFG. The intent is that **composition lives here** long-term. + +## 4. Where we are today (implementation progress, without code details) + +We have been moving pattern extraction SSOT into the plan layer and shifting routing to planner-first: + +- Facts/Planner-first are implemented conservatively for Pattern1–9 (subsets) and adopted only when the planner output variant matches the expected rule. +- `single_planner` still exists as the order SSOT, but it has been simplified: + - rule order and pattern names are SSOT’d (`rule_order.rs`) + - special-case guards/filters (Pattern1 guard, Pattern8 static-box filter) were moved into planner/facts and removed from `single_planner` + - `PlannerContext` exists to pass `pattern_kind` / `in_static_box` / `debug` to planner-side gating + +JoinIR regression gate (SSOT): +- `./tools/smokes/v2/profiles/integration/joinir/phase29ae_regression_pack_vm.sh` + +Legacy integration packs that are intentionally SKIP’d (SSOT) exist for cases whose environment assumptions no longer match (e.g. LoopBuilder removal / plugins disabled). + +## 5. The design question + +We believe the clean end-state is “compose with orthogonal parts” rather than “add more overlapping complete patterns”. + +The uncertainty: **what should remain in DomainPlan**, and **what should be expressed as CorePlan composition**? + +## 6. Specific questions for ChatGPT Pro + +### Q1) Should DomainPlan remain long-term, or shrink into a migration-only layer? + +- A) Keep DomainPlan as a permanent public-ish vocabulary (pattern semantics live here forever) +- B) Treat DomainPlan as a transient recipe layer, and over time represent most structures as CorePlan composition + +What end-state is the cleanest and most maintainable? + +### Q2) What belongs in DomainPlan vs CorePlan? + +We suspect: +- DomainPlan should keep “semantic-heavy” plans (scan/split/predicate) because they encode algorithmic intent. +- CorePlan should encode generic control structure (loop/if/join/exit/phi-ish edges) and be composable. + +Is that a good boundary? If not, what boundary is better? + +### Q3) How to make “non-overlapping” rules without pattern explosion? + +We currently use CandidateSet (0/1/2+ → None/Some/Freeze) and a strict Freeze taxonomy. + +What is the best practice to make rules “not overlap” in a compose-first system? +- Make Facts more lossless? +- Make Normalize stronger (canonicalize more forms)? +- Prefer “component inference” (emit parts) over “whole-plan inference”? + +### Q4) How to keep observability stable while migrating? + +We keep strict/dev-only tagged logs and forbid silent fallbacks. + +What is the cleanest “observability contract” across Facts/Planner/Normalizer/Emit so that: +- release logs don’t change +- strict/dev can show stable tags +- diagnostics remain local (no re-analysis in emit) + +### Q5) What minimal “final-form” invariants should be SSOT’d next? + +If we pick only 1–2 SSOT docs/invariants to harden next (without adding major features), what gives the most leverage? +- “post-phi representation” invariants? +- effect classification (pure/control/rc/observability) invariants? +- unwind/ExitKind forward design? + +## 7. Constraints recap (so advice is actionable) + +- Behavior-preserving refactor is the priority. +- No hardcoded by-name hacks. +- Fail-Fast only on clear contract violations (strict/dev gates OK). +- Avoid new env vars; prefer existing strict/dev mechanism. +- SSOT-first: boundary docs + verification commands are first-class. + +## 8. What kind of answer is most useful + +Please answer with: +- A recommended “clean end-state” (DomainPlan vs CorePlan) and why. +- A short checklist of invariants to SSOT next. +- A migration strategy that keeps behavior stable (how to gradually shrink DomainPlan or keep it). + diff --git a/docs/development/current/main/phases/phase-29ak/P5-PLANNER-CANDIDATE-CTX-GATE-SSOT-INSTRUCTIONS.md b/docs/development/current/main/phases/phase-29ak/P5-PLANNER-CANDIDATE-CTX-GATE-SSOT-INSTRUCTIONS.md new file mode 100644 index 00000000..fbc01ca5 --- /dev/null +++ b/docs/development/current/main/phases/phase-29ak/P5-PLANNER-CANDIDATE-CTX-GATE-SSOT-INSTRUCTIONS.md @@ -0,0 +1,65 @@ +# Phase 29ak P5: Planner candidate ctx gate SSOT + +Date: 2025-12-29 +Status: Ready for execution +Scope: planner の CandidateSet 生成を ctx-aware にして、single_planner を薄くする(仕様不変) +Goal: ctx gate を planner に一本化し、fallback 側の特例を最小化する + +## Objective + +- Pattern1/8 の候補抑制を planner 側に集約する +- single_planner の Pattern1 fallback 抑制を撤去 +- 挙動は不変(候補を作らないだけで legacy fallback は従来どおり) + +## Non-goals + +- CandidateSet の順序SSOT化 +- extractor fallback の削除 +- 新 env var / 新ログ追加 + +## Risk / Gotchas + +- Pattern1 の fallback 抑制を撤去すると、Pattern1 extractor が nested loop を誤マッチし得る(phase1883 の Pattern6NestedLoopMinimal が plan 側に吸われる)。Pattern1 extractor 側で nested loop を `Ok(None)` に倒すこと。 + +## Implementation Steps + +### Step 1: build_plan_from_facts を ctx-aware に + +Update: +- `src/mir/builder/control_flow/plan/planner/build.rs` +- `src/mir/builder/control_flow/plan/planner/mod.rs` +- `src/mir/builder/control_flow/plan/planner/outcome.rs` +- `src/mir/builder/control_flow/plan/planner/context.rs` + +Notes: +- `build_plan_from_facts_ctx(ctx, facts)` を追加 +- 既存 `build_plan_from_facts` は legacy ctx に委譲 +- Candidate push 直前で ctx gate を適用 + +### Step 2: outcome の ctx 版入口を更新 + +Update: +- `src/mir/builder/control_flow/plan/planner/outcome.rs` + +### Step 3: single_planner の Pattern1 fallback 抑制を撤去 + +Update: +- `src/mir/builder/control_flow/plan/single_planner/rules.rs` + +### Step 4: docs / CURRENT_TASK 更新 + +Update: +- `docs/development/current/main/phases/phase-29ak/README.md` +- `docs/development/current/main/10-Now.md` +- `docs/development/current/main/30-Backlog.md` +- `CURRENT_TASK.md` + +## Verification + +- `cargo build --release` +- `./tools/smokes/v2/run.sh --profile quick` +- `./tools/smokes/v2/profiles/integration/joinir/phase29ae_regression_pack_vm.sh` + +## Commit + +- `git add -A && git commit -m "phase29ak(p5): ssot ctx gating in planner candidates"` diff --git a/docs/development/current/main/phases/phase-29ak/README.md b/docs/development/current/main/phases/phase-29ak/README.md index 3a257cae..4192632e 100644 --- a/docs/development/current/main/phases/phase-29ak/README.md +++ b/docs/development/current/main/phases/phase-29ak/README.md @@ -36,3 +36,11 @@ Goal: single_planner の「順序・名前・ガード」の SSOT を 1 箇所 - ねらい: Pattern1 guard を planner/facts 側 SSOT に一本化 - 完了: single_planner の guard を削除し、fallback 側で同契約を維持 - 検証: `cargo build --release` / `./tools/smokes/v2/run.sh --profile quick` / `./tools/smokes/v2/profiles/integration/joinir/phase29ae_regression_pack_vm.sh` + +## P5: planner 側に ctx gate を集約(candidate 抑制) + +- 指示書: `docs/development/current/main/phases/phase-29ak/P5-PLANNER-CANDIDATE-CTX-GATE-SSOT-INSTRUCTIONS.md` +- ねらい: Pattern1/8 の候補抑制を planner の candidate 生成で SSOT 化 +- 完了: build_plan_from_facts_ctx で ctx gate を集中管理し、single_planner の Pattern1 fallback 抑制を撤去 +- 補足: Pattern1 extractor は nested loop を `Ok(None)` に倒して、phase1883(Pattern6NestedLoopMinimal)が plan 側に吸われないこと +- 検証: `cargo build --release` / `./tools/smokes/v2/run.sh --profile quick` / `./tools/smokes/v2/profiles/integration/joinir/phase29ae_regression_pack_vm.sh` diff --git a/src/mir/builder/control_flow/plan/extractors/common_helpers.rs b/src/mir/builder/control_flow/plan/extractors/common_helpers.rs index 1a091084..29d1dc59 100644 --- a/src/mir/builder/control_flow/plan/extractors/common_helpers.rs +++ b/src/mir/builder/control_flow/plan/extractors/common_helpers.rs @@ -88,12 +88,22 @@ pub(crate) fn count_control_flow( ASTNode::Return { .. } if detector.count_returns => { counts.return_count += 1; } - ASTNode::Loop { .. } if depth > 0 => { + ASTNode::ScopeBox { body, .. } => { + for stmt in body { + scan_node(stmt, counts, detector, depth); + } + } + ASTNode::Loop { body, .. } + | ASTNode::While { body, .. } + | ASTNode::ForRange { body, .. } => { counts.has_nested_loop = true; // Skip nested loop bodies if configured if detector.skip_nested_control_flow { return; } + for stmt in body { + scan_node(stmt, counts, detector, depth + 1); + } } ASTNode::If { then_body, @@ -358,6 +368,17 @@ mod tests { } } + fn make_nested_loop() -> ASTNode { + ASTNode::Loop { + condition: Box::new(ASTNode::Variable { + name: "cond".to_string(), + span: Span::unknown(), + }), + body: vec![make_break()], + span: Span::unknown(), + } + } + #[test] fn test_count_control_flow_break() { let body = vec![make_break()]; @@ -417,6 +438,22 @@ mod tests { assert!(!has_control_flow_statement(&body)); } + #[test] + fn test_count_control_flow_detects_nested_loop_at_top_level() { + let body = vec![make_nested_loop()]; + let counts = count_control_flow(&body, ControlFlowDetector::default()); + assert!(counts.has_nested_loop); + } + + #[test] + fn test_has_control_flow_statement_break_in_scopebox() { + let body = vec![ASTNode::ScopeBox { + body: vec![make_break()], + span: Span::unknown(), + }]; + assert!(has_control_flow_statement(&body)); + } + #[test] fn test_extract_loop_variable_success() { let condition = ASTNode::BinaryOp { diff --git a/src/mir/builder/control_flow/plan/extractors/pattern1.rs b/src/mir/builder/control_flow/plan/extractors/pattern1.rs index 37337952..915c6c43 100644 --- a/src/mir/builder/control_flow/plan/extractors/pattern1.rs +++ b/src/mir/builder/control_flow/plan/extractors/pattern1.rs @@ -17,7 +17,7 @@ pub(crate) struct Pattern1Parts { /// # Detection Criteria (誤マッチ防止強化版) /// /// 1. **Condition**: 比較演算(<, <=, >, >=, ==, !=)で左辺が変数 -/// 2. **Body**: No break/continue/if-else-phi (return is allowed - it's not loop control flow) +/// 2. **Body**: No break/continue/nested-loop/if-else-phi (return is allowed - it's not loop control flow) /// 3. **Step**: 単純な増減パターン (i = i + 1, i = i - 1 など) /// /// # Four-Phase Validation @@ -48,6 +48,18 @@ pub(crate) fn extract_simple_while_parts( return Ok(None); } + // Phase 29ak: Reject nested loops. + // + // Nested loops are Pattern6NestedLoopMinimal territory; letting Pattern1 match them + // causes routing to short-circuit before JoinIR can select the nested-loop lowerer. + let counts = super::common_helpers::count_control_flow( + body, + super::common_helpers::ControlFlowDetector::default(), + ); + if counts.has_nested_loop { + return Ok(None); + } + // Phase 286 P2.6: Reject if-else statements (Pattern3 territory) // Pattern1 allows simple if without else, but not if-else (which is Pattern3) if super::common_helpers::has_if_else_statement(body) { diff --git a/src/mir/builder/control_flow/plan/planner/build.rs b/src/mir/builder/control_flow/plan/planner/build.rs index 6b139360..6d9a63ac 100644 --- a/src/mir/builder/control_flow/plan/planner/build.rs +++ b/src/mir/builder/control_flow/plan/planner/build.rs @@ -7,6 +7,7 @@ use crate::ast::ASTNode; use crate::mir::builder::control_flow::plan::normalize::CanonicalLoopFacts; use super::candidates::{CandidateSet, PlanCandidate}; +use super::context::PlannerContext; use super::outcome::build_plan_with_facts; use super::Freeze; use crate::mir::builder::control_flow::plan::{ @@ -14,6 +15,7 @@ use crate::mir::builder::control_flow::plan::{ Pattern4ContinuePlan, Pattern5InfiniteEarlyExitPlan, Pattern8BoolPredicateScanPlan, Pattern9AccumConstLoopPlan, ScanDirection, ScanWithInitPlan, SplitScanPlan, }; +use crate::mir::loop_pattern_detection::LoopPatternKind; /// Phase 29ai P0: External-ish SSOT entrypoint (skeleton) /// @@ -27,6 +29,13 @@ pub(in crate::mir::builder) fn build_plan( pub(in crate::mir::builder) fn build_plan_from_facts( facts: CanonicalLoopFacts, +) -> Result, Freeze> { + build_plan_from_facts_ctx(&PlannerContext::default_for_legacy(), facts) +} + +pub(in crate::mir::builder) fn build_plan_from_facts_ctx( + ctx: &PlannerContext, + facts: CanonicalLoopFacts, ) -> Result, Freeze> { // Phase 29ai P3: CandidateSet-based boundary (SSOT) // @@ -34,6 +43,12 @@ pub(in crate::mir::builder) fn build_plan_from_facts( // unreachable in normal execution today. We still implement the SSOT // boundary here so that future Facts work cannot drift. + let allow_pattern1 = match ctx.pattern_kind { + Some(LoopPatternKind::Pattern1SimpleWhile) | None => true, + Some(_) => false, + }; + let allow_pattern8 = !ctx.in_static_box; + let mut candidates = CandidateSet::new(); if let Some(scan) = &facts.facts.scan_with_init { @@ -133,18 +148,20 @@ pub(in crate::mir::builder) fn build_plan_from_facts( }); } - if let Some(pattern8) = &facts.facts.pattern8_bool_predicate_scan { - candidates.push(PlanCandidate { - plan: DomainPlan::Pattern8BoolPredicateScan(Pattern8BoolPredicateScanPlan { - loop_var: pattern8.loop_var.clone(), - haystack: pattern8.haystack.clone(), - predicate_receiver: pattern8.predicate_receiver.clone(), - predicate_method: pattern8.predicate_method.clone(), - condition: pattern8.condition.clone(), - step_lit: pattern8.step_lit, - }), - rule: "loop/pattern8_bool_predicate_scan", - }); + if allow_pattern8 { + if let Some(pattern8) = &facts.facts.pattern8_bool_predicate_scan { + candidates.push(PlanCandidate { + plan: DomainPlan::Pattern8BoolPredicateScan(Pattern8BoolPredicateScanPlan { + loop_var: pattern8.loop_var.clone(), + haystack: pattern8.haystack.clone(), + predicate_receiver: pattern8.predicate_receiver.clone(), + predicate_method: pattern8.predicate_method.clone(), + condition: pattern8.condition.clone(), + step_lit: pattern8.step_lit, + }), + rule: "loop/pattern8_bool_predicate_scan", + }); + } } if let Some(pattern9) = &facts.facts.pattern9_accum_const_loop { @@ -160,15 +177,17 @@ pub(in crate::mir::builder) fn build_plan_from_facts( }); } - if let Some(pattern1) = &facts.facts.pattern1_simplewhile { - candidates.push(PlanCandidate { - plan: DomainPlan::Pattern1SimpleWhile(Pattern1SimpleWhilePlan { - loop_var: pattern1.loop_var.clone(), - condition: pattern1.condition.clone(), - loop_increment: pattern1.loop_increment.clone(), - }), - rule: "loop/pattern1_simplewhile", - }); + if allow_pattern1 { + if let Some(pattern1) = &facts.facts.pattern1_simplewhile { + candidates.push(PlanCandidate { + plan: DomainPlan::Pattern1SimpleWhile(Pattern1SimpleWhilePlan { + loop_var: pattern1.loop_var.clone(), + condition: pattern1.condition.clone(), + loop_increment: pattern1.loop_increment.clone(), + }), + rule: "loop/pattern1_simplewhile", + }); + } } candidates.finalize() diff --git a/src/mir/builder/control_flow/plan/planner/context.rs b/src/mir/builder/control_flow/plan/planner/context.rs index ec9e96ac..6a57f848 100644 --- a/src/mir/builder/control_flow/plan/planner/context.rs +++ b/src/mir/builder/control_flow/plan/planner/context.rs @@ -6,3 +6,13 @@ pub(in crate::mir::builder) struct PlannerContext { pub in_static_box: bool, pub debug: bool, } + +impl PlannerContext { + pub(in crate::mir::builder) fn default_for_legacy() -> Self { + Self { + pattern_kind: None, + in_static_box: false, + debug: false, + } + } +} diff --git a/src/mir/builder/control_flow/plan/planner/mod.rs b/src/mir/builder/control_flow/plan/planner/mod.rs index 6224b344..0c2d39b0 100644 --- a/src/mir/builder/control_flow/plan/planner/mod.rs +++ b/src/mir/builder/control_flow/plan/planner/mod.rs @@ -11,7 +11,7 @@ pub(in crate::mir::builder) mod context; pub(in crate::mir::builder) mod freeze; pub(in crate::mir::builder) mod outcome; -pub(in crate::mir::builder) use build::build_plan; +pub(in crate::mir::builder) use build::{build_plan, build_plan_from_facts_ctx}; pub(in crate::mir::builder) use context::PlannerContext; pub(in crate::mir::builder) use freeze::Freeze; pub(in crate::mir::builder) use outcome::{ diff --git a/src/mir/builder/control_flow/plan/planner/outcome.rs b/src/mir/builder/control_flow/plan/planner/outcome.rs index e22b3e7e..071fbaae 100644 --- a/src/mir/builder/control_flow/plan/planner/outcome.rs +++ b/src/mir/builder/control_flow/plan/planner/outcome.rs @@ -9,7 +9,7 @@ use crate::mir::builder::control_flow::plan::normalize::{ }; use crate::mir::builder::control_flow::plan::DomainPlan; -use super::build::build_plan_from_facts; +use super::build::build_plan_from_facts_ctx; use super::context::PlannerContext; use super::Freeze; @@ -24,7 +24,8 @@ pub(in crate::mir::builder) fn build_plan_with_facts( body: &[ASTNode], ) -> Result { let facts = try_build_loop_facts(condition, body)?; - build_plan_from_facts_opt(facts) + let legacy_ctx = PlannerContext::default_for_legacy(); + build_plan_from_facts_opt_with(&legacy_ctx, facts) } pub(in crate::mir::builder) fn build_plan_with_facts_ctx( @@ -33,10 +34,11 @@ pub(in crate::mir::builder) fn build_plan_with_facts_ctx( body: &[ASTNode], ) -> Result { let facts = try_build_loop_facts_with_ctx(ctx, condition, body)?; - build_plan_from_facts_opt(facts) + build_plan_from_facts_opt_with(ctx, facts) } -fn build_plan_from_facts_opt( +fn build_plan_from_facts_opt_with( + ctx: &PlannerContext, facts: Option, ) -> Result { let Some(facts) = facts else { @@ -46,7 +48,7 @@ fn build_plan_from_facts_opt( }); }; let canonical = canonicalize_loop_facts(facts); - let plan = build_plan_from_facts(canonical.clone())?; + let plan = build_plan_from_facts_ctx(ctx, canonical.clone())?; Ok(PlanBuildOutcome { facts: Some(canonical), diff --git a/src/mir/builder/control_flow/plan/single_planner/rules.rs b/src/mir/builder/control_flow/plan/single_planner/rules.rs index fde26c03..8cf0f374 100644 --- a/src/mir/builder/control_flow/plan/single_planner/rules.rs +++ b/src/mir/builder/control_flow/plan/single_planner/rules.rs @@ -4,7 +4,6 @@ //! (observability/behavior must not change). use crate::mir::builder::control_flow::joinir::patterns::router::LoopPatternContext; -use crate::mir::loop_pattern_detection::LoopPatternKind; use crate::mir::builder::control_flow::plan::extractors; use crate::mir::builder::control_flow::plan::facts::pattern2_loopbodylocal_facts::LoopBodyLocalShape; @@ -33,22 +32,11 @@ pub(super) fn try_build_domain_plan(ctx: &LoopPatternContext) -> Result true, - _ => false, - }; let allow_pattern8 = !ctx.in_static_box; let (plan_opt, log_none) = if planner_hit.is_some() { (planner_hit, false) } else { - let plan_opt = fallback_extract(ctx, rule_id, allow_pattern1, allow_pattern8)?; - let log_none = if matches!(rule_id, PlanRuleId::Pattern1) { - allow_pattern1 - } else { - true - }; - (plan_opt, log_none) + (fallback_extract(ctx, rule_id, allow_pattern8)?, true) }; let promotion_tag = if matches!(rule_id, PlanRuleId::Pattern2) @@ -103,16 +91,10 @@ fn try_take_planner(planner_opt: &Option, kind: PlanRuleId) -> Optio fn fallback_extract( ctx: &LoopPatternContext, kind: PlanRuleId, - allow_pattern1: bool, allow_pattern8: bool, ) -> Result, String> { match kind { - PlanRuleId::Pattern1 => { - if !allow_pattern1 { - return Ok(None); - } - extractors::pattern1::extract_pattern1_plan(ctx.condition, ctx.body) - } + PlanRuleId::Pattern1 => extractors::pattern1::extract_pattern1_plan(ctx.condition, ctx.body), PlanRuleId::Pattern2 => { extractors::pattern2_break::extract_pattern2_plan(ctx.condition, ctx.body) } diff --git a/tools/smokes/v2/profiles/integration/joinir/phase1883_nested_minimal_vm.sh b/tools/smokes/v2/profiles/integration/joinir/phase1883_nested_minimal_vm.sh index 0b73153e..e477c178 100644 --- a/tools/smokes/v2/profiles/integration/joinir/phase1883_nested_minimal_vm.sh +++ b/tools/smokes/v2/profiles/integration/joinir/phase1883_nested_minimal_vm.sh @@ -10,7 +10,7 @@ FIXTURE="$NYASH_ROOT/apps/tests/phase1883_nested_minimal.hako" RUN_TIMEOUT_SECS=${RUN_TIMEOUT_SECS:-10} set +e -OUTPUT=$(timeout "$RUN_TIMEOUT_SECS" env NYASH_DISABLE_PLUGINS=1 HAKO_JOINIR_STRICT=1 "$NYASH_BIN" --backend vm "$FIXTURE" 2>&1) +OUTPUT=$(timeout "$RUN_TIMEOUT_SECS" env HAKO_JOINIR_STRICT=1 "$NYASH_BIN" --backend vm "$FIXTURE" 2>&1) EXIT_CODE=$? set -e