mirbuilder: integrate Normalizer (toggle), add tag-quiet mode, share f64 canonicalization; expand canaries; doc updates for quick timeout + dev toggles; Phase 21.5 optimization readiness

This commit is contained in:
nyash-codex
2025-11-10 23:17:46 +09:00
parent 24d88a10c0
commit ece91306b7
56 changed files with 2227 additions and 142 deletions

View File

@ -92,6 +92,56 @@ Next (preOptimization — MirBuilder build line復活まで)
4) FAIL_FAST=1/nycompiler=既定 の条件で緑維持を確認し、dev トグルを段階撤去 4) FAIL_FAST=1/nycompiler=既定 の条件で緑維持を確認し、dev トグルを段階撤去
5) tests/NOTES_DEV_TOGGLES.md を追加dev トグルの意味・使い所・撤去方針) 5) tests/NOTES_DEV_TOGGLES.md を追加dev トグルの意味・使い所・撤去方針)
Loop 化の小ステップ(構造→置換)
- [x] 最小シーム化: Loop lowerssimple / sum_bc / count_paramの MapBox 生成・set を `loop_opts_adapter` に退避(置換点の集約)
- [x] トグル検証: `HAKO_MIR_BUILDER_SKIP_LOOPS=1` の canary を追加ON=回避、OFF=通常)
- [x] JsonFrag 置換opt-in: `loop_opts_adapter.build2(opts)` で JsonFrag ベースの最小 MIR(JSON) を出力(タグ `[mirbuilder/internal/loop:jsonfrag]`
- [x] 残り 2 件へ水平展開simple/sum_bc/count_param の3種を対応済・挙動不変既定OFF
NextLoop/JsonFrag polish
- [ ] JsonFragの精度引上げphi整列・ret値の正規化・const宣言の整理— 既定OFFのまま小刻み導入参照: docs/checklists/mirbuilder_jsonfrag_defaultization.md
- progress: Normalizer シーム導入済みpass-through→最小実装。phiを先頭集約、ret値の明示、constの重複排除/先頭寄せを追加devトグル下
- progress: const 正規化は i64 に加え f64/String を軽量カバーsignature ベースの dedupe
- progress: f64 値の出力正規化(小数表記の冗長ゼロを削除、整数は .0 付与)を実装、カナリアで確認。
- progress: f64 指数・-0.0 に対応。共通ユーティリティへ切り出しJsonNumberCanonicalBox
- [ ] canary helper の共通化MIR抽出/トークン検証/一貫したSKIPタグ— 共通lib: tools/smokes/v2/lib/mir_canary.sh
- progress: loop トグル canary に導入phase2034/mirbuilder_internal_loop_skip_toggle_canary_vm.sh。順次水平展開。
- progress: loop_*_jsonfrag 系3件へ適用simple/sum_bc/count_param
- progress: Normalizer検証カナリアを追加重複なし:
- 冪等: tools/smokes/v2/profiles/quick/core/phase2034/mirbuilder_jsonfrag_normalizer_idempotent_canary_vm.sh
- cross-block no-dedupe: tools/smokes/v2/profiles/quick/core/phase2034/mirbuilder_jsonfrag_normalizer_const_crossblock_no_dedupe_canary_vm.sh
- rcパリティ: if/loop/binop3件
- f64 exponent/negzero: tools/smokes/v2/profiles/quick/core/phase2034/mirbuilder_jsonfrag_normalizer_const_f64_exponent_negzero_canary_vm.sh
- f64 bigexp: tools/smokes/v2/profiles/quick/core/phase2034/mirbuilder_jsonfrag_normalizer_const_f64_bigexp_canary_vm.sh
- phi many incomings: tools/smokes/v2/profiles/quick/core/phase2034/mirbuilder_jsonfrag_normalizer_phi_many_incomings_order_canary_vm.sh
- phi multistage merge: tools/smokes/v2/profiles/quick/core/phase2034/mirbuilder_jsonfrag_normalizer_phi_multistage_merge_order_canary_vm.sh
- [ ] タグ静音トグル(既定静音)
- progress: `HAKO_MIR_BUILDER_NORMALIZE_TAG=1` の時のみタグ出力(既定は静音)。
- [ ] 正規化ロジックの共通化
- progress: `lang/src/shared/json/utils/json_number_canonical_box.hako` を新設し、f64正規化/トークン読取を共通化。
- [ ] dev一括トグル
- progress: `enable_mirbuilder_dev_env` に Normalizer 注入導線を追加(`SMOKES_DEV_NORMALIZE=1`/コメントで profile 切替例)。
- [ ] quick の timeout 推奨を明記
- progress: `tools/smokes/v2/run.sh` の Examples に `--timeout 120` の案内を追加。
Phase 21.5Optimization readiness
- 状態: Normalizer を本線MirBuilderBox/Minに既定OFFで配線、quick 代表セットNormalizer ONでパリティ確認済み。
- 推奨: quick 全体を Normalizer ON で回す場合は `--timeout 120` を使用EXE/AOT系の安定化
- 次アクション: rcパリティの面を拡大→default 候補フラグ既定OFFで段階導入→緑維持で default 化判断。
- [ ] 既定化の条件整理代表canary緑・タグ観測・既定経路とのdiff=なし・戻し手順をチェックリスト化docs/checklists/mirbuilder_jsonfrag_defaultization.md
- progress: チェックリストと Rollback 手順を docs に明記済み。
追加カナリアNormalizer
- 直接検証: tools/smokes/v2/profiles/quick/core/phase2034/mirbuilder_jsonfrag_normalizer_direct_canary_vm.sh
- 複数ブロック/phi順序: tools/smokes/v2/profiles/quick/core/phase2034/mirbuilder_jsonfrag_normalizer_multiblock_phi_order_canary_vm.sh
- JsonFrag+Normalize(tag): tools/smokes/v2/profiles/quick/core/phase2034/mirbuilder_internal_loop_simple_jsonfrag_normalize_canary_vm.sh
- JsonFrag+Normalize(tag・OFF確認): tools/smokes/v2/profiles/quick/core/phase2034/mirbuilder_internal_loop_simple_jsonfrag_nonormalize_no_tag_canary_vm.sh
- 冪等性: tools/smokes/v2/profiles/quick/core/phase2034/mirbuilder_jsonfrag_normalizer_idempotent_canary_vm.sh
- cross-block no-dedupe: tools/smokes/v2/profiles/quick/core/phase2034/mirbuilder_jsonfrag_normalizer_const_crossblock_no_dedupe_canary_vm.sh
- f64 canonicalize: tools/smokes/v2/profiles/quick/core/phase2034/mirbuilder_jsonfrag_normalizer_const_f64_canonicalize_canary_vm.sh
- phi nested merge: tools/smokes/v2/profiles/quick/core/phase2034/mirbuilder_jsonfrag_normalizer_phi_nested_merge_order_canary_vm.sh
- rcパリティ: if/loop/binop3件
# Current Task — Phase 21.10LLVM line crate backend print EXE # Current Task — Phase 21.10LLVM line crate backend print EXE
Status (22.3 wrap) Status (22.3 wrap)

View File

@ -182,6 +182,15 @@ Examples (FailFast tags and safe overrides)
NYASH_JSON_ONLY=1 NYASH_FILEBOX_ALLOW_FALLBACK=1 ./target/release/hakorune --backend vm apps/tests/emit_program_json.hako NYASH_JSON_ONLY=1 NYASH_FILEBOX_ALLOW_FALLBACK=1 ./target/release/hakorune --backend vm apps/tests/emit_program_json.hako
``` ```
Provider policy共通
- `HAKO_PROVIDER_POLICY=strict-plugin-first|safe-core-first|static-preferred`
- Auto モード時の選択ポリシーを制御(既定: `strict-plugin-first`)。
- `safe-core-first`/`static-preferred` は ring1静的/coreroを優先、利用不可時のみプラグインにフォールバック。
- Box個別のモード指定例: FileBox
- `<BOX>_MODE=auto|ring1|plugin-only`(例: `NYASH_FILEBOX_MODE`
- `<BOX>_ALLOW_FALLBACK=0|1`FailFast 例外。局所許可)
- 既存の FileBox 変数はそのまま有効。共通パターンとして今後は他Boxにも拡張予定。
Selfhostfirst wrapper toggles (StageB → MirBuilder) Selfhostfirst wrapper toggles (StageB → MirBuilder)
- `HAKO_SELFHOST_BUILDER_FIRST=0|1` (default: 0) - `HAKO_SELFHOST_BUILDER_FIRST=0|1` (default: 0)
- When 1, tools like `tools/hakorune_emit_mir.sh` try StageB → MirBuilder(Hako) first, and only fall back to the Rust delegate when necessary. - When 1, tools like `tools/hakorune_emit_mir.sh` try StageB → MirBuilder(Hako) first, and only fall back to the Rust delegate when necessary.

View File

@ -0,0 +1,39 @@
# Runtime Rings Architecture (ring0 / ring1 / ring2)
Purpose: clarify responsibilities and naming for core runtime layers, and make provider selection and diagnostics consistent without large code moves.
Overview
- ring0 (Kernel): Core executor/runner/host-bridge. Responsible for fail-fast and process orchestration. No direct business logic of boxes.
- ring1 (Core Providers): Minimal, trusted, always-available providers (static). Example: FileBox Core-RO (open/read/close). Small, reproducible, and safe to enable under fail-fast.
- ring2 (Plugins): Dynamic, featureful providers. Swappable and extensible. Example: Array/Map plugins, full-feature FileBox plugin.
Mapping (current repo)
- ring0: src/ring0/ (facade + guard; re-exports are deferred). Existing core remains under src/*; ring0 is a conceptual anchor.
- ring1: src/providers/ring1/ (facade + guard). Concrete code still lives where it is; ring1 hosts documentation and future home for static providers.
- ring2: plugins/ (dynamic shared libraries, as before).
Selection Policy (Auto mode)
- Global: `HAKO_PROVIDER_POLICY=strict-plugin-first|safe-core-first|static-preferred`
- strict-plugin-first (default): prefer dynamic/plugin; fallback to ring1 when allowed.
- safe-core-first/static-preferred: prefer ring1 (static) when available; fallback to plugin.
- Per box (example: FileBox)
- `NYASH_FILEBOX_MODE=auto|ring1|plugin-only`
- `NYASH_FILEBOX_ALLOW_FALLBACK=0|1` (narrow dev override)
Diagnostics (stderr, quiet when JSON_ONLY=1)
- Selection: `[provider/select:<Box> ring=<0|1|plugin> src=<static|dynamic>]`
- Fail-Fast block: `[failfast/provider/<box>:<reason>]`
Design Invariants
- ring0 must not depend on ring2. ring1 contains only minimal stable capabilities.
- Fallback is disallowed by default (Fail-Fast). Allow only via per-box override or JSON_ONLY quiet pipe.
Migration Plan (small steps)
1) Add facades and guards (this change).
2) Keep existing code paths; introduce provider policy (done for FileBox).
3) Gradually move minimal static providers under `src/providers/ring1/` (no behavior change).
4) Add canaries to assert selection tags under different policies.
Notes on Naming
- Historical names like "builtin" referred to in-tree providers. To avoid confusion, use ring terms: ring0 (kernel), ring1 (core providers), ring2 (plugins).

View File

@ -0,0 +1,40 @@
# MirBuilder JsonFrag Defaultization Checklist
Purpose: define clear, testable conditions to move the JsonFrag-based MirBuilder path from opt-in to default without changing observable behavior.
Scope
- Loop lowers (simple / sum_bc / count_param) via `loop_opts_adapter.build2`.
- Normalizer seam: `hako.mir.builder.internal.jsonfrag_normalizer`.
Toggles (dev-only, default OFF)
- `HAKO_MIR_BUILDER_LOOP_JSONFRAG=1` — enable JsonFrag minimal MIR assembly.
- `HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE=1` — apply normalization pass (phi/ret/const; incremental).
- `HAKO_MIR_BUILDER_SKIP_LOOPS=1` — canary guard to bypass loop lowers (safety valve; must remain honored).
- `HAKO_MIR_BUILDER_NORMALIZE_TAG=1` — emit normalization tag lines; default is quiet (no tag).
Acceptance Criteria
- Green on quick representative smokes and phase2231 canary.
- Tag observability present only when opt-in flags are set:
- `[mirbuilder/internal/loop:jsonfrag]` when JsonFrag path is taken.
- `[mirbuilder/normalize:jsonfrag:pass]` when normalization is applied.
- Parity: JsonFrag+Normalizer output is semantically identical to the default path (no diff in verification runners, exit code parity).
- Rollback: removing toggles restores legacy path immediately with zero residual side effects.
Verification Steps
1) Enable JsonFrag path for loop lowers and run quick smokes:
- `HAKO_MIR_BUILDER_LOOP_JSONFRAG=1 tools/smokes/v2/run.sh --profile quick`
2) Enable Normalizer additionally and re-run:
- `HAKO_MIR_BUILDER_LOOP_JSONFRAG=1 HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE=1 tools/smokes/v2/run.sh --profile quick`
3) Observe tags in logs (only when toggles ON), confirm absence when OFF.
4) Use `tools/smokes/v2/lib/mir_canary.sh` helpers to extract MIR and assert key tokens as needed.
5) Heavy EXE/AOT reps present: consider `--timeout 120` for `--profile quick` when NORMALIZE=1.
Rollback Plan
- Disable toggles to revert (`export` unset or set to `0`). No code removal required.
- If unexpected diffs appear, capture `[mirbuilder/*]` tags from logs and attach to CURRENT_TASK.md for follow-up.
Notes
- Normalizer is introduced as a pass-through seam first; refine in small, guarded steps (phi alignment, ret normalization, const folding) while keeping default OFF.
- Do not change default behavior or widen scope during this phase; prioritize stability and diagnostics.
- f64 canonicalization is shared via `selfhost.shared.json.utils.json_number_canonical`; prefer reusing this utility instead of local string hacking.
- Dev helpers: `enable_mirbuilder_dev_env` can inject NORMALIZE via `SMOKES_DEV_NORMALIZE=1` (profile-based injection example provided in comments).

View File

@ -15,6 +15,7 @@
// Phase 22.0: Hakofirstregistryを既定ONにする。必要なら 0 を明示して無効化する。 // Phase 22.0: Hakofirstregistryを既定ONにする。必要なら 0 を明示して無効化する。
using selfhost.shared.json.utils.json_frag as JsonFragBox using selfhost.shared.json.utils.json_frag as JsonFragBox
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
using "hako.mir.builder.internal.pattern_util" as PatternUtilBox using "hako.mir.builder.internal.pattern_util" as PatternUtilBox
static box MirBuilderBox { static box MirBuilderBox {
@ -38,6 +39,13 @@ static box MirBuilderBox {
print("[mirbuilder/input/invalid] missing version/kind keys") print("[mirbuilder/input/invalid] missing version/kind keys")
return null return null
} }
// Helper: optional normalization (dev toggle, default OFF)
local norm_if = function(m) {
if m == null { return null }
local nv = env.get("HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE")
if nv != null && ("" + nv) == "1" { return NormBox.normalize_all(m) }
return m
}
// Internal path既定ON — const(int)+ret, binop+ret ほか、registry 優先の lowering // Internal path既定ON — const(int)+ret, binop+ret ほか、registry 優先の lowering
// Disable with: HAKO_MIR_BUILDER_INTERNAL=0 // Disable with: HAKO_MIR_BUILDER_INTERNAL=0
{ {
@ -91,7 +99,7 @@ static box MirBuilderBox {
"{\\\"op\\\":\\\"mir_call\\\",\\\"dst\\\":4,\\\"mir_call\\\":{\\\"callee\\\":{\\\"type\\\":\\\"Method\\\",\\\"method\\\":\\\"" + method + "\\\",\\\"receiver\\\":1},\\\"args\\\":[" + args_text + "],\\\"effects\\\":[]}}," + "{\\\"op\\\":\\\"mir_call\\\",\\\"dst\\\":4,\\\"mir_call\\\":{\\\"callee\\\":{\\\"type\\\":\\\"Method\\\",\\\"method\\\":\\\"" + method + "\\\",\\\"receiver\\\":1},\\\"args\\\":[" + args_text + "],\\\"effects\\\":[]}}," +
"{\\\"op\\\":\\\"ret\\\",\\\"value\\\":4}]}]}]}" "{\\\"op\\\":\\\"ret\\\",\\\"value\\\":4}]}]}]}"
print("[mirbuilder/registry:return.method.arraymap]") print("[mirbuilder/registry:return.method.arraymap]")
return mir return norm_if(mir)
} }
// Registry list汎用 // Registry list汎用
using "hako.mir.builder.pattern_registry" as PatternRegistryBox using "hako.mir.builder.pattern_registry" as PatternRegistryBox
@ -119,23 +127,23 @@ static box MirBuilderBox {
local i = 0; local n = names.length() local i = 0; local n = names.length()
loop(i < n) { loop(i < n) {
local nm = "" + names.get(i) local nm = "" + names.get(i)
if nm == "if.compare.intint" { local out = LowerIfCompareBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return out } } if nm == "if.compare.intint" { local out = LowerIfCompareBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return norm_if(out) } }
if nm == "if.compare.fold.binints" { local out = LowerIfCompareFoldBinIntsBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return out } } if nm == "if.compare.fold.binints" { local out = LowerIfCompareFoldBinIntsBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return norm_if(out) } }
if nm == "if.compare.fold.varint" { local out = LowerIfCompareFoldVarIntBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return out } } if nm == "if.compare.fold.varint" { local out = LowerIfCompareFoldVarIntBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return norm_if(out) } }
if nm == "if.compare.varint" { local out = LowerIfCompareVarIntBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return out } } if nm == "if.compare.varint" { local out = LowerIfCompareVarIntBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return norm_if(out) } }
if nm == "if.compare.varvar" { local out = LowerIfCompareVarVarBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return out } } if nm == "if.compare.varvar" { local out = LowerIfCompareVarVarBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return norm_if(out) } }
if nm == "return.method.arraymap" { local out = LowerReturnMethodArrayMapBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return out } } if nm == "return.method.arraymap" { local out = LowerReturnMethodArrayMapBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return norm_if(out) } }
if nm == "return.method.string.length" { local out = LowerReturnMethodStringLengthBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return out } } if nm == "return.method.string.length" { local out = LowerReturnMethodStringLengthBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return norm_if(out) } }
if nm == "return.loop.strlen.sum" { local out = LowerReturnLoopStrlenSumBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return out } } if nm == "return.loop.strlen.sum" { local out = LowerReturnLoopStrlenSumBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return norm_if(out) } }
if nm == "return.var.local" { local out = LowerReturnVarLocalBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return out } } if nm == "return.var.local" { local out = LowerReturnVarLocalBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return norm_if(out) } }
if nm == "return.string" { local out = LowerReturnStringBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return out } } if nm == "return.string" { local out = LowerReturnStringBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return norm_if(out) } }
if nm == "return.float" { local out = LowerReturnFloatBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return out } } if nm == "return.float" { local out = LowerReturnFloatBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return norm_if(out) } }
if nm == "return.bool" { local out = LowerReturnBoolBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return out } } if nm == "return.bool" { local out = LowerReturnBoolBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return norm_if(out) } }
if nm == "return.logical" { local out = LowerReturnLogicalBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return out } } if nm == "return.logical" { local out = LowerReturnLogicalBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return norm_if(out) } }
if nm == "return.binop.varint" { local out = LowerReturnBinOpVarIntBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return out } } if nm == "return.binop.varint" { local out = LowerReturnBinOpVarIntBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return norm_if(out) } }
if nm == "return.binop.varvar" { local out = LowerReturnBinOpVarVarBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return out } } if nm == "return.binop.varvar" { local out = LowerReturnBinOpVarVarBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return norm_if(out) } }
if nm == "return.binop.intint" { local out = LowerReturnBinOpBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return out } } if nm == "return.binop.intint" { local out = LowerReturnBinOpBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return norm_if(out) } }
if nm == "return.int" { local out = LowerReturnIntBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return out } } if nm == "return.int" { local out = LowerReturnIntBox.try_lower(s); if out != null { if env.get("HAKO_MIR_BUILDER_DEBUG")=="1" || env.get("NYASH_CLI_VERBOSE")=="1" { print("[mirbuilder/registry:" + nm + "]") } return norm_if(out) } }
i = i + 1 i = i + 1
} }
// Fall-through to chain if none matched // Fall-through to chain if none matched
@ -156,53 +164,53 @@ static box MirBuilderBox {
using "hako.mir.builder.internal.lower_loop_count_param" as LowerLoopCountParamBox using "hako.mir.builder.internal.lower_loop_count_param" as LowerLoopCountParamBox
using "hako.mir.builder.internal.lower_loop_simple" as LowerLoopSimpleBox using "hako.mir.builder.internal.lower_loop_simple" as LowerLoopSimpleBox
// Prefer New(Constructor) minimal first to avoid unresolved nested lowers in inline runs // Prefer New(Constructor) minimal first to avoid unresolved nested lowers in inline runs
{ local out_newc = LowerNewboxConstructorBox.try_lower(s); if out_newc != null { return out_newc } } { local out_newc = LowerNewboxConstructorBox.try_lower(s); if out_newc != null { return norm_if(out_newc) } }
{ local out_arr_size = LowerMethodArraySizeBox.try_lower(s); if out_arr_size != null { return out_arr_size } } { local out_arr_size = LowerMethodArraySizeBox.try_lower(s); if out_arr_size != null { return norm_if(out_arr_size) } }
{ local out_arr_push = LowerMethodArrayPushBox.try_lower(s); if out_arr_push != null { return out_arr_push } } { local out_arr_push = LowerMethodArrayPushBox.try_lower(s); if out_arr_push != null { return norm_if(out_arr_push) } }
{ local out_arr_gs = LowerMethodArrayGetSetBox.try_lower(s); if out_arr_gs != null { return out_arr_gs } } { local out_arr_gs = LowerMethodArrayGetSetBox.try_lower(s); if out_arr_gs != null { return norm_if(out_arr_gs) } }
{ local out_map_size = LowerMethodMapSizeBox.try_lower(s); if out_map_size != null { return out_map_size } } { local out_map_size = LowerMethodMapSizeBox.try_lower(s); if out_map_size != null { return norm_if(out_map_size) } }
{ local out_map_gs = LowerMethodMapGetSetBox.try_lower(s); if out_map_gs != null { return out_map_gs } } { local out_map_gs = LowerMethodMapGetSetBox.try_lower(s); if out_map_gs != null { return norm_if(out_map_gs) } }
{ local out_ls = LowerLoadStoreLocalBox.try_lower(s); if out_ls != null { return out_ls } } { local out_ls = LowerLoadStoreLocalBox.try_lower(s); if out_ls != null { return norm_if(out_ls) } }
{ local out_toc = LowerTypeOpCheckBox.try_lower(s); if out_toc != null { return out_toc } } { local out_toc = LowerTypeOpCheckBox.try_lower(s); if out_toc != null { return norm_if(out_toc) } }
{ local out_tca = LowerTypeOpCastBox.try_lower(s); if out_tca != null { return out_tca } } { local out_tca = LowerTypeOpCastBox.try_lower(s); if out_tca != null { return norm_if(out_tca) } }
// Loop lowers (sum_bc/continue/break normalization) // Loop lowers (sum_bc/continue/break normalization)
// Allow skipping heavy loop lowers on plugin-less hosts: HAKO_MIR_BUILDER_SKIP_LOOPS=1 // Allow skipping heavy loop lowers on plugin-less hosts: HAKO_MIR_BUILDER_SKIP_LOOPS=1
{ {
local skip_loops = env.get("HAKO_MIR_BUILDER_SKIP_LOOPS") local skip_loops = env.get("HAKO_MIR_BUILDER_SKIP_LOOPS")
local do_loops = (skip_loops == null) || (("" + skip_loops) != "1") local do_loops = (skip_loops == null) || (("" + skip_loops) != "1")
if do_loops == 1 { if do_loops == 1 {
{ local out_loop2 = LowerLoopSumBcBox.try_lower(s); if out_loop2 != null { return out_loop2 } } { local out_loop2 = LowerLoopSumBcBox.try_lower(s); if out_loop2 != null { return norm_if(out_loop2) } }
} }
} }
{ local out_if2b = LowerIfNestedBox.try_lower(s); if out_if2b != null { return out_if2b } } { local out_if2b = LowerIfNestedBox.try_lower(s); if out_if2b != null { return norm_if(out_if2b) } }
{ local out_if2 = LowerIfThenElseFollowingReturnBox.try_lower(s); if out_if2 != null { return out_if2 } } { local out_if2 = LowerIfThenElseFollowingReturnBox.try_lower(s); if out_if2 != null { return norm_if(out_if2) } }
{ local out_if = LowerIfCompareBox.try_lower(s); if out_if != null { return out_if } } { local out_if = LowerIfCompareBox.try_lower(s); if out_if != null { return norm_if(out_if) } }
{ local out_ifb = LowerIfCompareFoldBinIntsBox.try_lower(s); if out_ifb != null { return out_ifb } } { local out_ifb = LowerIfCompareFoldBinIntsBox.try_lower(s); if out_ifb != null { return norm_if(out_ifb) } }
{ local out_ifbv = LowerIfCompareFoldVarIntBox.try_lower(s); if out_ifbv != null { return out_ifbv } } { local out_ifbv = LowerIfCompareFoldVarIntBox.try_lower(s); if out_ifbv != null { return norm_if(out_ifbv) } }
{ local out_ifvi = LowerIfCompareVarIntBox.try_lower(s); if out_ifvi != null { return out_ifvi } } { local out_ifvi = LowerIfCompareVarIntBox.try_lower(s); if out_ifvi != null { return norm_if(out_ifvi) } }
{ local out_ifvv = LowerIfCompareVarVarBox.try_lower(s); if out_ifvv != null { return out_ifvv } } { local out_ifvv = LowerIfCompareVarVarBox.try_lower(s); if out_ifvv != null { return norm_if(out_ifvv) } }
{ {
local skip_loops2 = env.get("HAKO_MIR_BUILDER_SKIP_LOOPS") local skip_loops2 = env.get("HAKO_MIR_BUILDER_SKIP_LOOPS")
local do_loops2 = (skip_loops2 == null) || (("" + skip_loops2) != "1") local do_loops2 = (skip_loops2 == null) || (("" + skip_loops2) != "1")
if do_loops2 == 1 { if do_loops2 == 1 {
{ local out_loopp = LowerLoopCountParamBox.try_lower(s); if out_loopp != null { return out_loopp } } { local out_loopp = LowerLoopCountParamBox.try_lower(s); if out_loopp != null { return norm_if(out_loopp) } }
{ local out_loop = LowerLoopSimpleBox.try_lower(s); if out_loop != null { return out_loop } } { local out_loop = LowerLoopSimpleBox.try_lower(s); if out_loop != null { return norm_if(out_loop) } }
} }
} }
{ local out_var = LowerReturnVarLocalBox.try_lower(s); if out_var != null { return out_var } } { local out_var = LowerReturnVarLocalBox.try_lower(s); if out_var != null { return norm_if(out_var) } }
{ local out_str = LowerReturnStringBox.try_lower(s); if out_str != null { return out_str } } { local out_str = LowerReturnStringBox.try_lower(s); if out_str != null { return norm_if(out_str) } }
{ local out_f = LowerReturnFloatBox.try_lower(s); if out_f != null { return out_f } } { local out_f = LowerReturnFloatBox.try_lower(s); if out_f != null { return norm_if(out_f) } }
{ local out_log = LowerReturnLogicalBox.try_lower(s); if out_log != null { return out_log } } { local out_log = LowerReturnLogicalBox.try_lower(s); if out_log != null { return norm_if(out_log) } }
{ local out_meth = LowerReturnMethodArrayMapBox.try_lower(s); if out_meth != null { return out_meth } } { local out_meth = LowerReturnMethodArrayMapBox.try_lower(s); if out_meth != null { return norm_if(out_meth) } }
{ local out_meth_s = LowerReturnMethodStringLengthBox.try_lower(s); if out_meth_s != null { return out_meth_s } } { local out_meth_s = LowerReturnMethodStringLengthBox.try_lower(s); if out_meth_s != null { return norm_if(out_meth_s) } }
{ local out_sum = LowerReturnLoopStrlenSumBox.try_lower(s); if out_sum != null { return out_sum } } { local out_sum = LowerReturnLoopStrlenSumBox.try_lower(s); if out_sum != null { return norm_if(out_sum) } }
{ local out_bool = LowerReturnBoolBox.try_lower(s); if out_bool != null { return out_bool } } { local out_bool = LowerReturnBoolBox.try_lower(s); if out_bool != null { return norm_if(out_bool) } }
{ local out_bvi = LowerReturnBinOpVarIntBox.try_lower(s); if out_bvi != null { return out_bvi } } { local out_bvi = LowerReturnBinOpVarIntBox.try_lower(s); if out_bvi != null { return norm_if(out_bvi) } }
{ local out_bvv = LowerReturnBinOpVarVarBox.try_lower(s); if out_bvv != null { return out_bvv } } { local out_bvv = LowerReturnBinOpVarVarBox.try_lower(s); if out_bvv != null { return norm_if(out_bvv) } }
{ local out_bin = LowerReturnBinOpBox.try_lower(s); if out_bin != null { return out_bin } } { local out_bin = LowerReturnBinOpBox.try_lower(s); if out_bin != null { return norm_if(out_bin) } }
{ {
local out_int = LowerReturnIntBox.try_lower(s) local out_int = LowerReturnIntBox.try_lower(s)
if out_int != null { return out_int } if out_int != null { return norm_if(out_int) }
} }
// Find Return marker (or If) // Find Return marker (or If)
// Case (If with Compare + Return(Int)/Return(Int) in branches) // Case (If with Compare + Return(Int)/Return(Int) in branches)
@ -294,7 +302,7 @@ static box MirBuilderBox {
"{\"op\":\"branch\",\"cond\":3,\"then\":1,\"else\":2}]}," + "{\"op\":\"branch\",\"cond\":3,\"then\":1,\"else\":2}]}," +
"{\"id\":1,\"instructions\":[{\"op\":\"const\",\"dst\":4,\"value\":{\"type\":\"i64\",\"value\":" + then_val + "}},{\"op\":\"ret\",\"value\":4}]}," + "{\"id\":1,\"instructions\":[{\"op\":\"const\",\"dst\":4,\"value\":{\"type\":\"i64\",\"value\":" + then_val + "}},{\"op\":\"ret\",\"value\":4}]}," +
"{\"id\":2,\"instructions\":[{\"op\":\"const\",\"dst\":5,\"value\":{\"type\":\"i64\",\"value\":" + else_val + "}},{\"op\":\"ret\",\"value\":5}]}]}]}" "{\"id\":2,\"instructions\":[{\"op\":\"const\",\"dst\":5,\"value\":{\"type\":\"i64\",\"value\":" + else_val + "}},{\"op\":\"ret\",\"value\":5}]}]}]}"
return mir_if return norm_if(mir_if)
} }
} }
} }
@ -303,7 +311,7 @@ static box MirBuilderBox {
// NewBox(Constructor) minimal // NewBox(Constructor) minimal
{ {
local out_new = LowerNewboxConstructorBox.try_lower(s) local out_new = LowerNewboxConstructorBox.try_lower(s)
if out_new != null { return out_new } if out_new != null { return norm_if(out_new) }
} }
// Fallback cases below: Return(Binary) and Return(Int) // Fallback cases below: Return(Binary) and Return(Int)
local k_ret = JsonFragBox.index_of_from(s, "\"type\":\"Return\"", 0) local k_ret = JsonFragBox.index_of_from(s, "\"type\":\"Return\"", 0)
@ -345,7 +353,7 @@ static box MirBuilderBox {
"{\"op\":\"const\",\"dst\":2,\"value\":{\"type\":\"i64\",\"value\":" + rhs_val + "}}," + "{\"op\":\"const\",\"dst\":2,\"value\":{\"type\":\"i64\",\"value\":" + rhs_val + "}}," +
"{\"op\":\"binop\",\"operation\":\"" + op + "\",\"lhs\":1,\"rhs\":2,\"dst\":3}," + "{\"op\":\"binop\",\"operation\":\"" + op + "\",\"lhs\":1,\"rhs\":2,\"dst\":3}," +
"{\"op\":\"ret\",\"value\":3}]}]}]}" "{\"op\":\"ret\",\"value\":3}]}]}]}"
return mir_bin return norm_if(mir_bin)
} }
} }
} }
@ -361,7 +369,7 @@ static box MirBuilderBox {
if num != null { if num != null {
if env.get("HAKO_MIR_BUILDER_DEBUG") == "1" || env.get("NYASH_CLI_VERBOSE") == "1" { print("[mirbuilder/fallback:Return(Int) val=" + num + "]") } if env.get("HAKO_MIR_BUILDER_DEBUG") == "1" || env.get("NYASH_CLI_VERBOSE") == "1" { print("[mirbuilder/fallback:Return(Int) val=" + num + "]") }
local mir = "{\"functions\":[{\"name\":\"main\",\"params\":[],\"locals\":[],\"blocks\":[{\"id\":0,\"instructions\":[{\"op\":\"const\",\"dst\":1,\"value\":{\"type\":\"i64\",\"value\":" + num + "}},{\"op\":\"ret\",\"value\":1}]}]}]}" local mir = "{\"functions\":[{\"name\":\"main\",\"params\":[],\"locals\":[],\"blocks\":[{\"id\":0,\"instructions\":[{\"op\":\"const\",\"dst\":1,\"value\":{\"type\":\"i64\",\"value\":" + num + "}},{\"op\":\"ret\",\"value\":1}]}]}]}"
return mir return norm_if(mir)
} }
} }
} else { } else {
@ -382,7 +390,7 @@ static box MirBuilderBox {
// Call host provider via extern: env.mirbuilder.emit(program_json) // Call host provider via extern: env.mirbuilder.emit(program_json)
local args = new ArrayBox(); args.push(program_json) local args = new ArrayBox(); args.push(program_json)
local ret = hostbridge.extern_invoke("env.mirbuilder", "emit", args) local ret = hostbridge.extern_invoke("env.mirbuilder", "emit", args)
return ret return norm_if(ret)
} }
// Provider not wired → FailFast tag // Provider not wired → FailFast tag
print("[mirbuilder/delegate/missing] no provider; enable HAKO_MIR_BUILDER_DELEGATE=1") print("[mirbuilder/delegate/missing] no provider; enable HAKO_MIR_BUILDER_DELEGATE=1")

View File

@ -3,6 +3,7 @@
// Toggles: なし(この箱自体は最小/静的)。 // Toggles: なし(この箱自体は最小/静的)。
using selfhost.shared.json.utils.json_frag as JsonFragBox using selfhost.shared.json.utils.json_frag as JsonFragBox
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
using "hako.mir.builder.internal.lower_return_method_array_map" as LowerReturnMethodArrayMapBox using "hako.mir.builder.internal.lower_return_method_array_map" as LowerReturnMethodArrayMapBox
using "hako.mir.builder.internal.lower_return_int" as LowerReturnIntBox using "hako.mir.builder.internal.lower_return_int" as LowerReturnIntBox
using "hako.mir.builder.internal.lower_return_binop" as LowerReturnBinOpBox using "hako.mir.builder.internal.lower_return_binop" as LowerReturnBinOpBox
@ -20,18 +21,25 @@ static box MirBuilderBox {
if program_json == null { print("[mirbuilder/min/input:null]"); return null } if program_json == null { print("[mirbuilder/min/input:null]"); return null }
local s = "" + program_json local s = "" + program_json
if !(s.contains("\"version\"")) || !(s.contains("\"kind\"")) { print("[mirbuilder/min/input:invalid]"); return null } if !(s.contains("\"version\"")) || !(s.contains("\"kind\"")) { print("[mirbuilder/min/input:invalid]"); return null }
// Small helper: optionally normalize MIR(JSON) via toggle (dev-only, default OFF)
local norm_if = function(m) {
if m == null { return null }
local nv = env.get("HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE")
if nv != null && ("" + nv) == "1" { return NormBox.normalize_all(m) }
return m
}
// Try minimal patterns (lightweight only) // Try minimal patterns (lightweight only)
{ local out = LowerReturnMethodArrayMapBox.try_lower(s); if out != null { print("[mirbuilder/min:return.method.arraymap]"); return out } } { local out = LowerReturnMethodArrayMapBox.try_lower(s); if out != null { print("[mirbuilder/min:return.method.arraymap]"); return norm_if(out) } }
{ local out_v = LowerReturnBinOpVarIntBox.try_lower(s); if out_v != null { print("[mirbuilder/min:return.binop.varint]"); return out_v } } { local out_v = LowerReturnBinOpVarIntBox.try_lower(s); if out_v != null { print("[mirbuilder/min:return.binop.varint]"); return norm_if(out_v) } }
{ local out_b = LowerReturnBinOpBox.try_lower(s); if out_b != null { print("[mirbuilder/min:return.binop.intint]"); return out_b } } { local out_b = LowerReturnBinOpBox.try_lower(s); if out_b != null { print("[mirbuilder/min:return.binop.intint]"); return norm_if(out_b) } }
{ local out_bvv = LowerReturnBinOpVarVarBox.try_lower(s); if out_bvv != null { print("[mirbuilder/min:return.binop.varvar]"); return out_bvv } } { local out_bvv = LowerReturnBinOpVarVarBox.try_lower(s); if out_bvv != null { print("[mirbuilder/min:return.binop.varvar]"); return norm_if(out_bvv) } }
// Compare lowers: prefer fold/var-based before int-int to avoid greedy match // Compare lowers: prefer fold/var-based before int-int to avoid greedy match
{ local out_if_fv = LowerIfCompareFoldVarIntBox.try_lower(s); if out_if_fv != null { print("[mirbuilder/min:if.compare.fold.varint]"); return out_if_fv } } { local out_if_fv = LowerIfCompareFoldVarIntBox.try_lower(s); if out_if_fv != null { print("[mirbuilder/min:if.compare.fold.varint]"); return norm_if(out_if_fv) } }
{ local out_if_fb = LowerIfCompareFoldBinIntsBox.try_lower(s); if out_if_fb != null { print("[mirbuilder/min:if.compare.fold.binints]"); return out_if_fb } } { local out_if_fb = LowerIfCompareFoldBinIntsBox.try_lower(s); if out_if_fb != null { print("[mirbuilder/min:if.compare.fold.binints]"); return norm_if(out_if_fb) } }
{ local out_ifvi = LowerIfCompareVarIntBox.try_lower(s); if out_ifvi != null { print("[mirbuilder/min:if.compare.varint]"); return out_ifvi } } { local out_ifvi = LowerIfCompareVarIntBox.try_lower(s); if out_ifvi != null { print("[mirbuilder/min:if.compare.varint]"); return norm_if(out_ifvi) } }
{ local out_ifvv = LowerIfCompareVarVarBox.try_lower(s); if out_ifvv != null { print("[mirbuilder/min:if.compare.varvar]"); return out_ifvv } } { local out_ifvv = LowerIfCompareVarVarBox.try_lower(s); if out_ifvv != null { print("[mirbuilder/min:if.compare.varvar]"); return norm_if(out_ifvv) } }
{ local out_if = LowerIfCompareBox.try_lower(s); if out_if != null { print("[mirbuilder/min:if.compare.intint]"); return out_if } } { local out_if = LowerIfCompareBox.try_lower(s); if out_if != null { print("[mirbuilder/min:if.compare.intint]"); return norm_if(out_if) } }
{ local out2 = LowerReturnIntBox.try_lower(s); if out2 != null { print("[mirbuilder/min:return.int]"); return out2 } } { local out2 = LowerReturnIntBox.try_lower(s); if out2 != null { print("[mirbuilder/min:return.int]"); return norm_if(out2) } }
print("[mirbuilder/min/unsupported]") print("[mirbuilder/min/unsupported]")
return null return null
} }

View File

@ -0,0 +1,381 @@
// jsonfrag_normalizer_box.hako — JsonFrag-based MIR(JSON) normalizer (scaffold)
// Scope: text-level normalization helpers for MIR(JSON) produced via JsonFrag path.
// Policy: behind dev toggle; starts as pass-through and will be refined in small steps.
using selfhost.shared.json.utils.json_frag as JsonFragBox
using selfhost.shared.json.core.json_cursor as JsonCursorBox
using selfhost.shared.common.string_helpers as StringHelpers
using selfhost.shared.json.utils.json_number_canonical as JsonNumberCanonicalBox
static box JsonFragNormalizerBox {
_dq() { return "\"" }
_lb() { return "{" }
_rb() { return "}" }
_lsq() { return "[" }
_rsq() { return "]" }
// Internal: ensure ret has explicit value (minimal, numeric or keep as-is if present)
_ret_ensure_value(obj) {
if obj == null { return null }
local s = "" + obj
// If key exists anywhere, keep as-is
if s.indexOf(me._dq()+"value"+me._dq()+":") >= 0 { return s }
// Insert before the last '}'
local n = s.length()
if n <= 1 { return s }
local tail = s.substring(n-1, n)
if tail != me._rb() { return s }
return s.substring(0, n-1) + ",\"value\":0}"
}
// Internal: normalize a single block's instructions array text
_normalize_instructions_array(arr_text) {
if arr_text == null { return "" }
local s = "" + arr_text
local n = s.length()
local i = 0
local guard = 0
local phi_list = new ArrayBox()
local const_list = new ArrayBox()
local others_list = new ArrayBox()
local seen = new MapBox() // signature -> 1
loop(i < n) {
guard = guard + 1
if guard > 4096 { break }
// seek next object '{'
local ob = JsonFragBox.index_of_from(s, me._lb(), i)
if ob < 0 { break }
local ob_end = JsonCursorBox.seek_obj_end(s, ob)
if ob_end < 0 { break }
local obj = s.substring(ob, ob_end + 1)
// advance index to after this object (skip trailing comma via next search)
i = ob_end + 1
// classify by op
local op = JsonFragBox.get_str(obj, "op")
if op == "phi" {
phi_list.push(obj)
continue
}
if op == "const" {
// Canonicalize const representation (e.g., f64 value formatting) before dedupe/signature
local obj_c = me._const_canonicalize(obj)
// Build dedupe signature: dst + value-signature (type-aware: i64/f64/String/raw)
local dst = JsonFragBox.get_int(obj_c, "dst")
local dstr = (dst == null) ? "?" : StringHelpers.int_to_str(dst)
local sig_val = me._const_value_sig(obj_c)
local sig = "d:" + dstr + ";val:" + sig_val
if seen.get(sig) == null { const_list.push(obj_c) seen.set(sig, 1) }
continue
}
if op == "ret" {
others_list.push(me._ret_ensure_value(obj))
continue
}
others_list.push(obj)
}
// Rebuild in normalized order: phi* (as-is order) → const* (stable) → others*
local out = ""
local first = 1
// emit phi
{
local j = 0
local m = phi_list.size()
loop(j < m) {
local item = phi_list.get(j)
if first == 0 { out = out + "," } else { first = 0 }
out = out + item
j = j + 1
}
}
// emit const
{
local j2 = 0
local m2 = const_list.size()
loop(j2 < m2) {
local item2 = const_list.get(j2)
if first == 0 { out = out + "," } else { first = 0 }
out = out + item2
j2 = j2 + 1
}
}
// emit others
{
local j3 = 0
local m3 = others_list.size()
loop(j3 < m3) {
local item3 = others_list.get(j3)
if first == 0 { out = out + "," } else { first = 0 }
out = out + item3
j3 = j3 + 1
}
}
return out
}
// Build a canonical value signature for a const instruction object text
_const_value_sig(obj_text) {
if obj_text == null { return "?" }
local s = "" + obj_text
// locate '"value":'
local k = s.indexOf(me._dq()+"value"+me._dq()+":")
if k < 0 { return "?" }
local p = k + (me._dq()+"value"+me._dq()+":").length()
if p >= s.length() { return "?" }
local ch = s.substring(p, p+1)
if ch == me._lb() {
// object form
local obj_start = p
local obj_end = JsonCursorBox.seek_obj_end(s, obj_start)
if obj_end < 0 { return "raw:?" }
local inner = s.substring(obj_start, obj_end+1)
// typed numeric
local t = JsonFragBox.get_str(inner, "type")
if t == "i64" {
local vi = JsonFragBox.get_int(inner, "value")
if vi != null { return "i64:" + StringHelpers.int_to_str(vi) }
return "i64:?"
}
if t == "f64" {
local keyv = me._dq()+"value"+me._dq()+":"
local kv = inner.indexOf(keyv)
if kv >= 0 {
local vs = JsonFragBox.read_float_after(inner, kv + keyv.length())
if vs != null { return "f64:" + vs }
}
return "f64:?"
}
// string form: {"String":"..."}
local sv = JsonFragBox.get_str(inner, "String")
if sv != "" { return "S:" + sv }
// fallback: raw object
return "raw:" + inner
}
// primitive value (number literal) — try float then int
local vs2 = JsonFragBox.read_float_after(s, p)
if vs2 != null { return "num:" + vs2 }
local vi2 = JsonFragBox.read_int_after(s, p)
if vi2 != null { return "int:" + vi2 }
return "?"
}
// Canonicalize const object textual representation for known patterns (f64 formatting)
_const_canonicalize(obj_text) {
if obj_text == null { return null }
local s = "" + obj_text
// narrow to object; if malformed, return as-is
local t = JsonFragBox.get_str(s, "type")
if t == "f64" {
// find value position inside object and normalize numeric literal
local keyv = me._dq()+"value"+me._dq()+":"
local kv = s.indexOf(keyv)
if kv >= 0 {
// position of number start
local p = kv + keyv.length()
// skip spaces
loop(p < s.length()) { if s.substring(p,p+1) == " " { p = p + 1 } else { break } }
// prefer tolerant numeric token reader (supports exponent), fallback to float reader
local vs = JsonNumberCanonicalBox.read_num_token(s, p)
if vs == null { vs = JsonFragBox.read_float_after(s, p) }
if vs != null {
local canon = JsonNumberCanonicalBox.canonicalize_f64(vs)
// replace substring from p of length vs.length with canon
local out = s.substring(0, p) + canon + s.substring(p + vs.length(), s.length())
return out
}
}
}
return s
}
// Make a minimal canonical float string (trim trailing zeros; ensure at least one fractional digit)
_canonicalize_f64_str(fs) {
if fs == null { return "0.0" }
local s = "" + fs
// Normalize case
local raw = s
// Split sign
local sign = ""
if s.substring(0,1) == "-" { sign = "-" s = s.substring(1) }
// Handle exponent forms like d(.d*)?[eE][+-]?d+
local exp_pos = -1
local ei = s.indexOf("e"); if ei < 0 { ei = s.indexOf("E") }
if ei >= 0 { exp_pos = ei }
if exp_pos >= 0 {
local mant = s.substring(0, exp_pos)
local exp_str = s.substring(exp_pos+1)
// parse exponent sign
local esign = 1
if exp_str.substring(0,1) == "+" { exp_str = exp_str.substring(1) }
else { if exp_str.substring(0,1) == "-" { esign = -1 exp_str = exp_str.substring(1) } }
// collect digits only from exp_str
local ed = ""
local k = 0
loop(k < exp_str.length()) { local ch = exp_str.substring(k,k+1); if ch >= "0" && ch <= "9" { ed = ed + ch k = k + 1 } else { break } }
local e = StringHelpers.to_i64(ed)
// build digits from mantissa (remove dot)
local dotm = mant.indexOf(".")
local mdigits = ""
local intp = mant
local frac = ""
if dotm >= 0 { intp = mant.substring(0, dotm); frac = mant.substring(dotm+1) }
mdigits = intp + frac
// trim leading zeros in mdigits for placement robustness
local ms = mdigits
local a = 0
loop(a < ms.length() && ms.substring(a,a+1) == "0") { a = a + 1 }
ms = ms.substring(a)
if ms.length() == 0 { return "0.0" }
// compute new decimal point index
local shift = e * esign
local pos = intp.length() + shift
// pad with zeros if needed
if pos < 0 {
// 0.xxx style: need leading zeros
local zeros = ""
local zc = 0 - pos
loop(zc > 0) { zeros = zeros + "0" zc = zc - 1 }
local sfrac = zeros + ms
// trim trailing zeros later
local out = "0." + sfrac
// remove trailing zeros
local j2 = out.length()
loop(j2 > 0) { if out.substring(j2-1,j2) == "0" { j2 = j2 - 1 } else { break } }
out = out.substring(0, j2)
if out.substring(out.length()-1) == "." { out = out + "0" }
if out == "0.0" { return "0.0" } else { return (sign == "-" && out != "0.0") ? ("-" + out) : out }
}
if pos >= ms.length() {
// integer with trailing zeros needed
local zeros2 = ""
local zc2 = pos - ms.length()
loop(zc2 > 0) { zeros2 = zeros2 + "0" zc2 = zc2 - 1 }
local out2 = ms + zeros2
// add .0 to mark f64
out2 = out2 + ".0"
if out2 == "0.0" { return "0.0" } else { return (sign == "-" && out2 != "0.0") ? ("-" + out2) : out2 }
}
// general case: insert dot into ms at pos
local left = ms.substring(0, pos)
local right = ms.substring(pos)
// trim trailing zeros in right
local j3 = right.length()
loop(j3 > 0) { if right.substring(j3-1,j3) == "0" { j3 = j3 - 1 } else { break } }
right = right.substring(0, j3)
if right.length() == 0 { right = "0" }
local out3 = left + "." + right
// trim leading zeros in left (leave at least one)
local b = 0
loop(b + 1 < left.length() && left.substring(b,b+1) == "0") { b = b + 1 }
out3 = left.substring(b) + "." + right
if out3.substring(0,1) == "." { out3 = "0" + out3 }
if out3 == "0.0" { return "0.0" } else { return (sign == "-" && out3 != "0.0") ? ("-" + out3) : out3 }
}
// Non-exponent path: canonicalize decimals
local dot = s.indexOf(".")
if dot < 0 { return (sign == "-") ? ("-" + s + ".0") : (s + ".0") }
local intp2 = s.substring(0, dot)
local frac2 = s.substring(dot+1)
// trim trailing zeros
local j = frac2.length()
loop(j > 0) { if frac2.substring(j-1,j) == "0" { j = j - 1 } else { break } }
frac2 = frac2.substring(0, j)
local out4 = intp2 + "." + (frac2.length() == 0 ? "0" : frac2)
if out4 == "0.0" { return "0.0" }
return (sign == "-") ? ("-" + out4) : out4
}
// Read a numeric token (digits, optional dot, optional exponent) starting at pos
_read_num_token(text, pos) {
if text == null { return null }
local s = "" + text
local n = s.length()
local i = pos
if i >= n { return null }
// optional sign
if s.substring(i,i+1) == "+" || s.substring(i,i+1) == "-" { i = i + 1 }
local had = 0
// digits before dot
loop(i < n) { local ch = s.substring(i,i+1); if ch >= "0" && ch <= "9" { had = 1 i = i + 1 } else { break } }
// optional dot + digits
if i < n && s.substring(i,i+1) == "." {
i = i + 1
loop(i < n) { local ch2 = s.substring(i,i+1); if ch2 >= "0" && ch2 <= "9" { had = 1 i = i + 1 } else { break } }
}
// optional exponent
if i < n {
local ch3 = s.substring(i,i+1)
if ch3 == "e" || ch3 == "E" {
i = i + 1
if i < n && (s.substring(i,i+1) == "+" || s.substring(i,i+1) == "-") { i = i + 1 }
local dig = 0
loop(i < n) { local ce = s.substring(i,i+1); if ce >= "0" && ce <= "9" { dig = 1 i = i + 1 } else { break } }
if dig == 0 { return null }
}
}
if had == 0 { return null }
return s.substring(pos, i)
}
// Normalize PHI placement/order (group at block head) across all blocks
normalize_phi(mir_json_text) {
return me._normalize_all_blocks(mir_json_text, /*mode*/ "phi")
}
// Normalize return forms (ensure explicit value)
normalize_ret(mir_json_text) {
return me._normalize_all_blocks(mir_json_text, /*mode*/ "ret")
}
// Normalize const declarations (dedupe/grouping at head)
normalize_consts(mir_json_text) {
return me._normalize_all_blocks(mir_json_text, /*mode*/ "const")
}
// Shared normalization runner: rewrites each instructions array using _normalize_instructions_array
_normalize_all_blocks(mir_json_text, mode) {
if mir_json_text == null { return null }
local src = "" + mir_json_text
local key = me._dq()+"instructions"+me._dq()+":"+me._lsq()
local out = ""
local pos = 0
local n = src.length()
local guard = 0
loop(pos < n) {
guard = guard + 1
if guard > 16384 { break }
local k = src.indexOf(key, pos)
if k < 0 {
out = out + src.substring(pos, n)
break
}
local arr_br = k + key.length() - 1 // points at '['
// append prefix and the '[' itself
out = out + src.substring(pos, arr_br + 1)
// locate end of array and extract
local arr_end = JsonCursorBox.seek_array_end(src, arr_br)
if arr_end < 0 {
// malformed array; append rest and break
out = out + src.substring(arr_br + 1, n)
break
}
local inner = src.substring(arr_br + 1, arr_end)
local normalized = me._normalize_instructions_array(inner)
out = out + normalized + me._rsq()
pos = arr_end + 1
}
return out
}
// Apply all normalizations in a safe order.
normalize_all(mir_json_text) {
if mir_json_text == null { return null }
// Order: consts → phi → ret保守的な順序
local s1 = JsonFragNormalizerBox.normalize_consts(mir_json_text)
local s2 = JsonFragNormalizerBox.normalize_phi(s1)
local s3 = JsonFragNormalizerBox.normalize_ret(s2)
// Optional tag (quiet by default)
local tag = env.get("HAKO_MIR_BUILDER_NORMALIZE_TAG")
if tag != null && ("" + tag) == "1" { print("[mirbuilder/normalize:jsonfrag:pass]") }
return s3
}
}

View File

@ -0,0 +1,45 @@
// loop_opts_adapter_box.hako — Minimal adapter for LoopForm options
// Purpose: centralize MapBox creation/set so that we can later swap to JsonFrag-based
// construction without touching all callers.
static box LoopOptsBox {
new_map() {
return new MapBox()
}
put(m, key, value) {
m.set(key, value)
return m
}
// build2: envでJsonFrag直組立を選択既定は LoopFormBox.build2 に委譲)
build2(opts) {
using selfhost.shared.mir.loopform as LoopFormBox
using selfhost.shared.common.string_helpers as StringHelpers
using "hako.mir.builder.internal.jsonfrag_normalizer" as JsonFragNormalizerBox
// Opt-in: minimal MIR(JSON) construction for structure validation
local jf = env.get("HAKO_MIR_BUILDER_LOOP_JSONFRAG")
if jf != null && ("" + jf) == "1" {
// Read limit if present, else default to 0 (safe)
local limit = opts.get("limit"); if limit == null { limit = 0 }
// Normalize to string
local limit_s = "" + limit
// Minimal MIR: const 0, const limit, compare <, branch then/else, both paths ret
local mir = "{\"functions\":[{\"name\":\"main\",\"params\":[],\"locals\":[],\"blocks\":[" +
"{\"id\":0,\"instructions\":[" +
"{\"op\":\"const\",\"dst\":1,\"value\":{\"type\":\"i64\",\"value\":0}}," +
"{\"op\":\"const\",\"dst\":2,\"value\":{\"type\":\"i64\",\"value\":" + limit_s + "}}," +
"{\"op\":\"compare\",\"operation\":\"<\",\"lhs\":1,\"rhs\":2,\"dst\":3}," +
"{\"op\":\"branch\",\"cond\":3,\"then\":1,\"else\":2}]}," +
"{\"id\":1,\"instructions\":[{\"op\":\"ret\",\"value\":1}]}," +
"{\"id\":2,\"instructions\":[{\"op\":\"ret\",\"value\":1}]}]}]}"
print("[mirbuilder/internal/loop:jsonfrag]")
// Optional: normalization (dev toggle)
local norm = env.get("HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE")
if norm != null && ("" + norm) == "1" {
return JsonFragNormalizerBox.normalize_all(mir)
}
return mir
}
// Default: delegate
return LoopFormBox.build2(opts)
}
}

View File

@ -6,6 +6,7 @@ using selfhost.shared.common.string_helpers as StringHelpers
using selfhost.shared.json.utils.json_frag as JsonFragBox using selfhost.shared.json.utils.json_frag as JsonFragBox
using "hako.mir.builder.internal.pattern_util" as PatternUtilBox using "hako.mir.builder.internal.pattern_util" as PatternUtilBox
using "hako.mir.builder.internal.loop_scan" as LoopScanBox using "hako.mir.builder.internal.loop_scan" as LoopScanBox
using "hako.mir.builder.internal.loop_opts_adapter" as LoopOptsBox
static box LowerLoopCountParamBox { static box LowerLoopCountParamBox {
try_lower(program_json) { try_lower(program_json) {
@ -119,13 +120,30 @@ static box LowerLoopCountParamBox {
} }
// limit normalized above // limit normalized above
// JsonFrag 直組立opt-in: HAKO_MIR_BUILDER_LOOP_JSONFRAG=1
{
local jf = env.get("HAKO_MIR_BUILDER_LOOP_JSONFRAG")
if jf != null && ("" + jf) == "1" {
local mir = "{\"functions\":[{\"name\":\"main\",\"params\":[],\"locals\":[],\"blocks\":[" +
"{\"id\":0,\"instructions\":[" +
"{\"op\":\"const\",\"dst\":1,\"value\":{\"type\":\"i64\",\"value\":0}}," +
"{\"op\":\"const\",\"dst\":2,\"value\":{\"type\":\"i64\",\"value\":" + limit + "}}," +
"{\"op\":\"compare\",\"operation\":\"<\",\"lhs\":1,\"rhs\":2,\"dst\":3}," +
"{\"op\":\"branch\",\"cond\":3,\"then\":1,\"else\":2}]}," +
"{\"id\":1,\"instructions\":[{\"op\":\"ret\",\"value\":1}]}," +
"{\"id\":2,\"instructions\":[{\"op\":\"ret\",\"value\":1}]}]}]}"
print("[mirbuilder/internal/loop:jsonfrag]")
return mir
}
}
// Build via LoopFormBox.build2 ({ mode:"count", init, limit, step, cmp }) // Build via LoopFormBox.build2 ({ mode:"count", init, limit, step, cmp })
local opts = new MapBox() // Adapter集中JsonFrag化の踏み台
opts.set("mode", "count") local opts = LoopOptsBox.new_map()
opts.set("init", init) opts = LoopOptsBox.put(opts, "mode", "count")
opts.set("limit", limit) opts = LoopOptsBox.put(opts, "init", init)
opts.set("step", step) opts = LoopOptsBox.put(opts, "limit", limit)
opts.set("cmp", cmp) opts = LoopOptsBox.put(opts, "step", step)
return LoopFormBox.build2(opts) opts = LoopOptsBox.put(opts, "cmp", cmp)
return LoopOptsBox.build2(opts)
} }
} }

View File

@ -7,6 +7,7 @@ using selfhost.shared.common.string_helpers as StringHelpers
using selfhost.shared.json.utils.json_frag as JsonFragBox using selfhost.shared.json.utils.json_frag as JsonFragBox
using "hako.mir.builder.internal.pattern_util" as PatternUtilBox using "hako.mir.builder.internal.pattern_util" as PatternUtilBox
using "hako.mir.builder.internal.loop_scan" as LoopScanBox using "hako.mir.builder.internal.loop_scan" as LoopScanBox
using "hako.mir.builder.internal.loop_opts_adapter" as LoopOptsBox
static box LowerLoopSimpleBox { static box LowerLoopSimpleBox {
try_lower(program_json) { try_lower(program_json) {
@ -56,11 +57,30 @@ static box LowerLoopSimpleBox {
limit = norm.substring(cpos+1, norm.length()) limit = norm.substring(cpos+1, norm.length())
if cmp != "Lt" { return null } if cmp != "Lt" { return null }
// Delegate to shared loop form builder (counting mode) via build2 // JsonFrag 直組立opt-in: HAKO_MIR_BUILDER_LOOP_JSONFRAG=1
local opts = new MapBox() {
opts.set("mode", "count") local jf = env.get("HAKO_MIR_BUILDER_LOOP_JSONFRAG")
opts.set("limit", limit) if jf != null && ("" + jf) == "1" {
// Minimal MIR(JSON) with compare + branch + ret構造検証用
// Note: semanticsは簡略canaryはトークン検出のみ
local mir = "{\"functions\":[{\"name\":\"main\",\"params\":[],\"locals\":[],\"blocks\":[" +
"{\"id\":0,\"instructions\":[" +
"{\"op\":\"const\",\"dst\":1,\"value\":{\"type\":\"i64\",\"value\":0}}," +
"{\"op\":\"const\",\"dst\":2,\"value\":{\"type\":\"i64\",\"value\":" + limit + "}}," +
"{\"op\":\"compare\",\"operation\":\"<\",\"lhs\":1,\"rhs\":2,\"dst\":3}," +
"{\"op\":\"branch\",\"cond\":3,\"then\":1,\"else\":2}]}," +
"{\"id\":1,\"instructions\":[{\"op\":\"ret\",\"value\":1}]}," +
"{\"id\":2,\"instructions\":[{\"op\":\"ret\",\"value\":1}]}]}]}"
print("[mirbuilder/internal/loop:jsonfrag]")
return mir
}
}
// Delegate to shared loop form builder (counting mode) via build2既定
// Adapter centralizes Map creationJsonFrag化の踏み台
local opts = LoopOptsBox.new_map()
opts = LoopOptsBox.put(opts, "mode", "count")
opts = LoopOptsBox.put(opts, "limit", limit)
// init/step are optional; default to 0/1 inside LoopFormBox // init/step are optional; default to 0/1 inside LoopFormBox
return LoopFormBox.build2(opts) return LoopOptsBox.build2(opts)
} }
} }

View File

@ -13,6 +13,7 @@ using selfhost.shared.common.string_helpers as StringHelpers
using selfhost.shared.json.utils.json_frag as JsonFragBox using selfhost.shared.json.utils.json_frag as JsonFragBox
using "hako.mir.builder.internal.loop_scan" as LoopScanBox using "hako.mir.builder.internal.loop_scan" as LoopScanBox
using "hako.mir.builder.internal.sentinel_extractor" as SentinelExtractorBox using "hako.mir.builder.internal.sentinel_extractor" as SentinelExtractorBox
using "hako.mir.builder.internal.loop_opts_adapter" as LoopOptsBox
static box LowerLoopSumBcBox { static box LowerLoopSumBcBox {
try_lower(program_json) { try_lower(program_json) {
@ -84,15 +85,32 @@ static box LowerLoopSumBcBox {
print("[sum_bc] limit=" + limit_str + " skip=" + skip_str + " break=" + break_str) print("[sum_bc] limit=" + limit_str + " skip=" + skip_str + " break=" + break_str)
} }
// Use build2 map form for clarity // JsonFrag 直組立opt-in: HAKO_MIR_BUILDER_LOOP_JSONFRAG=1
local opts = new MapBox() {
opts.set("mode", "sum_bc") local jf = env.get("HAKO_MIR_BUILDER_LOOP_JSONFRAG")
opts.set("limit", limit) if jf != null && ("" + jf) == "1" {
opts.set("skip", skip_value) // Minimal MIR(JSON) with compare + branch + ret構造検証用
opts.set("break", break_value) local mir = "{\"functions\":[{\"name\":\"main\",\"params\":[],\"locals\":[],\"blocks\":[" +
"{\"id\":0,\"instructions\":[" +
"{\"op\":\"const\",\"dst\":1,\"value\":{\"type\":\"i64\",\"value\":0}}," +
"{\"op\":\"const\",\"dst\":2,\"value\":{\"type\":\"i64\",\"value\":" + limit + "}}," +
"{\"op\":\"compare\",\"operation\":\"<\",\"lhs\":1,\"rhs\":2,\"dst\":3}," +
"{\"op\":\"branch\",\"cond\":3,\"then\":1,\"else\":2}]}," +
"{\"id\":1,\"instructions\":[{\"op\":\"ret\",\"value\":1}]}," +
"{\"id\":2,\"instructions\":[{\"op\":\"ret\",\"value\":1}]}]}]}"
print("[mirbuilder/internal/loop:jsonfrag]")
return mir
}
}
// Use build2 map form (adapter; future JsonFrag化対象)
local opts = LoopOptsBox.new_map()
opts = LoopOptsBox.put(opts, "mode", "sum_bc")
opts = LoopOptsBox.put(opts, "limit", limit)
opts = LoopOptsBox.put(opts, "skip", skip_value)
opts = LoopOptsBox.put(opts, "break", break_value)
if trace != null && ("" + trace) == "1" { if trace != null && ("" + trace) == "1" {
print("[sum_bc] building MIR with LoopFormBox") print("[sum_bc] building MIR with LoopFormBox")
} }
return LoopFormBox.build2(opts) return LoopOptsBox.build2(opts)
} }
} }

View File

@ -22,6 +22,7 @@ json.core.string_scan = "json/core/string_scan.hako"
json.core.json_canonical = "json/json_canonical_box.hako" json.core.json_canonical = "json/json_canonical_box.hako"
json.utils.json_utils = "json/json_utils.hako" json.utils.json_utils = "json/json_utils.hako"
json.utils.json_frag = "json/utils/json_frag.hako" json.utils.json_frag = "json/utils/json_frag.hako"
json.utils.json_number_canonical = "json/utils/json_number_canonical_box.hako"
# Host bridge & adapters # Host bridge & adapters
host_bridge.host_bridge = "host_bridge/host_bridge_box.hako" host_bridge.host_bridge = "host_bridge/host_bridge_box.hako"

View File

@ -0,0 +1,112 @@
// json_number_canonical_box.hako — Shared JSON numeric canonicalization helpers
// Responsibility: canonicalize f64 textual forms (including exponent); read numeric tokens.
// Non-responsibility: MIR execution or JSON scanning beyond local token extraction.
using selfhost.shared.common.string_helpers as StringHelpers
static box JsonNumberCanonicalBox {
// Make a minimal canonical float string (trim trailing zeros; ensure at least one fractional digit)
canonicalize_f64(fs) {
if fs == null { return "0.0" }
local s = "" + fs
// Split sign
local sign = ""
if s.substring(0,1) == "-" { sign = "-" s = s.substring(1) }
// Handle exponent forms like d(.d*)?[eE][+-]?d+
local exp_pos = -1
local ei = s.indexOf("e"); if ei < 0 { ei = s.indexOf("E") }
if ei >= 0 { exp_pos = ei }
if exp_pos >= 0 {
local mant = s.substring(0, exp_pos)
local exp_str = s.substring(exp_pos+1)
local esign = 1
if exp_str.substring(0,1) == "+" { exp_str = exp_str.substring(1) }
else { if exp_str.substring(0,1) == "-" { esign = -1 exp_str = exp_str.substring(1) } }
local ed = ""
local k = 0
loop(k < exp_str.length()) { local ch = exp_str.substring(k,k+1); if ch >= "0" && ch <= "9" { ed = ed + ch k = k + 1 } else { break } }
local e = StringHelpers.to_i64(ed)
local dotm = mant.indexOf(".")
local intp = mant
local frac = ""
if dotm >= 0 { intp = mant.substring(0, dotm); frac = mant.substring(dotm+1) }
local ms = intp + frac
// trim leading zeros for stability
local a = 0
loop(a < ms.length() && ms.substring(a,a+1) == "0") { a = a + 1 }
ms = ms.substring(a)
if ms.length() == 0 { return "0.0" }
local shift = e * esign
local pos = intp.length() + shift
if pos < 0 {
local zeros = ""; local zc = 0 - pos
loop(zc > 0) { zeros = zeros + "0" zc = zc - 1 }
local sfrac = zeros + ms
local out = "0." + sfrac
local j2 = out.length()
loop(j2 > 0) { if out.substring(j2-1,j2) == "0" { j2 = j2 - 1 } else { break } }
out = out.substring(0, j2)
if out.substring(out.length()-1) == "." { out = out + "0" }
if out == "0.0" { return "0.0" } else { return (sign == "-" && out != "0.0") ? ("-" + out) : out }
}
if pos >= ms.length() {
local zeros2 = ""; local zc2 = pos - ms.length()
loop(zc2 > 0) { zeros2 = zeros2 + "0" zc2 = zc2 - 1 }
local out2 = ms + zeros2 + ".0"
if out2 == "0.0" { return "0.0" } else { return (sign == "-" && out2 != "0.0") ? ("-" + out2) : out2 }
}
local left = ms.substring(0, pos)
local right = ms.substring(pos)
local j3 = right.length()
loop(j3 > 0) { if right.substring(j3-1,j3) == "0" { j3 = j3 - 1 } else { break } }
right = right.substring(0, j3)
if right.length() == 0 { right = "0" }
local out3 = left + "." + right
local b = 0
loop(b + 1 < left.length() && left.substring(b,b+1) == "0") { b = b + 1 }
out3 = left.substring(b) + "." + right
if out3.substring(0,1) == "." { out3 = "0" + out3 }
if out3 == "0.0" { return "0.0" } else { return (sign == "-" && out3 != "0.0") ? ("-" + out3) : out3 }
}
// Non-exponent path: canonicalize decimals
local dot = s.indexOf(".")
if dot < 0 { return (sign == "-") ? ("-" + s + ".0") : (s + ".0") }
local intp2 = s.substring(0, dot)
local frac2 = s.substring(dot+1)
local j = frac2.length()
loop(j > 0) { if frac2.substring(j-1,j) == "0" { j = j - 1 } else { break } }
frac2 = frac2.substring(0, j)
local out4 = intp2 + "." + (frac2.length() == 0 ? "0" : frac2)
if out4 == "0.0" { return "0.0" }
return (sign == "-") ? ("-" + out4) : out4
}
// Read a numeric token (digits, optional dot, optional exponent) starting at pos
read_num_token(text, pos) {
if text == null { return null }
local s = "" + text
local n = s.length()
local i = pos
if i >= n { return null }
if s.substring(i,i+1) == "+" || s.substring(i,i+1) == "-" { i = i + 1 }
local had = 0
loop(i < n) { local ch = s.substring(i,i+1); if ch >= "0" && ch <= "9" { had = 1 i = i + 1 } else { break } }
if i < n && s.substring(i,i+1) == "." {
i = i + 1
loop(i < n) { local ch2 = s.substring(i,i+1); if ch2 >= "0" && ch2 <= "9" { had = 1 i = i + 1 } else { break } }
}
if i < n {
local ch3 = s.substring(i,i+1)
if ch3 == "e" || ch3 == "E" {
i = i + 1
if i < n && (s.substring(i,i+1) == "+" || s.substring(i,i+1) == "-") { i = i + 1 }
local dig = 0
loop(i < n) { local ce = s.substring(i,i+1); if ce >= "0" && ce <= "9" { dig = 1 i = i + 1 } else { break } }
if dig == 0 { return null }
}
}
if had == 0 { return null }
return s.substring(pos, i)
}
}

View File

@ -224,6 +224,7 @@ path = "lang/src/shared/common/string_helpers.hako"
"hako.mir.builder.internal.lower_typeop_check" = "lang/src/mir/builder/internal/lower_typeop_check_box.hako" "hako.mir.builder.internal.lower_typeop_check" = "lang/src/mir/builder/internal/lower_typeop_check_box.hako"
"hako.mir.builder.internal.lower_typeop_cast" = "lang/src/mir/builder/internal/lower_typeop_cast_box.hako" "hako.mir.builder.internal.lower_typeop_cast" = "lang/src/mir/builder/internal/lower_typeop_cast_box.hako"
"hako.mir.builder.internal.runner_min" = "lang/src/mir/builder/internal/runner_min_box.hako" "hako.mir.builder.internal.runner_min" = "lang/src/mir/builder/internal/runner_min_box.hako"
"hako.mir.builder.internal.jsonfrag_normalizer" = "lang/src/mir/builder/internal/jsonfrag_normalizer_box.hako"
# StageB support modules # StageB support modules
"hako.compiler.entry.bundle_resolver" = "lang/src/compiler/entry/bundle_resolver.hako" "hako.compiler.entry.bundle_resolver" = "lang/src/compiler/entry/bundle_resolver.hako"
@ -231,6 +232,7 @@ path = "lang/src/shared/common/string_helpers.hako"
# Missing alias for JsonFragBox (used widely in lowers) # Missing alias for JsonFragBox (used widely in lowers)
"selfhost.shared.json.utils.json_frag" = "lang/src/shared/json/utils/json_frag.hako" "selfhost.shared.json.utils.json_frag" = "lang/src/shared/json/utils/json_frag.hako"
"selfhost.shared.json.utils.json_number_canonical" = "lang/src/shared/json/utils/json_number_canonical_box.hako"
# Temporary alias keys removed (Phase20.33 TTL reached). Use `selfhost.shared.*` above. # Temporary alias keys removed (Phase20.33 TTL reached). Use `selfhost.shared.*` above.

View File

@ -1,50 +1,4 @@
//! Core readonly File I/O provider (ring1). //! Core readonly File I/O provider (ring1) — moved under `src/providers/ring1/file/core_ro.rs`.
//! Provides basic read-only file operations using std::fs::File. //! This module re-exports the ring1 implementation to keep legacy paths working.
use super::provider::{FileCaps, FileError, FileIo, FileResult, normalize_newlines};
use std::fs::File;
use std::io::Read;
use std::sync::RwLock;
pub struct CoreRoFileIo {
handle: RwLock<Option<File>>,
}
impl CoreRoFileIo {
pub fn new() -> Self {
Self {
handle: RwLock::new(None),
}
}
}
impl FileIo for CoreRoFileIo {
fn caps(&self) -> FileCaps {
FileCaps::read_only()
}
fn open(&self, path: &str) -> FileResult<()> {
let file = File::open(path)
.map_err(|e| FileError::Io(format!("Failed to open {}: {}", path, e)))?;
*self.handle.write().unwrap() = Some(file);
Ok(())
}
fn read(&self) -> FileResult<String> {
let mut handle = self.handle.write().unwrap();
if let Some(ref mut file) = *handle {
let mut content = String::new();
file.read_to_string(&mut content)
.map_err(|e| FileError::Io(format!("Read failed: {}", e)))?;
Ok(normalize_newlines(&content))
} else {
Err(FileError::Io("No file opened".to_string()))
}
}
fn close(&self) -> FileResult<()> {
*self.handle.write().unwrap() = None;
Ok(())
}
}
pub use crate::providers::ring1::file::core_ro::CoreRoFileIo;

View File

@ -72,6 +72,8 @@ pub mod using; // using resolver scaffolding (Phase 15)
// Host providers (extern bridge for Hako boxes) // Host providers (extern bridge for Hako boxes)
pub mod host_providers; pub mod host_providers;
// Core providers (ring1) — expose providers tree for in-crate re-exports
pub mod providers;
// CABI PoC shim (20.36/20.37) // CABI PoC shim (20.36/20.37)
pub mod abi { pub mod nyrt_shim; } pub mod abi { pub mod nyrt_shim; }

10
src/providers/README.md Normal file
View File

@ -0,0 +1,10 @@
Providers — Facade and Policy
This directory hosts ring1 (core providers) and related documentation.
Goals
- Centralize provider responsibilities and selection policy.
- Keep ring1 minimal and reproducible (static or in-tree providers).
See also: src/providers/ring1/ and docs/architecture/RINGS.md

4
src/providers/mod.rs Normal file
View File

@ -0,0 +1,4 @@
//! Providers root module — exposes ring-1 core providers.
pub mod ring1;

View File

@ -0,0 +1,14 @@
ring1 — Core Providers (Static/Always-Available)
Scope
- Minimal, trusted providers with stable behavior (e.g., FileBox Core-RO).
- Prefer static linkage or in-tree implementations.
Guidelines
- Keep capabilities minimal (read-only where possible).
- Emit selection diagnostics via ProviderRegistry (stderr; quiet under JSON_ONLY).
- Do not depend on ring2 (plugins).
Migration Note
- Historical naming like "builtin" may still exist in the codebase. ring1 is the canonical concept; moves will be incremental and guarded.

View File

@ -0,0 +1,46 @@
//! Core readonly File I/O provider (ring1).
//! Provides basic read-only file operations using std::fs::File.
use crate::boxes::file::provider::{FileCaps, FileError, FileIo, FileResult, normalize_newlines};
use std::fs::File;
use std::io::Read;
use std::sync::RwLock;
pub struct CoreRoFileIo {
handle: RwLock<Option<File>>,
}
impl CoreRoFileIo {
pub fn new() -> Self {
Self { handle: RwLock::new(None) }
}
}
impl FileIo for CoreRoFileIo {
fn caps(&self) -> FileCaps { FileCaps::read_only() }
fn open(&self, path: &str) -> FileResult<()> {
let file = File::open(path)
.map_err(|e| FileError::Io(format!("Failed to open {}: {}", path, e)))?;
*self.handle.write().unwrap() = Some(file);
Ok(())
}
fn read(&self) -> FileResult<String> {
let mut handle = self.handle.write().unwrap();
if let Some(ref mut file) = *handle {
let mut content = String::new();
file.read_to_string(&mut content)
.map_err(|e| FileError::Io(format!("Read failed: {}", e)))?;
Ok(normalize_newlines(&content))
} else {
Err(FileError::Io("No file opened".to_string()))
}
}
fn close(&self) -> FileResult<()> {
*self.handle.write().unwrap() = None;
Ok(())
}
}

View File

@ -0,0 +1,2 @@
pub mod core_ro;

View File

@ -0,0 +1,4 @@
//! ring1 — Core Providers module
pub mod file;

13
src/ring0/LAYER_GUARD.rs Normal file
View File

@ -0,0 +1,13 @@
#![doc = "ring0 layer guard: defines responsibilities and import boundaries"]
#[allow(dead_code)]
pub const LAYER_NAME: &str = "ring0";
#[allow(dead_code)]
pub const ALLOWED_IMPORTS: &[&str] = &[
"runner", "config", "backend", "hostbridge", "runtime",
];
#[allow(dead_code)]
pub const FORBIDDEN_IMPORTS: &[&str] = &[
// Do not depend on plugins directly from ring0
"plugins", "providers::ring1",
];

13
src/ring0/README.md Normal file
View File

@ -0,0 +1,13 @@
ring0 — Kernel Layer (Facade)
Purpose
- Document the kernel responsibilities (runner/VM/host-bridge/fail-fast).
- Act as a conceptual anchor without moving existing files immediately.
Policy
- No dependencies on ring2 (plugins).
- Keep logic minimal; orchestration and fail-fast only.
Notes
- Physical moves are deferred to avoid large diffs. ring0 exists as a documentation and guard point first.

View File

@ -6,6 +6,17 @@ use std::sync::{Arc, Mutex, OnceLock};
use crate::boxes::file::provider::FileIo; use crate::boxes::file::provider::FileIo;
use crate::boxes::file::core_ro::CoreRoFileIo; use crate::boxes::file::core_ro::CoreRoFileIo;
/// Provider selection policy (global). Applied when mode=Auto.
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
enum ProviderPolicy {
/// Current default: prefer plugin/dynamic providers by priority; use ring-1 as fallback
StrictPluginFirst,
/// Prefer ring-1 (static/core-ro) for stability/CI; fall back to plugin if unavailable
SafeCoreFirst,
/// Alias to SafeCoreFirst for future extension
StaticPreferred,
}
#[allow(dead_code)] #[allow(dead_code)]
pub enum FileBoxMode { Auto, CoreRo, PluginOnly } pub enum FileBoxMode { Auto, CoreRo, PluginOnly }
@ -49,6 +60,15 @@ fn ensure_builtin_file_provider_registered() {
} }
} }
/// Read global provider policy (affects Auto mode only)
fn read_provider_policy_from_env() -> ProviderPolicy {
match std::env::var("HAKO_PROVIDER_POLICY").unwrap_or_else(|_| "strict-plugin-first".to_string()).as_str() {
"safe-core-first" => ProviderPolicy::SafeCoreFirst,
"static-preferred" => ProviderPolicy::StaticPreferred,
_ => ProviderPolicy::StrictPluginFirst,
}
}
/// Read FileBox mode from environment variables /// Read FileBox mode from environment variables
#[allow(dead_code)] #[allow(dead_code)]
pub fn read_filebox_mode_from_env() -> FileBoxMode { pub fn read_filebox_mode_from_env() -> FileBoxMode {
@ -73,7 +93,8 @@ pub fn select_file_provider(mode: FileBoxMode) -> Arc<dyn FileIo> {
match mode { match mode {
FileBoxMode::Auto => { FileBoxMode::Auto => {
// Try: dynamic/builtin (by priority) → core-ro fallback // Selection by global policy
let policy = read_provider_policy_from_env();
if let Some(reg) = registry { if let Some(reg) = registry {
let mut factories: Vec<_> = reg.lock().unwrap() let mut factories: Vec<_> = reg.lock().unwrap()
.iter() .iter()
@ -81,14 +102,38 @@ pub fn select_file_provider(mode: FileBoxMode) -> Arc<dyn FileIo> {
.cloned() .cloned()
.collect(); .collect();
// Sort by priority (descending) // Sort by priority (descending); plugin providers should rank higher than ring-1 (priority -100)
factories.sort_by(|a, b| b.priority().cmp(&a.priority())); factories.sort_by(|a, b| b.priority().cmp(&a.priority()));
if let Some(factory) = factories.first() { // Try policy-driven choice first
if !quiet_pipe { match policy {
eprintln!("[provider-registry] FileBox: using registered provider (priority={})", factory.priority()); ProviderPolicy::StrictPluginFirst => {
if let Some(factory) = factories.first() {
if !quiet_pipe || std::env::var("HAKO_PROVIDER_TRACE").as_deref() == Ok("1") {
eprintln!("[provider-registry] FileBox: using registered provider (priority={})", factory.priority());
eprintln!("[provider/select:FileBox ring=plugin src=dynamic]");
}
return factory.create_provider();
}
}
ProviderPolicy::SafeCoreFirst | ProviderPolicy::StaticPreferred => {
// Prefer ring-1 (priority <= -100)
if let Some(core_ro) = factories.iter().find(|f| f.priority() <= -100) {
if !quiet_pipe || std::env::var("HAKO_PROVIDER_TRACE").as_deref() == Ok("1") {
eprintln!("[provider-registry] FileBox: using core-ro (policy)");
eprintln!("[provider/select:FileBox ring=1 src=static caps=[read]]");
}
return core_ro.create_provider();
}
// Fallback to first available (plugin)
if let Some(factory) = factories.first() {
if !quiet_pipe || std::env::var("HAKO_PROVIDER_TRACE").as_deref() == Ok("1") {
eprintln!("[provider-registry] FileBox: using registered provider (priority={})", factory.priority());
eprintln!("[provider/select:FileBox ring=plugin src=dynamic]");
}
return factory.create_provider();
}
} }
return factory.create_provider();
} }
} }
@ -105,11 +150,12 @@ pub fn select_file_provider(mode: FileBoxMode) -> Arc<dyn FileIo> {
eprintln!("[failfast/provider/filebox:auto-fallback-blocked]"); eprintln!("[failfast/provider/filebox:auto-fallback-blocked]");
panic!("Fail-Fast: FileBox provider fallback is disabled (NYASH_FAIL_FAST=0 or NYASH_FILEBOX_ALLOW_FALLBACK=1 to override)"); panic!("Fail-Fast: FileBox provider fallback is disabled (NYASH_FAIL_FAST=0 or NYASH_FILEBOX_ALLOW_FALLBACK=1 to override)");
} else { } else {
if !quiet_pipe { if !quiet_pipe || std::env::var("HAKO_PROVIDER_TRACE").as_deref() == Ok("1") {
eprintln!( eprintln!(
"[provider-registry] FileBox: using core-ro fallback{}", "[provider-registry] FileBox: using core-ro fallback{}",
if allow_fb_override { " (override)" } else { "" } if allow_fb_override { " (override)" } else { "" }
); );
eprintln!("[provider/select:FileBox ring=1 src=static caps=[read]]");
} }
Arc::new(CoreRoFileIo::new()) Arc::new(CoreRoFileIo::new())
} }
@ -126,8 +172,9 @@ pub fn select_file_provider(mode: FileBoxMode) -> Arc<dyn FileIo> {
factories.sort_by(|a, b| b.priority().cmp(&a.priority())); factories.sort_by(|a, b| b.priority().cmp(&a.priority()));
if let Some(factory) = factories.first() { if let Some(factory) = factories.first() {
if !quiet_pipe { if !quiet_pipe || std::env::var("HAKO_PROVIDER_TRACE").as_deref() == Ok("1") {
eprintln!("[provider-registry] FileBox: using plugin-only provider (priority={})", factory.priority()); eprintln!("[provider-registry] FileBox: using plugin-only provider (priority={})", factory.priority());
eprintln!("[provider/select:FileBox ring=plugin src=dynamic]");
} }
return factory.create_provider(); return factory.create_provider();
} }
@ -137,10 +184,21 @@ pub fn select_file_provider(mode: FileBoxMode) -> Arc<dyn FileIo> {
} }
FileBoxMode::CoreRo => { FileBoxMode::CoreRo => {
// Always use core-ro, ignore registry // Always use core-ro, ignore registry
if !quiet_pipe { if !quiet_pipe || std::env::var("HAKO_PROVIDER_TRACE").as_deref() == Ok("1") {
eprintln!("[provider-registry] FileBox: using core-ro (forced)"); eprintln!("[provider-registry] FileBox: using core-ro (forced)");
eprintln!("[provider/select:FileBox ring=1 src=static caps=[read]]");
} }
Arc::new(CoreRoFileIo::new()) Arc::new(CoreRoFileIo::new())
} }
} }
} }
/// Provider descriptor (ring/source/capabilities). Currently informational.
#[allow(dead_code)]
#[derive(Clone, Debug)]
struct ProviderDescriptor {
box_name: &'static str,
ring: &'static str, // "0" | "1" | "plugin"
source: &'static str, // "static" | "dynamic"
capabilities: &'static [&'static str], // e.g., ["read"]
priority: i32,
}

View File

@ -0,0 +1,67 @@
#!/usr/bin/env bash
# mir_canary.sh — Common helpers for MIR(JSON) canary tests
# Note: library only; do not set -e here. test_runner controls shell flags.
# Extract MIR JSON content between [MIR_BEGIN] and [MIR_END] markers from stdin
extract_mir_from_output() {
awk '/\[MIR_BEGIN\]/{f=1;next}/\[MIR_END\]/{f=0}f'
}
# Assert that all given tokens are present in the provided input (stdin)
# Usage: some_source | assert_has_tokens token1 token2 ...
assert_has_tokens() {
local content
content=$(cat)
for tk in "$@"; do
if ! grep -Fq -- "$tk" <<<"$content"; then
echo "[FAIL] token missing: $tk" >&2
return 1
fi
done
return 0
}
# Assert that output contains a specific SKIP tag line (exact or prefix match)
# Usage: some_source | assert_skip_tag "[SKIP] loop_toggle" (prefix ok)
assert_skip_tag() {
local pattern="$1"
local content
content=$(cat)
if ! grep -Fq -- "$pattern" <<<"$content"; then
echo "[FAIL] skip tag not found: $pattern" >&2
return 1
fi
return 0
}
# Assert token1 appears before token2 in the input (byte offset order)
# Usage: some_source | assert_order token1 token2
assert_order() {
local t1="$1"; shift
local t2="$1"; shift || true
local content
content=$(cat)
local p1 p2
p1=$(grep -b -o -- "$t1" <<<"$content" | head -n1 | cut -d: -f1)
p2=$(grep -b -o -- "$t2" <<<"$content" | head -n1 | cut -d: -f1)
if [[ -z "$p1" || -z "$p2" ]]; then
echo "[FAIL] tokens not found for order check: '$t1' vs '$t2'" >&2
return 1
fi
if (( p1 < p2 )); then return 0; fi
echo "[FAIL] token order invalid: '$t1' not before '$t2' (p1=$p1 p2=$p2)" >&2
return 1
}
# Assert exact occurrence count of a token in input
# Usage: some_source | assert_token_count token expected_count
assert_token_count() {
local token="$1"; local expected="$2"
local content
content=$(cat)
local cnt
cnt=$(grep -o -- "$token" <<<"$content" | wc -l | tr -d ' \t\n')
if [[ "$cnt" = "$expected" ]]; then return 0; fi
echo "[FAIL] token count mismatch for '$token': got $cnt, want $expected" >&2
return 1
}

View File

@ -60,7 +60,7 @@ log_error() {
} }
# 共通イズフィルタVM実行時の出力整形 # 共通イズフィルタVM実行時の出力整形
filter_noise() { filter_noise() {
if [ "${HAKO_SHOW_CALL_LOGS:-0}" = "1" ]; then if [ "${HAKO_SHOW_CALL_LOGS:-0}" = "1" ]; then
# Show raw logs (no filtering) to allow call traces / diagnostics # Show raw logs (no filtering) to allow call traces / diagnostics
cat cat
@ -70,6 +70,8 @@ filter_noise() {
grep -v "^\[UnifiedBoxRegistry\]" \ grep -v "^\[UnifiedBoxRegistry\]" \
| grep -v "^\[FileBox\]" \ | grep -v "^\[FileBox\]" \
| grep -v "^\[provider-registry\]" \ | grep -v "^\[provider-registry\]" \
| grep -v "^\[provider/select:" \
| grep -v "^\[deprecate/env\]" \
| grep -v "^\[plugin/missing\]" \ | grep -v "^\[plugin/missing\]" \
| grep -v "^\[plugin/hint\]" \ | grep -v "^\[plugin/hint\]" \
| grep -v "^Net plugin:" \ | grep -v "^Net plugin:" \
@ -87,6 +89,8 @@ filter_noise() {
| grep -v '^\{"ev":' \ | grep -v '^\{"ev":' \
| grep -v '^\[warn\]' \ | grep -v '^\[warn\]' \
| grep -v '^\[error\]' \ | grep -v '^\[error\]' \
| grep -v '^RC: ' \
| grep -v '^\[mirbuilder/normalize:' \
| grep -v '^\[warn\] dev fallback: user instance BoxCall' \ | grep -v '^\[warn\] dev fallback: user instance BoxCall' \
| sed -E 's/^❌ VM fallback error: *//' \ | sed -E 's/^❌ VM fallback error: *//' \
| grep -v '^\[warn\] dev verify: NewBox ' \ | grep -v '^\[warn\] dev verify: NewBox ' \
@ -841,4 +845,14 @@ enable_mirbuilder_dev_env() {
if [ "${SMOKES_DEV_PREINCLUDE:-0}" = "1" ]; then if [ "${SMOKES_DEV_PREINCLUDE:-0}" = "1" ]; then
export HAKO_PREINCLUDE=1 export HAKO_PREINCLUDE=1
fi fi
# Optional: enable JsonFrag Normalizer for builder/min paths (default OFF)
# Use only in targeted canaries; keep OFF for general runs
if [ "${SMOKES_DEV_NORMALIZE:-0}" = "1" ]; then
export HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE=1
fi
# Profile-based injection example (commented; enable when needed):
# if [ "${SMOKES_ENABLE_NORMALIZE_FOR_QUICK:-0}" = "1" ] && [ "${SMOKES_CURRENT_PROFILE:-}" = "quick" ]; then
# export HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE=1
# export HAKO_MIR_BUILDER_NORMALIZE_TAG=1 # optional: show tags in logs for diagnostics
# fi
} }

View File

@ -0,0 +1,38 @@
#!/usr/bin/env bash
# MirBuilder(minimal if.compare) + Normalizer OFF — ensure tag not present
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
SMOKES_DEV_PREINCLUDE=1 enable_mirbuilder_dev_env
tmp_hako="/tmp/mirbuilder_if_compare_notag_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using hako.mir.builder as MirBuilderBox
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
local j = env.get("PROG_JSON"); if j == null { print("[fail:nojson]"); return 1 }
local out = MirBuilderBox.emit_from_program_json_v0(j, null);
if out == null { print("[fail:builder]"); return 1 }
local norm = env.get("HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE");
if norm != null && ("" + norm) == "1" { out = NormBox.normalize_all(out) }
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
PROG='{"version":0,"kind":"Program","body":[{"type":"If","cond":{"type":"Compare","op":">","lhs":{"type":"Int","value":4},"rhs":{"type":"Int","value":3}},"then":[{"type":"Return","expr":{"type":"Int","value":30}}],"else":[{"type":"Return","expr":{"type":"Int","value":40}}]}]}'
set +e
out="$(PROG_JSON="$PROG" run_nyash_vm "$tmp_hako" 2>&1 )"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [ "$rc" -ne 0 ]; then echo "[SKIP] if_compare_notag: env unstable"; exit 0; fi
if echo "$out" | grep -q "\[mirbuilder/normalize:jsonfrag:pass\]"; then echo "[FAIL] if_compare_notag: tag present without toggle"; exit 1; fi
mir=$(echo "$out" | extract_mir_from_output)
if [ -z "$mir" ]; then echo "[SKIP] if_compare_notag: no MIR (env)"; exit 0; fi
echo "[PASS] mirbuilder_internal_if_compare_jsonfrag_nonormalize_no_tag_canary_vm"
exit 0

View File

@ -0,0 +1,41 @@
#!/usr/bin/env bash
# MirBuilder(minimal if.compare) + Normalizer ON — tag + tokens check
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
SMOKES_DEV_PREINCLUDE=1 enable_mirbuilder_dev_env
tmp_hako="/tmp/mirbuilder_if_compare_norm_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using hako.mir.builder as MirBuilderBox
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
// If(Int cmp Int) → two branch return(Int)
local j = env.get("PROG_JSON"); if j == null { print("[fail:nojson]"); return 1 }
local out = MirBuilderBox.emit_from_program_json_v0(j, null);
if out == null { print("[fail:builder]"); return 1 }
// Apply Normalizer under toggle for canary
local norm = env.get("HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE");
if norm != null && ("" + norm) == "1" { out = NormBox.normalize_all(out) }
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
PROG='{"version":0,"kind":"Program","body":[{"type":"If","cond":{"type":"Compare","op":"<","lhs":{"type":"Int","value":1},"rhs":{"type":"Int","value":2}},"then":[{"type":"Return","expr":{"type":"Int","value":10}}],"else":[{"type":"Return","expr":{"type":"Int","value":20}}]}]}'
set +e
out="$(PROG_JSON="$PROG" HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE=1 run_nyash_vm "$tmp_hako" 2>&1 )"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [ "$rc" -ne 0 ]; then echo "[SKIP] if_compare_norm: env unstable"; exit 0; fi
if ! echo "$out" | grep -q "\[mirbuilder/normalize:jsonfrag:pass\]"; then echo "[SKIP] if_compare_norm: tag missing"; exit 0; fi
mir=$(echo "$out" | extract_mir_from_output)
if [ -z "$mir" ]; then echo "[SKIP] if_compare_norm: no MIR (env)"; exit 0; fi
if echo "$mir" | assert_has_tokens '"op":"compare"' '"op":"branch"' '"op":"ret"'; then
echo "[PASS] mirbuilder_internal_if_compare_jsonfrag_normalize_canary_vm"; exit 0; fi
echo "[SKIP] if_compare_norm: tokens not found (env)"; exit 0

View File

@ -0,0 +1,34 @@
#!/usr/bin/env bash
# JsonFrag minimal MIR for loop(count_param) — tokens check (compare/branch/ret)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
SMOKES_DEV_PREINCLUDE=1 enable_mirbuilder_dev_env
tmp_hako="/tmp/mirbuilder_loop_count_jsonfrag_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using hako.mir.builder as MirBuilderBox
static box Main { method main(args) {
local j = env.get("PROG_JSON"); if j == null { print("[fail:nojson]"); return 1 }
local out = MirBuilderBox.emit_from_program_json_v0(j, null);
if out == null { print("[fail:builder]"); return 1 }
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
PROG='{"version":0,"kind":"Program","body":[{"type":"Local","name":"i","expr":{"type":"Int","value":1}},{"type":"Loop","cond":{"type":"Compare","op":"<","lhs":{"type":"Var","name":"i"},"rhs":{"type":"Int","value":4}},"body":[{"type":"Local","name":"i","expr":{"type":"Binary","op":"+","lhs":{"type":"Var","name":"i"},"rhs":{"type":"Int","value":2}}}]},{"type":"Return","expr":{"type":"Var","name":"i"}}]}'
set +e
out="$(PROG_JSON="$PROG" HAKO_MIR_BUILDER_LOOP_JSONFRAG=1 run_nyash_vm "$tmp_hako" 2>&1 )"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [ "$rc" -ne 0 ]; then echo "[SKIP] loop_count_jsonfrag: env unstable"; exit 0; fi
mir=$(echo "$out" | extract_mir_from_output)
if [ -z "$mir" ]; then echo "[SKIP] loop_count_jsonfrag: no MIR (env)"; exit 0; fi
if echo "$mir" | assert_has_tokens '"op":"compare"' '"op":"branch"' '"op":"ret"'; then
echo "[PASS] mirbuilder_internal_loop_count_param_jsonfrag_canary_vm"; exit 0; fi
echo "[SKIP] loop_count_jsonfrag: tokens not found (env)"; exit 0

View File

@ -0,0 +1,36 @@
#!/usr/bin/env bash
# JsonFrag minimal MIR for loop(simple) — tokens check (compare/branch/ret)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
SMOKES_DEV_PREINCLUDE=1 enable_mirbuilder_dev_env
tmp_hako="/tmp/mirbuilder_loop_simple_jsonfrag_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using hako.mir.builder as MirBuilderBox
static box Main { method main(args) {
// Loop(simple): i=0; loop(i<3){ i=i+1 }; return i
local j = env.get("PROG_JSON"); if j == null { print("[fail:nojson]"); return 1 }
local out = MirBuilderBox.emit_from_program_json_v0(j, null);
if out == null { print("[fail:builder]"); return 1 }
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
set +e
PROG='{"version":0,"kind":"Program","body":[{"type":"Local","name":"i","expr":{"type":"Int","value":0}},{"type":"Loop","cond":{"type":"Compare","op":"<","lhs":{"type":"Var","name":"i"},"rhs":{"type":"Int","value":3}},"body":[{"type":"Local","name":"i","expr":{"type":"Binary","op":"+","lhs":{"type":"Var","name":"i"},"rhs":{"type":"Int","value":1}}}]},{"type":"Return","expr":{"type":"Var","name":"i"}}]}'
out="$(PROG_JSON="$PROG" HAKO_MIR_BUILDER_LOOP_JSONFRAG=1 run_nyash_vm "$tmp_hako" 2>&1 )"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [ "$rc" -ne 0 ]; then
echo "[SKIP] loop_simple_jsonfrag: preinclude/parse unstable on this host"; exit 0
fi
mir=$(echo "$out" | extract_mir_from_output)
if [ -z "$mir" ]; then echo "[SKIP] loop_simple_jsonfrag: no MIR (env)"; exit 0; fi
if echo "$mir" | assert_has_tokens '"op":"compare"' '"op":"branch"' '"op":"ret"'; then
echo "[PASS] mirbuilder_internal_loop_simple_jsonfrag_canary_vm"; exit 0; fi
echo "[SKIP] loop_simple_jsonfrag: tokens not found (env)"; exit 0

View File

@ -0,0 +1,35 @@
#!/usr/bin/env bash
# Ensure Normalizer tag does not appear when toggle is OFF (JsonFrag path ON)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
SMOKES_DEV_PREINCLUDE=1 enable_mirbuilder_dev_env
tmp_hako="/tmp/mirbuilder_loop_simple_jsonfrag_notag_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using hako.mir.builder as MirBuilderBox
static box Main { method main(args) {
local j = env.get("PROG_JSON"); if j == null { print("[fail:nojson]"); return 1 }
local out = MirBuilderBox.emit_from_program_json_v0(j, null);
if out == null { print("[fail:builder]"); return 1 }
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
PROG='{"version":0,"kind":"Program","body":[{"type":"Local","name":"i","expr":{"type":"Int","value":0}},{"type":"Loop","cond":{"type":"Compare","op":"<","lhs":{"type":"Var","name":"i"},"rhs":{"type":"Int","value":2}},"body":[{"type":"Local","name":"i","expr":{"type":"Binary","op":"+","lhs":{"type":"Var","name":"i"},"rhs":{"type":"Int","value":1}}}]},{"type":"Return","expr":{"type":"Var","name":"i"}}]}'
set +e
out="$(PROG_JSON="$PROG" HAKO_MIR_BUILDER_LOOP_JSONFRAG=1 run_nyash_vm "$tmp_hako" 2>&1 )"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [ "$rc" -ne 0 ]; then echo "[SKIP] loop_simple_jsonfrag_notag: env unstable"; exit 0; fi
if echo "$out" | grep -q "\[mirbuilder/normalize:jsonfrag:pass\]"; then echo "[FAIL] loop_simple_jsonfrag_notag: tag present without toggle"; exit 1; fi
mir=$(echo "$out" | extract_mir_from_output)
if [ -z "$mir" ]; then echo "[SKIP] loop_simple_jsonfrag_notag: no MIR (env)"; exit 0; fi
echo "[PASS] mirbuilder_internal_loop_simple_jsonfrag_nonormalize_no_tag_canary_vm"
exit 0

View File

@ -0,0 +1,37 @@
#!/usr/bin/env bash
# JsonFrag minimal MIR for loop(simple) with Normalizer ON — tag + tokens check
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
SMOKES_DEV_PREINCLUDE=1 enable_mirbuilder_dev_env
tmp_hako="/tmp/mirbuilder_loop_simple_jsonfrag_norm_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using hako.mir.builder as MirBuilderBox
static box Main { method main(args) {
// Loop(simple): i=0; loop(i<2){ i=i+1 }; return i
local j = env.get("PROG_JSON"); if j == null { print("[fail:nojson]"); return 1 }
local out = MirBuilderBox.emit_from_program_json_v0(j, null);
if out == null { print("[fail:builder]"); return 1 }
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
set +e
PROG='{"version":0,"kind":"Program","body":[{"type":"Local","name":"i","expr":{"type":"Int","value":0}},{"type":"Loop","cond":{"type":"Compare","op":"<","lhs":{"type":"Var","name":"i"},"rhs":{"type":"Int","value":2}},"body":[{"type":"Local","name":"i","expr":{"type":"Binary","op":"+","lhs":{"type":"Var","name":"i"},"rhs":{"type":"Int","value":1}}}]},{"type":"Return","expr":{"type":"Var","name":"i"}}]}'
out="$(PROG_JSON="$PROG" HAKO_MIR_BUILDER_LOOP_JSONFRAG=1 HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE=1 run_nyash_vm "$tmp_hako" 2>&1 )"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [ "$rc" -ne 0 ]; then echo "[SKIP] loop_simple_jsonfrag_norm: env unstable"; exit 0; fi
# expect normalization tag present (not filtered)
if ! echo "$out" | grep -q "\[mirbuilder/normalize:jsonfrag:pass\]"; then echo "[SKIP] loop_simple_jsonfrag_norm: tag missing"; exit 0; fi
mir=$(echo "$out" | extract_mir_from_output)
if [ -z "$mir" ]; then echo "[SKIP] loop_simple_jsonfrag_norm: no MIR (env)"; exit 0; fi
if echo "$mir" | assert_has_tokens '"op":"compare"' '"op":"branch"' '"op":"ret"'; then
echo "[PASS] mirbuilder_internal_loop_simple_jsonfrag_normalize_canary_vm"; exit 0; fi
echo "[SKIP] loop_simple_jsonfrag_norm: tokens not found (env)"; exit 0

View File

@ -0,0 +1,39 @@
#!/usr/bin/env bash
# Loop lowers skip toggle canary — HAKO_MIR_BUILDER_SKIP_LOOPS=1 でループlowerを回避する
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
enable_mirbuilder_dev_env
tmp_hako="/tmp/mirbuilder_loop_toggle_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using "hako.mir.builder" as MirBuilderBox
static box Main { method main(args) {
// Program: Local i=0; Loop(Compare(i<3)){ i=i+1 }; Return i
local j = '{"version":0,"kind":"Program","body":[{"type":"Local","name":"i","expr":{"type":"Int","value":0}},{"type":"Loop","cond":{"type":"Compare","op":"<","lhs":{"type":"Var","name":"i"},"rhs":{"type":"Int","value":3}},"body":[{"type":"Local","name":"i","expr":{"type":"Binary","op":"+","lhs":{"type":"Var","name":"i"},"rhs":{"type":"Int","value":1}}}]},{"type":"Return","expr":{"type":"Var","name":"i"}}]}'
local out = MirBuilderBox.emit_from_program_json_v0(j, null);
if out == null { print("[MIR_NONE]"); return 1 }
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
# 1) loops enabled (default): expect MIR content
set +e
out1="$(run_nyash_vm "$tmp_hako" 2>&1)"; rc1=$?
set -e
mir1=$(echo "$out1" | extract_mir_from_output)
if [[ "$rc1" -ne 0 ]] || [[ -z "$mir1" ]]; then echo "[SKIP] loop_toggle: no MIR when enabled"; rm -f "$tmp_hako"; exit 0; fi
if ! echo "$mir1" | assert_has_tokens '"functions"' '"blocks"'; then echo "[SKIP] loop_toggle: malformed MIR"; rm -f "$tmp_hako"; exit 0; fi
# 2) loops skipped: expect no MIR (or null)
set +e
out2="$(HAKO_MIR_BUILDER_SKIP_LOOPS=1 run_nyash_vm "$tmp_hako" 2>&1)"; rc2=$?
set -e
mir2=$(echo "$out2" | extract_mir_from_output)
rm -f "$tmp_hako" || true
if [[ -n "$mir2" ]]; then echo "[FAIL] loop_toggle: MIR present when skip=1" >&2; exit 1; fi
echo "[PASS] mirbuilder_internal_loop_skip_toggle_canary_vm"
exit 0

View File

@ -0,0 +1,35 @@
#!/usr/bin/env bash
# JsonFrag minimal MIR for loop(sum_bc) — tokens check (compare/branch/ret)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
SMOKES_DEV_PREINCLUDE=1 enable_mirbuilder_dev_env
tmp_hako="/tmp/mirbuilder_loop_sum_bc_jsonfrag_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using hako.mir.builder as MirBuilderBox
static box Main { method main(args) {
// Loop with break/continue sentinels; structureのみ検証
local j = env.get("PROG_JSON"); if j == null { print("[fail:nojson]"); return 1 }
local out = MirBuilderBox.emit_from_program_json_v0(j, null);
if out == null { print("[fail:builder]"); return 1 }
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
PROG='{"version":0,"kind":"Program","body":[{"type":"Local","name":"i","expr":{"type":"Int","value":0}},{"type":"Local","name":"s","expr":{"type":"Int","value":0}},{"type":"Loop","cond":{"type":"Compare","op":"<","lhs":{"type":"Var","name":"i"},"rhs":{"type":"Int","value":5}},"body":[{"type":"If","cond":{"type":"Compare","op":"==","lhs":{"type":"Var","name":"i"},"rhs":{"type":"Int","value":4}},"then":[{"type":"Break"}]},{"type":"If","cond":{"type":"Compare","op":"==","lhs":{"type":"Var","name":"i"},"rhs":{"type":"Int","value":2}},"then":[{"type":"Continue"}]}]},{"type":"Return","expr":{"type":"Var","name":"s"}}]}'
set +e
out="$(PROG_JSON="$PROG" HAKO_MIR_BUILDER_LOOP_JSONFRAG=1 run_nyash_vm "$tmp_hako" 2>&1 )"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [ "$rc" -ne 0 ]; then echo "[SKIP] loop_sum_bc_jsonfrag: env unstable"; exit 0; fi
mir=$(echo "$out" | extract_mir_from_output)
if [ -z "$mir" ]; then echo "[SKIP] loop_sum_bc_jsonfrag: no MIR (env)"; exit 0; fi
if echo "$mir" | assert_has_tokens '"op":"compare"' '"op":"branch"' '"op":"ret"'; then
echo "[PASS] mirbuilder_internal_loop_sum_bc_jsonfrag_canary_vm"; exit 0; fi
echo "[SKIP] loop_sum_bc_jsonfrag: tokens not found (env)"; exit 0

View File

@ -0,0 +1,36 @@
#!/usr/bin/env bash
# JsonFrag minimal MIR for loop(sum_bc) with Normalizer ON — tag + tokens check
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
SMOKES_DEV_PREINCLUDE=1 enable_mirbuilder_dev_env
tmp_hako="/tmp/mirbuilder_loop_sum_bc_jsonfrag_norm_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using hako.mir.builder as MirBuilderBox
static box Main { method main(args) {
local j = env.get("PROG_JSON"); if j == null { print("[fail:nojson]"); return 1 }
local out = MirBuilderBox.emit_from_program_json_v0(j, null);
if out == null { print("[fail:builder]"); return 1 }
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
PROG='{"version":0,"kind":"Program","body":[{"type":"Local","name":"i","expr":{"type":"Int","value":0}},{"type":"Local","name":"s","expr":{"type":"Int","value":0}},{"type":"Loop","cond":{"type":"Compare","op":"<","lhs":{"type":"Var","name":"i"},"rhs":{"type":"Int","value":6}},"body":[{"type":"If","cond":{"type":"Compare","op":"==","lhs":{"type":"Var","name":"i"},"rhs":{"type":"Int","value":4}},"then":[{"type":"Break"}]},{"type":"If","cond":{"type":"Compare","op":"==","lhs":{"type":"Var","name":"i"},"rhs":{"type":"Int","value":2}},"then":[{"type":"Continue"}]}]},{"type":"Return","expr":{"type":"Var","name":"s"}}]}'
set +e
out="$(PROG_JSON="$PROG" HAKO_MIR_BUILDER_LOOP_JSONFRAG=1 HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE=1 run_nyash_vm "$tmp_hako" 2>&1 )"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [ "$rc" -ne 0 ]; then echo "[SKIP] loop_sum_bc_jsonfrag_norm: env unstable"; exit 0; fi
if ! echo "$out" | grep -q "\[mirbuilder/normalize:jsonfrag:pass\]"; then echo "[SKIP] loop_sum_bc_jsonfrag_norm: tag missing"; exit 0; fi
mir=$(echo "$out" | extract_mir_from_output)
if [ -z "$mir" ]; then echo "[SKIP] loop_sum_bc_jsonfrag_norm: no MIR (env)"; exit 0; fi
if echo "$mir" | assert_has_tokens '"op":"compare"' '"op":"branch"' '"op":"ret"'; then
echo "[PASS] mirbuilder_internal_loop_sum_bc_jsonfrag_normalize_canary_vm"; exit 0; fi
echo "[SKIP] loop_sum_bc_jsonfrag_norm: tokens not found (env)"; exit 0

View File

@ -0,0 +1,38 @@
#!/usr/bin/env bash
# MirBuilder(minimal return binop) + Normalizer OFF — no tag expected
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
SMOKES_DEV_PREINCLUDE=1 enable_mirbuilder_dev_env
tmp_hako="/tmp/mirbuilder_return_binop_notag_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using hako.mir.builder as MirBuilderBox
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
local j = env.get("PROG_JSON"); if j == null { print("[fail:nojson]"); return 1 }
local out = MirBuilderBox.emit_from_program_json_v0(j, null);
if out == null { print("[fail:builder]"); return 1 }
local norm = env.get("HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE");
if norm != null && ("" + norm) == "1" { out = NormBox.normalize_all(out) }
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
PROG='{"version":0,"kind":"Program","body":[{"type":"Return","expr":{"type":"Binary","op":"-","lhs":{"type":"Int","value":5},"rhs":{"type":"Int","value":3}}}]}'
set +e
out="$(PROG_JSON="$PROG" run_nyash_vm "$tmp_hako" 2>&1 )"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [ "$rc" -ne 0 ]; then echo "[SKIP] return_binop_notag: env unstable"; exit 0; fi
if echo "$out" | grep -q "\[mirbuilder/normalize:jsonfrag:pass\]"; then echo "[FAIL] return_binop_notag: tag present without toggle"; exit 1; fi
mir=$(echo "$out" | extract_mir_from_output)
if [ -z "$mir" ]; then echo "[SKIP] return_binop_notag: no MIR (env)"; exit 0; fi
echo "[PASS] mirbuilder_internal_return_binop_jsonfrag_nonormalize_no_tag_canary_vm"
exit 0

View File

@ -0,0 +1,40 @@
#!/usr/bin/env bash
# MirBuilder(minimal return binop) + Normalizer ON — tag + tokens check
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
SMOKES_DEV_PREINCLUDE=1 enable_mirbuilder_dev_env
tmp_hako="/tmp/mirbuilder_return_binop_norm_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using hako.mir.builder as MirBuilderBox
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
// Return(Binary(Int,Int))
local j = env.get("PROG_JSON"); if j == null { print("[fail:nojson]"); return 1 }
local out = MirBuilderBox.emit_from_program_json_v0(j, null);
if out == null { print("[fail:builder]"); return 1 }
local norm = env.get("HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE");
if norm != null && ("" + norm) == "1" { out = NormBox.normalize_all(out) }
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
PROG='{"version":0,"kind":"Program","body":[{"type":"Return","expr":{"type":"Binary","op":"+","lhs":{"type":"Int","value":1},"rhs":{"type":"Int","value":2}}}]}'
set +e
out="$(PROG_JSON="$PROG" HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE=1 run_nyash_vm "$tmp_hako" 2>&1 )"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [ "$rc" -ne 0 ]; then echo "[SKIP] return_binop_norm: env unstable"; exit 0; fi
if ! echo "$out" | grep -q "\[mirbuilder/normalize:jsonfrag:pass\]"; then echo "[SKIP] return_binop_norm: tag missing"; exit 0; fi
mir=$(echo "$out" | extract_mir_from_output)
if [ -z "$mir" ]; then echo "[SKIP] return_binop_norm: no MIR (env)"; exit 0; fi
if echo "$mir" | assert_has_tokens '"op":"const"' '"op":"binop"' '"op":"ret"'; then
echo "[PASS] mirbuilder_internal_return_binop_jsonfrag_normalize_canary_vm"; exit 0; fi
echo "[SKIP] return_binop_norm: tokens not found (env)"; exit 0

View File

@ -0,0 +1,37 @@
#!/usr/bin/env bash
# Verify const dedupe is block-local (identical const in different blocks remain both present)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
tmp_hako="/tmp/mirbuilder_normalizer_crossblock_const_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
local m = env.get("MIR"); if m == null { print("[fail:nomir]"); return 1 }
local out = NormBox.normalize_all("" + m)
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
# Two blocks with same const (dst=1, i64 42); should remain two occurrences after normalization
MIR='{"functions":[{"name":"main","params":[],"locals":[],"blocks":[
{"id":0,"instructions":[{"op":"const","dst":1,"value":{"type":"i64","value":42}},{"op":"ret","value":1}]},
{"id":1,"instructions":[{"op":"const","dst":1,"value":{"type":"i64","value":42}},{"op":"ret","value":1}]}
]}]}'
set +e
out="$(MIR="$MIR" run_nyash_vm "$tmp_hako" 2>&1)"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [[ "$rc" -ne 0 ]]; then echo "[SKIP] normalizer_crossblock_const: vm exec unstable"; exit 0; fi
mir=$(echo "$out" | extract_mir_from_output)
if [[ -z "$mir" ]]; then echo "[SKIP] normalizer_crossblock_const: no MIR"; exit 0; fi
if echo "$mir" | assert_token_count '"op":"const","dst":1,"value":{"type":"i64","value":42}' 2; then
echo "[PASS] mirbuilder_jsonfrag_normalizer_const_crossblock_no_dedupe_canary_vm"; exit 0; fi
echo "[SKIP] normalizer_crossblock_const: count mismatch"; exit 0

View File

@ -0,0 +1,39 @@
#!/usr/bin/env bash
# Verify f64 big exponent canonicalization (1e+05 → 100000.0, 1e-03 → 0.001)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
tmp_hako="/tmp/mirbuilder_normalizer_f64_bigexp_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
local m = env.get("MIR"); if m == null { print("[fail:nomir]"); return 1 }
local out = NormBox.normalize_all("" + m)
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
MIR='{"functions":[{"name":"main","params":[],"locals":[],"blocks":[{"id":0,"instructions":[
{"op":"const","dst":1,"value":{"type":"f64","value":1e+05}},
{"op":"const","dst":2,"value":{"type":"f64","value":1e-03}},
{"op":"ret","value":1}
]}]}]}'
set +e
out="$(MIR="$MIR" run_nyash_vm "$tmp_hako" 2>&1)"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [[ "$rc" -ne 0 ]]; then echo "[SKIP] normalizer_f64_bigexp: vm exec unstable"; exit 0; fi
mir=$(echo "$out" | extract_mir_from_output)
if [[ -z "$mir" ]]; then echo "[SKIP] normalizer_f64_bigexp: no MIR"; exit 0; fi
if ! echo "$mir" | assert_has_tokens '"type":"f64","value":100000.0' '"type":"f64","value":0.001'; then echo "[SKIP] normalizer_f64_bigexp: canonical tokens missing"; exit 0; fi
echo "[PASS] mirbuilder_jsonfrag_normalizer_const_f64_bigexp_canary_vm"
exit 0

View File

@ -0,0 +1,42 @@
#!/usr/bin/env bash
# Verify f64 value canonicalization (e.g., 3.140000 → 3.14; 3 → 3.0)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
tmp_hako="/tmp/mirbuilder_normalizer_f64canon_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
local m = env.get("MIR"); if m == null { print("[fail:nomir]"); return 1 }
local out = NormBox.normalize_all("" + m)
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
# f64 represented with redundant zeros should be trimmed to minimal decimal form
MIR='{"functions":[{"name":"main","params":[],"locals":[],"blocks":[{"id":0,"instructions":[
{"op":"const","dst":2,"value":{"type":"f64","value":3.140000}},
{"op":"const","dst":1,"value":{"type":"f64","value":3}},
{"op":"ret","value":2}
]}]}]}'
set +e
out="$(MIR="$MIR" run_nyash_vm "$tmp_hako" 2>&1)"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [[ "$rc" -ne 0 ]]; then echo "[SKIP] normalizer_f64canon: vm exec unstable"; exit 0; fi
mir=$(echo "$out" | extract_mir_from_output)
if [[ -z "$mir" ]]; then echo "[SKIP] normalizer_f64canon: no MIR"; exit 0; fi
# Expect canonical values present; redundant form absent
if ! echo "$mir" | assert_has_tokens '"type":"f64","value":3.14' '"type":"f64","value":3.0'; then echo "[SKIP] normalizer_f64canon: canonical tokens missing"; exit 0; fi
if ! echo "$mir" | assert_token_count '"type":"f64","value":3.140000' 0; then echo "[SKIP] normalizer_f64canon: redundant form remains"; exit 0; fi
echo "[PASS] mirbuilder_jsonfrag_normalizer_const_f64_canonicalize_canary_vm"
exit 0

View File

@ -0,0 +1,41 @@
#!/usr/bin/env bash
# Verify f64 exponent canonicalization and -0.0 → 0.0 normalization
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
tmp_hako="/tmp/mirbuilder_normalizer_f64_exp_negzero_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
local m = env.get("MIR"); if m == null { print("[fail:nomir]"); return 1 }
local out = NormBox.normalize_all("" + m)
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
# f64 exponent and -0.0 handling
MIR='{"functions":[{"name":"main","params":[],"locals":[],"blocks":[{"id":0,"instructions":[
{"op":"const","dst":1,"value":{"type":"f64","value":1.230e+01}},
{"op":"const","dst":2,"value":{"type":"f64","value":-0.0}},
{"op":"ret","value":1}
]}]}]}'
set +e
out="$(MIR="$MIR" run_nyash_vm "$tmp_hako" 2>&1)"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [[ "$rc" -ne 0 ]]; then echo "[SKIP] normalizer_f64_exp_negzero: vm exec unstable"; exit 0; fi
mir=$(echo "$out" | extract_mir_from_output)
if [[ -z "$mir" ]]; then echo "[SKIP] normalizer_f64_exp_negzero: no MIR"; exit 0; fi
# Expect 1.230e+01 → 12.3 and -0.0 → 0.0
if ! echo "$mir" | assert_has_tokens '"type":"f64","value":12.3' '"type":"f64","value":0.0'; then echo "[SKIP] normalizer_f64_exp_negzero: canonical tokens missing"; exit 0; fi
echo "[PASS] mirbuilder_jsonfrag_normalizer_const_f64_exponent_negzero_canary_vm"
exit 0

View File

@ -0,0 +1,44 @@
#!/usr/bin/env bash
# Verify Normalizer dedupes const for f64 and String values
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
tmp_hako="/tmp/mirbuilder_normalizer_const_fx_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
local m = env.get("MIR"); if m == null { print("[fail:nomir]"); return 1 }
local out = NormBox.normalize_all("" + m)
if out == null { print("[fail:norm]"); return 1 }
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
# MIR: duplicate f64 and duplicate String constants on same dst → expect single occurrence each
MIR='{"functions":[{"name":"main","params":[],"locals":[],"blocks":[{"id":0,"instructions":[
{"op":"const","dst":2,"value":{"type":"f64","value":3.140000}},
{"op":"const","dst":2,"value":{"type":"f64","value":3.14}},
{"op":"const","dst":3,"value":{"String":"hello"}},
{"op":"const","dst":3,"value":{"String":"hello"}},
{"op":"ret","value":2}
]}]}]}'
set +e
out="$(MIR="$MIR" run_nyash_vm "$tmp_hako" 2>&1)"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [[ "$rc" -ne 0 ]]; then echo "[SKIP] normalizer_const_fx: vm exec unstable"; exit 0; fi
mir=$(echo "$out" | extract_mir_from_output)
if [[ -z "$mir" ]]; then echo "[SKIP] normalizer_const_fx: no MIR"; exit 0; fi
# f64 and String const should each remain oncedstごとに1件
if ! echo "$mir" | assert_token_count '"op":"const","dst":2,' 1; then echo "[SKIP] normalizer_const_fx: f64 const dedupe failed"; exit 0; fi
if ! echo "$mir" | assert_token_count '"op":"const","dst":3,' 1; then echo "[SKIP] normalizer_const_fx: String const dedupe failed"; exit 0; fi
echo "[PASS] mirbuilder_jsonfrag_normalizer_const_f64_string_dedupe_canary_vm"
exit 0

View File

@ -0,0 +1,43 @@
#!/usr/bin/env bash
# Verify JsonFrag Normalizer: phi grouping, ret value insertion, const dedupe (direct call)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
tmp_hako="/tmp/mirbuilder_jsonfrag_normalizer_direct_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
// Input MIR(JSON) crafted with: non-head phi, duplicate const, ret without value
local m = env.get("MIR"); if m == null { print("[fail:nomir]"); return 1 }
local out = NormBox.normalize_all("" + m)
if out == null { print("[fail:norm]"); return 1 }
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
# Craft MIR(JSON) with phi late, duplicate const, and ret without value
MIR='{"functions":[{"name":"main","params":[],"locals":[],"blocks":[{"id":0,"instructions":[{"op":"compare","operation":"<","lhs":1,"rhs":2,"dst":9},{"op":"const","dst":1,"value":{"type":"i64","value":42}},{"op":"ret"},{"op":"const","dst":1,"value":{"type":"i64","value":42}},{"op":"phi","dst":5,"values":[{"value":1,"block":1},{"value":2,"block":2}]}]}]}]}'
set +e
out="$(MIR="$MIR" run_nyash_vm "$tmp_hako" 2>&1)"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [[ "$rc" -ne 0 ]]; then echo "[SKIP] normalizer_direct: vm exec unstable"; exit 0; fi
mir=$(echo "$out" | extract_mir_from_output)
if [[ -z "$mir" ]]; then echo "[SKIP] normalizer_direct: no MIR"; exit 0; fi
# 1) phi grouped before compare
if ! echo "$mir" | assert_order '"op":"phi"' '"op":"compare"'; then echo "[SKIP] normalizer_direct: phi not grouped"; exit 0; fi
# 2) ret has value
if ! echo "$mir" | assert_has_tokens '"op":"ret","value"'; then echo "[SKIP] normalizer_direct: ret value missing"; exit 0; fi
# 3) const deduped (dst 1, i64 42 occurs once)
if ! echo "$mir" | assert_token_count '"op":"const","dst":1,"value":{"type":"i64","value":42}' 1; then echo "[SKIP] normalizer_direct: const dedupe failed"; exit 0; fi
echo "[PASS] mirbuilder_jsonfrag_normalizer_direct_canary_vm"
exit 0

View File

@ -0,0 +1,43 @@
#!/usr/bin/env bash
# Verify normalize_all is idempotent (normalize twice → same output)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
tmp_hako="/tmp/mirbuilder_normalizer_idem_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
local m = env.get("MIR"); if m == null { print("[fail:nomir]"); return 1 }
local out1 = NormBox.normalize_all("" + m)
local out2 = NormBox.normalize_all(out1)
print("[MIR1_BEGIN]"); print("" + out1); print("[MIR1_END]")
print("[MIR2_BEGIN]"); print("" + out2); print("[MIR2_END]")
return 0
} }
HAKO
# Input MIR: mixed order + duplicate const; normalize twice should stabilize
MIR='{"functions":[{"name":"main","params":[],"locals":[],"blocks":[{"id":0,"instructions":[
{"op":"compare","operation":"<","lhs":1,"rhs":2,"dst":9},
{"op":"const","dst":1,"value":{"type":"i64","value":42}},
{"op":"ret"},
{"op":"const","dst":1,"value":{"type":"i64","value":42}},
{"op":"phi","dst":5,"values":[{"value":1,"block":1},{"value":2,"block":2}]}
]}]}]}'
set +e
out="$(MIR="$MIR" run_nyash_vm "$tmp_hako" 2>&1)"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [[ "$rc" -ne 0 ]]; then echo "[SKIP] normalizer_idempotent: vm exec unstable"; exit 0; fi
mir1=$(echo "$out" | awk '/\[MIR1_BEGIN\]/{f=1;next}/\[MIR1_END\]/{f=0}f')
mir2=$(echo "$out" | awk '/\[MIR2_BEGIN\]/{f=1;next}/\[MIR2_END\]/{f=0}f')
if [[ -z "$mir1" || -z "$mir2" ]]; then echo "[SKIP] normalizer_idempotent: outputs missing"; exit 0; fi
if [[ "$mir1" != "$mir2" ]]; then echo "[FAIL] normalizer_idempotent: outputs differ"; exit 1; fi
echo "[PASS] mirbuilder_jsonfrag_normalizer_idempotent_canary_vm"
exit 0

View File

@ -0,0 +1,42 @@
#!/usr/bin/env bash
# Verify Normalizer preserves relative order of multiple phi in same block and moves them to head
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
tmp_hako="/tmp/mirbuilder_normalizer_multi_phi_same_block_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
local m = env.get("MIR"); if m == null { print("[fail:nomir]"); return 1 }
local out = NormBox.normalize_all("" + m)
if out == null { print("[fail:norm]"); return 1 }
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
# One block containing: const, phi(dst=5), compare, phi(dst=6), ret → expect phi(5), phi(6) first
MIR='{"functions":[{"name":"main","params":[],"locals":[],"blocks":[{"id":0,"instructions":[{"op":"const","dst":1,"value":{"type":"i64","value":0}},{"op":"phi","dst":5,"values":[{"value":1,"block":1}]},{"op":"compare","operation":"<","lhs":1,"rhs":2,"dst":9},{"op":"phi","dst":6,"values":[{"value":2,"block":0}]},{"op":"ret","value":1}]}]}]}'
set +e
out="$(MIR="$MIR" run_nyash_vm "$tmp_hako" 2>&1)"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [[ "$rc" -ne 0 ]]; then echo "[SKIP] normalizer_multi_phi_same_block: vm exec unstable"; exit 0; fi
mir=$(echo "$out" | extract_mir_from_output)
if [[ -z "$mir" ]]; then echo "[SKIP] normalizer_multi_phi_same_block: no MIR"; exit 0; fi
# Check: both phi occur, phi(5) before phi(6), and both before compare/const/ret
if ! echo "$mir" | assert_token_count '"op":"phi"' 2; then echo "[SKIP] normalizer_multi_phi_same_block: phi count != 2"; exit 0; fi
if ! echo "$mir" | assert_order '"op":"phi","dst":5' '"op":"phi","dst":6'; then echo "[SKIP] normalizer_multi_phi_same_block: phi order unstable"; exit 0; fi
if ! echo "$mir" | assert_order '"op":"phi"' '"op":"compare"'; then echo "[SKIP] normalizer_multi_phi_same_block: phi not before compare"; exit 0; fi
if ! echo "$mir" | assert_order '"op":"phi"' '"op":"const"'; then echo "[SKIP] normalizer_multi_phi_same_block: phi not before const"; exit 0; fi
if ! echo "$mir" | assert_order '"op":"phi"' '"op":"ret"'; then echo "[SKIP] normalizer_multi_phi_same_block: phi not before ret"; exit 0; fi
echo "[PASS] mirbuilder_jsonfrag_normalizer_multi_phi_same_block_order_canary_vm"
exit 0

View File

@ -0,0 +1,43 @@
#!/usr/bin/env bash
# Verify Normalizer groups phi at head for multiple blocks and preserves stable order
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
tmp_hako="/tmp/mirbuilder_normalizer_multibb_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
local m = env.get("MIR"); if m == null { print("[fail:nomir]"); return 1 }
local out = NormBox.normalize_all("" + m)
if out == null { print("[fail:norm]"); return 1 }
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
# Two blocks, each with late phi followed by const/ret; expect phi moved ahead of others in each block
MIR='{"functions":[{"name":"main","params":[],"locals":[],"blocks":[
{"id":0,"instructions":[{"op":"const","dst":1,"value":{"type":"i64","value":0}},{"op":"ret"},{"op":"phi","dst":5,"values":[{"value":1,"block":1}]}]},
{"id":1,"instructions":[{"op":"compare","operation":"<","lhs":1,"rhs":2,"dst":9},{"op":"phi","dst":7,"values":[{"value":2,"block":0}]},{"op":"const","dst":2,"value":{"type":"i64","value":2}}]}
]}]}'
set +e
out="$(MIR="$MIR" run_nyash_vm "$tmp_hako" 2>&1)"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [[ "$rc" -ne 0 ]]; then echo "[SKIP] normalizer_multibb: vm exec unstable"; exit 0; fi
mir=$(echo "$out" | extract_mir_from_output)
if [[ -z "$mir" ]]; then echo "[SKIP] normalizer_multibb: no MIR"; exit 0; fi
# We cannot easily target per-block segments without a parser; instead check that the first phi appears before first compare and before first const/ret
if ! echo "$mir" | assert_order '"op":"phi"' '"op":"compare"'; then echo "[SKIP] normalizer_multibb: phi not before compare"; exit 0; fi
if ! echo "$mir" | assert_order '"op":"phi"' '"op":"const"'; then echo "[SKIP] normalizer_multibb: phi not before const"; exit 0; fi
if ! echo "$mir" | assert_order '"op":"phi"' '"op":"ret"'; then echo "[SKIP] normalizer_multibb: phi not before ret"; exit 0; fi
echo "[PASS] mirbuilder_jsonfrag_normalizer_multiblock_phi_order_canary_vm"
exit 0

View File

@ -0,0 +1,40 @@
#!/usr/bin/env bash
# Verify PHI grouping for many incoming values (phi moved to block head)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
tmp_hako="/tmp/mirbuilder_normalizer_phi_many_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
local m = env.get("MIR"); if m == null { print("[fail:nomir]"); return 1 }
local out = NormBox.normalize_all("" + m)
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
# Block with phi late and many incomings; expect phi appear before compare/const/ret after normalization
MIR='{"functions":[{"name":"main","params":[],"locals":[],"blocks":[
{"id":0,"instructions":[{"op":"compare","operation":"<","lhs":1,"rhs":2,"dst":9},{"op":"const","dst":3,"value":{"type":"i64","value":100}},{"op":"phi","dst":7,"values":[{"value":3,"block":1},{"value":4,"block":2},{"value":5,"block":3}]},{"op":"ret","value":7}]}
]}]}'
set +e
out="$(MIR="$MIR" run_nyash_vm "$tmp_hako" 2>&1)"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [[ "$rc" -ne 0 ]]; then echo "[SKIP] phi_many_incomings: vm exec unstable"; exit 0; fi
mir=$(echo "$out" | extract_mir_from_output)
if [[ -z "$mir" ]]; then echo "[SKIP] phi_many_incomings: no MIR"; exit 0; fi
if ! echo "$mir" | assert_order '"op":"phi"' '"op":"compare"'; then echo "[SKIP] phi_many_incomings: phi not before compare"; exit 0; fi
if ! echo "$mir" | assert_order '"op":"phi"' '"op":"const"'; then echo "[SKIP] phi_many_incomings: phi not before const"; exit 0; fi
if ! echo "$mir" | assert_order '"op":"phi"' '"op":"ret"'; then echo "[SKIP] phi_many_incomings: phi not before ret"; exit 0; fi
echo "[PASS] mirbuilder_jsonfrag_normalizer_phi_many_incomings_order_canary_vm"
exit 0

View File

@ -0,0 +1,43 @@
#!/usr/bin/env bash
# Verify PHI grouping on multi-stage merges (two-level join with final phi at head)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
tmp_hako="/tmp/mirbuilder_normalizer_phi_multistage_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
local m = env.get("MIR"); if m == null { print("[fail:nomir]"); return 1 }
local out = NormBox.normalize_all("" + m)
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
# Multistage: 0 -> (1,2); 1->3 const; 2->4 const; 3,4 -> 5 join with phi late
MIR='{"functions":[{"name":"main","params":[],"locals":[],"blocks":[
{"id":0,"instructions":[{"op":"branch","cond":1,"then":1,"else":2}]},
{"id":1,"instructions":[{"op":"const","dst":10,"value":{"type":"i64","value":11}},{"op":"branch","cond":1,"then":3,"else":3}]},
{"id":2,"instructions":[{"op":"const","dst":20,"value":{"type":"i64","value":22}},{"op":"branch","cond":1,"then":4,"else":4}]},
{"id":3,"instructions":[{"op":"const","dst":30,"value":{"type":"i64","value":33}},{"op":"branch","cond":1,"then":5,"else":5}]},
{"id":4,"instructions":[{"op":"const","dst":40,"value":{"type":"i64","value":44}},{"op":"branch","cond":1,"then":5,"else":5}]},
{"id":5,"instructions":[{"op":"compare","operation":"<","lhs":1,"rhs":2,"dst":9},{"op":"phi","dst":7,"values":[{"value":30,"block":3},{"value":40,"block":4}]},{"op":"ret","value":7}]}]}]}'
set +e
out="$(MIR="$MIR" run_nyash_vm "$tmp_hako" 2>&1)"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [[ "$rc" -ne 0 ]]; then echo "[SKIP] phi_multistage_merge: vm exec unstable"; exit 0; fi
mir=$(echo "$out" | extract_mir_from_output)
if [[ -z "$mir" ]]; then echo "[SKIP] phi_multistage_merge: no MIR"; exit 0; fi
if ! echo "$mir" | assert_order '"op":"phi"' '"op":"compare"'; then echo "[SKIP] phi_multistage_merge: phi not before compare"; exit 0; fi
if ! echo "$mir" | assert_order '"op":"phi"' '"op":"ret"'; then echo "[SKIP] phi_multistage_merge: phi not before ret"; exit 0; fi
echo "[PASS] mirbuilder_jsonfrag_normalizer_phi_multistage_merge_order_canary_vm"
exit 0

View File

@ -0,0 +1,42 @@
#!/usr/bin/env bash
# Verify PHI grouping for nested merge scenario (phi appears before compare/const/ret after normalization)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
source "$ROOT/tools/smokes/v2/lib/mir_canary.sh" || true
tmp_hako="/tmp/mirbuilder_normalizer_phi_nested_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
local m = env.get("MIR"); if m == null { print("[fail:nomir]"); return 1 }
local out = NormBox.normalize_all("" + m)
print("[MIR_BEGIN]"); print("" + out); print("[MIR_END]")
return 0
} }
HAKO
# Craft a nested-style merge: block 3 acts as a join with phi late in the instruction list
MIR='{"functions":[{"name":"main","params":[],"locals":[],"blocks":[
{"id":0,"instructions":[{"op":"branch","cond":1,"then":1,"else":2}]},
{"id":1,"instructions":[{"op":"const","dst":10,"value":{"type":"i64","value":1}},{"op":"branch","cond":1,"then":3,"else":3}]},
{"id":2,"instructions":[{"op":"const","dst":20,"value":{"type":"i64","value":2}},{"op":"branch","cond":1,"then":3,"else":3}]},
{"id":3,"instructions":[{"op":"compare","operation":"<","lhs":1,"rhs":2,"dst":9},{"op":"const","dst":30,"value":{"type":"i64","value":3}},{"op":"phi","dst":5,"values":[{"value":10,"block":1},{"value":20,"block":2}]},{"op":"ret","value":5}]}]}]}'
set +e
out="$(MIR="$MIR" run_nyash_vm "$tmp_hako" 2>&1)"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [[ "$rc" -ne 0 ]]; then echo "[SKIP] phi_nested_merge: vm exec unstable"; exit 0; fi
mir=$(echo "$out" | extract_mir_from_output)
if [[ -z "$mir" ]]; then echo "[SKIP] phi_nested_merge: no MIR"; exit 0; fi
if ! echo "$mir" | assert_order '"op":"phi"' '"op":"compare"'; then echo "[SKIP] phi_nested_merge: phi not before compare"; exit 0; fi
if ! echo "$mir" | assert_order '"op":"phi"' '"op":"const"'; then echo "[SKIP] phi_nested_merge: phi not before const"; exit 0; fi
if ! echo "$mir" | assert_order '"op":"phi"' '"op":"ret"'; then echo "[SKIP] phi_nested_merge: phi not before ret"; exit 0; fi
echo "[PASS] mirbuilder_jsonfrag_normalizer_phi_nested_merge_order_canary_vm"
exit 0

View File

@ -0,0 +1,46 @@
#!/usr/bin/env bash
# RC parity: MirBuilder output vs Normalized output must produce identical return codes (Return BinOp)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
tmp_hako="/tmp/mirbuilder_rc_parity_binop_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using "hako.mir.builder" as MirBuilderBox
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
local j = env.get("PROG_JSON"); if j == null { print("[fail:nojson]"); return 1 }
local out = MirBuilderBox.emit_from_program_json_v0(j, null)
if out == null { print("[fail:builder]"); return 1 }
local outn = NormBox.normalize_all(out)
print("[A_BEGIN]"); print("" + out); print("[A_END]")
print("[B_BEGIN]"); print("" + outn); print("[B_END]")
return 0
} }
HAKO
PROG='{"version":0,"kind":"Program","body":[{"type":"Return","expr":{"type":"Binary","op":"*","lhs":{"type":"Int","value":3},"rhs":{"type":"Int","value":4}}}]}'
set +e
out="$(PROG_JSON="$PROG" run_nyash_vm "$tmp_hako" 2>&1)"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [ "$rc" -ne 0 ]; then echo "[SKIP] rc_parity_binop: env unstable"; exit 0; fi
mir_a=$(echo "$out" | awk '/\[A_BEGIN\]/{f=1;next}/\[A_END\]/{f=0}f')
mir_b=$(echo "$out" | awk '/\[B_BEGIN\]/{f=1;next}/\[B_END\]/{f=0}f')
if [ -z "$mir_a" ] || [ -z "$mir_b" ]; then echo "[SKIP] rc_parity_binop: MIR missing"; exit 0; fi
tmp_a="/tmp/mir_a_$$.json"; tmp_b="/tmp/mir_b_$$.json"
printf '%s' "$mir_a" > "$tmp_a"; printf '%s' "$mir_b" > "$tmp_b"
set +e
verify_mir_rc "$tmp_a"; rc_a=$?
verify_mir_rc "$tmp_b"; rc_b=$?
set -e
rm -f "$tmp_a" "$tmp_b" || true
if [ "$rc_a" -eq "$rc_b" ]; then echo "[PASS] mirbuilder_jsonfrag_normalizer_rc_parity_binop_canary_vm"; exit 0; fi
echo "[FAIL] rc_parity_binop: rc_a=$rc_a rc_b=$rc_b"; exit 1

View File

@ -0,0 +1,46 @@
#!/usr/bin/env bash
# RC parity: MirBuilder output vs Normalized output must produce identical return codes (If/Compare)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
tmp_hako="/tmp/mirbuilder_rc_parity_if_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using "hako.mir.builder" as MirBuilderBox
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
local j = env.get("PROG_JSON"); if j == null { print("[fail:nojson]"); return 1 }
local out = MirBuilderBox.emit_from_program_json_v0(j, null)
if out == null { print("[fail:builder]"); return 1 }
local outn = NormBox.normalize_all(out)
print("[A_BEGIN]"); print("" + out); print("[A_END]")
print("[B_BEGIN]"); print("" + outn); print("[B_END]")
return 0
} }
HAKO
PROG='{"version":0,"kind":"Program","body":[{"type":"If","cond":{"type":"Compare","op":"<=","lhs":{"type":"Int","value":2},"rhs":{"type":"Int","value":2}},"then":[{"type":"Return","expr":{"type":"Int","value":7}}],"else":[{"type":"Return","expr":{"type":"Int","value":9}}]}]}'
set +e
out="$(PROG_JSON="$PROG" run_nyash_vm "$tmp_hako" 2>&1)"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [ "$rc" -ne 0 ]; then echo "[SKIP] rc_parity_if: env unstable"; exit 0; fi
mir_a=$(echo "$out" | awk '/\[A_BEGIN\]/{f=1;next}/\[A_END\]/{f=0}f')
mir_b=$(echo "$out" | awk '/\[B_BEGIN\]/{f=1;next}/\[B_END\]/{f=0}f')
if [ -z "$mir_a" ] || [ -z "$mir_b" ]; then echo "[SKIP] rc_parity_if: MIR missing"; exit 0; fi
tmp_a="/tmp/mir_a_$$.json"; tmp_b="/tmp/mir_b_$$.json"
printf '%s' "$mir_a" > "$tmp_a"; printf '%s' "$mir_b" > "$tmp_b"
set +e
verify_mir_rc "$tmp_a"; rc_a=$?
verify_mir_rc "$tmp_b"; rc_b=$?
set -e
rm -f "$tmp_a" "$tmp_b" || true
if [ "$rc_a" -eq "$rc_b" ]; then echo "[PASS] mirbuilder_jsonfrag_normalizer_rc_parity_if_canary_vm"; exit 0; fi
echo "[FAIL] rc_parity_if: rc_a=$rc_a rc_b=$rc_b"; exit 1

View File

@ -0,0 +1,47 @@
#!/usr/bin/env bash
# RC parity: MirBuilder output vs Normalized output must produce identical return codes (Loop simple)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"; if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then ROOT="$ROOT_GIT"; else ROOT="$(cd "$SCRIPT_DIR/../../../../../../../../.." && pwd)"; fi
source "$ROOT/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
tmp_hako="/tmp/mirbuilder_rc_parity_loop_$$.hako"
cat > "$tmp_hako" <<'HAKO'
using "hako.mir.builder" as MirBuilderBox
using "hako.mir.builder.internal.jsonfrag_normalizer" as NormBox
static box Main { method main(args) {
local j = env.get("PROG_JSON"); if j == null { print("[fail:nojson]"); return 1 }
local out = MirBuilderBox.emit_from_program_json_v0(j, null)
if out == null { print("[fail:builder]"); return 1 }
local outn = NormBox.normalize_all(out)
print("[A_BEGIN]"); print("" + out); print("[A_END]")
print("[B_BEGIN]"); print("" + outn); print("[B_END]")
return 0
} }
HAKO
# i=0; loop(i<3){ i=i+1 }; return i → rc=3
PROG='{"version":0,"kind":"Program","body":[{"type":"Local","name":"i","expr":{"type":"Int","value":0}},{"type":"Loop","cond":{"type":"Compare","op":"<","lhs":{"type":"Var","name":"i"},"rhs":{"type":"Int","value":3}},"body":[{"type":"Local","name":"i","expr":{"type":"Binary","op":"+","lhs":{"type":"Var","name":"i"},"rhs":{"type":"Int","value":1}}}]},{"type":"Return","expr":{"type":"Var","name":"i"}}]}'
set +e
out="$(PROG_JSON="$PROG" run_nyash_vm "$tmp_hako" 2>&1)"; rc=$?
set -e
rm -f "$tmp_hako" || true
if [ "$rc" -ne 0 ]; then echo "[SKIP] rc_parity_loop: env unstable"; exit 0; fi
mir_a=$(echo "$out" | awk '/\[A_BEGIN\]/{f=1;next}/\[A_END\]/{f=0}f')
mir_b=$(echo "$out" | awk '/\[B_BEGIN\]/{f=1;next}/\[B_END\]/{f=0}f')
if [ -z "$mir_a" ] || [ -z "$mir_b" ]; then echo "[SKIP] rc_parity_loop: MIR missing"; exit 0; fi
tmp_a="/tmp/mir_a_$$.json"; tmp_b="/tmp/mir_b_$$.json"
printf '%s' "$mir_a" > "$tmp_a"; printf '%s' "$mir_b" > "$tmp_b"
set +e
verify_mir_rc "$tmp_a"; rc_a=$?
verify_mir_rc "$tmp_b"; rc_b=$?
set -e
rm -f "$tmp_a" "$tmp_b" || true
if [ "$rc_a" -eq "$rc_b" ]; then echo "[PASS] mirbuilder_jsonfrag_normalizer_rc_parity_loop_canary_vm"; exit 0; fi
echo "[FAIL] rc_parity_loop: rc_a=$rc_a rc_b=$rc_b"; exit 1

View File

@ -0,0 +1,39 @@
#!/usr/bin/env bash
# provider_select_plugin_only_filebox_canary_vm.sh — plugin-only モードで ring1 が選ばれないことを確認環境によりSKIP
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then
ROOT_DIR="$ROOT_GIT"
else
ROOT_DIR="$(cd "$SCRIPT_DIR/../../../../../../.." && pwd)"
fi
BIN="${ROOT_DIR}/target/release/hakorune"
if [[ ! -x "${BIN}" ]]; then echo "[SKIP] hakorune not built"; exit 0; fi
source "$ROOT_DIR/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
TMP_HAKO=$(mktemp --suffix .hako)
cat >"${TMP_HAKO}" <<'HAKO'
static box Main { method main(args) {
// Do nothing; we only care about provider selection logs
return 0
} }
HAKO
set +e
out="$(NYASH_FILEBOX_MODE=plugin-only HAKO_PROVIDER_TRACE=1 "${BIN}" --backend vm "${TMP_HAKO}" 2>&1 | filter_noise)"; rc=$?
set -e
rm -f "$TMP_HAKO" || true
# In plugin-only mode, ring1 must not be selected; tag for ring1 must be absent.
if echo "$out" | grep -q "\[provider/select:FileBox ring=1"; then
echo "[FAIL] provider_select_plugin_only: ring1 selected unexpectedly" >&2; exit 1
fi
# Environment may lack plugins; accept rc!=0 and treat as SKIP when no plugin tag appears.
if [[ "$rc" -ne 0 ]]; then
echo "[SKIP] provider_select_plugin_only: no plugin available (env)"; exit 0
fi
echo "[PASS] provider_select_plugin_only_filebox_canary_vm"
exit 0

View File

@ -0,0 +1,43 @@
#!/usr/bin/env bash
# provider_select_ring1_filebox_canary_vm.sh — HAKO_PROVIDER_POLICY=safe-core-first で ring1 選択タグを検証
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
if ROOT_GIT=$(git -C "$SCRIPT_DIR" rev-parse --show-toplevel 2>/dev/null); then
ROOT_DIR="$ROOT_GIT"
else
ROOT_DIR="$(cd "$SCRIPT_DIR/../../../../../../.." && pwd)"
fi
BIN="${ROOT_DIR}/target/release/hakorune"
if [[ ! -x "${BIN}" ]]; then echo "[SKIP] hakorune not built"; exit 0; fi
source "$ROOT_DIR/tools/smokes/v2/lib/test_runner.sh"; require_env || exit 2
enable_mirbuilder_dev_env
TMP=$(mktemp); echo "hello" > "$TMP"; trap 'rm -f "$TMP" || true' EXIT
TMP_HAKO=$(mktemp --suffix .hako)
cat >"${TMP_HAKO}" <<HAKO
static box Main { method main(args) {
local f = FileBox.open("${TMP}");
local s = f.read();
print("[OUT]"); print("" + s); print("[END]");
return 0
} }
HAKO
set +e
out="$(NYASH_FAIL_FAST=0 HAKO_PROVIDER_POLICY=safe-core-first NYASH_FILEBOX_MODE=auto NYASH_FILEBOX_ALLOW_FALLBACK=1 "${BIN}" --backend vm "${TMP_HAKO}" 2>&1 | filter_noise)"; rc=$?
set -e
rm -f "$TMP_HAKO" || true
if ! grep -q "\[provider/select:FileBox ring=1 src=static\]" <<< "$out"; then
echo "[SKIP] provider_select_ring1 tag missing"; exit 0
fi
# Content check is best-effort; when VM environment lacks plugins for FileBox wrapper, skip content verification.
if [[ "$rc" -eq 0 ]]; then
if ! awk '/\[OUT\]/{f=1;next}/\[END\]/{f=0}f' <<< "$out" | grep -q "hello"; then
echo "[SKIP] provider_select_ring1 content missing (env)"; exit 0
fi
fi
echo "[PASS] provider_select_ring1_filebox_canary_vm"
exit 0

View File

@ -74,6 +74,10 @@ Examples:
# Quick development check # Quick development check
./run.sh --profile quick ./run.sh --profile quick
# Quick with Normalizer ON and heavy AOT cases present
# Tip: increase timeout for EXE build/link reps (e.g., phase2100)
HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE=1 ./run.sh --profile quick --timeout 120
# Integration with filter # Integration with filter
./run.sh --profile integration --filter "plugins:*" ./run.sh --profile integration --filter "plugins:*"