selfhost/runtime: Stage 0-1 runner + MIR JSON loader (summary) with trace; compiler: scopebox/loopform prepass wiring (flags, child args); libs: add P1 standard boxes (console/string/array/map) as thin wrappers; runner: pass --box-pref via env; ops_calls dispatcher skeleton; docs: selfhost executor roadmap + scopebox/loopform notes; smokes: selfhost runner + identity prepasses; CURRENT_TASK: update plan and box lib schedule
This commit is contained in:
43
.github/workflows/min-gate.yml
vendored
43
.github/workflows/min-gate.yml
vendored
@ -25,7 +25,9 @@ jobs:
|
||||
- name: Cargo check (default features)
|
||||
run: cargo check --all-targets -q
|
||||
|
||||
# Disabled by default to keep CI light; run locally when needed
|
||||
pyvm-smoke:
|
||||
if: ${{ false }}
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 20
|
||||
needs: rust-check
|
||||
@ -51,7 +53,9 @@ jobs:
|
||||
- name: JSON smoke — parse error (PyVM+plugins)
|
||||
run: bash tools/test/smoke/selfhost/jsonbox_parse_err.sh
|
||||
|
||||
# Disabled by default to keep CI light; run locally when needed
|
||||
macro-golden:
|
||||
if: ${{ false }}
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 20
|
||||
needs: rust-check
|
||||
@ -97,7 +101,9 @@ jobs:
|
||||
- name: Macro golden — if then → loopform (gated)
|
||||
run: bash tools/test/golden/macro/if_then_loopform_user_macro_golden.sh
|
||||
|
||||
# Disabled by default to keep CI light; run locally when needed
|
||||
macro-smokes-lite:
|
||||
if: ${{ false }}
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 20
|
||||
needs: rust-check
|
||||
@ -134,12 +140,49 @@ jobs:
|
||||
run: bash tools/test/smoke/mir/hints_scope_loop_if_smoke.sh
|
||||
- name: Smoke — ScopeBox enabled (no-op)
|
||||
run: bash tools/test/smoke/mir/scopebox_enable_smoke.sh
|
||||
- name: Smoke — Loop PHI values (then-continue)
|
||||
run: bash tools/test/smoke/loop_phi_values.sh
|
||||
- name: Smoke — MacroCtx JSON (ctx caps)
|
||||
run: bash tools/test/smoke/macro/macro_ctx_json_smoke.sh
|
||||
- name: Smoke — UTF-8 CP strings (length/indexOf/substring)
|
||||
run: bash tools/test/smoke/strings/utf8_cp_smoke.sh
|
||||
|
||||
# Disabled by default to keep CI light; run locally when needed
|
||||
llvm-phi-smoke:
|
||||
if: ${{ false }}
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 20
|
||||
needs: rust-check
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup Rust (stable)
|
||||
uses: dtolnay/rust-toolchain@stable
|
||||
|
||||
- name: Cache cargo
|
||||
uses: Swatinem/rust-cache@v2
|
||||
|
||||
- name: Install ripgrep (for smokes)
|
||||
run: sudo apt-get update && sudo apt-get install -y ripgrep
|
||||
|
||||
- name: Build (llvm + phi)
|
||||
run: |
|
||||
set -euo pipefail
|
||||
if command -v llvm-config-18 >/dev/null 2>&1; then
|
||||
LLVM_SYS_180_PREFIX="$(llvm-config-18 --prefix)" cargo build --release --features "llvm,phi-legacy" -q
|
||||
else
|
||||
cargo build --release --features "llvm,phi-legacy" -q
|
||||
fi
|
||||
|
||||
- name: LLVM harness — empty PHI check (batch)
|
||||
env:
|
||||
NYASH_MIR_NO_PHI: "0"
|
||||
run: bash tools/test/smoke/llvm/ir_phi_empty_check_all.sh
|
||||
|
||||
# Disabled by default to keep CI light; run locally when needed
|
||||
selfhost-preexpand-smoke:
|
||||
if: ${{ false }}
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 20
|
||||
needs: rust-check
|
||||
|
||||
@ -1,6 +1,6 @@
|
||||
# Current Task — Stability Polish (Concise)
|
||||
|
||||
Updated: 2025‑09‑21
|
||||
Updated: 2025‑09‑22
|
||||
|
||||
## Compressed Snapshot (Short)
|
||||
- Strings (UTF‑8/CP vs Byte): baseline done
|
||||
@ -27,6 +27,36 @@ Principles (feature‑pause)
|
||||
- Keep changes minimal/local; no spec changes unless to fix critical issues, and guard any optional paths behind default‑OFF flags.
|
||||
|
||||
### Delta (since last update)
|
||||
- Self‑Host Ny Executor(MIR→Ny 実行器)計画を追加(既定OFF・段階導入)
|
||||
- Docs 追加: `docs/development/roadmap/selfhosting-ny-executor.md`
|
||||
- 目的/原則/フラグ/段階計画/受け入れ/ロールバック/リスクを整理
|
||||
- 既定は PyVM。`NYASH_SELFHOST_EXEC=1` で Ny Executor に委譲(当面 no‑op→順次実装)
|
||||
- Stage 0 実装: スカフォールド + ランナー配線(既定OFF/no‑op)
|
||||
- 追加: `apps/selfhost-runtime/{runner.nyash,mir_loader.nyash,ops_core.nyash,ops_calls.nyash,boxes_std.nyash}`(雛形)
|
||||
- 配線: `src/runner/modes/pyvm.rs` に `NYASH_SELFHOST_EXEC=1` 検出時の Ny ランナー呼び出し(子には同フラグを継承しない)
|
||||
- 受け入れ: cargo check 緑。既定挙動不変。フラグONで no‑op 実行(exit 0)可能。
|
||||
- Stage 1 一部: MIR(JSON v0) の最小ローダ実装(要約抽出のみ)
|
||||
- `apps/selfhost-runtime/mir_loader.nyash`: FileBox で読込→JsonDocBox で parse→version/kind/body_len を要約
|
||||
- `apps/selfhost-runtime/runner.nyash`: args[0] の JSON を読み、{version:0, kind:"Program"} を検証(NGは非0exit)
|
||||
- v0 とハーネスJSON({"functions":…})の両フォーマットを受理。`--trace` で v0 要約/stmt数、ハーネス要約(functions数)を出力
|
||||
- Smoke 追加(任意実行): `tools/test/smoke/selfhost/selfhost_runner_smoke.sh`
|
||||
- Identity 確認スモーク(Selfhost Compiler 直呼び)
|
||||
- ScopeBox: `tools/test/smoke/selfhost/scopebox_identity_smoke.sh`
|
||||
- LoopForm: `tools/test/smoke/selfhost/loopform_identity_smoke.sh`
|
||||
- Selfhost Compiler 前処理導線(既定OFF)
|
||||
- 追加: `apps/lib/{scopebox_inject.nyash,loopform_normalize.nyash}`(恒等版・将来拡張の足場)
|
||||
- 配線: `apps/selfhost/compiler/compiler.nyash` が `--scopebox`/`--loopform` を受理して JSON を前処理
|
||||
- Runner 側: `src/runner/selfhost.rs` が env→子引数にマップ(`NYASH_SCOPEBOX_ENABLE=1` → `--scopebox`、`NYASH_LOOPFORM_NORMALIZE=1` → `--loopform`)
|
||||
- 任意: `NYASH_SELFHOST_CHILD_ARGS` で追加引数を透過
|
||||
- ループ内 if 合流の PHI 決定を MIR で正規化(仕様不変・堅牢化)
|
||||
- 変更: `src/mir/loop_builder.rs`
|
||||
- then/else の代入変数を再帰収集→合流で変数ごとに PHI/直バインドを決定
|
||||
- 到達ブランチのみ incoming に採用(break/continue 終端は除外)
|
||||
- `no_phi_mode` では到達 pred に対して edge‑copy を生成(意味論等価)
|
||||
- 効果: i/printed 等のキャリア変数が合流後に正しく統一。無限ループ/古い SSA 値混入の根治
|
||||
- MiniVmPrints(JSON 経路): 出力総数のカウントを安定化(仕様不変)
|
||||
- 各 Print ステートメントでの実出力回数を 1 回だけ加算するよう整理(Compare を 1/0 直接 print に)
|
||||
- 代表プローブ: A/B/7/1/7/5 → count=6 を確認(PyVM と一致)
|
||||
- JSON provider(yyjsonベンダリング完了・切替はランタイム)
|
||||
- `plugins/nyash-json-plugin/c/yyjson/{yyjson.h,yyjson.c,LICENSE}` を同梱し、`build.rs + cc` で自己完結ビルド。
|
||||
- env `NYASH_JSON_PROVIDER=serde|yyjson`(既定=serde)。nyash.toml の [env] からも設定可能。
|
||||
@ -191,6 +221,48 @@ JSON / Plugin v2(現状に追記)
|
||||
- [ ] method_id キャッシュの統一化(loader 内で Box単位の LRU/Hash で維持)
|
||||
- [x] MiniVmPrints フォールバックは開発用トグルのみ(既定OFF)
|
||||
|
||||
### Box Std Lib(lib化計画・共通化)
|
||||
|
||||
目的
|
||||
- VM/PyVM/自己ホスト実行器で共通に使える“最小面の標準箱”を apps/lib に蒸留し、名前と戻り値を統一する(意味論不変・既定OFF)。
|
||||
|
||||
置き場(Ny libs)
|
||||
- `apps/lib/boxes/{console_std.nyash,string_std.nyash,array_std.nyash,map_std.nyash,path_std.nyash,json_std.nyash}`(段階追加)
|
||||
|
||||
導線/トグル(既定OFF)
|
||||
- 優先度切替(自己ホスト実行器のみ):`NYASH_SELFHOST_BOX_PREF=plugin|ny`(既定=plugin)
|
||||
- `plugin`: 既存のプラグイン/シムを優先(後方互換)
|
||||
- `ny`: lib/boxes 実装を優先(プラグイン未定義メソッドは安全フォールバック)
|
||||
- 導入は include/using ベース(採用側のみ差し替え、広域リネームは行わない)
|
||||
|
||||
優先順位(段階導入)
|
||||
1) P1(高頻度・最小面)
|
||||
- ConsoleBox: `print/println/log` → i64(status)
|
||||
- String: `length/substring/indexOf/lastIndexOf/esc_json`
|
||||
- ArrayBox: `size/len/get/set/push/toString`
|
||||
- MapBox: `size/has/get/set/toString`
|
||||
2) P2(周辺ユーティリティ)
|
||||
- PathBox: `dirname/join`(POSIX 風)
|
||||
- JsonDocBox/JsonNodeBox adaptor: plugin へ薄ラップ(PyVM シムと同名)
|
||||
3) P3(補助)
|
||||
- StringBuilderBox(最小)/ Pattern helpers(既存 libs を整理)
|
||||
|
||||
受け入れ基準
|
||||
- 既定OFFで挙動不変(pref=plugin)。`pref=ny` 時も VM/LLVM/PyVM の出力一致。
|
||||
- 代表スモーク(strings/array/map/path/json)を green。未実装メソッドは no-op/None で安全に退避。
|
||||
|
||||
ロールバック/リスク
|
||||
- ロールバックは `NYASH_SELFHOST_BOX_PREF=plugin` で即時復帰。
|
||||
- 命名衝突は採用側の include/using に限定(既存ファイルの広域変更は行わない)。
|
||||
|
||||
TODO(段階タスク)
|
||||
- [ ] P1: console_std/string_std/array_std/map_std の雛形を追加(apps/lib/boxes/)
|
||||
- [ ] P1: 自己ホスト実行器の BoxCall ディスパッチに `NYASH_SELFHOST_BOX_PREF` を導入(既定=plugin)
|
||||
- [ ] P1: strings/array/map の最小スモークを selfhost 経路で追加
|
||||
- [ ] P2: path_std/json_std の雛形とアダプタ(プラグイン優先のラップ)
|
||||
- [ ] P2: path/json のスモーク(dirname/join、parse/root/get/size/at/str/int/bool)
|
||||
- [ ] P3: string_builder_std と tests の軽量追加(任意)
|
||||
|
||||
Checklist(更新済み)
|
||||
- [x] Self‑host 前展開の固定スモーク 1 本(upper_string)
|
||||
- [x] MacroCtx ctx JSON スモーク 1 本(CI 組み込み)
|
||||
@ -263,3 +335,24 @@ Trigger: nyash_vm の安定(主要スモーク緑・自己ホスト経路が
|
||||
- method_id 解決キャッシュの整備・計測
|
||||
- YYJSON backend のスケルトン追加(既定OFF・プラグイン切替可能設計)
|
||||
- JSON smokes をCI最小ゲートへ追加
|
||||
- Self‑Host(自己ホスト実行器)
|
||||
- [ ] Stage 0: フラグ/ランナー配線のみ(no‑op Ny runner)
|
||||
- [ ] Stage 1: MIR ローダ(JSON→構造体)
|
||||
- [ ] Stage 2: コア命令(const/binop/compare/branch/jump/ret/phi)
|
||||
- [ ] Stage 3: call/externcall/boxcall(MVP)
|
||||
- [ ] Stage 4: Array/Map 最小メソッド
|
||||
- [ ] Stage 5: using/seam 代表ケース安定化
|
||||
- [ ] Stage 6: パリティハーネス/CI(非ブロッキング→昇格)
|
||||
|
||||
### Self‑Host フラグ/戻し手順(記録)
|
||||
- フラグ(既定OFF)
|
||||
- `NYASH_SELFHOST_EXEC=1`: Ny Executor を有効化
|
||||
- `NYASH_SELFHOST_TRACE=1`: 追跡ログ
|
||||
- `NYASH_SELFHOST_STEP_MAX`: ステップ上限
|
||||
- `NYASH_SELFHOST_STRICT=1`: 厳格モード
|
||||
- ロールバック: フラグ OFF で即 PyVM に復帰(既定)。差分は最小・局所で導入。
|
||||
|
||||
### Notes (Stage 0 wiring)
|
||||
- Rust 側は MIR(JSON) を `tmp/nyash_selfhost_mir.json` に出力→Ny ランナー(apps/selfhost-runtime/runner.nyash)へ引き渡し。
|
||||
- 子プロセスへは `NYASH_SELFHOST_EXEC` を伝播しない(再帰配線を防止)。
|
||||
- 現段階の Ny ランナーは no‑op で 0 を返す。次ステージでローダ/ディスパッチを追加。
|
||||
|
||||
@ -17,6 +17,10 @@ e2e = []
|
||||
cli = []
|
||||
plugins-only = []
|
||||
builtin-core = []
|
||||
## Silence check-cfg warnings for historical cfg guards (kept off by default)
|
||||
interpreter-legacy = []
|
||||
vm-legacy = []
|
||||
phi-legacy = []
|
||||
## JIT-direct only mode: disable legacy VM-arg fallback and plugin-builtins branches
|
||||
## (keeps code compiling; VM-integrated JIT paths remain but are inert)
|
||||
jit-direct-only = []
|
||||
|
||||
@ -24,6 +24,8 @@ ExternCall(env.*)と println 正規化: `docs/reference/runtime/externcall.m
|
||||
- 必須不変条件(Invariants): `docs/reference/invariants.md`
|
||||
- 制約(既知/一時/解消済み): `docs/reference/constraints.md`
|
||||
- PHI と SSA の設計: `docs/architecture/phi-and-ssa.md`
|
||||
- 既定のPHI挙動: ビルドが `phi-legacy` を有効化している場合は PHI-ON(推奨)。未有効時は安定性のため PHI-OFF(エッジコピー)にフォールバック。
|
||||
- 実行時切替: `NYASH_MIR_NO_PHI=0`(PHI-ON)、`NYASH_MIR_NO_PHI=1`(PHI-OFF)。
|
||||
- テスト行列(仕様→テスト対応): `docs/guides/testing-matrix.md`
|
||||
- 他言語との比較: `docs/comparison/nyash-vs-others.md`
|
||||
|
||||
|
||||
@ -31,7 +31,10 @@ User Macros (Phase 2): `docs/guides/user-macros.md`
|
||||
Exceptions (postfix catch/cleanup): `docs/guides/exception-handling.md`
|
||||
ScopeBox & MIR hints: `docs/guides/scopebox.md`
|
||||
AST JSON v0 (macro/bridge): `docs/reference/ir/ast-json-v0.md`
|
||||
MIR mode note: default is MIR13 (PHI-off). See `docs/development/mir/MIR13_MODE.md`.
|
||||
MIR mode note: Default PHI behavior
|
||||
- Default is PHI-ON when the build enables `phi-legacy` (recommended). Otherwise it falls back to PHI‑OFF (edge‑copy) for stability.
|
||||
- Force at runtime: `NYASH_MIR_NO_PHI=0` (PHI‑ON), `NYASH_MIR_NO_PHI=1` (PHI‑OFF).
|
||||
- See `docs/architecture/phi-and-ssa.md`.
|
||||
Self‑hosting one‑pager: `docs/how-to/self-hosting.md`.
|
||||
ExternCall (env.*) and println normalization: `docs/reference/runtime/externcall.md`.
|
||||
|
||||
|
||||
33
apps/lib/boxes/array_std.nyash
Normal file
33
apps/lib/boxes/array_std.nyash
Normal file
@ -0,0 +1,33 @@
|
||||
// ArrayStd — minimal standard array helpers (commonized API)
|
||||
|
||||
static box ArrayStd {
|
||||
size(a) {
|
||||
if a == null { return 0 }
|
||||
// support both size/len
|
||||
if a.size { return a.size() }
|
||||
if a.len { return a.len() }
|
||||
return 0
|
||||
}
|
||||
len(a) { return me.size(a) }
|
||||
get(a, i) {
|
||||
if a == null { return null }
|
||||
if a.get { return a.get(i) }
|
||||
return null
|
||||
}
|
||||
set(a, i, v) {
|
||||
if a == null { return 0 }
|
||||
if a.set { a.set(i, v) return 0 }
|
||||
return 0
|
||||
}
|
||||
push(a, v) {
|
||||
if a == null { return 0 }
|
||||
if a.push { return a.push(v) }
|
||||
return 0
|
||||
}
|
||||
toString(a) {
|
||||
if a == null { return "[]" }
|
||||
if a.toString { return a.toString() }
|
||||
return "[]"
|
||||
}
|
||||
}
|
||||
|
||||
21
apps/lib/boxes/console_std.nyash
Normal file
21
apps/lib/boxes/console_std.nyash
Normal file
@ -0,0 +1,21 @@
|
||||
// ConsoleStd — minimal standard console helpers (commonized API)
|
||||
|
||||
static box ConsoleStd {
|
||||
// print/println/log: convert null to "" and return 0
|
||||
print(x) {
|
||||
if x == null { x = "" }
|
||||
print(x)
|
||||
return 0
|
||||
}
|
||||
println(x) {
|
||||
if x == null { x = "" }
|
||||
print(x)
|
||||
return 0
|
||||
}
|
||||
log(x) {
|
||||
if x == null { x = "" }
|
||||
print(x)
|
||||
return 0
|
||||
}
|
||||
}
|
||||
|
||||
32
apps/lib/boxes/map_std.nyash
Normal file
32
apps/lib/boxes/map_std.nyash
Normal file
@ -0,0 +1,32 @@
|
||||
// MapStd — minimal standard map helpers (commonized API)
|
||||
|
||||
static box MapStd {
|
||||
size(m) {
|
||||
if m == null { return 0 }
|
||||
if m.size { return m.size() }
|
||||
return 0
|
||||
}
|
||||
has(m, key) {
|
||||
if m == null { return 0 }
|
||||
if m.has { return m.has(key) }
|
||||
// fallback: compare get() to null
|
||||
if m.get { return m.get(key) != null ? 1 : 0 }
|
||||
return 0
|
||||
}
|
||||
get(m, key) {
|
||||
if m == null { return null }
|
||||
if m.get { return m.get(key) }
|
||||
return null
|
||||
}
|
||||
set(m, key, val) {
|
||||
if m == null { return 0 }
|
||||
if m.set { m.set(key, val) return 0 }
|
||||
return 0
|
||||
}
|
||||
toString(m) {
|
||||
if m == null { return "{}" }
|
||||
if m.toString { return m.toString() }
|
||||
return "{}"
|
||||
}
|
||||
}
|
||||
|
||||
39
apps/lib/boxes/string_std.nyash
Normal file
39
apps/lib/boxes/string_std.nyash
Normal file
@ -0,0 +1,39 @@
|
||||
// StringStd — minimal standard string helpers (commonized API)
|
||||
|
||||
static box StringStd {
|
||||
length(s) {
|
||||
if s == null { return 0 }
|
||||
return s.length()
|
||||
}
|
||||
substring(s, start, end) {
|
||||
if s == null { return "" }
|
||||
if start == null { start = 0 }
|
||||
if end == null { end = s.length() }
|
||||
return s.substring(start, end)
|
||||
}
|
||||
indexOf(s, needle, start) {
|
||||
if s == null { return -1 }
|
||||
if needle == null { needle = "" }
|
||||
if start == null { return s.indexOf(needle) }
|
||||
return s.indexOf(needle, start)
|
||||
}
|
||||
lastIndexOf(s, needle, start) {
|
||||
if s == null { return -1 }
|
||||
if needle == null { needle = "" }
|
||||
if start == null { return s.lastIndexOf(needle) }
|
||||
return s.lastIndexOf(needle, start)
|
||||
}
|
||||
esc_json(s) {
|
||||
if s == null { return "" }
|
||||
local out = ""
|
||||
local i = 0
|
||||
local n = s.length()
|
||||
loop(i < n) {
|
||||
local ch = s.substring(i, i+1)
|
||||
if ch == "\\" { out = out + "\\\\" } else { if ch == "\"" { out = out + "\\\"" } else { out = out + ch } }
|
||||
i = i + 1
|
||||
}
|
||||
return out
|
||||
}
|
||||
}
|
||||
|
||||
12
apps/lib/loopform_normalize.nyash
Normal file
12
apps/lib/loopform_normalize.nyash
Normal file
@ -0,0 +1,12 @@
|
||||
// loopform_normalize.nyash — ループ正規化前処理(恒等版)
|
||||
// 目的: while 等を安定化(キー順・簡易キャリア整列)。
|
||||
// 現段階は恒等(identity)。将来段階で安全な最小正規化を実装する。
|
||||
|
||||
static box LoopFormNormalize {
|
||||
// Apply LoopForm normalization on JSON v0 text. Returns transformed JSON text.
|
||||
// MVP: identity (no-op)
|
||||
apply(json_text) {
|
||||
return json_text
|
||||
}
|
||||
}
|
||||
|
||||
12
apps/lib/scopebox_inject.nyash
Normal file
12
apps/lib/scopebox_inject.nyash
Normal file
@ -0,0 +1,12 @@
|
||||
// scopebox_inject.nyash — JSON v0 前処理(恒等版)
|
||||
// 目的: If.then/else, Loop.body を ScopeBox 相当に包む前処理の導線を提供。
|
||||
// 現段階は恒等(identity)。将来段階で安全な包み込みを実装する。
|
||||
|
||||
static box ScopeBoxInject {
|
||||
// Apply ScopeBox-style wrapping on JSON v0 text. Returns transformed JSON text.
|
||||
// MVP: identity (no-op)
|
||||
apply(json_text) {
|
||||
return json_text
|
||||
}
|
||||
}
|
||||
|
||||
6
apps/selfhost-runtime/boxes_std.nyash
Normal file
6
apps/selfhost-runtime/boxes_std.nyash
Normal file
@ -0,0 +1,6 @@
|
||||
// Ny Executor Stage 0 scaffold — std boxes placeholder
|
||||
|
||||
static box BoxesStd {
|
||||
main(args) { return 0 }
|
||||
}
|
||||
|
||||
128
apps/selfhost-runtime/mir_loader.nyash
Normal file
128
apps/selfhost-runtime/mir_loader.nyash
Normal file
@ -0,0 +1,128 @@
|
||||
// Ny Executor — Stage 1 minimal MIR(JSON v0) loader
|
||||
// - Reads JSON from file (args[0]) using FileBox
|
||||
// - Parses via JsonDocBox → root JsonNodeBox
|
||||
// - Extracts simple summary: version, kind, body_len
|
||||
|
||||
static box MirLoader {
|
||||
// Load file content as string; returns null on error
|
||||
read_file(path) {
|
||||
if !path { return null }
|
||||
@f = new FileBox()
|
||||
@ok = f.open(path, "r")
|
||||
if ok != 1 { return null }
|
||||
@s = f.read()
|
||||
f.close()
|
||||
return s
|
||||
}
|
||||
|
||||
// Parse JSON text into JsonNodeBox root; returns null on error
|
||||
parse_root(json_text) {
|
||||
if !json_text { return null }
|
||||
@doc = new JsonDocBox()
|
||||
doc.parse(json_text)
|
||||
@root = doc.root()
|
||||
return root
|
||||
}
|
||||
|
||||
// Summarize MIR(JSON v0) — returns a MapBox with keys: version(int), kind(string), body_len(int)
|
||||
// Returns null if JSON is invalid or missing required fields.
|
||||
summarize_root(root) {
|
||||
if !root { return null }
|
||||
@vnode = root.get("version")
|
||||
@knode = root.get("kind")
|
||||
@bnode = root.get("body")
|
||||
@ver = vnode.int()
|
||||
@kind = knode.str()
|
||||
@blen = bnode.size()
|
||||
@m = new MapBox()
|
||||
m.set("version", ver)
|
||||
m.set("kind", kind)
|
||||
m.set("body_len", blen)
|
||||
return m
|
||||
}
|
||||
|
||||
// Detect JSON format: "v0_program" | "harness" | "unknown"
|
||||
detect_format(root) {
|
||||
if !root { return "unknown" }
|
||||
@v = root.get("version")
|
||||
@k = root.get("kind")
|
||||
if v && k { return "v0_program" }
|
||||
@f = root.get("functions")
|
||||
if f { return "harness" }
|
||||
return "unknown"
|
||||
}
|
||||
|
||||
// Summarize harness MIR JSON (root.functions)
|
||||
summarize_harness(root) {
|
||||
if !root { return null }
|
||||
@fn = root.get("functions")
|
||||
@m = new MapBox()
|
||||
if fn {
|
||||
@flen = fn.size()
|
||||
m.set("functions_len", flen)
|
||||
} else {
|
||||
m.set("functions_len", 0)
|
||||
}
|
||||
return m
|
||||
}
|
||||
|
||||
// Count statement kinds in Program.body (known subset)
|
||||
summarize_stmt_kinds(root) {
|
||||
if !root { return null }
|
||||
@bnode = root.get("body")
|
||||
@n = bnode.size()
|
||||
@r = 0 @e = 0 @l = 0 @iff = 0 @lp = 0 @br = 0 @ct = 0 @tr = 0 @ex = 0
|
||||
@i = 0
|
||||
while i < n {
|
||||
@st = bnode.at(i)
|
||||
@t = st.get("type").str()
|
||||
if t == "Return" { r = r + 1 }
|
||||
else if t == "Expr" { e = e + 1 }
|
||||
else if t == "Local" { l = l + 1 }
|
||||
else if t == "If" { iff = iff + 1 }
|
||||
else if t == "Loop" { lp = lp + 1 }
|
||||
else if t == "Break" { br = br + 1 }
|
||||
else if t == "Continue" { ct = ct + 1 }
|
||||
else if t == "Try" { tr = tr + 1 }
|
||||
else if t == "Extern" { ex = ex + 1 }
|
||||
i = i + 1
|
||||
}
|
||||
@m = new MapBox()
|
||||
m.set("Return", r)
|
||||
m.set("Expr", e)
|
||||
m.set("Local", l)
|
||||
m.set("If", iff)
|
||||
m.set("Loop", lp)
|
||||
m.set("Break", br)
|
||||
m.set("Continue", ct)
|
||||
m.set("Try", tr)
|
||||
m.set("Extern", ex)
|
||||
return m
|
||||
}
|
||||
|
||||
// High-level: load+parse+summarize from path
|
||||
load_summary(path) {
|
||||
@s = me.read_file(path)
|
||||
if !s { return null }
|
||||
@root = me.parse_root(s)
|
||||
if !root { return null }
|
||||
@fmt = me.detect_format(root)
|
||||
@m = new MapBox()
|
||||
m.set("format", fmt)
|
||||
if fmt == "v0_program" {
|
||||
@v0 = me.summarize_root(root)
|
||||
if v0 {
|
||||
// merge into m
|
||||
m.set("version", v0.get("version"))
|
||||
m.set("kind", v0.get("kind"))
|
||||
m.set("body_len", v0.get("body_len"))
|
||||
}
|
||||
} else if fmt == "harness" {
|
||||
@h = me.summarize_harness(root)
|
||||
if h { m.set("functions_len", h.get("functions_len")) }
|
||||
}
|
||||
return m
|
||||
}
|
||||
|
||||
main(args) { return 0 }
|
||||
}
|
||||
51
apps/selfhost-runtime/ops_calls.nyash
Normal file
51
apps/selfhost-runtime/ops_calls.nyash
Normal file
@ -0,0 +1,51 @@
|
||||
// OpsCalls — Selfhost runtime BoxCall dispatcher (skeleton)
|
||||
// - pref: "plugin" (default) or "ny" (use lib std boxes)
|
||||
|
||||
box OpsCalls {
|
||||
pref
|
||||
|
||||
birth() { me.pref = "plugin" return 0 }
|
||||
set_pref(p) {
|
||||
if p == "ny" { me.pref = "ny" } else { me.pref = "plugin" }
|
||||
return 0
|
||||
}
|
||||
|
||||
// Dispatch method call on receiver `recv` by policy.
|
||||
// MVP: when pref=ny and method is recognized, route to lib wrappers; otherwise call receiver method.
|
||||
dispatch(recv, method, args) {
|
||||
if me.pref == "ny" {
|
||||
// Console
|
||||
if method == "print" { return new ConsoleStd().print(args.get(0)) }
|
||||
if method == "println" { return new ConsoleStd().println(args.get(0)) }
|
||||
if method == "log" { return new ConsoleStd().log(args.get(0)) }
|
||||
// String
|
||||
if method == "length" { return new StringStd().length(recv) }
|
||||
if method == "substring" { return new StringStd().substring(recv, args.get(0), args.get(1)) }
|
||||
if method == "indexOf" { return new StringStd().indexOf(recv, args.get(0), args.get(1)) }
|
||||
if method == "lastIndexOf" { return new StringStd().lastIndexOf(recv, args.get(0), args.get(1)) }
|
||||
if method == "esc_json" { return new StringStd().esc_json(args.get(0)) }
|
||||
// Array
|
||||
if method == "size" { return new ArrayStd().size(recv) }
|
||||
if method == "len" { return new ArrayStd().len(recv) }
|
||||
if method == "get" { return new ArrayStd().get(recv, args.get(0)) }
|
||||
if method == "set" { return new ArrayStd().set(recv, args.get(0), args.get(1)) }
|
||||
if method == "push" { return new ArrayStd().push(recv, args.get(0)) }
|
||||
if method == "toString" { return new ArrayStd().toString(recv) }
|
||||
// Map
|
||||
if method == "size" { return new MapStd().size(recv) }
|
||||
if method == "has" { return new MapStd().has(recv, args.get(0)) }
|
||||
if method == "get" { return new MapStd().get(recv, args.get(0)) }
|
||||
if method == "set" { return new MapStd().set(recv, args.get(0), args.get(1)) }
|
||||
if method == "toString" { return new MapStd().toString(recv) }
|
||||
}
|
||||
// Fallback: direct method invocation on receiver
|
||||
// Note: This will route to plugin or built-in box implementations.
|
||||
if args == null { return recv[method]() }
|
||||
local a0 = args.get(0)
|
||||
local a1 = args.get(1)
|
||||
local a2 = args.get(2)
|
||||
// call with up to 3 args (MVP)
|
||||
return recv[method](a0, a1, a2)
|
||||
}
|
||||
}
|
||||
|
||||
6
apps/selfhost-runtime/ops_core.nyash
Normal file
6
apps/selfhost-runtime/ops_core.nyash
Normal file
@ -0,0 +1,6 @@
|
||||
// Ny Executor Stage 0 scaffold — core ops placeholder
|
||||
|
||||
static box OpsCore {
|
||||
main(args) { return 0 }
|
||||
}
|
||||
|
||||
92
apps/selfhost-runtime/runner.nyash
Normal file
92
apps/selfhost-runtime/runner.nyash
Normal file
@ -0,0 +1,92 @@
|
||||
// Ny Executor — Stage 1 entry (silent by default)
|
||||
// Reads MIR(JSON v0) from args[0], parses minimal summary, and exits 0 on success.
|
||||
// No side-effects; prints nothing unless later stages add tracing.
|
||||
|
||||
include "apps/selfhost-runtime/mir_loader.nyash"
|
||||
include "apps/selfhost-runtime/ops_calls.nyash"
|
||||
include "apps/lib/boxes/console_std.nyash"
|
||||
include "apps/lib/boxes/string_std.nyash"
|
||||
include "apps/lib/boxes/array_std.nyash"
|
||||
include "apps/lib/boxes/map_std.nyash"
|
||||
|
||||
static box Main {
|
||||
main(args) {
|
||||
@path = null
|
||||
if args { if args.size() > 0 { @p = args.get(0) if p { path = p } } }
|
||||
@box_pref = "plugin"
|
||||
// parse optional args like --trace / --box-pref=ny
|
||||
if args { if args.size() > 1 {
|
||||
@i = 1
|
||||
while i < args.size() {
|
||||
@flagx = args.get(i)
|
||||
if flagx && flagx.length() > 0 {
|
||||
if flagx == "--trace" { /* handled later */ }
|
||||
else {
|
||||
@kpos = flagx.indexOf("--box-pref=")
|
||||
if kpos == 0 {
|
||||
@val = flagx.substring(11, flagx.length())
|
||||
if val == "ny" || val == "plugin" { box_pref = val }
|
||||
}
|
||||
}
|
||||
}
|
||||
i = i + 1
|
||||
}
|
||||
} }
|
||||
if !path { return 0 }
|
||||
|
||||
@loader = new MirLoader()
|
||||
@sum = loader.load_summary(path)
|
||||
if !sum { return 1 }
|
||||
|
||||
// Optional ad-hoc trace (development only): pass "--trace" in args
|
||||
@trace_env = "0"
|
||||
// NOTE: env access is not available inside Ny script; placeholder for future
|
||||
// we emulate it with CLI arg for now.
|
||||
if args { if args.size() > 1 {
|
||||
@flag = args.get(1)
|
||||
if flag == "--trace" {
|
||||
@fmt = sum.get("format")
|
||||
if fmt == "v0_program" {
|
||||
@ver0 = sum.get("version")
|
||||
@kind0 = sum.get("kind")
|
||||
@blen0 = sum.get("body_len")
|
||||
print("[selfhost] v0 summary: version=" + ver0 + ", kind=" + kind0 + ", body_len=" + blen0)
|
||||
@k = loader.summarize_stmt_kinds( loader.parse_root( loader.read_file(path) ) )
|
||||
if k {
|
||||
@s = ""
|
||||
function app(name, n) { if n > 0 { if s.length() > 0 { s = s + "," } s = s + name + "=" + n } }
|
||||
app("Return", k.get("Return"))
|
||||
app("Expr", k.get("Expr"))
|
||||
app("Local", k.get("Local"))
|
||||
app("If", k.get("If"))
|
||||
app("Loop", k.get("Loop"))
|
||||
app("Break", k.get("Break"))
|
||||
app("Continue", k.get("Continue"))
|
||||
app("Try", k.get("Try"))
|
||||
app("Extern", k.get("Extern"))
|
||||
if s.length() > 0 { print("[selfhost] body stmt counts: " + s) }
|
||||
}
|
||||
} else if fmt == "harness" {
|
||||
@flen = sum.get("functions_len")
|
||||
print("[selfhost] harness summary: functions=" + flen)
|
||||
} else {
|
||||
print("[selfhost] unknown JSON format")
|
||||
}
|
||||
}
|
||||
} }
|
||||
|
||||
// Initialize BoxCall policy (skeleton)
|
||||
@oc = new OpsCalls()
|
||||
oc.set_pref(box_pref)
|
||||
|
||||
// Basic schema guard for v0 Program
|
||||
@fmt2 = sum.get("format")
|
||||
if fmt2 == "v0_program" {
|
||||
@ver = sum.get("version")
|
||||
@kind = sum.get("kind")
|
||||
if ver != 0 { return 2 }
|
||||
if kind != "Program" { return 3 }
|
||||
}
|
||||
return 0
|
||||
}
|
||||
}
|
||||
@ -13,6 +13,9 @@ include "apps/selfhost-compiler/boxes/debug_box.nyash"
|
||||
include "apps/selfhost-compiler/boxes/parser_box.nyash"
|
||||
include "apps/selfhost-compiler/boxes/emitter_box.nyash"
|
||||
include "apps/selfhost-compiler/boxes/mir_emitter_box.nyash"
|
||||
// Prepass libs (ScopeBox/LoopForm)
|
||||
include "apps/lib/scopebox_inject.nyash"
|
||||
include "apps/lib/loopform_normalize.nyash"
|
||||
|
||||
static box Main {
|
||||
// ---- IO helper ----
|
||||
@ -117,14 +120,34 @@ static box Main {
|
||||
ast_json = me.parse_program(src, stage3_mode)
|
||||
}
|
||||
|
||||
// Optional prepasses driven by CLI flags (mapped from env by runner)
|
||||
local do_scopebox = 0
|
||||
local do_loopform = 0
|
||||
if args != null {
|
||||
local alen3 = args.length()
|
||||
local i3 = 0
|
||||
loop(i3 < alen3) {
|
||||
local a3 = args.get(i3)
|
||||
if a3 == "--scopebox" { do_scopebox = 1 }
|
||||
if a3 == "--loopform" { do_loopform = 1 }
|
||||
i3 = i3 + 1
|
||||
}
|
||||
}
|
||||
|
||||
if emit_mir == 1 {
|
||||
// Lower minimal AST to MIR JSON (Return(Int) only for MVP)
|
||||
local mir = new MirEmitterBoxMod.MirEmitterBox()
|
||||
json = mir.emit_mir_min(ast_json)
|
||||
local aj = ast_json
|
||||
if do_scopebox == 1 { aj = new ScopeBoxInject().apply(aj) }
|
||||
if do_loopform == 1 { aj = new LoopFormNormalize().apply(aj) }
|
||||
json = mir.emit_mir_min(aj)
|
||||
} else {
|
||||
// Emit Stage‑1 JSON with metadata
|
||||
local emitter = new EmitterBoxMod.EmitterBox()
|
||||
json = emitter.emit_program(ast_json, me._usings)
|
||||
local aj2 = ast_json
|
||||
if do_scopebox == 1 { aj2 = new ScopeBoxInject().apply(aj2) }
|
||||
if do_loopform == 1 { aj2 = new LoopFormNormalize().apply(aj2) }
|
||||
json = emitter.emit_program(aj2, me._usings)
|
||||
}
|
||||
|
||||
// Output JSON
|
||||
|
||||
19
apps/tests/loop_if_phi_continue.nyash
Normal file
19
apps/tests/loop_if_phi_continue.nyash
Normal file
@ -0,0 +1,19 @@
|
||||
static box Main {
|
||||
main(args) {
|
||||
local i = 0
|
||||
local printed = 0
|
||||
loop(i < 6) {
|
||||
if (i % 2 == 0) {
|
||||
i = i + 1
|
||||
printed = printed + 1
|
||||
continue
|
||||
} else {
|
||||
i = i + 2
|
||||
}
|
||||
}
|
||||
print(i)
|
||||
print(printed)
|
||||
return 0
|
||||
}
|
||||
}
|
||||
|
||||
84
docs/development/roadmap/selfhosting-ny-executor.md
Normal file
84
docs/development/roadmap/selfhosting-ny-executor.md
Normal file
@ -0,0 +1,84 @@
|
||||
# 自己ホスティング Ny 実行器(Nyash→MIR→Ny 実行)計画
|
||||
|
||||
更新日: 2025-09-22
|
||||
|
||||
## 目的
|
||||
- 既存の Python PyVM を参照器として維持しつつ、Nyash 自身で MIR(JSON) を実行できる最小実行器(Ny Executor)を段階的に立ち上げる。
|
||||
- LLVM ライン/既存 CI を壊さず、機能追加ポーズの原則(仕様不変・既定OFF)を守る。
|
||||
|
||||
## 原則
|
||||
- 意味論の決定は MIR 側。PHI は MIR が決め、llvmlite は IR の phi を具現化するだけ。
|
||||
- 小粒・ガード付き。新経路は既定OFFのフラグで明示、ロールバック容易に。
|
||||
- パリティ最優先。PyVM/LLVM をオラクルにして、ゴールデンとスモークで一致を確認。
|
||||
- 観測可能性(TRACE)と安全弁(STEP_MAX)。
|
||||
|
||||
## フラグ(既定OFF)
|
||||
- `NYASH_SELFHOST_EXEC=1`: Ny Executor を有効化(PyVM の代わりに Ny スクリプトへ委譲)。
|
||||
- `NYASH_SELFHOST_TRACE=1`: Ny Executor の構造化ログ(JSON lines or 整形文字列)。
|
||||
- `NYASH_SELFHOST_STEP_MAX=<int>`: 1 実行あたりの最大命令数(既定 200000 相当)。
|
||||
- `NYASH_SELFHOST_STRICT=1`: 厳格モード(型/値のチェックを強化、未知 extern を拒否)。
|
||||
- 参考: `NYASH_MIR_NO_PHI=1`(開発用。Edge Copy 経路の確認に使用)
|
||||
|
||||
## 構成(新規 Ny ファイル)
|
||||
- `apps/selfhost-runtime/`
|
||||
- `mir_loader.nyash`: MIR(JSON v0) ローダ
|
||||
- `ops_core.nyash`: const/binop/compare/branch/jump/ret/phi
|
||||
- `ops_calls.nyash`: call/externcall/boxcall(MVP)
|
||||
- `boxes_std.nyash`: String/Array/Map/Console の最小メソッド
|
||||
- `runner.nyash`: エントリ/ディスパッチ/ステップガード/TRACE
|
||||
|
||||
Rust ランナー側は PyVM 経路にて `NYASH_SELFHOST_EXEC=1` を検出した場合のみ Ny Executor に MIR(JSON) を渡す(既定は従来どおり PyVM)。
|
||||
|
||||
## ステージ計画と受け入れ基準
|
||||
|
||||
### Stage 0 — スカフォールド(1–2日)
|
||||
- フラグ/エントリ配線のみ。Ny Executor は no-op のまま 0 で終了。
|
||||
- 受け入れ: ビルド緑・既定挙動不変・フラグONで no-op 実行可。
|
||||
|
||||
### Stage 1 — MIR ローダ(2–3日)
|
||||
- `mir_loader.nyash` で JSON v0 を読み込み、関数/ブロック/命令の構造体に展開(最初は要約のみ)。
|
||||
- 受け入れ: ロードのみのスモーク(構文要素の個数検証)。
|
||||
- 備考: 立ち上げ初期は PyVM ハーネス用 MIR JSON(`{"functions":…}`)も受理し、要約(functions数)だけ行う(既定OFF)。
|
||||
|
||||
### Stage 2 — コア命令(3–5日)
|
||||
- `ops_core.nyash` で `const/binop/compare/branch/jump/ret/phi` を実装。
|
||||
- `runner.nyash` にステップ budget と TRACE を実装。
|
||||
- 受け入れ: 小さな MIR プログラム群で PyVM と stdout/exit code が一致。
|
||||
|
||||
### Stage 3 — call/externcall/boxcall(4–7日)
|
||||
- `ops_calls.nyash` で関数呼び出し/外部呼び出し/Box メソッド呼び出し(MVP)を実装。
|
||||
- String/Console の最小メソッド揃え。未知 extern は STRICT=1 で拒否。
|
||||
- 受け入れ: 既存の小スモーク(文字列・print 系)でパリティ緑。
|
||||
|
||||
### Stage 4 — コレクション/JSON アダプタ整合(4–7日)
|
||||
- Array/Map の最小メソッド(size/len/get/set/push 等)を Ny 側で実装。
|
||||
- 受け入れ: 既存 `apps/tests/` のコレクション系スモークでパリティ緑。
|
||||
|
||||
### Stage 5 — using/seam の安定化(3–5日)
|
||||
- using 解決は従来どおり Rust 側。Ny Executor は MIR(JSON) のみを消費。
|
||||
- seam ガード(`NYASH_RESOLVE_FIX_BRACES`/`NYASH_RESOLVE_DEDUP_BOX`)ON で代表例を通す。
|
||||
- 受け入れ: using 混在スモークでハングなし・出力一致。
|
||||
|
||||
### Stage 6 — パリティハーネス/CI(5–7日)
|
||||
- `tools/parity.sh --lhs selfhost --rhs pyvm` を追加。
|
||||
- CI に非ブロッキングで Ny Executor ジョブを追加(緑安定後にブロッキング昇格)。
|
||||
- 受け入れ: 代表セットで selfhost=PyVM(必要環境で LLVM もオプション比較)。
|
||||
|
||||
### Stage 7 — 速度/診断/堅牢化(継続)
|
||||
- ホットパス最適化、TRACE 圧縮、エラーメッセージ充実、境界ケーステストの拡充。
|
||||
|
||||
## ロールバック/安全策
|
||||
- 既定は PyVM。`NYASH_SELFHOST_EXEC=1` を明示しなければ Ny Executor は有効にならない。
|
||||
- すべての追加は最小差分・既定OFFで導入。戻しはフラグOFFで即時原状復帰可能。
|
||||
|
||||
## リスクと対策
|
||||
- MIR JSON スキーマの変化: ローダでバージョンを検査。未知フィールドは安全に無視または明示的に失敗。
|
||||
- PHI/ループの隅: MIR が意味論を決める(PHI or Edge Copy)。PyVM/LLVM とゴールデンで監視。
|
||||
- Boxcall 乖離: MVP のみ実装。STRICT で未知を拒否。スモークで毎回比較。
|
||||
|
||||
## タイムライン(概算)
|
||||
- Stage 0–1: 3–5日 / Stage 2: 3–5日 / Stage 3–4: 8–14日 / Stage 5–6: 8–12日 / Hardening: 継続
|
||||
|
||||
## 受け入れ指標(簡易)
|
||||
- stdout/exit code の一致、ハングなし、TRACE(任意)で差分ゼロ。
|
||||
- CI 緑維持(既存ジョブ不変)+ 新規 selfhost ジョブは当初非ブロッキング。
|
||||
@ -126,9 +126,13 @@ for / foreach の糖衣と正規化(概要)
|
||||
- `tools/test/golden/macro/loop_two_vars_user_macro_golden.sh`
|
||||
- 出力一致スモーク(VM)
|
||||
- `tools/test/smoke/macro/loop_two_vars_output_smoke.sh`
|
||||
- 自己ホスト前展開(PyVM 経由)
|
||||
- 自己ホスト前展開(PyVM 経由)
|
||||
- `NYASH_VM_USE_PY=1 NYASH_USE_NY_COMPILER=1 NYASH_MACRO_ENABLE=1 NYASH_MACRO_PATHS=apps/macros/examples/loop_normalize_macro.nyash ./target/release/nyash --macro-preexpand --backend vm apps/tests/macro/loopform/simple.nyash`
|
||||
|
||||
Selfhost compiler prepass(恒等→最小正規化)
|
||||
- Runner が `NYASH_LOOPFORM_NORMALIZE=1` を `--loopform` にマップして子に渡し、`apps/lib/loopform_normalize.nyash` の前処理を適用(現状は恒等)。
|
||||
- 既定OFF。将来、キー順正規化→簡易キャリア整列を段階的に追加する。
|
||||
|
||||
実装メモ(内蔵変換ルート / Rust)
|
||||
- 既定のマクロ実行は internal‑child(Rust内蔵)です。LoopNormalize は以下の保守的なガードで正規化します。
|
||||
- トップレベル本体に Break/Continue がないこと
|
||||
|
||||
@ -6,6 +6,7 @@ Overview
|
||||
How to enable
|
||||
- Inject ScopeBox wrappers during core normalization by setting:
|
||||
- `NYASH_SCOPEBOX_ENABLE=1`
|
||||
- Selfhost compiler path: Runner maps `NYASH_SCOPEBOX_ENABLE=1` to child arg `--scopebox` and applies a JSON prepass via `apps/lib/scopebox_inject.nyash`(現状は恒等: 構文確認のみ)。
|
||||
- Injection points:
|
||||
- If.then / If.else bodies
|
||||
- Loop.body
|
||||
@ -29,3 +30,5 @@ MIR Scope Hints (unified env)
|
||||
Zero-cost policy
|
||||
- ScopeBox is removed implicitly during MIR lowering (treated as a block). ScopeEnter/ScopeLeave hints are observational only. Execution and IR are unchanged.
|
||||
|
||||
Notes (Selfhost path)
|
||||
- 当面は JSON v0 に `ScopeBox` 型は導入しない(互換維持)。前処理は恒等(identity)から開始し、将来は安全な包み込み/ヒント付与を検討する。
|
||||
|
||||
@ -22,6 +22,7 @@ language/
|
||||
├── concurrency/ # 並行性Box (設計完了・docs化済み)
|
||||
├── flow-blocks/ # フロー演算子 (設計完了・docs化済み)
|
||||
├── scope-reuse/ # スコープ演算子 (設計完了・docs化済み)
|
||||
├── pure-functional-blocks.md # []純粋関数型ブロック vs {}通常ブロック (NEW!)
|
||||
├── pattern-matching/ # パターンマッチング拡張
|
||||
├── async-await/ # 非同期構文Sugar
|
||||
└── metaprogramming/ # メタプログラミング機能
|
||||
@ -76,6 +77,7 @@ experimental/
|
||||
|
||||
### 🔥 高優先度(Post‑Bootstrap 即実装)
|
||||
- **CAX (C-ABI Explorer)**: 革新的デバッグツール(世界初)
|
||||
- **Pure Functional []Blocks**: 純粋関数型ブロック vs 通常{}ブロック (NEW!)
|
||||
- **Nyash Self-VM**: Python/Rust VM統一化
|
||||
- **Flow Blocks**: 設計完了、実装のみ
|
||||
- **Concurrency Boxes**: Go超越の並行性
|
||||
|
||||
158
docs/ideas/language/pure-functional-blocks.md
Normal file
158
docs/ideas/language/pure-functional-blocks.md
Normal file
@ -0,0 +1,158 @@
|
||||
# 純粋関数型[]ブロック vs 通常{}ブロック設計案
|
||||
|
||||
## 概要
|
||||
|
||||
Nyashにおける**{}ローカルスコープ**と**[]純粋スコープ**の明確な差別化設計。
|
||||
「すべてはBox」哲学の中で、副作用を含む通常処理と純粋関数型処理を共存させる革命的アプローチ。
|
||||
|
||||
## 基本的な差別化
|
||||
|
||||
### {}ブロック(既存)- 通常のNyashスコープ
|
||||
```nyash
|
||||
{
|
||||
local counter = new IntegerBox(0) // Box生成OK
|
||||
counter.set(counter.get() + 1) // 副作用OK
|
||||
SomeGlobalBox.modify() // 外部状態変更OK
|
||||
ChannelBox.send("data") // I/O副作用OK
|
||||
// 普通のNyash文法、制約なし
|
||||
}
|
||||
```
|
||||
|
||||
### []ブロック(提案)- 純粋関数型スコープ
|
||||
```nyash
|
||||
[
|
||||
// コンパイラが以下を厳密にチェック:
|
||||
input.map(x => x * 2) // ✅ 純粋変換のみ
|
||||
.filter(x => x > 10) // ✅ 副作用なし
|
||||
.reduce((a, b) => a + b, 0) // ✅ 参照透過性保証
|
||||
|
||||
// 以下はコンパイルエラー:
|
||||
// SomeBox.mutate() // ❌ 外部状態変更禁止
|
||||
// print("debug") // ❌ I/O副作用禁止
|
||||
// local x = 1; x = 2 // ❌ 変数再代入禁止
|
||||
]
|
||||
```
|
||||
|
||||
## 深い差別化ポイント
|
||||
|
||||
### 1. コンパイラレベルの制約
|
||||
- **{}**: 制約なし、通常のNyash文法
|
||||
- **[]**: 厳密な純粋性チェック、副作用検出でコンパイルエラー
|
||||
|
||||
### 2. メモリモデル
|
||||
- **{}**: 新しいBoxの生成・変更可能
|
||||
- **[]**: 既存データの変換のみ、新しい状態作成禁止
|
||||
|
||||
### 3. 並行安全性
|
||||
- **{}**: データ競合の可能性あり
|
||||
- **[]**: 副作用なしなので自動的にthread-safe
|
||||
|
||||
### 4. 最適化
|
||||
- **{}**: 通常の最適化
|
||||
- **[]**: コンパイラが大胆な最適化可能(純粋性保証されているため)
|
||||
|
||||
### 5. デバッグ性
|
||||
- **{}**: 通常のデバッガ
|
||||
- **[]**: 入力→出力のみ追跡すればOK、デバッグ超簡単
|
||||
|
||||
## 実用的な使い分け例
|
||||
|
||||
```nyash
|
||||
box DataProcessor {
|
||||
processData(input: ArrayBox) -> ArrayBox {
|
||||
// 通常の{}スコープ: 準備・後処理
|
||||
{
|
||||
local logger = new LoggerBox()
|
||||
logger.info("Processing started")
|
||||
|
||||
// 純粋な[]スコープ: コア計算
|
||||
local result = [
|
||||
input
|
||||
.filter(x => x > 0)
|
||||
.map(x => Math.sqrt(x))
|
||||
.sort()
|
||||
]
|
||||
|
||||
logger.info("Processing completed")
|
||||
return result
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## []ブロックの真の価値
|
||||
|
||||
### 1. 数学的証明可能性
|
||||
[]内の処理は数学的に証明できる
|
||||
|
||||
### 2. 完全な再現性
|
||||
同じ入力→必ず同じ出力
|
||||
|
||||
### 3. 並列化自動最適化
|
||||
コンパイラが自動で並列化可能
|
||||
|
||||
### 4. ホットスワップ可能
|
||||
実行中でも[]部分だけ安全に入れ替え可能
|
||||
|
||||
## Box化による実装戦略
|
||||
|
||||
### PureCalculatorBox例
|
||||
```nyash
|
||||
box PureCalculatorBox {
|
||||
birth() { /* 状態なし、初期化も最小限 */ }
|
||||
|
||||
calculate(input: ArrayBox) -> ArrayBox {
|
||||
// []ブロックを使って、純粋性を言語レベルで保証
|
||||
local result = [
|
||||
input.map(x => x * 2).filter(x => x > 10)
|
||||
]
|
||||
return result
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### コンパイラもBox化
|
||||
「[]だけ解析する箱をつくる」アプローチ:
|
||||
- `PurityCheckerBox`: []ブロックの純粋性を厳密チェック
|
||||
- `ParserBox`: []専用構文解析
|
||||
- `OptimizerBox`: []専用最適化
|
||||
|
||||
## 哲学的統合
|
||||
|
||||
### {}ブロックと[]ブロックの共存
|
||||
- **{}**: "現実世界とのインターフェース"(副作用、状態管理)
|
||||
- **[]**: "純粋な計算の核心"(ロジック、変換、計算)
|
||||
|
||||
この差別化により、Nyashは:
|
||||
- **実用性** ({}での柔軟な状態管理)
|
||||
- **数学的美しさ** ([]での純粋性)
|
||||
|
||||
という**矛盾する要求を同時に満たす**ことができる。
|
||||
|
||||
## 「すべてはBox」哲学との整合性
|
||||
|
||||
これは「すべてはBox」哲学を破るのではなく、**異なる種類のBox(副作用Box vs 純粋Box)を明確に分離**する革命的なアプローチ。
|
||||
|
||||
- 副作用のあるBoxは{}内で使用
|
||||
- 純粋なBoxは[]内で使用
|
||||
- それぞれのBoxが適切なスコープで最大限の力を発揮
|
||||
|
||||
## 実装上の課題と解決策
|
||||
|
||||
### 課題
|
||||
1. コンパイラの複雑性増加
|
||||
2. []と{}間のデータやり取り
|
||||
3. エラーメッセージの分かりやすさ
|
||||
|
||||
### 解決策
|
||||
1. Box化されたコンパイラ(PurityCheckerBox等)で複雑性を分離
|
||||
2. 明確なデータ変換API([]→{}は常に安全、{}→[]は純粋性チェック)
|
||||
3. 文脈に応じた専用エラーメッセージ
|
||||
|
||||
## 結論
|
||||
|
||||
この設計により、Nyashは純粋関数型プログラミングの利点を享受しつつ、実用的な副作用処理も自然に行える、世界初の「ハイブリッド純粋言語」となる可能性がある。
|
||||
|
||||
**Date**: 2025-09-22
|
||||
**Status**: アイデア段階
|
||||
**Priority**: Phase 16以降での検討推奨
|
||||
@ -6,7 +6,6 @@ mod args;
|
||||
mod groups;
|
||||
mod utils;
|
||||
|
||||
use groups::*;
|
||||
|
||||
/// Command-line configuration structure
|
||||
#[derive(Debug, Clone)]
|
||||
@ -217,4 +216,3 @@ mod tests {
|
||||
assert_eq!(config.iterations, 10);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@ -106,8 +106,9 @@ pub fn await_max_ms() -> u64 {
|
||||
.unwrap_or(5000)
|
||||
}
|
||||
|
||||
// ---- MIR PHI-less (edge-copy) mode ----
|
||||
// ---- MIR PHI / PHI-less (edge-copy) mode ----
|
||||
/// Enable MIR PHI non-generation. Bridge/Builder emit edge copies instead of PHI.
|
||||
/// Default: PHI-ON when the build supports it (feature `phi-legacy`), otherwise fall back to PHI-OFF.
|
||||
pub fn mir_no_phi() -> bool {
|
||||
match std::env::var("NYASH_MIR_NO_PHI").ok() {
|
||||
Some(v) => {
|
||||
@ -116,7 +117,7 @@ pub fn mir_no_phi() -> bool {
|
||||
if requested_no_phi {
|
||||
return true;
|
||||
}
|
||||
// PHI-on requested
|
||||
// PHI-on requested explicitly
|
||||
#[cfg(feature = "phi-legacy")]
|
||||
{
|
||||
return false;
|
||||
@ -131,8 +132,22 @@ pub fn mir_no_phi() -> bool {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
// Default: ON for MIR13 stability (PHI generation off by default)
|
||||
None => true,
|
||||
None => {
|
||||
// Default preference: PHI-ON if available
|
||||
#[cfg(feature = "phi-legacy")]
|
||||
{
|
||||
return false;
|
||||
}
|
||||
#[cfg(not(feature = "phi-legacy"))]
|
||||
{
|
||||
if PHI_ON_GATED_WARNED.set(()).is_ok() {
|
||||
eprintln!(
|
||||
"[nyash] Default PHI-on requested but build lacks 'phi-legacy' feature; falling back to PHI-off."
|
||||
);
|
||||
}
|
||||
return true;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@ -5,6 +5,9 @@
|
||||
*/
|
||||
|
||||
use super::{Effect, EffectMask, ValueId};
|
||||
use crate::mir::types::{
|
||||
BarrierOp, BinaryOp, CompareOp, ConstValue, MirType, TypeOpKind, UnaryOp, WeakRefOp,
|
||||
};
|
||||
// use crate::value::NyashValue; // Commented out to avoid circular dependency
|
||||
use std::fmt;
|
||||
|
||||
@ -291,76 +294,7 @@ pub enum MirInstruction {
|
||||
},
|
||||
}
|
||||
|
||||
/// Constant values in MIR
|
||||
#[derive(Debug, Clone, PartialEq)]
|
||||
pub enum ConstValue {
|
||||
Integer(i64),
|
||||
Float(f64),
|
||||
Bool(bool),
|
||||
String(String),
|
||||
Null,
|
||||
Void,
|
||||
}
|
||||
|
||||
/// Binary operations
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub enum BinaryOp {
|
||||
// Arithmetic
|
||||
Add,
|
||||
Sub,
|
||||
Mul,
|
||||
Div,
|
||||
Mod,
|
||||
|
||||
// Bitwise
|
||||
BitAnd,
|
||||
BitOr,
|
||||
BitXor,
|
||||
Shl,
|
||||
Shr,
|
||||
|
||||
// Logical
|
||||
And,
|
||||
Or,
|
||||
}
|
||||
|
||||
/// Unary operations
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub enum UnaryOp {
|
||||
// Arithmetic
|
||||
Neg,
|
||||
|
||||
// Logical
|
||||
Not,
|
||||
|
||||
// Bitwise
|
||||
BitNot,
|
||||
}
|
||||
|
||||
/// Comparison operations
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub enum CompareOp {
|
||||
Eq,
|
||||
Ne,
|
||||
Lt,
|
||||
Le,
|
||||
Gt,
|
||||
Ge,
|
||||
}
|
||||
|
||||
/// MIR type system
|
||||
#[derive(Debug, Clone, PartialEq, Eq)]
|
||||
pub enum MirType {
|
||||
Integer,
|
||||
Float,
|
||||
Bool,
|
||||
String,
|
||||
Box(String), // Box type with name
|
||||
Array(Box<MirType>),
|
||||
Future(Box<MirType>), // Future containing a type
|
||||
Void,
|
||||
Unknown,
|
||||
}
|
||||
// types moved to crate::mir::types
|
||||
|
||||
impl MirInstruction {
|
||||
/// Get the effect mask for this instruction
|
||||
@ -587,26 +521,7 @@ impl MirInstruction {
|
||||
}
|
||||
}
|
||||
|
||||
/// Kind of unified type operation (PoC)
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub enum TypeOpKind {
|
||||
Check,
|
||||
Cast,
|
||||
}
|
||||
|
||||
/// Kind of unified weak reference operation (PoC)
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub enum WeakRefOp {
|
||||
New,
|
||||
Load,
|
||||
}
|
||||
|
||||
/// Kind of unified barrier operation (PoC)
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub enum BarrierOp {
|
||||
Read,
|
||||
Write,
|
||||
}
|
||||
// enums TypeOpKind/WeakRefOp/BarrierOp moved to crate::mir::types
|
||||
|
||||
impl ConstValue {
|
||||
/*
|
||||
@ -767,18 +682,7 @@ impl fmt::Display for MirInstruction {
|
||||
}
|
||||
}
|
||||
|
||||
impl fmt::Display for ConstValue {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
match self {
|
||||
ConstValue::Integer(n) => write!(f, "{}", n),
|
||||
ConstValue::Float(fl) => write!(f, "{}", fl),
|
||||
ConstValue::Bool(b) => write!(f, "{}", b),
|
||||
ConstValue::String(s) => write!(f, "\"{}\"", s),
|
||||
ConstValue::Null => write!(f, "null"),
|
||||
ConstValue::Void => write!(f, "void"),
|
||||
}
|
||||
}
|
||||
}
|
||||
// Display for ConstValue moved to crate::mir::types
|
||||
|
||||
|
||||
#[cfg(test)]
|
||||
|
||||
@ -6,9 +6,10 @@
|
||||
//! in `MirInstruction`.
|
||||
|
||||
use super::{BasicBlockId, ConstValue, Effect, EffectMask, ValueId};
|
||||
use crate::mir::instruction::{
|
||||
BarrierOp as MirBarrierOp, BinaryOp as MirBinOp, MirInstruction, MirType,
|
||||
TypeOpKind as MirTypeOpKind, WeakRefOp as MirWeakRefOp,
|
||||
use crate::mir::instruction::MirInstruction;
|
||||
use crate::mir::types::{
|
||||
BarrierOp as MirBarrierOp, BinaryOp as MirBinOp, MirType, TypeOpKind as MirTypeOpKind,
|
||||
WeakRefOp as MirWeakRefOp,
|
||||
};
|
||||
|
||||
// Local macro utilities for generating InstructionMeta boilerplate.
|
||||
@ -114,6 +115,13 @@ pub fn effects_via_meta(i: &MirInstruction) -> Option<EffectMask> {
|
||||
if let Some(k) = JumpInst::from_mir(i) { return Some(k.effects()); }
|
||||
if let Some(k) = PrintInst::from_mir(i) { return Some(k.effects()); }
|
||||
if let Some(k) = DebugInst::from_mir(i) { return Some(k.effects()); }
|
||||
if let Some(k) = TypeCheckInst::from_mir(i) { return Some(k.effects()); }
|
||||
if let Some(k) = CopyInst::from_mir(i) { return Some(k.effects()); }
|
||||
if let Some(k) = NopInst::from_mir(i) { return Some(k.effects()); }
|
||||
if let Some(k) = ThrowInst::from_mir(i) { return Some(k.effects()); }
|
||||
if let Some(k) = CatchInst::from_mir(i) { return Some(k.effects()); }
|
||||
if let Some(k) = SafepointInst::from_mir(i) { return Some(k.effects()); }
|
||||
if let Some(k) = FunctionNewInst::from_mir(i) { return Some(k.effects()); }
|
||||
None
|
||||
}
|
||||
|
||||
@ -136,6 +144,13 @@ pub fn dst_via_meta(i: &MirInstruction) -> Option<ValueId> {
|
||||
if let Some(_k) = PrintInst::from_mir(i) { return None; }
|
||||
if let Some(_k) = DebugInst::from_mir(i) { return None; }
|
||||
if let Some(k) = CallLikeInst::from_mir(i) { return k.dst(); }
|
||||
if let Some(k) = TypeCheckInst::from_mir(i) { return k.dst(); }
|
||||
if let Some(k) = CopyInst::from_mir(i) { return k.dst(); }
|
||||
if let Some(_k) = NopInst::from_mir(i) { return None; }
|
||||
if let Some(_k) = ThrowInst::from_mir(i) { return None; }
|
||||
if let Some(k) = CatchInst::from_mir(i) { return k.dst(); }
|
||||
if let Some(_k) = SafepointInst::from_mir(i) { return None; }
|
||||
if let Some(k) = FunctionNewInst::from_mir(i) { return k.dst(); }
|
||||
None
|
||||
}
|
||||
|
||||
@ -158,6 +173,13 @@ pub fn used_via_meta(i: &MirInstruction) -> Option<Vec<ValueId>> {
|
||||
if let Some(k) = PrintInst::from_mir(i) { return Some(k.used()); }
|
||||
if let Some(k) = DebugInst::from_mir(i) { return Some(k.used()); }
|
||||
if let Some(k) = CallLikeInst::from_mir(i) { return Some(k.used()); }
|
||||
if let Some(k) = TypeCheckInst::from_mir(i) { return Some(k.used()); }
|
||||
if let Some(k) = CopyInst::from_mir(i) { return Some(k.used()); }
|
||||
if let Some(k) = NopInst::from_mir(i) { return Some(k.used()); }
|
||||
if let Some(k) = ThrowInst::from_mir(i) { return Some(k.used()); }
|
||||
if let Some(k) = CatchInst::from_mir(i) { return Some(k.used()); }
|
||||
if let Some(k) = SafepointInst::from_mir(i) { return Some(k.used()); }
|
||||
if let Some(k) = FunctionNewInst::from_mir(i) { return Some(k.used()); }
|
||||
None
|
||||
}
|
||||
|
||||
@ -369,6 +391,17 @@ inst_meta! {
|
||||
}
|
||||
}
|
||||
|
||||
// ---- FunctionNew ---- (macro-generated)
|
||||
inst_meta! {
|
||||
pub struct FunctionNewInst { dst: ValueId, captures: Vec<(String, ValueId)>, me: Option<ValueId> }
|
||||
=> {
|
||||
from_mir = |i| match i { MirInstruction::FunctionNew { dst, captures, me, .. } => Some(FunctionNewInst { dst: *dst, captures: captures.clone(), me: *me }), _ => None };
|
||||
effects = |_: &Self| EffectMask::PURE.add(Effect::Alloc);
|
||||
dst = |s: &Self| Some(s.dst);
|
||||
used = |s: &Self| { let mut v: Vec<ValueId> = s.captures.iter().map(|(_, id)| *id).collect(); if let Some(m) = s.me { v.push(m); } v };
|
||||
}
|
||||
}
|
||||
|
||||
// ---- Store ---- (macro-generated)
|
||||
inst_meta! {
|
||||
pub struct StoreInst { value: ValueId, ptr: ValueId }
|
||||
@ -446,6 +479,72 @@ inst_meta! {
|
||||
}
|
||||
}
|
||||
|
||||
// ---- TypeCheck ---- (macro-generated)
|
||||
inst_meta! {
|
||||
pub struct TypeCheckInst { dst: ValueId, value: ValueId }
|
||||
=> {
|
||||
from_mir = |i| match i { MirInstruction::TypeCheck { dst, value, .. } => Some(TypeCheckInst { dst: *dst, value: *value }), _ => None };
|
||||
effects = |_: &Self| EffectMask::PURE;
|
||||
dst = |s: &Self| Some(s.dst);
|
||||
used = |s: &Self| vec![s.value];
|
||||
}
|
||||
}
|
||||
|
||||
// ---- Copy ---- (macro-generated)
|
||||
inst_meta! {
|
||||
pub struct CopyInst { dst: ValueId, src: ValueId }
|
||||
=> {
|
||||
from_mir = |i| match i { MirInstruction::Copy { dst, src } => Some(CopyInst { dst: *dst, src: *src }), _ => None };
|
||||
effects = |_: &Self| EffectMask::PURE;
|
||||
dst = |s: &Self| Some(s.dst);
|
||||
used = |s: &Self| vec![s.src];
|
||||
}
|
||||
}
|
||||
|
||||
// ---- Nop ---- (macro-generated)
|
||||
inst_meta! {
|
||||
pub struct NopInst { }
|
||||
=> {
|
||||
from_mir = |i| match i { MirInstruction::Nop => Some(NopInst {}), _ => None };
|
||||
effects = |_: &Self| EffectMask::PURE;
|
||||
dst = |_: &Self| None;
|
||||
used = |_: &Self| Vec::new();
|
||||
}
|
||||
}
|
||||
|
||||
// ---- Throw ---- (macro-generated)
|
||||
inst_meta! {
|
||||
pub struct ThrowInst { exception: ValueId, effects_mask: EffectMask }
|
||||
=> {
|
||||
from_mir = |i| match i { MirInstruction::Throw { exception, effects } => Some(ThrowInst { exception: *exception, effects_mask: *effects }), _ => None };
|
||||
effects = |s: &Self| s.effects_mask;
|
||||
dst = |_: &Self| None;
|
||||
used = |s: &Self| vec![s.exception];
|
||||
}
|
||||
}
|
||||
|
||||
// ---- Catch ---- (macro-generated)
|
||||
inst_meta! {
|
||||
pub struct CatchInst { exception_value: ValueId }
|
||||
=> {
|
||||
from_mir = |i| match i { MirInstruction::Catch { exception_value, .. } => Some(CatchInst { exception_value: *exception_value }), _ => None };
|
||||
effects = |_: &Self| EffectMask::CONTROL;
|
||||
dst = |s: &Self| Some(s.exception_value);
|
||||
used = |_: &Self| Vec::new();
|
||||
}
|
||||
}
|
||||
|
||||
// ---- Safepoint ---- (macro-generated)
|
||||
inst_meta! {
|
||||
pub struct SafepointInst { }
|
||||
=> {
|
||||
from_mir = |i| match i { MirInstruction::Safepoint => Some(SafepointInst {}), _ => None };
|
||||
effects = |_: &Self| EffectMask::PURE;
|
||||
dst = |_: &Self| None;
|
||||
used = |_: &Self| Vec::new();
|
||||
}
|
||||
}
|
||||
|
||||
// ---- Call-like (dst/used only; effects fallback in MirInstruction) ----
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum CallLikeInst {
|
||||
|
||||
@ -75,7 +75,7 @@ pub fn build_simple_loop<L: LoopBuilderApi>(
|
||||
let void_id = lb.new_value();
|
||||
lb.emit(MirInstruction::Const {
|
||||
dst: void_id,
|
||||
value: super::instruction::ConstValue::Void,
|
||||
value: crate::mir::ConstValue::Void,
|
||||
})?;
|
||||
Ok(void_id)
|
||||
}
|
||||
|
||||
@ -42,47 +42,14 @@ pub struct LoopBuilder<'a> {
|
||||
no_phi_mode: bool,
|
||||
}
|
||||
|
||||
// Local copy: detect a variable name assigned within an AST fragment
|
||||
fn extract_assigned_var_local(ast: &ASTNode) -> Option<String> {
|
||||
match ast {
|
||||
ASTNode::Assignment { target, .. } => {
|
||||
if let ASTNode::Variable { name, .. } = target.as_ref() {
|
||||
Some(name.clone())
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
ASTNode::Program { statements, .. } => statements
|
||||
.last()
|
||||
.and_then(|st| extract_assigned_var_local(st)),
|
||||
ASTNode::If {
|
||||
then_body,
|
||||
else_body,
|
||||
..
|
||||
} => {
|
||||
let then_prog = ASTNode::Program {
|
||||
statements: then_body.clone(),
|
||||
span: crate::ast::Span::unknown(),
|
||||
};
|
||||
let tvar = extract_assigned_var_local(&then_prog);
|
||||
let evar = else_body.as_ref().and_then(|eb| {
|
||||
let ep = ASTNode::Program {
|
||||
statements: eb.clone(),
|
||||
span: crate::ast::Span::unknown(),
|
||||
};
|
||||
extract_assigned_var_local(&ep)
|
||||
});
|
||||
match (tvar, evar) {
|
||||
(Some(tv), Some(ev)) if tv == ev => Some(tv),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
// (removed) extract_assigned_var_local was a local helper used during
|
||||
// diagnostics and is no longer referenced. Keep the file lean and avoid
|
||||
// dead_code warnings.
|
||||
|
||||
impl<'a> LoopBuilder<'a> {
|
||||
// --- Small helpers for continue/break commonization ---
|
||||
// =============================================================
|
||||
// Control Helpers — break/continue/jumps/unreachable handling
|
||||
// =============================================================
|
||||
|
||||
/// Emit a jump to `target` from the current block and record predecessor metadata.
|
||||
fn jump_with_pred(&mut self, target: BasicBlockId) -> Result<(), String> {
|
||||
@ -124,6 +91,9 @@ impl<'a> LoopBuilder<'a> {
|
||||
self.switch_to_unreachable_block_with_void()
|
||||
}
|
||||
|
||||
// =============================================================
|
||||
// Lifecycle — create builder, main loop construction
|
||||
// =============================================================
|
||||
/// 新しいループビルダーを作成
|
||||
pub fn new(parent: &'a mut super::builder::MirBuilder) -> Self {
|
||||
let no_phi_mode = parent.is_no_phi_mode();
|
||||
@ -272,6 +242,9 @@ impl<'a> LoopBuilder<'a> {
|
||||
Ok(void_dst)
|
||||
}
|
||||
|
||||
// =============================================================
|
||||
// PHI Helpers — prepare/finalize PHIs and block sealing
|
||||
// =============================================================
|
||||
/// ループ変数の準備(事前検出または遅延生成)
|
||||
fn prepare_loop_variables(
|
||||
&mut self,
|
||||
@ -400,6 +373,7 @@ impl<'a> LoopBuilder<'a> {
|
||||
.emit_instruction(MirInstruction::Const { dst, value })
|
||||
}
|
||||
|
||||
/// ブロック先頭に PHI 命令を挿入(不変条件: PHI は常にブロック先頭)
|
||||
fn emit_phi_at_block_start(
|
||||
&mut self,
|
||||
block_id: BasicBlockId,
|
||||
@ -454,6 +428,9 @@ impl<'a> LoopBuilder<'a> {
|
||||
}
|
||||
}
|
||||
|
||||
// =============================================================
|
||||
// Variable Map Utilities — snapshots and rebinding
|
||||
// =============================================================
|
||||
fn get_current_variable_map(&self) -> HashMap<String, ValueId> {
|
||||
self.parent_builder.variable_map.clone()
|
||||
}
|
||||
@ -509,192 +486,157 @@ impl<'a> LoopBuilder<'a> {
|
||||
void_id
|
||||
}))
|
||||
}
|
||||
ASTNode::If {
|
||||
condition,
|
||||
then_body,
|
||||
else_body,
|
||||
..
|
||||
} => {
|
||||
// Lower a simple if inside loop, ensuring continue/break inside branches are handled
|
||||
let cond_val = self.parent_builder.build_expression(*condition.clone())?;
|
||||
let then_bb = self.new_block();
|
||||
let else_bb = self.new_block();
|
||||
let merge_bb = self.new_block();
|
||||
self.emit_branch(cond_val, then_bb, else_bb)?;
|
||||
|
||||
// Capture pre-if variable map (used for phi normalization)
|
||||
let pre_if_var_map = self.get_current_variable_map();
|
||||
let pre_then_var_value = pre_if_var_map.clone();
|
||||
|
||||
// then
|
||||
self.set_current_block(then_bb)?;
|
||||
for s in then_body.iter().cloned() {
|
||||
let _ = self.build_statement(s)?;
|
||||
// Stop if block terminated
|
||||
let cur_id = self.current_block()?;
|
||||
let terminated = {
|
||||
if let Some(ref fun_ro) = self.parent_builder.current_function {
|
||||
if let Some(bb) = fun_ro.get_block(cur_id) {
|
||||
bb.is_terminated()
|
||||
} else {
|
||||
false
|
||||
}
|
||||
} else {
|
||||
false
|
||||
}
|
||||
};
|
||||
if terminated {
|
||||
break;
|
||||
}
|
||||
}
|
||||
let then_var_map_end = self.get_current_variable_map();
|
||||
// Only jump to merge if not already terminated (e.g., continue/break)
|
||||
let then_reaches_merge = {
|
||||
let cur_id = self.current_block()?;
|
||||
let need_jump = {
|
||||
if let Some(ref fun_ro) = self.parent_builder.current_function {
|
||||
if let Some(bb) = fun_ro.get_block(cur_id) {
|
||||
!bb.is_terminated()
|
||||
} else {
|
||||
false
|
||||
}
|
||||
} else {
|
||||
false
|
||||
}
|
||||
};
|
||||
if need_jump {
|
||||
self.emit_jump(merge_bb)?;
|
||||
true
|
||||
} else {
|
||||
false
|
||||
}
|
||||
};
|
||||
|
||||
// else
|
||||
self.set_current_block(else_bb)?;
|
||||
let mut else_var_map_end_opt: Option<
|
||||
std::collections::HashMap<String, super::ValueId>,
|
||||
> = None;
|
||||
if let Some(es) = else_body.clone() {
|
||||
for s in es.into_iter() {
|
||||
let _ = self.build_statement(s)?;
|
||||
let cur_id = self.current_block()?;
|
||||
let terminated = {
|
||||
if let Some(ref fun_ro) = self.parent_builder.current_function {
|
||||
if let Some(bb) = fun_ro.get_block(cur_id) {
|
||||
bb.is_terminated()
|
||||
} else {
|
||||
false
|
||||
}
|
||||
} else {
|
||||
false
|
||||
}
|
||||
};
|
||||
if terminated {
|
||||
break;
|
||||
}
|
||||
}
|
||||
else_var_map_end_opt = Some(self.get_current_variable_map());
|
||||
}
|
||||
let else_reaches_merge = {
|
||||
let cur_id = self.current_block()?;
|
||||
let need_jump = {
|
||||
if let Some(ref fun_ro) = self.parent_builder.current_function {
|
||||
if let Some(bb) = fun_ro.get_block(cur_id) {
|
||||
!bb.is_terminated()
|
||||
} else {
|
||||
false
|
||||
}
|
||||
} else {
|
||||
false
|
||||
}
|
||||
};
|
||||
if need_jump {
|
||||
self.emit_jump(merge_bb)?;
|
||||
true
|
||||
} else {
|
||||
false
|
||||
}
|
||||
};
|
||||
|
||||
// Continue at merge
|
||||
self.set_current_block(merge_bb)?;
|
||||
// If branches assign variables, emit PHIs per variable and bind them.
|
||||
// Previous logic handled only a single variable; here we generalize to all assigned vars.
|
||||
fn collect_assigned_vars(ast: &ASTNode, out: &mut std::collections::HashSet<String>) {
|
||||
match ast {
|
||||
ASTNode::Assignment { target, .. } => {
|
||||
if let ASTNode::Variable { name, .. } = target.as_ref() {
|
||||
out.insert(name.clone());
|
||||
}
|
||||
}
|
||||
ASTNode::Program { statements, .. } => {
|
||||
for s in statements { collect_assigned_vars(s, out); }
|
||||
}
|
||||
ASTNode::If { then_body, else_body, .. } => {
|
||||
let tp = ASTNode::Program { statements: then_body.clone(), span: crate::ast::Span::unknown() };
|
||||
collect_assigned_vars(&tp, out);
|
||||
if let Some(eb) = else_body {
|
||||
let ep = ASTNode::Program { statements: eb.clone(), span: crate::ast::Span::unknown() };
|
||||
collect_assigned_vars(&ep, out);
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
let mut vars: std::collections::HashSet<String> = std::collections::HashSet::new();
|
||||
let then_prog = ASTNode::Program { statements: then_body.clone(), span: crate::ast::Span::unknown() };
|
||||
collect_assigned_vars(&then_prog, &mut vars);
|
||||
if let Some(es) = &else_body {
|
||||
let else_prog = ASTNode::Program { statements: es.clone(), span: crate::ast::Span::unknown() };
|
||||
collect_assigned_vars(&else_prog, &mut vars);
|
||||
}
|
||||
|
||||
// Reset to pre-if map before rebinding to ensure a clean environment
|
||||
self.parent_builder.variable_map = pre_if_var_map.clone();
|
||||
for var_name in vars.into_iter() {
|
||||
// then-side value: from then end map if assigned there; otherwise pre-if value
|
||||
let then_val = then_var_map_end.get(&var_name).copied().or_else(|| pre_then_var_value.get(&var_name).copied());
|
||||
// else-side value: prefer else end map when else assigns; otherwise pre-if value
|
||||
let else_val = else_var_map_end_opt
|
||||
.as_ref()
|
||||
.and_then(|m| m.get(&var_name).copied())
|
||||
.or_else(|| pre_then_var_value.get(&var_name).copied());
|
||||
|
||||
if let (Some(tv), Some(ev)) = (then_val, else_val) {
|
||||
// Build incoming list only for predecessors that actually reach merge
|
||||
let mut incomings: Vec<(BasicBlockId, ValueId)> = Vec::new();
|
||||
if then_reaches_merge { incomings.push((then_bb, tv)); }
|
||||
if else_reaches_merge { incomings.push((else_bb, ev)); }
|
||||
match incomings.len() {
|
||||
0 => { /* neither branch reaches merge: nothing to bind */ }
|
||||
1 => {
|
||||
// Single predecessor reaching merge: no PHI needed
|
||||
let (_pred, v) = incomings[0];
|
||||
self.parent_builder.variable_map.insert(var_name, v);
|
||||
}
|
||||
_ => {
|
||||
let phi_id = self.new_value();
|
||||
if self.no_phi_mode {
|
||||
for (pred, v) in incomings.iter().copied() {
|
||||
self.parent_builder.insert_edge_copy(pred, phi_id, v)?;
|
||||
}
|
||||
} else {
|
||||
self.emit_phi_at_block_start(merge_bb, phi_id, incomings)?;
|
||||
}
|
||||
self.parent_builder.variable_map.insert(var_name, phi_id);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
let void_id = self.new_value();
|
||||
self.emit_const(void_id, ConstValue::Void)?;
|
||||
Ok(void_id)
|
||||
}
|
||||
ASTNode::If { condition, then_body, else_body, .. } =>
|
||||
self.lower_if_in_loop(*condition, then_body, else_body),
|
||||
ASTNode::Break { .. } => self.do_break(),
|
||||
ASTNode::Continue { .. } => self.do_continue(),
|
||||
other => self.parent_builder.build_expression(other),
|
||||
}
|
||||
}
|
||||
|
||||
/// Lower an if-statement inside a loop, preserving continue/break semantics and emitting PHIs per assigned variable.
|
||||
fn lower_if_in_loop(
|
||||
&mut self,
|
||||
condition: ASTNode,
|
||||
then_body: Vec<ASTNode>,
|
||||
else_body: Option<Vec<ASTNode>>,
|
||||
) -> Result<ValueId, String> {
|
||||
// Evaluate condition and create blocks
|
||||
let cond_val = self.parent_builder.build_expression(condition)?;
|
||||
let then_bb = self.new_block();
|
||||
let else_bb = self.new_block();
|
||||
let merge_bb = self.new_block();
|
||||
self.emit_branch(cond_val, then_bb, else_bb)?;
|
||||
|
||||
// Capture pre-if variable map (used for phi normalization)
|
||||
let pre_if_var_map = self.get_current_variable_map();
|
||||
let pre_then_var_value = pre_if_var_map.clone();
|
||||
|
||||
// then branch
|
||||
self.set_current_block(then_bb)?;
|
||||
for s in then_body.iter().cloned() {
|
||||
let _ = self.build_statement(s)?;
|
||||
// Stop if block terminated
|
||||
let cur_id = self.current_block()?;
|
||||
let terminated = {
|
||||
if let Some(ref fun_ro) = self.parent_builder.current_function {
|
||||
if let Some(bb) = fun_ro.get_block(cur_id) { bb.is_terminated() } else { false }
|
||||
} else { false }
|
||||
};
|
||||
if terminated { break; }
|
||||
}
|
||||
let then_var_map_end = self.get_current_variable_map();
|
||||
// Only jump to merge if not already terminated (e.g., continue/break)
|
||||
// Capture the actual predecessor block that reaches merge (entry block may not be the exit).
|
||||
let then_pred_to_merge: Option<BasicBlockId> = {
|
||||
let cur_id = self.current_block()?;
|
||||
let need_jump = {
|
||||
if let Some(ref fun_ro) = self.parent_builder.current_function {
|
||||
if let Some(bb) = fun_ro.get_block(cur_id) { !bb.is_terminated() } else { false }
|
||||
} else { false }
|
||||
};
|
||||
if need_jump {
|
||||
// Emit the edge now; record the real predecessor (cur_id), not the entry then_bb.
|
||||
self.emit_jump(merge_bb)?;
|
||||
Some(cur_id)
|
||||
} else {
|
||||
// Terminated path (e.g., continue/break) — no incoming to merge.
|
||||
None
|
||||
}
|
||||
};
|
||||
|
||||
// else branch
|
||||
self.set_current_block(else_bb)?;
|
||||
let mut else_var_map_end_opt: Option<HashMap<String, ValueId>> = None;
|
||||
if let Some(es) = else_body.clone() {
|
||||
for s in es.into_iter() {
|
||||
let _ = self.build_statement(s)?;
|
||||
let cur_id = self.current_block()?;
|
||||
let terminated = {
|
||||
if let Some(ref fun_ro) = self.parent_builder.current_function {
|
||||
if let Some(bb) = fun_ro.get_block(cur_id) { bb.is_terminated() } else { false }
|
||||
} else { false }
|
||||
};
|
||||
if terminated { break; }
|
||||
}
|
||||
else_var_map_end_opt = Some(self.get_current_variable_map());
|
||||
}
|
||||
let else_pred_to_merge: Option<BasicBlockId> = {
|
||||
let cur_id = self.current_block()?;
|
||||
let need_jump = {
|
||||
if let Some(ref fun_ro) = self.parent_builder.current_function {
|
||||
if let Some(bb) = fun_ro.get_block(cur_id) { !bb.is_terminated() } else { false }
|
||||
} else { false }
|
||||
};
|
||||
if need_jump {
|
||||
self.emit_jump(merge_bb)?;
|
||||
Some(cur_id)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
};
|
||||
|
||||
// Continue at merge
|
||||
self.set_current_block(merge_bb)?;
|
||||
// collect assigned variables in both branches
|
||||
fn collect_assigned_vars(ast: &ASTNode, out: &mut std::collections::HashSet<String>) {
|
||||
match ast {
|
||||
ASTNode::Assignment { target, .. } => {
|
||||
if let ASTNode::Variable { name, .. } = target.as_ref() { out.insert(name.clone()); }
|
||||
}
|
||||
ASTNode::Program { statements, .. } => { for s in statements { collect_assigned_vars(s, out); } }
|
||||
ASTNode::If { then_body, else_body, .. } => {
|
||||
let tp = ASTNode::Program { statements: then_body.clone(), span: crate::ast::Span::unknown() };
|
||||
collect_assigned_vars(&tp, out);
|
||||
if let Some(eb) = else_body {
|
||||
let ep = ASTNode::Program { statements: eb.clone(), span: crate::ast::Span::unknown() };
|
||||
collect_assigned_vars(&ep, out);
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
let mut vars: std::collections::HashSet<String> = std::collections::HashSet::new();
|
||||
let then_prog = ASTNode::Program { statements: then_body.clone(), span: crate::ast::Span::unknown() };
|
||||
collect_assigned_vars(&then_prog, &mut vars);
|
||||
if let Some(es) = &else_body {
|
||||
let else_prog = ASTNode::Program { statements: es.clone(), span: crate::ast::Span::unknown() };
|
||||
collect_assigned_vars(&else_prog, &mut vars);
|
||||
}
|
||||
|
||||
// Reset to pre-if map before rebinding to ensure a clean environment
|
||||
self.parent_builder.variable_map = pre_if_var_map.clone();
|
||||
for var_name in vars.into_iter() {
|
||||
let then_val = then_var_map_end.get(&var_name).copied().or_else(|| pre_then_var_value.get(&var_name).copied());
|
||||
let else_val = else_var_map_end_opt.as_ref().and_then(|m| m.get(&var_name).copied()).or_else(|| pre_then_var_value.get(&var_name).copied());
|
||||
|
||||
if let (Some(tv), Some(ev)) = (then_val, else_val) {
|
||||
let mut incomings: Vec<(BasicBlockId, ValueId)> = Vec::new();
|
||||
if let Some(pred) = then_pred_to_merge { incomings.push((pred, tv)); }
|
||||
if let Some(pred) = else_pred_to_merge { incomings.push((pred, ev)); }
|
||||
match incomings.len() {
|
||||
0 => {}
|
||||
1 => {
|
||||
let (_pred, v) = incomings[0];
|
||||
self.parent_builder.variable_map.insert(var_name, v);
|
||||
}
|
||||
_ => {
|
||||
let phi_id = self.new_value();
|
||||
if self.no_phi_mode {
|
||||
for (pred, v) in incomings.iter().copied() {
|
||||
self.parent_builder.insert_edge_copy(pred, phi_id, v)?;
|
||||
}
|
||||
} else {
|
||||
self.emit_phi_at_block_start(merge_bb, phi_id, incomings)?;
|
||||
}
|
||||
self.parent_builder.variable_map.insert(var_name, phi_id);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
let void_id = self.new_value();
|
||||
self.emit_const(void_id, ConstValue::Void)?;
|
||||
Ok(void_id)
|
||||
}
|
||||
}
|
||||
|
||||
@ -14,6 +14,7 @@ pub mod function;
|
||||
pub mod instruction;
|
||||
pub mod instruction_kinds; // small kind-specific metadata (Const/BinOp)
|
||||
pub mod instruction_introspection; // Introspection helpers for tests (instruction names)
|
||||
pub mod types; // core MIR enums (ConstValue, Ops, MirType)
|
||||
pub mod loop_api; // Minimal LoopBuilder facade (adapter-ready)
|
||||
pub mod loop_builder; // SSA loop construction with phi nodes
|
||||
pub mod optimizer;
|
||||
@ -33,9 +34,9 @@ pub use basic_block::{BasicBlock, BasicBlockId, BasicBlockIdGenerator};
|
||||
pub use builder::MirBuilder;
|
||||
pub use effect::{Effect, EffectMask};
|
||||
pub use function::{FunctionSignature, MirFunction, MirModule};
|
||||
pub use instruction::{
|
||||
BarrierOp, BinaryOp, CompareOp, ConstValue, MirInstruction, MirType, TypeOpKind, UnaryOp,
|
||||
WeakRefOp,
|
||||
pub use instruction::MirInstruction;
|
||||
pub use types::{
|
||||
BarrierOp, BinaryOp, CompareOp, ConstValue, MirType, TypeOpKind, UnaryOp, WeakRefOp,
|
||||
};
|
||||
pub use optimizer::MirOptimizer;
|
||||
pub use printer::MirPrinter;
|
||||
|
||||
@ -225,13 +225,12 @@ fn resolve_type_from_value(
|
||||
def_map: &std::collections::HashMap<ValueId, (super::basic_block::BasicBlockId, usize)>,
|
||||
id: ValueId,
|
||||
) -> Option<MirType> {
|
||||
use super::instruction::ConstValue;
|
||||
if let Some((bb, idx)) = def_map.get(&id).copied() {
|
||||
if let Some(block) = function.blocks.get(&bb) {
|
||||
if idx < block.instructions.len() {
|
||||
match &block.instructions[idx] {
|
||||
MirInstruction::Const {
|
||||
value: ConstValue::String(s),
|
||||
value: crate::mir::ConstValue::String(s),
|
||||
..
|
||||
} => {
|
||||
return Some(map_type_name(s));
|
||||
@ -244,7 +243,7 @@ fn resolve_type_from_value(
|
||||
if let Some(sblock) = function.blocks.get(&sbb) {
|
||||
if sidx < sblock.instructions.len() {
|
||||
if let MirInstruction::Const {
|
||||
value: ConstValue::String(s),
|
||||
value: crate::mir::ConstValue::String(s),
|
||||
..
|
||||
} = &sblock.instructions[sidx]
|
||||
{
|
||||
@ -313,7 +312,7 @@ fn diagnose_unlowered_type_ops(
|
||||
if let Some(b) = function.blocks.get(&bb) {
|
||||
if idx < b.instructions.len() {
|
||||
if let MirInstruction::Const {
|
||||
value: super::instruction::ConstValue::String(s),
|
||||
value: crate::mir::ConstValue::String(s),
|
||||
..
|
||||
} = &b.instructions[idx]
|
||||
{
|
||||
|
||||
@ -41,7 +41,7 @@ pub fn diagnose_unlowered_type_ops(
|
||||
if let Some(b) = function.blocks.get(&bb) {
|
||||
if idx < b.instructions.len() {
|
||||
if let MirInstruction::Const {
|
||||
value: crate::mir::instruction::ConstValue::String(s),
|
||||
value: crate::mir::ConstValue::String(s),
|
||||
..
|
||||
} = &b.instructions[idx]
|
||||
{
|
||||
|
||||
@ -239,7 +239,7 @@ pub fn normalize_legacy_instructions(
|
||||
value,
|
||||
expected_type,
|
||||
} => {
|
||||
let ty = crate::mir::instruction::MirType::Box(expected_type.clone());
|
||||
let ty = crate::mir::MirType::Box(expected_type.clone());
|
||||
*term = I::TypeOp {
|
||||
dst: *dst,
|
||||
op: TypeOpKind::Check,
|
||||
@ -370,7 +370,7 @@ pub fn normalize_ref_field_access(
|
||||
function.next_value_id += 1;
|
||||
out.push(I::Const {
|
||||
dst: new_id,
|
||||
value: crate::mir::instruction::ConstValue::String(field),
|
||||
value: crate::mir::ConstValue::String(field),
|
||||
});
|
||||
out.push(I::BoxCall {
|
||||
dst: Some(dst),
|
||||
@ -391,7 +391,7 @@ pub fn normalize_ref_field_access(
|
||||
function.next_value_id += 1;
|
||||
out.push(I::Const {
|
||||
dst: new_id,
|
||||
value: crate::mir::instruction::ConstValue::String(field),
|
||||
value: crate::mir::ConstValue::String(field),
|
||||
});
|
||||
out.push(I::Barrier {
|
||||
op: BarrierOp::Write,
|
||||
@ -423,7 +423,7 @@ pub fn normalize_ref_field_access(
|
||||
function.next_value_id += 1;
|
||||
block.instructions.push(I::Const {
|
||||
dst: new_id,
|
||||
value: crate::mir::instruction::ConstValue::String(field),
|
||||
value: crate::mir::ConstValue::String(field),
|
||||
});
|
||||
I::BoxCall {
|
||||
dst: Some(dst),
|
||||
@ -443,7 +443,7 @@ pub fn normalize_ref_field_access(
|
||||
function.next_value_id += 1;
|
||||
block.instructions.push(I::Const {
|
||||
dst: new_id,
|
||||
value: crate::mir::instruction::ConstValue::String(field),
|
||||
value: crate::mir::ConstValue::String(field),
|
||||
});
|
||||
block.instructions.push(I::Barrier {
|
||||
op: BarrierOp::Write,
|
||||
|
||||
@ -11,7 +11,7 @@ use crate::mir::{BinaryOp, CompareOp, EffectMask, MirInstruction as I, MirModule
|
||||
/// Not x => Compare(Eq, x, Const false)
|
||||
/// BitNot x => BinOp(BitXor, x, Const(-1))
|
||||
pub fn normalize_pure_core13(_opt: &mut MirOptimizer, module: &mut MirModule) -> OptimizationStats {
|
||||
use crate::mir::instruction::ConstValue;
|
||||
use crate::mir::types::ConstValue;
|
||||
let mut stats = OptimizationStats::new();
|
||||
for (_fname, function) in &mut module.functions {
|
||||
for (_bb, block) in &mut function.blocks {
|
||||
@ -43,7 +43,7 @@ pub fn normalize_pure_core13(_opt: &mut MirOptimizer, module: &mut MirModule) ->
|
||||
// prepend type name as Const String
|
||||
let ty_id = ValueId::new(function.next_value_id);
|
||||
function.next_value_id += 1;
|
||||
out.push(I::Const { dst: ty_id, value: ConstValue::String(box_type) });
|
||||
out.push(I::Const { dst: ty_id, value: crate::mir::ConstValue::String(box_type) });
|
||||
let mut call_args = Vec::with_capacity(1 + args.len());
|
||||
call_args.push(ty_id);
|
||||
call_args.append(&mut args);
|
||||
@ -61,19 +61,19 @@ pub fn normalize_pure_core13(_opt: &mut MirOptimizer, module: &mut MirModule) ->
|
||||
crate::mir::UnaryOp::Neg => {
|
||||
let zero = ValueId::new(function.next_value_id);
|
||||
function.next_value_id += 1;
|
||||
out.push(I::Const { dst: zero, value: ConstValue::Integer(0) });
|
||||
out.push(I::Const { dst: zero, value: crate::mir::ConstValue::Integer(0) });
|
||||
out.push(I::BinOp { dst, op: BinaryOp::Sub, lhs: zero, rhs: operand });
|
||||
}
|
||||
crate::mir::UnaryOp::Not => {
|
||||
let f = ValueId::new(function.next_value_id);
|
||||
function.next_value_id += 1;
|
||||
out.push(I::Const { dst: f, value: ConstValue::Bool(false) });
|
||||
out.push(I::Const { dst: f, value: crate::mir::ConstValue::Bool(false) });
|
||||
out.push(I::Compare { dst, op: CompareOp::Eq, lhs: operand, rhs: f });
|
||||
}
|
||||
crate::mir::UnaryOp::BitNot => {
|
||||
let all1 = ValueId::new(function.next_value_id);
|
||||
function.next_value_id += 1;
|
||||
out.push(I::Const { dst: all1, value: ConstValue::Integer(-1) });
|
||||
out.push(I::Const { dst: all1, value: crate::mir::ConstValue::Integer(-1) });
|
||||
out.push(I::BinOp { dst, op: BinaryOp::BitXor, lhs: operand, rhs: all1 });
|
||||
}
|
||||
}
|
||||
|
||||
@ -338,10 +338,6 @@ impl MirPrinter {
|
||||
output
|
||||
}
|
||||
|
||||
fn format_dst(&self, dst: &ValueId, types: &HashMap<ValueId, MirType>) -> String {
|
||||
printer_helpers::format_dst(dst, types)
|
||||
}
|
||||
|
||||
/// Format a single instruction
|
||||
fn format_instruction(
|
||||
&self,
|
||||
|
||||
111
src/mir/types.rs
Normal file
111
src/mir/types.rs
Normal file
@ -0,0 +1,111 @@
|
||||
/*!
|
||||
* MIR Core Types — split from instruction.rs for clarity (behavior-preserving)
|
||||
*/
|
||||
|
||||
use std::fmt;
|
||||
|
||||
/// Constant values in MIR
|
||||
#[derive(Debug, Clone, PartialEq)]
|
||||
pub enum ConstValue {
|
||||
Integer(i64),
|
||||
Float(f64),
|
||||
Bool(bool),
|
||||
String(String),
|
||||
Null,
|
||||
Void,
|
||||
}
|
||||
|
||||
impl fmt::Display for ConstValue {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
match self {
|
||||
ConstValue::Integer(n) => write!(f, "{}", n),
|
||||
ConstValue::Float(fl) => write!(f, "{}", fl),
|
||||
ConstValue::Bool(b) => write!(f, "{}", b),
|
||||
ConstValue::String(s) => write!(f, "\"{}\"", s),
|
||||
ConstValue::Null => write!(f, "null"),
|
||||
ConstValue::Void => write!(f, "void"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Binary operations
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub enum BinaryOp {
|
||||
// Arithmetic
|
||||
Add,
|
||||
Sub,
|
||||
Mul,
|
||||
Div,
|
||||
Mod,
|
||||
|
||||
// Bitwise
|
||||
BitAnd,
|
||||
BitOr,
|
||||
BitXor,
|
||||
Shl,
|
||||
Shr,
|
||||
|
||||
// Logical
|
||||
And,
|
||||
Or,
|
||||
}
|
||||
|
||||
/// Unary operations
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub enum UnaryOp {
|
||||
// Arithmetic
|
||||
Neg,
|
||||
|
||||
// Logical
|
||||
Not,
|
||||
|
||||
// Bitwise
|
||||
BitNot,
|
||||
}
|
||||
|
||||
/// Comparison operations
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub enum CompareOp {
|
||||
Eq,
|
||||
Ne,
|
||||
Lt,
|
||||
Le,
|
||||
Gt,
|
||||
Ge,
|
||||
}
|
||||
|
||||
/// MIR type system
|
||||
#[derive(Debug, Clone, PartialEq, Eq)]
|
||||
pub enum MirType {
|
||||
Integer,
|
||||
Float,
|
||||
Bool,
|
||||
String,
|
||||
Box(String), // Box type with name
|
||||
Array(Box<MirType>),
|
||||
Future(Box<MirType>), // Future containing a type
|
||||
Void,
|
||||
Unknown,
|
||||
}
|
||||
|
||||
/// Kind of unified type operation (PoC)
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub enum TypeOpKind {
|
||||
Check,
|
||||
Cast,
|
||||
}
|
||||
|
||||
/// Kind of unified weak reference operation (PoC)
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub enum WeakRefOp {
|
||||
New,
|
||||
Load,
|
||||
}
|
||||
|
||||
/// Kind of unified barrier operation (PoC)
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub enum BarrierOp {
|
||||
Read,
|
||||
Write,
|
||||
}
|
||||
|
||||
@ -4,10 +4,9 @@
|
||||
* Implements dominance checking, SSA verification, and semantic analysis
|
||||
*/
|
||||
|
||||
use super::{BasicBlockId, MirFunction, MirModule, ValueId};
|
||||
use super::{MirFunction, MirModule};
|
||||
use crate::debug::log as dlog;
|
||||
use crate::mir::verification_types::VerificationError;
|
||||
use std::collections::HashMap;
|
||||
mod cfg;
|
||||
mod dom;
|
||||
mod awaits;
|
||||
|
||||
@ -69,7 +69,7 @@ pub fn check_weakref_and_barrier(function: &MirFunction) -> Result<(), Vec<Verif
|
||||
| MirInstruction::BarrierWrite { ptr } => {
|
||||
if let Some((_db, _di, def_inst)) = def_map.get(ptr) {
|
||||
if let MirInstruction::Const {
|
||||
value: crate::mir::instruction::ConstValue::Void,
|
||||
value: crate::mir::ConstValue::Void,
|
||||
..
|
||||
} = def_inst
|
||||
{
|
||||
|
||||
@ -6,7 +6,7 @@ use crate::tokenizer::TokenType;
|
||||
|
||||
/// Parse the leading header of a box declaration and return
|
||||
/// (name, type_params, extends, implements). Does not consume the opening '{'.
|
||||
pub fn parse_header(
|
||||
pub(crate) fn parse_header(
|
||||
p: &mut NyashParser,
|
||||
) -> Result<(String, Vec<String>, Vec<String>, Vec<String>), ParseError> {
|
||||
// Name
|
||||
|
||||
107
src/parser/declarations/box_def/interface.rs
Normal file
107
src/parser/declarations/box_def/interface.rs
Normal file
@ -0,0 +1,107 @@
|
||||
//! Interface box parser: `interface box Name { methods... }`
|
||||
use crate::ast::{ASTNode, Span};
|
||||
use crate::parser::{NyashParser, ParseError};
|
||||
use crate::parser::common::ParserUtils;
|
||||
use crate::tokenizer::TokenType;
|
||||
use std::collections::HashMap;
|
||||
|
||||
/// Parse `interface box Name { methods... }` and return an AST BoxDeclaration.
|
||||
/// Caller must be positioned at the beginning of `interface box`.
|
||||
pub(crate) fn parse_interface_box(p: &mut NyashParser) -> Result<ASTNode, ParseError> {
|
||||
p.consume(TokenType::INTERFACE)?;
|
||||
p.consume(TokenType::BOX)?;
|
||||
|
||||
let name = if let TokenType::IDENTIFIER(name) = &p.current_token().token_type {
|
||||
let name = name.clone();
|
||||
p.advance();
|
||||
name
|
||||
} else {
|
||||
let line = p.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: p.current_token().token_type.clone(),
|
||||
expected: "identifier".to_string(),
|
||||
line,
|
||||
});
|
||||
};
|
||||
|
||||
p.consume(TokenType::LBRACE)?;
|
||||
p.skip_newlines(); // ブレース後の改行をスキップ
|
||||
|
||||
let mut methods = HashMap::new();
|
||||
|
||||
while !p.match_token(&TokenType::RBRACE) && !p.is_at_end() {
|
||||
p.skip_newlines(); // ループ開始時に改行をスキップ
|
||||
if let TokenType::IDENTIFIER(method_name) = &p.current_token().token_type {
|
||||
let method_name = method_name.clone();
|
||||
p.advance();
|
||||
|
||||
// インターフェースメソッドはシグネチャのみ
|
||||
if p.match_token(&TokenType::LPAREN) {
|
||||
p.advance(); // consume '('
|
||||
|
||||
let mut params = Vec::new();
|
||||
while !p.match_token(&TokenType::RPAREN) && !p.is_at_end() {
|
||||
if let TokenType::IDENTIFIER(param) = &p.current_token().token_type {
|
||||
params.push(param.clone());
|
||||
p.advance();
|
||||
}
|
||||
|
||||
if p.match_token(&TokenType::COMMA) {
|
||||
p.advance();
|
||||
}
|
||||
}
|
||||
|
||||
p.consume(TokenType::RPAREN)?;
|
||||
|
||||
// インターフェースメソッドは実装なし(空のbody)
|
||||
let method_decl = ASTNode::FunctionDeclaration {
|
||||
name: method_name.clone(),
|
||||
params,
|
||||
body: vec![], // 空の実装
|
||||
is_static: false, // インターフェースメソッドは通常静的でない
|
||||
is_override: false, // デフォルトは非オーバーライド
|
||||
span: Span::unknown(),
|
||||
};
|
||||
|
||||
methods.insert(method_name, method_decl);
|
||||
|
||||
// メソッド宣言後の改行をスキップ
|
||||
p.skip_newlines();
|
||||
} else {
|
||||
let line = p.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: p.current_token().token_type.clone(),
|
||||
expected: "(".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
} else {
|
||||
let line = p.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: p.current_token().token_type.clone(),
|
||||
expected: "method name".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
p.consume(TokenType::RBRACE)?;
|
||||
|
||||
Ok(ASTNode::BoxDeclaration {
|
||||
name,
|
||||
fields: vec![], // インターフェースはフィールドなし
|
||||
public_fields: vec![],
|
||||
private_fields: vec![],
|
||||
methods,
|
||||
constructors: HashMap::new(), // インターフェースにコンストラクタなし
|
||||
init_fields: vec![], // インターフェースにinitブロックなし
|
||||
weak_fields: vec![], // インターフェースにweak fieldsなし
|
||||
is_interface: true, // インターフェースフラグ
|
||||
extends: vec![], // Multi-delegation: None → vec![] として表現
|
||||
implements: vec![],
|
||||
type_parameters: Vec::new(), // インターフェースではジェネリクス未対応
|
||||
is_static: false, // インターフェースは非static
|
||||
static_init: None, // インターフェースにstatic initなし
|
||||
span: Span::unknown(),
|
||||
})
|
||||
}
|
||||
@ -4,7 +4,7 @@ use crate::parser::common::ParserUtils;
|
||||
use crate::tokenizer::TokenType;
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Eq)]
|
||||
pub enum MemberKind {
|
||||
pub(crate) enum MemberKind {
|
||||
Field,
|
||||
Method,
|
||||
Constructor,
|
||||
@ -14,7 +14,7 @@ pub enum MemberKind {
|
||||
}
|
||||
|
||||
/// Decide member kind via simple lookahead (scaffold placeholder)
|
||||
pub fn classify_member(p: &mut NyashParser) -> Result<MemberKind, ParseError> {
|
||||
pub(crate) fn classify_member(p: &mut NyashParser) -> Result<MemberKind, ParseError> {
|
||||
// block-first: { body } as (once|birth_once)? name : Type
|
||||
if crate::config::env::unified_members() && p.match_any_token(&[TokenType::LBRACE]) {
|
||||
return Ok(MemberKind::PropertyComputed);
|
||||
|
||||
@ -7,7 +7,7 @@ use crate::tokenizer::TokenType;
|
||||
/// Try to parse a constructor at current position.
|
||||
/// Supported: `init(...) {}`, `pack(...) {}`, `birth(...) {}`.
|
||||
/// Returns Ok(Some((key, node))) when a constructor was parsed and consumed.
|
||||
pub fn try_parse_constructor(
|
||||
pub(crate) fn try_parse_constructor(
|
||||
p: &mut NyashParser,
|
||||
is_override: bool,
|
||||
) -> Result<Option<(String, ASTNode)>, ParseError> {
|
||||
|
||||
@ -12,7 +12,7 @@ use std::collections::HashMap;
|
||||
/// - `name: Type => expr` → computed property (getter function generated)
|
||||
/// - `name: Type { ... } [catch|cleanup]` → computed property block with optional postfix handlers
|
||||
/// Returns Ok(true) when this function consumed and handled the construct; Ok(false) if not applicable.
|
||||
pub fn try_parse_header_first_field_or_property(
|
||||
pub(crate) fn try_parse_header_first_field_or_property(
|
||||
p: &mut NyashParser,
|
||||
fname: String,
|
||||
methods: &mut HashMap<String, ASTNode>,
|
||||
@ -87,3 +87,104 @@ pub fn try_parse_header_first_field_or_property(
|
||||
fields.push(fname);
|
||||
Ok(true)
|
||||
}
|
||||
|
||||
/// Parse a visibility block or a single header-first property with visibility prefix.
|
||||
/// Handles:
|
||||
/// - `public { a, b, c }` → pushes to fields/public_fields
|
||||
/// - `private { ... }` → pushes to fields/private_fields
|
||||
/// - `public name: Type ...` (delegates to header-first field/property)
|
||||
/// Returns Ok(true) if consumed, Ok(false) if visibility keyword not matched.
|
||||
pub(crate) fn try_parse_visibility_block_or_single(
|
||||
p: &mut NyashParser,
|
||||
visibility: &str,
|
||||
methods: &mut HashMap<String, ASTNode>,
|
||||
fields: &mut Vec<String>,
|
||||
public_fields: &mut Vec<String>,
|
||||
private_fields: &mut Vec<String>,
|
||||
last_method_name: &mut Option<String>,
|
||||
) -> Result<bool, ParseError> {
|
||||
if visibility != "public" && visibility != "private" {
|
||||
return Ok(false);
|
||||
}
|
||||
if p.match_token(&TokenType::LBRACE) {
|
||||
p.advance();
|
||||
p.skip_newlines();
|
||||
while !p.match_token(&TokenType::RBRACE) && !p.is_at_end() {
|
||||
if let TokenType::IDENTIFIER(fname) = &p.current_token().token_type {
|
||||
let fname = fname.clone();
|
||||
if visibility == "public" { public_fields.push(fname.clone()); } else { private_fields.push(fname.clone()); }
|
||||
fields.push(fname);
|
||||
p.advance();
|
||||
if p.match_token(&TokenType::COMMA) { p.advance(); }
|
||||
p.skip_newlines();
|
||||
continue;
|
||||
}
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
expected: "identifier in visibility block".to_string(),
|
||||
found: p.current_token().token_type.clone(),
|
||||
line: p.current_token().line,
|
||||
});
|
||||
}
|
||||
p.consume(TokenType::RBRACE)?;
|
||||
p.skip_newlines();
|
||||
return Ok(true);
|
||||
}
|
||||
if let TokenType::IDENTIFIER(n) = &p.current_token().token_type {
|
||||
let fname = n.clone();
|
||||
p.advance();
|
||||
if try_parse_header_first_field_or_property(p, fname.clone(), methods, fields)? {
|
||||
if visibility == "public" { public_fields.push(fname.clone()); } else { private_fields.push(fname.clone()); }
|
||||
*last_method_name = None;
|
||||
p.skip_newlines();
|
||||
return Ok(true);
|
||||
} else {
|
||||
if visibility == "public" { public_fields.push(fname.clone()); } else { private_fields.push(fname.clone()); }
|
||||
fields.push(fname);
|
||||
p.skip_newlines();
|
||||
return Ok(true);
|
||||
}
|
||||
}
|
||||
Ok(false)
|
||||
}
|
||||
|
||||
/// Parse `init { ... }` non-call block to collect initializable fields and weak flags.
|
||||
/// Returns Ok(true) if consumed; Ok(false) if no `init {` at current position.
|
||||
pub(crate) fn parse_init_block_if_any(
|
||||
p: &mut NyashParser,
|
||||
init_fields: &mut Vec<String>,
|
||||
weak_fields: &mut Vec<String>,
|
||||
) -> Result<bool, ParseError> {
|
||||
if !(p.match_token(&TokenType::INIT) && p.peek_token() != &TokenType::LPAREN) {
|
||||
return Ok(false);
|
||||
}
|
||||
p.advance(); // consume 'init'
|
||||
p.consume(TokenType::LBRACE)?;
|
||||
while !p.match_token(&TokenType::RBRACE) && !p.is_at_end() {
|
||||
p.skip_newlines();
|
||||
if p.match_token(&TokenType::RBRACE) {
|
||||
break;
|
||||
}
|
||||
let is_weak = if p.match_token(&TokenType::WEAK) {
|
||||
p.advance();
|
||||
true
|
||||
} else {
|
||||
false
|
||||
};
|
||||
if let TokenType::IDENTIFIER(field_name) = &p.current_token().token_type {
|
||||
init_fields.push(field_name.clone());
|
||||
if is_weak {
|
||||
weak_fields.push(field_name.clone());
|
||||
}
|
||||
p.advance();
|
||||
if p.match_token(&TokenType::COMMA) { p.advance(); }
|
||||
} else {
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
expected: if is_weak { "field name after 'weak'" } else { "field name" }.to_string(),
|
||||
found: p.current_token().token_type.clone(),
|
||||
line: p.current_token().line,
|
||||
});
|
||||
}
|
||||
}
|
||||
p.consume(TokenType::RBRACE)?;
|
||||
Ok(true)
|
||||
}
|
||||
|
||||
@ -6,7 +6,7 @@ use crate::tokenizer::TokenType;
|
||||
|
||||
/// Try to parse a method declaration starting at `method_name` (already consumed identifier).
|
||||
/// Returns Some(method_node) when parsed; None when not applicable (i.e., next token is not '(').
|
||||
pub fn try_parse_method(
|
||||
pub(crate) fn try_parse_method(
|
||||
p: &mut NyashParser,
|
||||
method_name: String,
|
||||
is_override: bool,
|
||||
|
||||
@ -3,10 +3,11 @@ use crate::ast::{ASTNode, Span};
|
||||
use crate::parser::{NyashParser, ParseError};
|
||||
use crate::parser::common::ParserUtils;
|
||||
use crate::tokenizer::TokenType;
|
||||
use std::collections::HashMap;
|
||||
|
||||
/// If Stage-3 gate allows, parse optional catch/cleanup after a block body and wrap it.
|
||||
/// Returns a (possibly) wrapped body.
|
||||
pub fn wrap_with_optional_postfix(
|
||||
pub(crate) fn wrap_with_optional_postfix(
|
||||
p: &mut NyashParser,
|
||||
body: Vec<ASTNode>,
|
||||
) -> Result<Vec<ASTNode>, ParseError> {
|
||||
@ -52,3 +53,66 @@ pub fn wrap_with_optional_postfix(
|
||||
span: Span::unknown(),
|
||||
}])
|
||||
}
|
||||
|
||||
/// Try to parse method-level postfix catch/cleanup after the last parsed method.
|
||||
/// Attaches a TryCatch wrapper around the last method body.
|
||||
pub(crate) fn try_parse_method_postfix_after_last_method(
|
||||
p: &mut NyashParser,
|
||||
methods: &mut HashMap<String, ASTNode>,
|
||||
last_method_name: &Option<String>,
|
||||
) -> Result<bool, ParseError> {
|
||||
if !(p.match_token(&TokenType::CATCH) || p.match_token(&TokenType::CLEANUP)) || last_method_name.is_none() {
|
||||
return Ok(false);
|
||||
}
|
||||
let mname = last_method_name.clone().unwrap();
|
||||
let mut catch_clauses: Vec<crate::ast::CatchClause> = Vec::new();
|
||||
if p.match_token(&TokenType::CATCH) {
|
||||
p.advance();
|
||||
p.consume(TokenType::LPAREN)?;
|
||||
let (exc_ty, exc_var) = p.parse_catch_param()?;
|
||||
p.consume(TokenType::RPAREN)?;
|
||||
let catch_body = p.parse_block_statements()?;
|
||||
catch_clauses.push(crate::ast::CatchClause {
|
||||
exception_type: exc_ty,
|
||||
variable_name: exc_var,
|
||||
body: catch_body,
|
||||
span: Span::unknown(),
|
||||
});
|
||||
p.skip_newlines();
|
||||
if p.match_token(&TokenType::CATCH) {
|
||||
let line = p.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: p.current_token().token_type.clone(),
|
||||
expected: "single catch only after method body".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
}
|
||||
let finally_body = if p.match_token(&TokenType::CLEANUP) {
|
||||
p.advance();
|
||||
Some(p.parse_block_statements()?)
|
||||
} else {
|
||||
None
|
||||
};
|
||||
if let Some(mnode) = methods.get_mut(&mname) {
|
||||
if let crate::ast::ASTNode::FunctionDeclaration { body, .. } = mnode {
|
||||
let already = body.iter().any(|n| matches!(n, crate::ast::ASTNode::TryCatch { .. }));
|
||||
if already {
|
||||
let line = p.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: p.current_token().token_type.clone(),
|
||||
expected: "duplicate postfix catch/cleanup after method".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
let old = std::mem::take(body);
|
||||
*body = vec![crate::ast::ASTNode::TryCatch {
|
||||
try_body: old,
|
||||
catch_clauses,
|
||||
finally_body,
|
||||
span: crate::ast::Span::unknown(),
|
||||
}];
|
||||
}
|
||||
}
|
||||
Ok(true)
|
||||
}
|
||||
|
||||
@ -5,9 +5,10 @@ use crate::parser::common::ParserUtils;
|
||||
use crate::tokenizer::TokenType;
|
||||
use std::collections::HashMap;
|
||||
|
||||
|
||||
/// Try to parse a unified member property: `once name: Type ...` or `birth_once name: Type ...`
|
||||
/// Returns Ok(true) if consumed and handled; otherwise Ok(false).
|
||||
pub fn try_parse_unified_property(
|
||||
pub(crate) fn try_parse_unified_property(
|
||||
p: &mut NyashParser,
|
||||
kind_kw: &str,
|
||||
methods: &mut HashMap<String, ASTNode>,
|
||||
@ -115,3 +116,146 @@ pub fn try_parse_unified_property(
|
||||
methods.insert(getter_name, getter);
|
||||
Ok(true)
|
||||
}
|
||||
|
||||
/// Try to parse a block-first unified member: `{ body } as [once|birth_once]? name : Type [postfix]`
|
||||
/// Returns Ok(true) if a member was parsed and emitted into `methods`.
|
||||
pub(crate) fn try_parse_block_first_property(
|
||||
p: &mut NyashParser,
|
||||
methods: &mut HashMap<String, ASTNode>,
|
||||
birth_once_props: &mut Vec<String>,
|
||||
) -> Result<bool, ParseError> {
|
||||
if !(crate::config::env::unified_members() && p.match_token(&TokenType::LBRACE)) {
|
||||
return Ok(false);
|
||||
}
|
||||
// 1) Parse block body first
|
||||
let mut final_body = p.parse_block_statements()?;
|
||||
p.skip_newlines();
|
||||
|
||||
// 2) Expect 'as'
|
||||
if let TokenType::IDENTIFIER(kw) = &p.current_token().token_type {
|
||||
if kw != "as" {
|
||||
let line = p.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: p.current_token().token_type.clone(),
|
||||
expected: "'as' after block for block-first member".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
} else {
|
||||
let line = p.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: p.current_token().token_type.clone(),
|
||||
expected: "'as' after block for block-first member".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
p.advance(); // consume 'as'
|
||||
|
||||
// 3) Optional kind keyword: once | birth_once
|
||||
let mut kind = "computed".to_string();
|
||||
if let TokenType::IDENTIFIER(k) = &p.current_token().token_type {
|
||||
if k == "once" || k == "birth_once" {
|
||||
kind = k.clone();
|
||||
p.advance();
|
||||
}
|
||||
}
|
||||
|
||||
// 4) Name : Type
|
||||
let name = if let TokenType::IDENTIFIER(n) = &p.current_token().token_type {
|
||||
let s = n.clone();
|
||||
p.advance();
|
||||
s
|
||||
} else {
|
||||
let line = p.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: p.current_token().token_type.clone(),
|
||||
expected: "identifier for member name".to_string(),
|
||||
line,
|
||||
});
|
||||
};
|
||||
if p.match_token(&TokenType::COLON) {
|
||||
p.advance();
|
||||
if let TokenType::IDENTIFIER(_ty) = &p.current_token().token_type {
|
||||
p.advance();
|
||||
} else {
|
||||
let line = p.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: p.current_token().token_type.clone(),
|
||||
expected: "type name after ':'".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
} else {
|
||||
let line = p.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: p.current_token().token_type.clone(),
|
||||
expected: ": type".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
|
||||
// 5) Optional postfix handlers (Stage‑3) directly after block (shared helper)
|
||||
final_body = crate::parser::declarations::box_def::members::postfix::wrap_with_optional_postfix(p, final_body)?;
|
||||
|
||||
// 6) Generate methods per kind (fully equivalent to legacy inline branch)
|
||||
if kind == "once" {
|
||||
// __compute_once_<name>
|
||||
let compute_name = format!("__compute_once_{}", name);
|
||||
let compute = ASTNode::FunctionDeclaration {
|
||||
name: compute_name.clone(),
|
||||
params: vec![],
|
||||
body: final_body,
|
||||
is_static: false,
|
||||
is_override: false,
|
||||
span: Span::unknown(),
|
||||
};
|
||||
methods.insert(compute_name.clone(), compute);
|
||||
|
||||
// Getter with cache + poison handling
|
||||
let key = format!("__once_{}", name);
|
||||
let poison_key = format!("__once_poison_{}", name);
|
||||
let cached_local = format!("__ny_cached_{}", name);
|
||||
let poison_local = format!("__ny_poison_{}", name);
|
||||
let val_local = format!("__ny_val_{}", name);
|
||||
let me_node = ASTNode::Me { span: Span::unknown() };
|
||||
let get_cached = ASTNode::MethodCall {
|
||||
object: Box::new(me_node.clone()),
|
||||
method: "getField".to_string(),
|
||||
arguments: vec![ASTNode::Literal { value: crate::ast::LiteralValue::String(key.clone()), span: Span::unknown() }],
|
||||
span: Span::unknown(),
|
||||
};
|
||||
let local_cached = ASTNode::Local { variables: vec![cached_local.clone()], initial_values: vec![Some(Box::new(get_cached))], span: Span::unknown() };
|
||||
let cond_cached = ASTNode::BinaryOp { operator: crate::ast::BinaryOperator::NotEqual, left: Box::new(ASTNode::Variable { name: cached_local.clone(), span: Span::unknown() }), right: Box::new(ASTNode::Literal { value: crate::ast::LiteralValue::Null, span: Span::unknown() }), span: Span::unknown() };
|
||||
let then_ret_cached = vec![ASTNode::Return { value: Some(Box::new(ASTNode::Variable { name: cached_local.clone(), span: Span::unknown() })), span: Span::unknown() }];
|
||||
let if_cached = ASTNode::If { condition: Box::new(cond_cached), then_body: then_ret_cached, else_body: None, span: Span::unknown() };
|
||||
|
||||
let get_poison = ASTNode::MethodCall { object: Box::new(me_node.clone()), method: "getField".to_string(), arguments: vec![ASTNode::Literal { value: crate::ast::LiteralValue::String(poison_key.clone()), span: Span::unknown() }], span: Span::unknown() };
|
||||
let local_poison = ASTNode::Local { variables: vec![poison_local.clone()], initial_values: vec![Some(Box::new(get_poison))], span: Span::unknown() };
|
||||
let cond_poison = ASTNode::BinaryOp { operator: crate::ast::BinaryOperator::NotEqual, left: Box::new(ASTNode::Variable { name: poison_local.clone(), span: Span::unknown() }), right: Box::new(ASTNode::Literal { value: crate::ast::LiteralValue::Null, span: Span::unknown() }), span: Span::unknown() };
|
||||
let then_throw = vec![ASTNode::Throw { expression: Box::new(ASTNode::Literal { value: crate::ast::LiteralValue::String(format!("once '{}' previously failed", name)), span: Span::unknown() }), span: Span::unknown() }];
|
||||
let if_poison = ASTNode::If { condition: Box::new(cond_poison), then_body: then_throw, else_body: None, span: Span::unknown() };
|
||||
|
||||
let call_compute = ASTNode::MethodCall { object: Box::new(me_node.clone()), method: compute_name.clone(), arguments: vec![], span: Span::unknown() };
|
||||
let local_val = ASTNode::Local { variables: vec![val_local.clone()], initial_values: vec![Some(Box::new(call_compute))], span: Span::unknown() };
|
||||
let set_call = ASTNode::MethodCall { object: Box::new(me_node.clone()), method: "setField".to_string(), arguments: vec![ASTNode::Literal { value: crate::ast::LiteralValue::String(key.clone()), span: Span::unknown() }, ASTNode::Variable { name: val_local.clone(), span: Span::unknown() }], span: Span::unknown() };
|
||||
let ret_stmt = ASTNode::Return { value: Some(Box::new(ASTNode::Variable { name: val_local.clone(), span: Span::unknown() })), span: Span::unknown() };
|
||||
let getter_body = vec![local_cached, if_cached, local_poison, if_poison, local_val, set_call, ret_stmt];
|
||||
let getter_name = format!("__get_once_{}", name);
|
||||
let getter = ASTNode::FunctionDeclaration { name: getter_name.clone(), params: vec![], body: getter_body, is_static: false, is_override: false, span: Span::unknown() };
|
||||
methods.insert(getter_name, getter);
|
||||
return Ok(true);
|
||||
}
|
||||
|
||||
// birth_once
|
||||
birth_once_props.push(name.clone());
|
||||
let compute_name = format!("__compute_birth_{}", name);
|
||||
let compute = ASTNode::FunctionDeclaration { name: compute_name.clone(), params: vec![], body: final_body, is_static: false, is_override: false, span: Span::unknown() };
|
||||
methods.insert(compute_name.clone(), compute);
|
||||
let me_node = ASTNode::Me { span: Span::unknown() };
|
||||
let get_call = ASTNode::MethodCall { object: Box::new(me_node.clone()), method: "getField".to_string(), arguments: vec![ASTNode::Literal { value: crate::ast::LiteralValue::String(format!("__birth_{}", name)), span: Span::unknown() }], span: Span::unknown() };
|
||||
let getter_body = vec![ASTNode::Return { value: Some(Box::new(get_call)), span: Span::unknown() }];
|
||||
let getter_name = format!("__get_birth_{}", name);
|
||||
let getter = ASTNode::FunctionDeclaration { name: getter_name.clone(), params: vec![], body: getter_body, is_static: false, is_override: false, span: Span::unknown() };
|
||||
methods.insert(getter_name, getter);
|
||||
Ok(true)
|
||||
}
|
||||
|
||||
@ -10,14 +10,16 @@ use crate::parser::{NyashParser, ParseError};
|
||||
|
||||
pub mod header;
|
||||
pub mod members;
|
||||
pub mod validators;
|
||||
pub mod interface;
|
||||
|
||||
/// Facade to host the staged migration.
|
||||
pub struct BoxDefParserFacade;
|
||||
pub(crate) struct BoxDefParserFacade;
|
||||
|
||||
impl BoxDefParserFacade {
|
||||
/// Entry planned: parse full box declaration (header + members).
|
||||
/// Not wired yet; use NyashParser::parse_box_declaration for now.
|
||||
pub fn parse_box(_p: &mut NyashParser) -> Result<ASTNode, ParseError> {
|
||||
pub(crate) fn parse_box(_p: &mut NyashParser) -> Result<ASTNode, ParseError> {
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: crate::tokenizer::TokenType::EOF,
|
||||
expected: "box declaration (facade not wired)".to_string(),
|
||||
|
||||
142
src/parser/declarations/box_def/validators.rs
Normal file
142
src/parser/declarations/box_def/validators.rs
Normal file
@ -0,0 +1,142 @@
|
||||
//! Validators and light analysis for box members
|
||||
use crate::ast::ASTNode;
|
||||
use crate::parser::{NyashParser, ParseError};
|
||||
use crate::parser::common::ParserUtils;
|
||||
use crate::tokenizer::TokenType;
|
||||
use std::collections::{HashMap, HashSet};
|
||||
|
||||
/// Forbid user-defined methods named exactly as the box (constructor-like names).
|
||||
/// Nyash constructors are explicit: init/pack/birth. A method with the same name
|
||||
/// as the box is likely a mistaken constructor attempt; reject for clarity.
|
||||
pub(crate) fn validate_no_ctor_like_name(
|
||||
p: &mut NyashParser,
|
||||
box_name: &str,
|
||||
methods: &HashMap<String, ASTNode>,
|
||||
) -> Result<(), ParseError> {
|
||||
if methods.contains_key(box_name) {
|
||||
let line = p.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: p.current_token().token_type.clone(),
|
||||
expected: format!(
|
||||
"method name must not match box name '{}'; use init/pack/birth for constructors",
|
||||
box_name
|
||||
),
|
||||
line,
|
||||
});
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Validate that birth_once properties do not have cyclic dependencies via me.<prop> references
|
||||
pub(crate) fn validate_birth_once_cycles(
|
||||
p: &mut NyashParser,
|
||||
methods: &HashMap<String, ASTNode>,
|
||||
) -> Result<(), ParseError> {
|
||||
if !crate::config::env::unified_members() {
|
||||
return Ok(());
|
||||
}
|
||||
// Collect birth_once compute bodies
|
||||
let mut birth_bodies: HashMap<String, Vec<ASTNode>> = HashMap::new();
|
||||
for (mname, mast) in methods {
|
||||
if let Some(prop) = mname.strip_prefix("__compute_birth_") {
|
||||
if let ASTNode::FunctionDeclaration { body, .. } = mast {
|
||||
birth_bodies.insert(prop.to_string(), body.clone());
|
||||
}
|
||||
}
|
||||
}
|
||||
if birth_bodies.is_empty() {
|
||||
return Ok(());
|
||||
}
|
||||
// Build dependency graph: A -> {B | me.B used inside A}
|
||||
let mut deps: HashMap<String, HashSet<String>> = HashMap::new();
|
||||
let props: HashSet<String> = birth_bodies.keys().cloned().collect();
|
||||
for (pname, body) in &birth_bodies {
|
||||
let used = ast_collect_me_fields(body);
|
||||
let mut set = HashSet::new();
|
||||
for u in used {
|
||||
if props.contains(&u) && u != *pname {
|
||||
set.insert(u);
|
||||
}
|
||||
}
|
||||
deps.insert(pname.clone(), set);
|
||||
}
|
||||
// Detect cycle via DFS
|
||||
fn has_cycle(
|
||||
node: &str,
|
||||
deps: &HashMap<String, HashSet<String>>,
|
||||
temp: &mut HashSet<String>,
|
||||
perm: &mut HashSet<String>,
|
||||
) -> bool {
|
||||
if perm.contains(node) { return false; }
|
||||
if !temp.insert(node.to_string()) { return true; } // back-edge
|
||||
if let Some(ns) = deps.get(node) {
|
||||
for n in ns {
|
||||
if has_cycle(n, deps, temp, perm) { return true; }
|
||||
}
|
||||
}
|
||||
temp.remove(node);
|
||||
perm.insert(node.to_string());
|
||||
false
|
||||
}
|
||||
let mut perm = HashSet::new();
|
||||
let mut temp = HashSet::new();
|
||||
for pname in deps.keys() {
|
||||
if has_cycle(pname, &deps, &mut temp, &mut perm) {
|
||||
let line = p.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: p.current_token().token_type.clone(),
|
||||
expected: "birth_once declarations must not have cyclic dependencies".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Forbid constructor call with the same name as the box; enforce `birth()` usage.
|
||||
pub(crate) fn forbid_box_named_constructor(p: &mut NyashParser, box_name: &str) -> Result<(), ParseError> {
|
||||
if let TokenType::IDENTIFIER(id) = &p.current_token().token_type {
|
||||
if id == box_name && p.peek_token() == &TokenType::LPAREN {
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
expected: format!(
|
||||
"birth() constructor instead of {}(). Nyash uses birth() for unified constructor syntax.",
|
||||
box_name
|
||||
),
|
||||
found: TokenType::IDENTIFIER(box_name.to_string()),
|
||||
line: p.current_token().line,
|
||||
});
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Collect all `me.<field>` accessed in nodes (flat set)
|
||||
fn ast_collect_me_fields(nodes: &Vec<ASTNode>) -> std::collections::HashSet<String> {
|
||||
use std::collections::HashSet;
|
||||
fn scan(nodes: &Vec<ASTNode>, out: &mut HashSet<String>) {
|
||||
for n in nodes {
|
||||
match n {
|
||||
ASTNode::FieldAccess { object, field, .. } => {
|
||||
if let ASTNode::Me { .. } = object.as_ref() {
|
||||
out.insert(field.clone());
|
||||
}
|
||||
}
|
||||
ASTNode::Program { statements, .. } => scan(statements, out),
|
||||
ASTNode::If { then_body, else_body, .. } => {
|
||||
scan(then_body, out);
|
||||
if let Some(eb) = else_body { scan(eb, out); }
|
||||
}
|
||||
ASTNode::TryCatch { try_body, catch_clauses, finally_body, .. } => {
|
||||
scan(try_body, out);
|
||||
for c in catch_clauses { scan(&c.body, out); }
|
||||
if let Some(fb) = finally_body { scan(fb, out); }
|
||||
}
|
||||
ASTNode::FunctionDeclaration { body, .. } => scan(body, out),
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
}
|
||||
let mut hs = HashSet::new();
|
||||
scan(nodes, &mut hs);
|
||||
hs
|
||||
}
|
||||
@ -13,407 +13,50 @@ use crate::tokenizer::TokenType;
|
||||
use std::collections::HashMap;
|
||||
|
||||
impl NyashParser {
|
||||
/// Forbid user-defined methods named exactly as the box (constructor-like names).
|
||||
/// Nyash constructors are explicit: init/pack/birth. A method with the same name
|
||||
/// as the box is likely a mistaken constructor attempt; reject for clarity.
|
||||
fn validate_no_ctor_like_name(
|
||||
&mut self,
|
||||
box_name: &str,
|
||||
methods: &HashMap<String, ASTNode>,
|
||||
) -> Result<(), ParseError> {
|
||||
if methods.contains_key(box_name) {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: format!(
|
||||
"method name must not match box name '{}'; use init/pack/birth for constructors",
|
||||
box_name
|
||||
),
|
||||
line,
|
||||
});
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
/// Validate that birth_once properties do not have cyclic dependencies via me.<prop> references
|
||||
fn validate_birth_once_cycles(
|
||||
&mut self,
|
||||
methods: &HashMap<String, ASTNode>,
|
||||
) -> Result<(), ParseError> {
|
||||
if !crate::config::env::unified_members() {
|
||||
return Ok(());
|
||||
}
|
||||
use std::collections::{HashMap, HashSet};
|
||||
// Collect birth_once compute bodies
|
||||
let mut birth_bodies: HashMap<String, Vec<ASTNode>> = HashMap::new();
|
||||
for (mname, mast) in methods {
|
||||
if let Some(prop) = mname.strip_prefix("__compute_birth_") {
|
||||
if let ASTNode::FunctionDeclaration { body, .. } = mast {
|
||||
birth_bodies.insert(prop.to_string(), body.clone());
|
||||
}
|
||||
}
|
||||
}
|
||||
if birth_bodies.is_empty() {
|
||||
return Ok(());
|
||||
}
|
||||
// Build dependency graph: A -> {B | me.B used inside A}
|
||||
let mut deps: HashMap<String, HashSet<String>> = HashMap::new();
|
||||
let props: HashSet<String> = birth_bodies.keys().cloned().collect();
|
||||
for (p, body) in &birth_bodies {
|
||||
let used = self.ast_collect_me_fields(body);
|
||||
let mut set = HashSet::new();
|
||||
for u in used {
|
||||
if props.contains(&u) && u != *p {
|
||||
set.insert(u);
|
||||
}
|
||||
}
|
||||
deps.insert(p.clone(), set);
|
||||
}
|
||||
// Detect cycle via DFS
|
||||
fn has_cycle(
|
||||
node: &str,
|
||||
deps: &HashMap<String, HashSet<String>>,
|
||||
temp: &mut HashSet<String>,
|
||||
perm: &mut HashSet<String>,
|
||||
) -> bool {
|
||||
if perm.contains(node) {
|
||||
return false;
|
||||
}
|
||||
if !temp.insert(node.to_string()) {
|
||||
return true; // back-edge
|
||||
}
|
||||
if let Some(ns) = deps.get(node) {
|
||||
for n in ns {
|
||||
if has_cycle(n, deps, temp, perm) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
}
|
||||
temp.remove(node);
|
||||
perm.insert(node.to_string());
|
||||
false
|
||||
}
|
||||
let mut perm = HashSet::new();
|
||||
let mut temp = HashSet::new();
|
||||
for p in deps.keys() {
|
||||
if has_cycle(p, &deps, &mut temp, &mut perm) {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "birth_once declarations must not have cyclic dependencies".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
/// Extracted: parse block-first unified member: `{ body } as [once|birth_once]? name : Type [postfix]`
|
||||
/// Returns true if a member was parsed and emitted into `methods`.
|
||||
fn parse_unified_member_block_first(
|
||||
/// Thin wrappers to keep the main loop tidy (behavior-preserving)
|
||||
fn box_try_block_first_property(
|
||||
&mut self,
|
||||
methods: &mut HashMap<String, ASTNode>,
|
||||
birth_once_props: &mut Vec<String>,
|
||||
) -> Result<bool, ParseError> {
|
||||
if !(crate::config::env::unified_members() && self.match_token(&TokenType::LBRACE)) {
|
||||
return Ok(false);
|
||||
}
|
||||
// 1) Parse block body first
|
||||
let mut final_body = self.parse_block_statements()?;
|
||||
self.skip_newlines();
|
||||
|
||||
// 2) Expect 'as'
|
||||
if let TokenType::IDENTIFIER(kw) = &self.current_token().token_type {
|
||||
if kw != "as" {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "'as' after block for block-first member".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
} else {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "'as' after block for block-first member".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
self.advance(); // consume 'as'
|
||||
|
||||
// 3) Optional kind keyword: once | birth_once
|
||||
let mut kind = "computed".to_string();
|
||||
if let TokenType::IDENTIFIER(k) = &self.current_token().token_type {
|
||||
if k == "once" || k == "birth_once" {
|
||||
kind = k.clone();
|
||||
self.advance();
|
||||
}
|
||||
}
|
||||
|
||||
// 4) Name : Type
|
||||
let name = if let TokenType::IDENTIFIER(n) = &self.current_token().token_type {
|
||||
let s = n.clone();
|
||||
self.advance();
|
||||
s
|
||||
} else {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "identifier for member name".to_string(),
|
||||
line,
|
||||
});
|
||||
};
|
||||
if self.match_token(&TokenType::COLON) {
|
||||
self.advance();
|
||||
if let TokenType::IDENTIFIER(_ty) = &self.current_token().token_type {
|
||||
self.advance();
|
||||
} else {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "type name after ':'".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
} else {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: ": type".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
|
||||
// 5) Optional postfix handlers (Stage‑3) directly after block (shared helper)
|
||||
final_body = crate::parser::declarations::box_def::members::postfix::wrap_with_optional_postfix(self, final_body)?;
|
||||
|
||||
// 6) Generate methods per kind (fully equivalent to former inline branch)
|
||||
if kind == "once" {
|
||||
// __compute_once_<name>
|
||||
let compute_name = format!("__compute_once_{}", name);
|
||||
let compute = ASTNode::FunctionDeclaration {
|
||||
name: compute_name.clone(),
|
||||
params: vec![],
|
||||
body: final_body,
|
||||
is_static: false,
|
||||
is_override: false,
|
||||
span: Span::unknown(),
|
||||
};
|
||||
methods.insert(compute_name.clone(), compute);
|
||||
|
||||
// Getter with cache + poison handling
|
||||
let key = format!("__once_{}", name);
|
||||
let poison_key = format!("__once_poison_{}", name);
|
||||
let cached_local = format!("__ny_cached_{}", name);
|
||||
let poison_local = format!("__ny_poison_{}", name);
|
||||
let val_local = format!("__ny_val_{}", name);
|
||||
let me_node = ASTNode::Me { span: Span::unknown() };
|
||||
let get_cached = ASTNode::MethodCall {
|
||||
object: Box::new(me_node.clone()),
|
||||
method: "getField".to_string(),
|
||||
arguments: vec![ASTNode::Literal {
|
||||
value: crate::ast::LiteralValue::String(key.clone()),
|
||||
span: Span::unknown(),
|
||||
}],
|
||||
span: Span::unknown(),
|
||||
};
|
||||
let local_cached = ASTNode::Local {
|
||||
variables: vec![cached_local.clone()],
|
||||
initial_values: vec![Some(Box::new(get_cached))],
|
||||
span: Span::unknown(),
|
||||
};
|
||||
let cond_cached = ASTNode::BinaryOp {
|
||||
operator: crate::ast::BinaryOperator::NotEqual,
|
||||
left: Box::new(ASTNode::Variable { name: cached_local.clone(), span: Span::unknown() }),
|
||||
right: Box::new(ASTNode::Literal { value: crate::ast::LiteralValue::Null, span: Span::unknown() }),
|
||||
span: Span::unknown(),
|
||||
};
|
||||
let then_ret_cached = vec![ASTNode::Return {
|
||||
value: Some(Box::new(ASTNode::Variable { name: cached_local.clone(), span: Span::unknown() })),
|
||||
span: Span::unknown(),
|
||||
}];
|
||||
let if_cached = ASTNode::If { condition: Box::new(cond_cached), then_body: then_ret_cached, else_body: None, span: Span::unknown() };
|
||||
|
||||
let get_poison = ASTNode::MethodCall {
|
||||
object: Box::new(me_node.clone()),
|
||||
method: "getField".to_string(),
|
||||
arguments: vec![ASTNode::Literal { value: crate::ast::LiteralValue::String(poison_key.clone()), span: Span::unknown() }],
|
||||
span: Span::unknown(),
|
||||
};
|
||||
let local_poison = ASTNode::Local {
|
||||
variables: vec![poison_local.clone()],
|
||||
initial_values: vec![Some(Box::new(get_poison))],
|
||||
span: Span::unknown(),
|
||||
};
|
||||
let cond_poison = ASTNode::BinaryOp {
|
||||
operator: crate::ast::BinaryOperator::NotEqual,
|
||||
left: Box::new(ASTNode::Variable { name: poison_local.clone(), span: Span::unknown() }),
|
||||
right: Box::new(ASTNode::Literal { value: crate::ast::LiteralValue::Null, span: Span::unknown() }),
|
||||
span: Span::unknown(),
|
||||
};
|
||||
let then_throw = vec![ASTNode::Throw {
|
||||
expression: Box::new(ASTNode::Literal { value: crate::ast::LiteralValue::String(format!("once '{}' previously failed", name)), span: Span::unknown() }),
|
||||
span: Span::unknown(),
|
||||
}];
|
||||
let if_poison = ASTNode::If { condition: Box::new(cond_poison), then_body: then_throw, else_body: None, span: Span::unknown() };
|
||||
|
||||
let call_compute = ASTNode::MethodCall { object: Box::new(me_node.clone()), method: compute_name.clone(), arguments: vec![], span: Span::unknown() };
|
||||
let local_val = ASTNode::Local { variables: vec![val_local.clone()], initial_values: vec![Some(Box::new(call_compute))], span: Span::unknown() };
|
||||
let set_call = ASTNode::MethodCall { object: Box::new(me_node.clone()), method: "setField".to_string(), arguments: vec![ASTNode::Literal { value: crate::ast::LiteralValue::String(key.clone()), span: Span::unknown() }, ASTNode::Variable { name: val_local.clone(), span: Span::unknown() }], span: Span::unknown() };
|
||||
let ret_stmt = ASTNode::Return { value: Some(Box::new(ASTNode::Variable { name: val_local.clone(), span: Span::unknown() })), span: Span::unknown() };
|
||||
let try_body = vec![local_val, set_call, ret_stmt];
|
||||
let catch_body = vec![
|
||||
ASTNode::MethodCall { object: Box::new(me_node.clone()), method: "setField".to_string(), arguments: vec![ASTNode::Literal { value: crate::ast::LiteralValue::String(poison_key.clone()), span: Span::unknown() }, ASTNode::Literal { value: crate::ast::LiteralValue::Bool(true), span: Span::unknown() }], span: Span::unknown() },
|
||||
ASTNode::Throw { expression: Box::new(ASTNode::Literal { value: crate::ast::LiteralValue::String(format!("once '{}' init failed", name)), span: Span::unknown() }), span: Span::unknown() }
|
||||
];
|
||||
let trycatch = ASTNode::TryCatch { try_body, catch_clauses: vec![crate::ast::CatchClause { exception_type: None, variable_name: None, body: catch_body, span: Span::unknown() }], finally_body: None, span: Span::unknown() };
|
||||
let getter_body = vec![local_cached, if_cached, local_poison, if_poison, trycatch];
|
||||
let getter_name = format!("__get_once_{}", name);
|
||||
let getter = ASTNode::FunctionDeclaration { name: getter_name.clone(), params: vec![], body: getter_body, is_static: false, is_override: false, span: Span::unknown() };
|
||||
methods.insert(getter_name, getter);
|
||||
} else if kind == "birth_once" {
|
||||
// Self-cycle guard: birth_once cannot reference itself via me.<name>
|
||||
if self.ast_contains_me_field(&final_body, &name) {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken { found: self.current_token().token_type.clone(), expected: format!("birth_once '{}' must not reference itself", name), line });
|
||||
}
|
||||
birth_once_props.push(name.clone());
|
||||
let compute_name = format!("__compute_birth_{}", name);
|
||||
let compute = ASTNode::FunctionDeclaration { name: compute_name.clone(), params: vec![], body: final_body, is_static: false, is_override: false, span: Span::unknown() };
|
||||
methods.insert(compute_name.clone(), compute);
|
||||
let key = format!("__birth_{}", name);
|
||||
let me_node = ASTNode::Me { span: Span::unknown() };
|
||||
let get_call = ASTNode::MethodCall { object: Box::new(me_node.clone()), method: "getField".to_string(), arguments: vec![ASTNode::Literal { value: crate::ast::LiteralValue::String(key.clone()), span: Span::unknown() }], span: Span::unknown() };
|
||||
let getter_body = vec![ASTNode::Return { value: Some(Box::new(get_call)), span: Span::unknown() }];
|
||||
let getter_name = format!("__get_birth_{}", name);
|
||||
let getter = ASTNode::FunctionDeclaration { name: getter_name.clone(), params: vec![], body: getter_body, is_static: false, is_override: false, span: Span::unknown() };
|
||||
methods.insert(getter_name, getter);
|
||||
} else {
|
||||
// computed
|
||||
let getter_name = format!("__get_{}", name);
|
||||
let getter = ASTNode::FunctionDeclaration { name: getter_name.clone(), params: vec![], body: final_body, is_static: false, is_override: false, span: Span::unknown() };
|
||||
methods.insert(getter_name, getter);
|
||||
}
|
||||
|
||||
self.skip_newlines();
|
||||
Ok(true)
|
||||
crate::parser::declarations::box_def::members::properties::try_parse_block_first_property(
|
||||
self, methods, birth_once_props,
|
||||
)
|
||||
}
|
||||
|
||||
/// Extracted: method-level postfix catch/cleanup after a method
|
||||
fn parse_method_postfix_after_last_method(
|
||||
fn box_try_method_postfix_after_last(
|
||||
&mut self,
|
||||
methods: &mut HashMap<String, ASTNode>,
|
||||
last_method_name: &Option<String>,
|
||||
) -> Result<bool, ParseError> {
|
||||
if !(self.match_token(&TokenType::CATCH) || self.match_token(&TokenType::CLEANUP))
|
||||
|| last_method_name.is_none()
|
||||
{
|
||||
return Ok(false);
|
||||
}
|
||||
let mname = last_method_name.clone().unwrap();
|
||||
let mut catch_clauses: Vec<crate::ast::CatchClause> = Vec::new();
|
||||
if self.match_token(&TokenType::CATCH) {
|
||||
self.advance();
|
||||
self.consume(TokenType::LPAREN)?;
|
||||
let (exc_ty, exc_var) = self.parse_catch_param()?;
|
||||
self.consume(TokenType::RPAREN)?;
|
||||
let catch_body = self.parse_block_statements()?;
|
||||
catch_clauses.push(crate::ast::CatchClause {
|
||||
exception_type: exc_ty,
|
||||
variable_name: exc_var,
|
||||
body: catch_body,
|
||||
span: crate::ast::Span::unknown(),
|
||||
});
|
||||
self.skip_newlines();
|
||||
if self.match_token(&TokenType::CATCH) {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "single catch only after method body".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
}
|
||||
let finally_body = if self.match_token(&TokenType::CLEANUP) {
|
||||
self.advance();
|
||||
Some(self.parse_block_statements()?)
|
||||
} else {
|
||||
None
|
||||
};
|
||||
if let Some(mnode) = methods.get_mut(&mname) {
|
||||
if let crate::ast::ASTNode::FunctionDeclaration { body, .. } = mnode {
|
||||
let already = body
|
||||
.iter()
|
||||
.any(|n| matches!(n, crate::ast::ASTNode::TryCatch { .. }));
|
||||
if already {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "duplicate postfix catch/cleanup after method".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
let old = std::mem::take(body);
|
||||
*body = vec![crate::ast::ASTNode::TryCatch {
|
||||
try_body: old,
|
||||
catch_clauses,
|
||||
finally_body,
|
||||
span: crate::ast::Span::unknown(),
|
||||
}];
|
||||
}
|
||||
}
|
||||
Ok(true)
|
||||
crate::parser::declarations::box_def::members::postfix::try_parse_method_postfix_after_last_method(
|
||||
self, methods, last_method_name,
|
||||
)
|
||||
}
|
||||
|
||||
/// Extracted: init { ... } block (non-call form)
|
||||
fn parse_init_block_if_any(
|
||||
fn box_try_init_block(
|
||||
&mut self,
|
||||
init_fields: &mut Vec<String>,
|
||||
weak_fields: &mut Vec<String>,
|
||||
) -> Result<bool, ParseError> {
|
||||
if !(self.match_token(&TokenType::INIT) && self.peek_token() != &TokenType::LPAREN) {
|
||||
return Ok(false);
|
||||
}
|
||||
self.advance(); // consume 'init'
|
||||
self.consume(TokenType::LBRACE)?;
|
||||
while !self.match_token(&TokenType::RBRACE) && !self.is_at_end() {
|
||||
self.skip_newlines();
|
||||
if self.match_token(&TokenType::RBRACE) {
|
||||
break;
|
||||
}
|
||||
let is_weak = if self.match_token(&TokenType::WEAK) {
|
||||
self.advance();
|
||||
true
|
||||
} else {
|
||||
false
|
||||
};
|
||||
if let TokenType::IDENTIFIER(field_name) = &self.current_token().token_type {
|
||||
init_fields.push(field_name.clone());
|
||||
if is_weak {
|
||||
weak_fields.push(field_name.clone());
|
||||
}
|
||||
self.advance();
|
||||
if self.match_token(&TokenType::COMMA) {
|
||||
self.advance();
|
||||
}
|
||||
} else {
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
expected: if is_weak {
|
||||
"field name after 'weak'"
|
||||
} else {
|
||||
"field name"
|
||||
}
|
||||
.to_string(),
|
||||
found: self.current_token().token_type.clone(),
|
||||
line: self.current_token().line,
|
||||
});
|
||||
}
|
||||
}
|
||||
self.consume(TokenType::RBRACE)?;
|
||||
Ok(true)
|
||||
crate::parser::declarations::box_def::members::fields::parse_init_block_if_any(
|
||||
self, init_fields, weak_fields,
|
||||
)
|
||||
}
|
||||
|
||||
/// Extracted: visibility block or single property header-first
|
||||
fn parse_visibility_block_or_single(
|
||||
fn box_try_constructor(
|
||||
&mut self,
|
||||
is_override: bool,
|
||||
constructors: &mut HashMap<String, ASTNode>,
|
||||
) -> Result<bool, ParseError> {
|
||||
if let Some((key, node)) = crate::parser::declarations::box_def::members::constructors::try_parse_constructor(self, is_override)? {
|
||||
constructors.insert(key, node);
|
||||
return Ok(true);
|
||||
}
|
||||
Ok(false)
|
||||
}
|
||||
|
||||
fn box_try_visibility(
|
||||
&mut self,
|
||||
visibility: &str,
|
||||
methods: &mut HashMap<String, ASTNode>,
|
||||
@ -422,69 +65,50 @@ impl NyashParser {
|
||||
private_fields: &mut Vec<String>,
|
||||
last_method_name: &mut Option<String>,
|
||||
) -> Result<bool, ParseError> {
|
||||
if visibility != "public" && visibility != "private" {
|
||||
return Ok(false);
|
||||
}
|
||||
if self.match_token(&TokenType::LBRACE) {
|
||||
self.advance();
|
||||
self.skip_newlines();
|
||||
while !self.match_token(&TokenType::RBRACE) && !self.is_at_end() {
|
||||
if let TokenType::IDENTIFIER(fname) = &self.current_token().token_type {
|
||||
let fname = fname.clone();
|
||||
if visibility == "public" {
|
||||
public_fields.push(fname.clone());
|
||||
} else {
|
||||
private_fields.push(fname.clone());
|
||||
}
|
||||
fields.push(fname);
|
||||
self.advance();
|
||||
if self.match_token(&TokenType::COMMA) {
|
||||
self.advance();
|
||||
}
|
||||
self.skip_newlines();
|
||||
continue;
|
||||
}
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
expected: "identifier in visibility block".to_string(),
|
||||
found: self.current_token().token_type.clone(),
|
||||
line: self.current_token().line,
|
||||
});
|
||||
}
|
||||
self.consume(TokenType::RBRACE)?;
|
||||
self.skip_newlines();
|
||||
crate::parser::declarations::box_def::members::fields::try_parse_visibility_block_or_single(
|
||||
self,
|
||||
visibility,
|
||||
methods,
|
||||
fields,
|
||||
public_fields,
|
||||
private_fields,
|
||||
last_method_name,
|
||||
)
|
||||
}
|
||||
|
||||
/// Parse either a method or a header-first field/property starting with `name`.
|
||||
/// Updates `methods`/`fields` and `last_method_name` as appropriate.
|
||||
fn box_try_method_or_field(
|
||||
&mut self,
|
||||
name: String,
|
||||
is_override: bool,
|
||||
methods: &mut HashMap<String, ASTNode>,
|
||||
fields: &mut Vec<String>,
|
||||
birth_once_props: &Vec<String>,
|
||||
last_method_name: &mut Option<String>,
|
||||
) -> Result<bool, ParseError> {
|
||||
if let Some(method) = crate::parser::declarations::box_def::members::methods::try_parse_method(
|
||||
self,
|
||||
name.clone(),
|
||||
is_override,
|
||||
birth_once_props,
|
||||
)? {
|
||||
*last_method_name = Some(name.clone());
|
||||
methods.insert(name, method);
|
||||
return Ok(true);
|
||||
}
|
||||
if let TokenType::IDENTIFIER(n) = &self.current_token().token_type {
|
||||
let fname = n.clone();
|
||||
self.advance();
|
||||
if crate::parser::declarations::box_def::members::fields::try_parse_header_first_field_or_property(
|
||||
self,
|
||||
fname.clone(),
|
||||
methods,
|
||||
fields,
|
||||
)? {
|
||||
if visibility == "public" {
|
||||
public_fields.push(fname.clone());
|
||||
} else {
|
||||
private_fields.push(fname.clone());
|
||||
}
|
||||
*last_method_name = None;
|
||||
self.skip_newlines();
|
||||
return Ok(true);
|
||||
} else {
|
||||
// Fallback: record visibility + field name for compatibility
|
||||
if visibility == "public" {
|
||||
public_fields.push(fname.clone());
|
||||
} else {
|
||||
private_fields.push(fname.clone());
|
||||
}
|
||||
fields.push(fname);
|
||||
self.skip_newlines();
|
||||
return Ok(true);
|
||||
}
|
||||
}
|
||||
Ok(false)
|
||||
// Fallback: header-first field/property (computed/once/birth_once handled inside)
|
||||
crate::parser::declarations::box_def::members::fields::try_parse_header_first_field_or_property(
|
||||
self,
|
||||
name,
|
||||
methods,
|
||||
fields,
|
||||
)
|
||||
}
|
||||
// parse_unified_member_block_first moved to members::properties
|
||||
|
||||
// parse_method_postfix_after_last_method moved to members::postfix
|
||||
|
||||
/// box宣言をパース: box Name { fields... methods... }
|
||||
pub fn parse_box_declaration(&mut self) -> Result<ASTNode, ParseError> {
|
||||
self.consume(TokenType::BOX)?;
|
||||
@ -515,10 +139,10 @@ impl NyashParser {
|
||||
}
|
||||
|
||||
// nyashモード(block-first): { body } as (once|birth_once)? name : Type
|
||||
if self.parse_unified_member_block_first(&mut methods, &mut birth_once_props)? { continue; }
|
||||
if self.box_try_block_first_property(&mut methods, &mut birth_once_props)? { continue; }
|
||||
|
||||
// Fallback: method-level postfix catch/cleanup after a method (non-static box)
|
||||
if self.parse_method_postfix_after_last_method(&mut methods, &last_method_name)? { continue; }
|
||||
if self.box_try_method_postfix_after_last(&mut methods, &last_method_name)? { continue; }
|
||||
|
||||
// RBRACEに到達していればループを抜ける
|
||||
if self.match_token(&TokenType::RBRACE) {
|
||||
@ -526,7 +150,7 @@ impl NyashParser {
|
||||
}
|
||||
|
||||
// initブロックの処理(initメソッドではない場合のみ)
|
||||
if self.parse_init_block_if_any(&mut init_fields, &mut weak_fields)? { continue; }
|
||||
if self.box_try_init_block(&mut init_fields, &mut weak_fields)? { continue; }
|
||||
|
||||
// overrideキーワードをチェック
|
||||
let mut is_override = false;
|
||||
@ -536,22 +160,10 @@ impl NyashParser {
|
||||
}
|
||||
|
||||
// constructor parsing moved to members::constructors
|
||||
if let Some((ctor_key, ctor_node)) = crate::parser::declarations::box_def::members::constructors::try_parse_constructor(self, is_override)? {
|
||||
constructors.insert(ctor_key, ctor_node);
|
||||
continue;
|
||||
}
|
||||
if self.box_try_constructor(is_override, &mut constructors)? { continue; }
|
||||
|
||||
// 🚨 birth()統一システム: Box名コンストラクタ無効化
|
||||
// Box名と同じ名前のコンストラクタは禁止(birth()のみ許可)
|
||||
if let TokenType::IDENTIFIER(id) = &self.current_token().token_type {
|
||||
if id == &name && self.peek_token() == &TokenType::LPAREN {
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
expected: format!("birth() constructor instead of {}(). Nyash uses birth() for unified constructor syntax.", name),
|
||||
found: TokenType::IDENTIFIER(name.clone()),
|
||||
line: self.current_token().line,
|
||||
});
|
||||
}
|
||||
}
|
||||
crate::parser::declarations::box_def::validators::forbid_box_named_constructor(self, &name)?;
|
||||
|
||||
// 通常のフィールド名またはメソッド名、または unified members の先頭キーワードを読み取り
|
||||
if let TokenType::IDENTIFIER(field_or_method) = &self.current_token().token_type {
|
||||
@ -559,7 +171,7 @@ impl NyashParser {
|
||||
self.advance();
|
||||
|
||||
// 可視性: public/private ブロック/単行
|
||||
if self.parse_visibility_block_or_single(
|
||||
if self.box_try_visibility(
|
||||
&field_or_method,
|
||||
&mut methods,
|
||||
&mut fields,
|
||||
@ -583,26 +195,14 @@ impl NyashParser {
|
||||
}
|
||||
|
||||
// メソッド or フィールド(委譲)
|
||||
if let Some(method) = crate::parser::declarations::box_def::members::methods::try_parse_method(
|
||||
self,
|
||||
field_or_method.clone(),
|
||||
if self.box_try_method_or_field(
|
||||
field_or_method,
|
||||
is_override,
|
||||
&mut methods,
|
||||
&mut fields,
|
||||
&birth_once_props,
|
||||
)? {
|
||||
last_method_name = Some(field_or_method.clone());
|
||||
methods.insert(field_or_method, method);
|
||||
} else {
|
||||
// フィールド or 統一メンバ(computed/once/birth_once header-first)
|
||||
let fname = field_or_method;
|
||||
if crate::parser::declarations::box_def::members::fields::try_parse_header_first_field_or_property(
|
||||
self,
|
||||
fname,
|
||||
&mut methods,
|
||||
&mut fields,
|
||||
)? {
|
||||
continue;
|
||||
}
|
||||
}
|
||||
&mut last_method_name,
|
||||
)? { continue; }
|
||||
} else {
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
expected: "method or field name".to_string(),
|
||||
@ -614,7 +214,7 @@ impl NyashParser {
|
||||
|
||||
self.consume(TokenType::RBRACE)?;
|
||||
// 🚫 Disallow method named same as the box (constructor-like confusion)
|
||||
self.validate_no_ctor_like_name(&name, &methods)?;
|
||||
crate::parser::declarations::box_def::validators::validate_no_ctor_like_name(self, &name, &methods)?;
|
||||
|
||||
// 🔥 Override validation
|
||||
for parent in &extends {
|
||||
@ -622,7 +222,7 @@ impl NyashParser {
|
||||
}
|
||||
|
||||
// birth_once 相互依存の簡易検出(宣言間の循環)
|
||||
self.validate_birth_once_cycles(&methods)?;
|
||||
crate::parser::declarations::box_def::validators::validate_birth_once_cycles(self, &methods)?;
|
||||
|
||||
Ok(ASTNode::BoxDeclaration {
|
||||
name,
|
||||
@ -645,169 +245,8 @@ impl NyashParser {
|
||||
|
||||
/// interface box宣言をパース: interface box Name { methods... }
|
||||
pub fn parse_interface_box_declaration(&mut self) -> Result<ASTNode, ParseError> {
|
||||
self.consume(TokenType::INTERFACE)?;
|
||||
self.consume(TokenType::BOX)?;
|
||||
|
||||
let name = if let TokenType::IDENTIFIER(name) = &self.current_token().token_type {
|
||||
let name = name.clone();
|
||||
self.advance();
|
||||
name
|
||||
} else {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "identifier".to_string(),
|
||||
line,
|
||||
});
|
||||
};
|
||||
|
||||
self.consume(TokenType::LBRACE)?;
|
||||
self.skip_newlines(); // ブレース後の改行をスキップ
|
||||
|
||||
let mut methods = HashMap::new();
|
||||
|
||||
while !self.match_token(&TokenType::RBRACE) && !self.is_at_end() {
|
||||
self.skip_newlines(); // ループ開始時に改行をスキップ
|
||||
if let TokenType::IDENTIFIER(method_name) = &self.current_token().token_type {
|
||||
let method_name = method_name.clone();
|
||||
self.advance();
|
||||
|
||||
// インターフェースメソッドはシグネチャのみ
|
||||
if self.match_token(&TokenType::LPAREN) {
|
||||
self.advance(); // consume '('
|
||||
|
||||
let mut params = Vec::new();
|
||||
while !self.match_token(&TokenType::RPAREN) && !self.is_at_end() {
|
||||
if let TokenType::IDENTIFIER(param) = &self.current_token().token_type {
|
||||
params.push(param.clone());
|
||||
self.advance();
|
||||
}
|
||||
|
||||
if self.match_token(&TokenType::COMMA) {
|
||||
self.advance();
|
||||
}
|
||||
}
|
||||
|
||||
self.consume(TokenType::RPAREN)?;
|
||||
|
||||
// インターフェースメソッドは実装なし(空のbody)
|
||||
let method_decl = ASTNode::FunctionDeclaration {
|
||||
name: method_name.clone(),
|
||||
params,
|
||||
body: vec![], // 空の実装
|
||||
is_static: false, // インターフェースメソッドは通常静的でない
|
||||
is_override: false, // デフォルトは非オーバーライド
|
||||
span: Span::unknown(),
|
||||
};
|
||||
|
||||
methods.insert(method_name, method_decl);
|
||||
|
||||
// メソッド宣言後の改行をスキップ
|
||||
self.skip_newlines();
|
||||
} else {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "(".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
} else {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "method name".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
self.consume(TokenType::RBRACE)?;
|
||||
|
||||
Ok(ASTNode::BoxDeclaration {
|
||||
name,
|
||||
fields: vec![], // インターフェースはフィールドなし
|
||||
public_fields: vec![],
|
||||
private_fields: vec![],
|
||||
methods,
|
||||
constructors: HashMap::new(), // インターフェースにコンストラクタなし
|
||||
init_fields: vec![], // インターフェースにinitブロックなし
|
||||
weak_fields: vec![], // 🔗 インターフェースにweak fieldsなし
|
||||
is_interface: true, // インターフェースフラグ
|
||||
extends: vec![], // 🚀 Multi-delegation: Changed from None to vec![]
|
||||
implements: vec![],
|
||||
type_parameters: Vec::new(), // 🔥 インターフェースではジェネリクス未対応
|
||||
is_static: false, // インターフェースは非static
|
||||
static_init: None, // インターフェースにstatic initなし
|
||||
span: Span::unknown(),
|
||||
})
|
||||
crate::parser::declarations::box_def::interface::parse_interface_box(self)
|
||||
}
|
||||
}
|
||||
|
||||
impl NyashParser {
|
||||
/// Minimal scan: does body contain `me.<field>` access (direct self-cycle guard)
|
||||
fn ast_contains_me_field(&self, nodes: &Vec<ASTNode>, field: &str) -> bool {
|
||||
fn scan(nodes: &Vec<ASTNode>, field: &str) -> bool {
|
||||
for n in nodes {
|
||||
match n {
|
||||
ASTNode::FieldAccess { object, field: f, .. } => {
|
||||
if f == field {
|
||||
if let ASTNode::Me { .. } = object.as_ref() {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
}
|
||||
ASTNode::Program { statements, .. } => {
|
||||
if scan(statements, field) { return true; }
|
||||
}
|
||||
ASTNode::If { then_body, else_body, .. } => {
|
||||
if scan(then_body, field) { return true; }
|
||||
if let Some(eb) = else_body { if scan(eb, field) { return true; } }
|
||||
}
|
||||
ASTNode::TryCatch { try_body, catch_clauses, finally_body, .. } => {
|
||||
if scan(try_body, field) { return true; }
|
||||
for c in catch_clauses { if scan(&c.body, field) { return true; } }
|
||||
if let Some(fb) = finally_body { if scan(fb, field) { return true; } }
|
||||
}
|
||||
ASTNode::FunctionDeclaration { body, .. } => {
|
||||
if scan(body, field) { return true; }
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
false
|
||||
}
|
||||
scan(nodes, field)
|
||||
}
|
||||
|
||||
/// Collect all `me.<field>` accessed in nodes (flat set)
|
||||
fn ast_collect_me_fields(&self, nodes: &Vec<ASTNode>) -> std::collections::HashSet<String> {
|
||||
use std::collections::HashSet;
|
||||
fn scan(nodes: &Vec<ASTNode>, out: &mut HashSet<String>) {
|
||||
for n in nodes {
|
||||
match n {
|
||||
ASTNode::FieldAccess { object, field, .. } => {
|
||||
if let ASTNode::Me { .. } = object.as_ref() {
|
||||
out.insert(field.clone());
|
||||
}
|
||||
}
|
||||
ASTNode::Program { statements, .. } => scan(statements, out),
|
||||
ASTNode::If { then_body, else_body, .. } => {
|
||||
scan(then_body, out);
|
||||
if let Some(eb) = else_body { scan(eb, out); }
|
||||
}
|
||||
ASTNode::TryCatch { try_body, catch_clauses, finally_body, .. } => {
|
||||
scan(try_body, out);
|
||||
for c in catch_clauses { scan(&c.body, out); }
|
||||
if let Some(fb) = finally_body { scan(fb, out); }
|
||||
}
|
||||
ASTNode::FunctionDeclaration { body, .. } => scan(body, out),
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
}
|
||||
let mut hs = HashSet::new();
|
||||
scan(nodes, &mut hs);
|
||||
hs
|
||||
}
|
||||
}
|
||||
// ast_collect_me_fields moved into box_def::validators (private helper)
|
||||
|
||||
@ -9,5 +9,6 @@ pub mod box_definition;
|
||||
pub mod box_def;
|
||||
pub mod dependency_helpers;
|
||||
pub mod static_box;
|
||||
pub mod static_def;
|
||||
|
||||
// Re-export commonly used items
|
||||
|
||||
@ -14,112 +14,8 @@ impl NyashParser {
|
||||
/// static box宣言をパース: static box Name { ... }
|
||||
pub fn parse_static_box(&mut self) -> Result<ASTNode, ParseError> {
|
||||
self.consume(TokenType::BOX)?;
|
||||
|
||||
let name = if let TokenType::IDENTIFIER(name) = &self.current_token().token_type {
|
||||
let name = name.clone();
|
||||
self.advance();
|
||||
name
|
||||
} else {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "identifier".to_string(),
|
||||
line,
|
||||
});
|
||||
};
|
||||
|
||||
// 🔥 ジェネリクス型パラメータのパース (<T, U>)
|
||||
let type_parameters = if self.match_token(&TokenType::LESS) {
|
||||
self.advance(); // consume '<'
|
||||
let mut params = Vec::new();
|
||||
|
||||
loop {
|
||||
if let TokenType::IDENTIFIER(param_name) = &self.current_token().token_type {
|
||||
params.push(param_name.clone());
|
||||
self.advance();
|
||||
|
||||
if self.match_token(&TokenType::COMMA) {
|
||||
self.advance(); // consume ','
|
||||
} else {
|
||||
break;
|
||||
}
|
||||
} else {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "type parameter name".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
self.consume(TokenType::GREATER)?; // consume '>'
|
||||
params
|
||||
} else {
|
||||
Vec::new()
|
||||
};
|
||||
|
||||
// from句のパース(Multi-delegation)- static boxでもデリゲーション可能 🚀
|
||||
let extends = if self.match_token(&TokenType::FROM) {
|
||||
self.advance(); // consume 'from'
|
||||
|
||||
let mut parent_list = Vec::new();
|
||||
|
||||
loop {
|
||||
if let TokenType::IDENTIFIER(parent_name) = &self.current_token().token_type {
|
||||
parent_list.push(parent_name.clone());
|
||||
self.advance();
|
||||
|
||||
if self.match_token(&TokenType::COMMA) {
|
||||
self.advance(); // consume ','
|
||||
} else {
|
||||
break;
|
||||
}
|
||||
} else {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "parent class name".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
parent_list
|
||||
} else {
|
||||
Vec::new()
|
||||
};
|
||||
|
||||
// interface句のパース(インターフェース実装)- static boxでもinterface実装可能
|
||||
let implements = if self.match_token(&TokenType::INTERFACE) {
|
||||
self.advance(); // consume 'interface'
|
||||
|
||||
let mut interface_list = Vec::new();
|
||||
|
||||
loop {
|
||||
if let TokenType::IDENTIFIER(interface_name) = &self.current_token().token_type {
|
||||
interface_list.push(interface_name.clone());
|
||||
self.advance();
|
||||
|
||||
if self.match_token(&TokenType::COMMA) {
|
||||
self.advance(); // consume ','
|
||||
} else {
|
||||
break;
|
||||
}
|
||||
} else {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "interface name".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
interface_list
|
||||
} else {
|
||||
vec![]
|
||||
};
|
||||
let (name, type_parameters, extends, implements) =
|
||||
crate::parser::declarations::static_def::header::parse_static_header(self)?;
|
||||
|
||||
self.consume(TokenType::LBRACE)?;
|
||||
self.skip_newlines(); // ブレース後の改行をスキップ
|
||||
@ -129,7 +25,7 @@ impl NyashParser {
|
||||
let constructors = HashMap::new();
|
||||
let mut init_fields = Vec::new();
|
||||
let mut weak_fields = Vec::new(); // 🔗 Track weak fields for static box
|
||||
let mut static_init = None;
|
||||
let mut static_init: Option<Vec<ASTNode>> = None;
|
||||
|
||||
// Track last inserted method name to allow postfix catch/cleanup fallback parsing
|
||||
let mut last_method_name: Option<String> = None;
|
||||
@ -137,212 +33,35 @@ impl NyashParser {
|
||||
self.skip_newlines(); // ループ開始時に改行をスキップ
|
||||
|
||||
// Fallback: method-level postfix catch/cleanup immediately following a method
|
||||
if (self.match_token(&TokenType::CATCH) || self.match_token(&TokenType::CLEANUP)) && last_method_name.is_some() {
|
||||
let mname = last_method_name.clone().unwrap();
|
||||
// Parse optional catch then optional cleanup
|
||||
let mut catch_clauses: Vec<crate::ast::CatchClause> = Vec::new();
|
||||
if self.match_token(&TokenType::CATCH) {
|
||||
self.advance();
|
||||
self.consume(TokenType::LPAREN)?;
|
||||
let (exc_ty, exc_var) = self.parse_catch_param()?;
|
||||
self.consume(TokenType::RPAREN)?;
|
||||
let catch_body = self.parse_block_statements()?;
|
||||
catch_clauses.push(crate::ast::CatchClause { exception_type: exc_ty, variable_name: exc_var, body: catch_body, span: crate::ast::Span::unknown() });
|
||||
self.skip_newlines();
|
||||
if self.match_token(&TokenType::CATCH) {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken { found: self.current_token().token_type.clone(), expected: "single catch only after method body".to_string(), line });
|
||||
}
|
||||
}
|
||||
let finally_body = if self.match_token(&TokenType::CLEANUP) { self.advance(); Some(self.parse_block_statements()?) } else { None };
|
||||
// Wrap existing method body
|
||||
if let Some(mnode) = methods.get_mut(&mname) {
|
||||
if let crate::ast::ASTNode::FunctionDeclaration { body, .. } = mnode {
|
||||
// If already TryCatch present, disallow duplicate postfix
|
||||
let already = body.iter().any(|n| matches!(n, crate::ast::ASTNode::TryCatch{..}));
|
||||
if already {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken { found: self.current_token().token_type.clone(), expected: "duplicate postfix catch/cleanup after method".to_string(), line });
|
||||
}
|
||||
let old = std::mem::take(body);
|
||||
*body = vec![crate::ast::ASTNode::TryCatch { try_body: old, catch_clauses, finally_body, span: crate::ast::Span::unknown() }];
|
||||
continue;
|
||||
}
|
||||
}
|
||||
}
|
||||
if crate::parser::declarations::box_def::members::postfix::try_parse_method_postfix_after_last_method(
|
||||
self, &mut methods, &last_method_name,
|
||||
)? { continue; }
|
||||
|
||||
// RBRACEに到達していればループを抜ける
|
||||
if self.match_token(&TokenType::RBRACE) {
|
||||
break;
|
||||
}
|
||||
|
||||
// 🔥 static 初期化子の処理
|
||||
// Gate: NYASH_PARSER_STATIC_INIT_STRICT=1 のとき、
|
||||
// - 直後が '{' の場合のみ static 初期化子として扱う
|
||||
// - 直後が 'box' or 'function' の場合は、トップレベル宣言の開始とみなし、この box 本体を閉じる
|
||||
// 既定(ゲートOFF)は従来挙動(常に static { ... } を期待)
|
||||
if self.match_token(&TokenType::STATIC) {
|
||||
let strict = std::env::var("NYASH_PARSER_STATIC_INIT_STRICT").ok().as_deref() == Some("1");
|
||||
if strict {
|
||||
match self.peek_token() {
|
||||
TokenType::LBRACE => {
|
||||
self.advance(); // consume 'static'
|
||||
let static_body = self.parse_block_statements()?;
|
||||
static_init = Some(static_body);
|
||||
continue;
|
||||
}
|
||||
TokenType::BOX | TokenType::FUNCTION => {
|
||||
// トップレベルの `static box|function` が続くシーム: ここで box を閉じる
|
||||
break;
|
||||
}
|
||||
_ => {
|
||||
// 不明な形は従来通り initializer として解釈(互換重視)
|
||||
self.advance();
|
||||
let static_body = self.parse_block_statements()?;
|
||||
static_init = Some(static_body);
|
||||
continue;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
self.advance(); // consume 'static'
|
||||
let static_body = self.parse_block_statements()?;
|
||||
static_init = Some(static_body);
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
// initブロックの処理
|
||||
if self.match_token(&TokenType::INIT) {
|
||||
self.advance(); // consume 'init'
|
||||
self.consume(TokenType::LBRACE)?;
|
||||
|
||||
// initブロック内のフィールド定義を読み込み
|
||||
while !self.match_token(&TokenType::RBRACE) && !self.is_at_end() {
|
||||
self.skip_newlines();
|
||||
|
||||
if self.match_token(&TokenType::RBRACE) {
|
||||
break;
|
||||
}
|
||||
|
||||
// Check for weak modifier
|
||||
let is_weak = if self.match_token(&TokenType::WEAK) {
|
||||
self.advance(); // consume 'weak'
|
||||
true
|
||||
} else {
|
||||
false
|
||||
};
|
||||
|
||||
if let TokenType::IDENTIFIER(field_name) = &self.current_token().token_type {
|
||||
init_fields.push(field_name.clone());
|
||||
if is_weak {
|
||||
weak_fields.push(field_name.clone()); // 🔗 Add to weak fields list
|
||||
}
|
||||
self.advance();
|
||||
|
||||
// カンマがあればスキップ
|
||||
if self.match_token(&TokenType::COMMA) {
|
||||
self.advance();
|
||||
}
|
||||
} else {
|
||||
// 不正なトークンがある場合はエラー
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
expected: if is_weak {
|
||||
"field name after 'weak'"
|
||||
} else {
|
||||
"field name"
|
||||
}
|
||||
.to_string(),
|
||||
found: self.current_token().token_type.clone(),
|
||||
line: self.current_token().line,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
self.consume(TokenType::RBRACE)?;
|
||||
// 🔥 static 初期化子の処理(厳密ゲート互換)
|
||||
if let Some(body) = crate::parser::declarations::static_def::members::parse_static_initializer_if_any(self)? {
|
||||
static_init = Some(body);
|
||||
continue;
|
||||
} else if self.match_token(&TokenType::STATIC) {
|
||||
// STRICT で top-level seam を検出した場合は while を抜ける
|
||||
break;
|
||||
}
|
||||
|
||||
// initブロックの処理(共通ヘルパに委譲)
|
||||
if crate::parser::declarations::box_def::members::fields::parse_init_block_if_any(
|
||||
self, &mut init_fields, &mut weak_fields,
|
||||
)? { continue; }
|
||||
|
||||
if let TokenType::IDENTIFIER(field_or_method) = &self.current_token().token_type {
|
||||
let field_or_method = field_or_method.clone();
|
||||
self.advance();
|
||||
|
||||
// メソッド定義か?
|
||||
if self.match_token(&TokenType::LPAREN) {
|
||||
// メソッド定義
|
||||
self.advance(); // consume '('
|
||||
|
||||
let mut params = Vec::new();
|
||||
while !self.match_token(&TokenType::RPAREN) && !self.is_at_end() {
|
||||
if let TokenType::IDENTIFIER(param) = &self.current_token().token_type {
|
||||
params.push(param.clone());
|
||||
self.advance();
|
||||
}
|
||||
|
||||
if self.match_token(&TokenType::COMMA) {
|
||||
self.advance();
|
||||
}
|
||||
}
|
||||
|
||||
self.consume(TokenType::RPAREN)?;
|
||||
let mut body = self.parse_block_statements()?;
|
||||
self.skip_newlines();
|
||||
|
||||
// Method-level postfix catch/cleanup (gate)
|
||||
if self.match_token(&TokenType::CATCH) || self.match_token(&TokenType::CLEANUP)
|
||||
{
|
||||
let mut catch_clauses: Vec<crate::ast::CatchClause> = Vec::new();
|
||||
if self.match_token(&TokenType::CATCH) {
|
||||
self.advance(); // consume 'catch'
|
||||
self.consume(TokenType::LPAREN)?;
|
||||
let (exc_ty, exc_var) = self.parse_catch_param()?;
|
||||
self.consume(TokenType::RPAREN)?;
|
||||
let catch_body = self.parse_block_statements()?;
|
||||
catch_clauses.push(crate::ast::CatchClause {
|
||||
exception_type: exc_ty,
|
||||
variable_name: exc_var,
|
||||
body: catch_body,
|
||||
span: crate::ast::Span::unknown(),
|
||||
});
|
||||
self.skip_newlines();
|
||||
if self.match_token(&TokenType::CATCH) {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "single catch only after method body".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
}
|
||||
let finally_body = if self.match_token(&TokenType::CLEANUP) {
|
||||
self.advance();
|
||||
Some(self.parse_block_statements()?)
|
||||
} else {
|
||||
None
|
||||
};
|
||||
// Wrap original body with TryCatch
|
||||
body = vec![ASTNode::TryCatch {
|
||||
try_body: body,
|
||||
catch_clauses,
|
||||
finally_body,
|
||||
span: crate::ast::Span::unknown(),
|
||||
}];
|
||||
}
|
||||
|
||||
let method = ASTNode::FunctionDeclaration {
|
||||
name: field_or_method.clone(),
|
||||
params,
|
||||
body,
|
||||
is_static: false, // static box内のメソッドは通常メソッド
|
||||
is_override: false, // デフォルトは非オーバーライド
|
||||
span: Span::unknown(),
|
||||
};
|
||||
|
||||
last_method_name = Some(field_or_method.clone());
|
||||
methods.insert(field_or_method, method);
|
||||
} else {
|
||||
// フィールド定義
|
||||
fields.push(field_or_method);
|
||||
}
|
||||
crate::parser::declarations::static_def::members::try_parse_method_or_field(
|
||||
self, field_or_method, &mut methods, &mut fields, &mut last_method_name,
|
||||
)?;
|
||||
} else {
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
expected: "method or field name".to_string(),
|
||||
|
||||
91
src/parser/declarations/static_def/header.rs
Normal file
91
src/parser/declarations/static_def/header.rs
Normal file
@ -0,0 +1,91 @@
|
||||
//! Header parsing for `static box Name<T...> from Parent1, ... [interface ...]` (staged)
|
||||
use crate::parser::{NyashParser, ParseError};
|
||||
use crate::parser::common::ParserUtils;
|
||||
use crate::tokenizer::TokenType;
|
||||
|
||||
/// Parse the leading header of a static box and return
|
||||
/// (name, type_params, extends, implements). Does not consume the opening '{'.
|
||||
pub(crate) fn parse_static_header(
|
||||
p: &mut NyashParser,
|
||||
) -> Result<(String, Vec<String>, Vec<String>, Vec<String>), ParseError> {
|
||||
// Name
|
||||
let name = if let TokenType::IDENTIFIER(name) = &p.current_token().token_type {
|
||||
let name = name.clone();
|
||||
p.advance();
|
||||
name
|
||||
} else {
|
||||
let line = p.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: p.current_token().token_type.clone(),
|
||||
expected: "identifier".to_string(),
|
||||
line,
|
||||
});
|
||||
};
|
||||
|
||||
// Generic type parameters: <T, U>
|
||||
let type_parameters = if p.match_token(&TokenType::LESS) {
|
||||
p.advance(); // consume '<'
|
||||
let mut params = Vec::new();
|
||||
loop {
|
||||
if let TokenType::IDENTIFIER(param_name) = &p.current_token().token_type {
|
||||
params.push(param_name.clone());
|
||||
p.advance();
|
||||
if p.match_token(&TokenType::COMMA) { p.advance(); } else { break; }
|
||||
} else {
|
||||
let line = p.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: p.current_token().token_type.clone(),
|
||||
expected: "type parameter name".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
}
|
||||
p.consume(TokenType::GREATER)?;
|
||||
params
|
||||
} else { Vec::new() };
|
||||
|
||||
// extends: from Parent1, Parent2
|
||||
let extends = if p.match_token(&TokenType::FROM) {
|
||||
p.advance(); // consume 'from'
|
||||
let mut parents = Vec::new();
|
||||
loop {
|
||||
if let TokenType::IDENTIFIER(parent_name) = &p.current_token().token_type {
|
||||
parents.push(parent_name.clone());
|
||||
p.advance();
|
||||
if p.match_token(&TokenType::COMMA) { p.advance(); } else { break; }
|
||||
} else {
|
||||
let line = p.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: p.current_token().token_type.clone(),
|
||||
expected: "parent class name".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
}
|
||||
parents
|
||||
} else { Vec::new() };
|
||||
|
||||
// implements: `interface A, B` (optional)
|
||||
let implements = if p.match_token(&TokenType::INTERFACE) {
|
||||
p.advance(); // consume 'interface'
|
||||
let mut list = Vec::new();
|
||||
loop {
|
||||
if let TokenType::IDENTIFIER(interface_name) = &p.current_token().token_type {
|
||||
list.push(interface_name.clone());
|
||||
p.advance();
|
||||
if p.match_token(&TokenType::COMMA) { p.advance(); } else { break; }
|
||||
} else {
|
||||
let line = p.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: p.current_token().token_type.clone(),
|
||||
expected: "interface name".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
}
|
||||
list
|
||||
} else { Vec::new() };
|
||||
|
||||
Ok((name, type_parameters, extends, implements))
|
||||
}
|
||||
|
||||
90
src/parser/declarations/static_def/members.rs
Normal file
90
src/parser/declarations/static_def/members.rs
Normal file
@ -0,0 +1,90 @@
|
||||
//! Members helpers for static box (staged)
|
||||
use crate::ast::ASTNode;
|
||||
use crate::parser::{NyashParser, ParseError};
|
||||
use crate::parser::common::ParserUtils;
|
||||
use crate::tokenizer::TokenType;
|
||||
use std::collections::HashMap;
|
||||
|
||||
/// Parse a `static { ... }` initializer if present, honoring STRICT gate behavior.
|
||||
/// Returns Ok(Some(body)) when consumed; Ok(None) otherwise.
|
||||
pub(crate) fn parse_static_initializer_if_any(
|
||||
p: &mut NyashParser,
|
||||
) -> Result<Option<Vec<ASTNode>>, ParseError> {
|
||||
if !p.match_token(&TokenType::STATIC) { return Ok(None); }
|
||||
let strict = std::env::var("NYASH_PARSER_STATIC_INIT_STRICT").ok().as_deref() == Some("1");
|
||||
if strict {
|
||||
match p.peek_token() {
|
||||
TokenType::LBRACE => {
|
||||
p.advance(); // consume 'static'
|
||||
let body = p.parse_block_statements()?;
|
||||
return Ok(Some(body));
|
||||
}
|
||||
TokenType::BOX | TokenType::FUNCTION => {
|
||||
// top-level seam: do not consume, let caller close the box
|
||||
return Ok(None);
|
||||
}
|
||||
_ => {
|
||||
// backward-compatible fallback: treat as initializer
|
||||
p.advance();
|
||||
let body = p.parse_block_statements()?;
|
||||
return Ok(Some(body));
|
||||
}
|
||||
}
|
||||
} else {
|
||||
p.advance(); // consume 'static'
|
||||
let body = p.parse_block_statements()?;
|
||||
Ok(Some(body))
|
||||
}
|
||||
}
|
||||
|
||||
/// Parse a method body and apply optional postfix catch/cleanup inline (Stage‑3 gate).
|
||||
/// Caller must have consumed `(` and collected `params` and parsed `body`.
|
||||
pub(crate) fn wrap_method_body_with_postfix_if_any(
|
||||
p: &mut NyashParser,
|
||||
body: Vec<ASTNode>,
|
||||
) -> Result<Vec<ASTNode>, ParseError> {
|
||||
crate::parser::declarations::box_def::members::postfix::wrap_with_optional_postfix(p, body)
|
||||
}
|
||||
|
||||
/// Parse either a method or a field in static box after consuming an identifier `name`.
|
||||
/// - If next token is `(`, parses a method with optional postfix and inserts into `methods`.
|
||||
/// - Otherwise, treats as a field name and pushes into `fields`.
|
||||
pub(crate) fn try_parse_method_or_field(
|
||||
p: &mut NyashParser,
|
||||
name: String,
|
||||
methods: &mut HashMap<String, ASTNode>,
|
||||
fields: &mut Vec<String>,
|
||||
last_method_name: &mut Option<String>,
|
||||
) -> Result<bool, ParseError> {
|
||||
if !p.match_token(&TokenType::LPAREN) {
|
||||
// Field
|
||||
fields.push(name);
|
||||
return Ok(true);
|
||||
}
|
||||
// Method
|
||||
p.advance(); // consume '('
|
||||
let mut params = Vec::new();
|
||||
while !p.match_token(&TokenType::RPAREN) && !p.is_at_end() {
|
||||
crate::must_advance!(p, _unused, "static method parameter parsing");
|
||||
if let TokenType::IDENTIFIER(param) = &p.current_token().token_type {
|
||||
params.push(param.clone());
|
||||
p.advance();
|
||||
}
|
||||
if p.match_token(&TokenType::COMMA) { p.advance(); }
|
||||
}
|
||||
p.consume(TokenType::RPAREN)?;
|
||||
let body = p.parse_block_statements()?;
|
||||
let body = wrap_method_body_with_postfix_if_any(p, body)?;
|
||||
// Construct method node
|
||||
let method = ASTNode::FunctionDeclaration {
|
||||
name: name.clone(),
|
||||
params,
|
||||
body,
|
||||
is_static: false,
|
||||
is_override: false,
|
||||
span: crate::ast::Span::unknown(),
|
||||
};
|
||||
*last_method_name = Some(name.clone());
|
||||
methods.insert(name, method);
|
||||
Ok(true)
|
||||
}
|
||||
23
src/parser/declarations/static_def/mod.rs
Normal file
23
src/parser/declarations/static_def/mod.rs
Normal file
@ -0,0 +1,23 @@
|
||||
//! Static Box Definition (staged split)
|
||||
#![allow(dead_code)]
|
||||
|
||||
use crate::ast::ASTNode;
|
||||
use crate::parser::{NyashParser, ParseError};
|
||||
|
||||
pub mod header;
|
||||
pub mod members;
|
||||
pub mod validators;
|
||||
|
||||
/// Facade placeholder for static box parsing (to be wired gradually).
|
||||
pub(crate) struct StaticDefFacade;
|
||||
|
||||
impl StaticDefFacade {
|
||||
pub(crate) fn parse_box(_p: &mut NyashParser) -> Result<ASTNode, ParseError> {
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: crate::tokenizer::TokenType::EOF,
|
||||
expected: "static box declaration (facade not wired)".to_string(),
|
||||
line: 0,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
12
src/parser/declarations/static_def/validators.rs
Normal file
12
src/parser/declarations/static_def/validators.rs
Normal file
@ -0,0 +1,12 @@
|
||||
//! Static box validators (placeholder for symmetry)
|
||||
#![allow(dead_code)]
|
||||
|
||||
use crate::parser::{NyashParser, ParseError};
|
||||
|
||||
pub(crate) struct StaticValidators;
|
||||
|
||||
impl StaticValidators {
|
||||
#[allow(dead_code)]
|
||||
pub(crate) fn validate(_p: &mut NyashParser) -> Result<(), ParseError> { Ok(()) }
|
||||
}
|
||||
|
||||
@ -11,6 +11,78 @@ use crate::ast::{ASTNode, CatchClause, Span};
|
||||
use crate::tokenizer::TokenType;
|
||||
|
||||
impl NyashParser {
|
||||
/// Parse a standalone block `{ ... }` and optional postfix `catch/cleanup` sequence.
|
||||
/// Returns Program(body) when no postfix keywords follow.
|
||||
fn parse_standalone_block_statement(&mut self) -> Result<ASTNode, ParseError> {
|
||||
// Parse the block body first
|
||||
let try_body = self.parse_block_statements()?;
|
||||
|
||||
// Allow whitespace/newlines between block and postfix keywords
|
||||
self.skip_newlines();
|
||||
|
||||
if crate::config::env::block_postfix_catch()
|
||||
&& (self.match_token(&TokenType::CATCH) || self.match_token(&TokenType::CLEANUP))
|
||||
{
|
||||
// Parse at most one catch, then optional cleanup
|
||||
let mut catch_clauses: Vec<CatchClause> = Vec::new();
|
||||
if self.match_token(&TokenType::CATCH) {
|
||||
self.advance(); // consume 'catch'
|
||||
self.consume(TokenType::LPAREN)?;
|
||||
let (exception_type, exception_var) = self.parse_catch_param()?;
|
||||
self.consume(TokenType::RPAREN)?;
|
||||
let catch_body = self.parse_block_statements()?;
|
||||
catch_clauses.push(CatchClause {
|
||||
exception_type,
|
||||
variable_name: exception_var,
|
||||
body: catch_body,
|
||||
span: Span::unknown(),
|
||||
});
|
||||
|
||||
// Single‑catch policy (MVP): disallow multiple catch in postfix form
|
||||
self.skip_newlines();
|
||||
if self.match_token(&TokenType::CATCH) {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "single catch only after standalone block".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Optional cleanup
|
||||
let finally_body = if self.match_token(&TokenType::CLEANUP) {
|
||||
self.advance(); // consume 'cleanup'
|
||||
Some(self.parse_block_statements()?)
|
||||
} else {
|
||||
None
|
||||
};
|
||||
|
||||
Ok(ASTNode::TryCatch {
|
||||
try_body,
|
||||
catch_clauses,
|
||||
finally_body,
|
||||
span: Span::unknown(),
|
||||
})
|
||||
} else {
|
||||
// No postfix keywords. If gate is on, enforce MVP static check:
|
||||
// direct top-level `throw` inside the standalone block must be followed by catch
|
||||
if crate::config::env::block_postfix_catch()
|
||||
&& try_body.iter().any(|n| matches!(n, ASTNode::Throw { .. }))
|
||||
{
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "block with direct 'throw' must be followed by 'catch'".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
Ok(ASTNode::Program {
|
||||
statements: try_body,
|
||||
span: Span::unknown(),
|
||||
})
|
||||
}
|
||||
}
|
||||
/// Helper: parse a block `{ stmt* }` and return its statements
|
||||
pub(super) fn parse_block_statements(&mut self) -> Result<Vec<ASTNode>, ParseError> {
|
||||
self.consume(TokenType::LBRACE)?;
|
||||
@ -25,6 +97,151 @@ impl NyashParser {
|
||||
Ok(body)
|
||||
}
|
||||
|
||||
/// Grouped: declarations (box/interface/global/function/static/import)
|
||||
fn parse_declaration_statement(&mut self) -> Result<ASTNode, ParseError> {
|
||||
match &self.current_token().token_type {
|
||||
TokenType::BOX => self.parse_box_declaration(),
|
||||
TokenType::IMPORT => self.parse_import(),
|
||||
TokenType::INTERFACE => self.parse_interface_box_declaration(),
|
||||
TokenType::GLOBAL => self.parse_global_var(),
|
||||
TokenType::FUNCTION => self.parse_function_declaration(),
|
||||
TokenType::STATIC => self.parse_static_declaration(),
|
||||
_ => {
|
||||
let line = self.current_token().line;
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "declaration statement".to_string(),
|
||||
line,
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Grouped: control flow (if/loop/break/continue/return)
|
||||
fn parse_control_flow_statement(&mut self) -> Result<ASTNode, ParseError> {
|
||||
match &self.current_token().token_type {
|
||||
TokenType::IF => self.parse_if(),
|
||||
TokenType::LOOP => self.parse_loop(),
|
||||
TokenType::BREAK => self.parse_break(),
|
||||
TokenType::CONTINUE => self.parse_continue(),
|
||||
TokenType::RETURN => self.parse_return(),
|
||||
_ => {
|
||||
let line = self.current_token().line;
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "control-flow statement".to_string(),
|
||||
line,
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Grouped: IO/module-ish (print/nowait/include)
|
||||
fn parse_io_module_statement(&mut self) -> Result<ASTNode, ParseError> {
|
||||
match &self.current_token().token_type {
|
||||
TokenType::PRINT => self.parse_print(),
|
||||
TokenType::NOWAIT => self.parse_nowait(),
|
||||
TokenType::INCLUDE => self.parse_include(),
|
||||
_ => {
|
||||
let line = self.current_token().line;
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "io/module statement".to_string(),
|
||||
line,
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Grouped: variable-related (local/outbox)
|
||||
fn parse_variable_declaration_statement(&mut self) -> Result<ASTNode, ParseError> {
|
||||
match &self.current_token().token_type {
|
||||
TokenType::LOCAL => self.parse_local(),
|
||||
TokenType::OUTBOX => self.parse_outbox(),
|
||||
_ => {
|
||||
let line = self.current_token().line;
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "variable declaration".to_string(),
|
||||
line,
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Grouped: exception (try/throw) with gate checks preserved
|
||||
fn parse_exception_statement(&mut self) -> Result<ASTNode, ParseError> {
|
||||
match &self.current_token().token_type {
|
||||
TokenType::TRY => {
|
||||
if crate::config::env::parser_stage3() {
|
||||
self.parse_try_catch()
|
||||
} else {
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "enable NYASH_PARSER_STAGE3=1 to use 'try'".to_string(),
|
||||
line: self.current_token().line,
|
||||
})
|
||||
}
|
||||
}
|
||||
TokenType::THROW => {
|
||||
if crate::config::env::parser_stage3() {
|
||||
self.parse_throw()
|
||||
} else {
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "enable NYASH_PARSER_STAGE3=1 to use 'throw'".to_string(),
|
||||
line: self.current_token().line,
|
||||
})
|
||||
}
|
||||
}
|
||||
_ => {
|
||||
let line = self.current_token().line;
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "try/throw".to_string(),
|
||||
line,
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Error helpers for standalone postfix keywords (catch/cleanup)
|
||||
fn parse_postfix_catch_cleanup_error(&mut self) -> Result<ASTNode, ParseError> {
|
||||
match &self.current_token().token_type {
|
||||
TokenType::CATCH => {
|
||||
if crate::config::env::block_postfix_catch() {
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "postfix 'catch' is only allowed immediately after a standalone block: { ... } catch (...) { ... } (wrap if/else/loop in a standalone block)".to_string(),
|
||||
line: self.current_token().line,
|
||||
})
|
||||
} else {
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "enable NYASH_BLOCK_CATCH=1 (or NYASH_PARSER_STAGE3=1) to use postfix 'catch' after a standalone block".to_string(),
|
||||
line: self.current_token().line,
|
||||
})
|
||||
}
|
||||
}
|
||||
TokenType::CLEANUP => {
|
||||
if crate::config::env::block_postfix_catch() {
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "postfix 'cleanup' is only allowed immediately after a standalone block: { ... } cleanup { ... }".to_string(),
|
||||
line: self.current_token().line,
|
||||
})
|
||||
} else {
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "enable NYASH_BLOCK_CATCH=1 (or NYASH_PARSER_STAGE3=1) to use postfix 'cleanup' after a standalone block".to_string(),
|
||||
line: self.current_token().line,
|
||||
})
|
||||
}
|
||||
}
|
||||
_ => unreachable!(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Helper: parse catch parameter inside parentheses (after '(' consumed)
|
||||
/// Forms: (Type ident) | (ident) | ()
|
||||
pub(super) fn parse_catch_param(&mut self) -> Result<(Option<String>, Option<String>), ParseError> {
|
||||
@ -66,149 +283,22 @@ impl NyashParser {
|
||||
// For grammar diff: capture starting token to classify statement keyword
|
||||
let start_tok = self.current_token().token_type.clone();
|
||||
let result = match &start_tok {
|
||||
TokenType::LBRACE => {
|
||||
// Standalone block (Phase 15.5): may be followed by block‑postfix catch/finally
|
||||
// Only enabled under gate; otherwise treat as error via expression fallback
|
||||
// Parse the block body first
|
||||
let try_body = self.parse_block_statements()?;
|
||||
|
||||
// Allow whitespace/newlines between block and postfix keywords
|
||||
self.skip_newlines();
|
||||
|
||||
if crate::config::env::block_postfix_catch()
|
||||
&& (self.match_token(&TokenType::CATCH) || self.match_token(&TokenType::CLEANUP))
|
||||
{
|
||||
// Parse at most one catch, then optional cleanup
|
||||
let mut catch_clauses: Vec<CatchClause> = Vec::new();
|
||||
if self.match_token(&TokenType::CATCH) {
|
||||
self.advance(); // consume 'catch'
|
||||
self.consume(TokenType::LPAREN)?;
|
||||
let (exception_type, exception_var) = self.parse_catch_param()?;
|
||||
self.consume(TokenType::RPAREN)?;
|
||||
let catch_body = self.parse_block_statements()?;
|
||||
catch_clauses.push(CatchClause {
|
||||
exception_type,
|
||||
variable_name: exception_var,
|
||||
body: catch_body,
|
||||
span: Span::unknown(),
|
||||
});
|
||||
|
||||
// Single‑catch policy (MVP): disallow multiple catch in postfix form
|
||||
self.skip_newlines();
|
||||
if self.match_token(&TokenType::CATCH) {
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "single catch only after standalone block".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Optional cleanup
|
||||
let finally_body = if self.match_token(&TokenType::CLEANUP) {
|
||||
self.advance(); // consume 'cleanup'
|
||||
Some(self.parse_block_statements()?)
|
||||
} else {
|
||||
None
|
||||
};
|
||||
|
||||
Ok(ASTNode::TryCatch {
|
||||
try_body,
|
||||
catch_clauses,
|
||||
finally_body,
|
||||
span: Span::unknown(),
|
||||
})
|
||||
} else {
|
||||
// No postfix keywords. If gate is on, enforce MVP static check:
|
||||
// direct top-level `throw` inside the standalone block must be followed by catch
|
||||
if crate::config::env::block_postfix_catch()
|
||||
&& try_body.iter().any(|n| matches!(n, ASTNode::Throw { .. }))
|
||||
{
|
||||
let line = self.current_token().line;
|
||||
return Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "block with direct 'throw' must be followed by 'catch'".to_string(),
|
||||
line,
|
||||
});
|
||||
}
|
||||
Ok(ASTNode::Program {
|
||||
statements: try_body,
|
||||
span: Span::unknown(),
|
||||
})
|
||||
}
|
||||
}
|
||||
TokenType::BOX => self.parse_box_declaration(),
|
||||
TokenType::IMPORT => self.parse_import(),
|
||||
TokenType::INTERFACE => self.parse_interface_box_declaration(),
|
||||
TokenType::GLOBAL => self.parse_global_var(),
|
||||
TokenType::FUNCTION => self.parse_function_declaration(),
|
||||
TokenType::STATIC => {
|
||||
self.parse_static_declaration() // 🔥 静的宣言 (function/box)
|
||||
}
|
||||
TokenType::IF => self.parse_if(),
|
||||
TokenType::LOOP => self.parse_loop(),
|
||||
TokenType::BREAK => self.parse_break(),
|
||||
TokenType::CONTINUE => self.parse_continue(),
|
||||
TokenType::RETURN => self.parse_return(),
|
||||
TokenType::PRINT => self.parse_print(),
|
||||
TokenType::NOWAIT => self.parse_nowait(),
|
||||
TokenType::INCLUDE => self.parse_include(),
|
||||
TokenType::LOCAL => self.parse_local(),
|
||||
TokenType::OUTBOX => self.parse_outbox(),
|
||||
TokenType::TRY => {
|
||||
if crate::config::env::parser_stage3() {
|
||||
self.parse_try_catch()
|
||||
} else {
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "enable NYASH_PARSER_STAGE3=1 to use 'try'".to_string(),
|
||||
line: self.current_token().line,
|
||||
})
|
||||
}
|
||||
}
|
||||
TokenType::THROW => {
|
||||
if crate::config::env::parser_stage3() {
|
||||
self.parse_throw()
|
||||
} else {
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "enable NYASH_PARSER_STAGE3=1 to use 'throw'".to_string(),
|
||||
line: self.current_token().line,
|
||||
})
|
||||
}
|
||||
}
|
||||
TokenType::CATCH => {
|
||||
// Provide a friendlier error when someone writes: if { .. } catch { .. }
|
||||
if crate::config::env::block_postfix_catch() {
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "postfix 'catch' is only allowed immediately after a standalone block: { ... } catch (...) { ... } (wrap if/else/loop in a standalone block)".to_string(),
|
||||
line: self.current_token().line,
|
||||
})
|
||||
} else {
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "enable NYASH_BLOCK_CATCH=1 (or NYASH_PARSER_STAGE3=1) to use postfix 'catch' after a standalone block".to_string(),
|
||||
line: self.current_token().line,
|
||||
})
|
||||
}
|
||||
}
|
||||
TokenType::CLEANUP => {
|
||||
if crate::config::env::block_postfix_catch() {
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "postfix 'cleanup' is only allowed immediately after a standalone block: { ... } cleanup { ... }".to_string(),
|
||||
line: self.current_token().line,
|
||||
})
|
||||
} else {
|
||||
Err(ParseError::UnexpectedToken {
|
||||
found: self.current_token().token_type.clone(),
|
||||
expected: "enable NYASH_BLOCK_CATCH=1 (or NYASH_PARSER_STAGE3=1) to use postfix 'cleanup' after a standalone block".to_string(),
|
||||
line: self.current_token().line,
|
||||
})
|
||||
}
|
||||
}
|
||||
TokenType::LBRACE => self.parse_standalone_block_statement(),
|
||||
TokenType::BOX
|
||||
| TokenType::IMPORT
|
||||
| TokenType::INTERFACE
|
||||
| TokenType::GLOBAL
|
||||
| TokenType::FUNCTION
|
||||
| TokenType::STATIC => self.parse_declaration_statement(),
|
||||
TokenType::IF
|
||||
| TokenType::LOOP
|
||||
| TokenType::BREAK
|
||||
| TokenType::CONTINUE
|
||||
| TokenType::RETURN => self.parse_control_flow_statement(),
|
||||
TokenType::PRINT | TokenType::NOWAIT | TokenType::INCLUDE => self.parse_io_module_statement(),
|
||||
TokenType::LOCAL | TokenType::OUTBOX => self.parse_variable_declaration_statement(),
|
||||
TokenType::TRY | TokenType::THROW => self.parse_exception_statement(),
|
||||
TokenType::CATCH | TokenType::CLEANUP => self.parse_postfix_catch_cleanup_error(),
|
||||
TokenType::USING => self.parse_using(),
|
||||
TokenType::FROM => {
|
||||
// 🔥 from構文: from Parent.method(args) または from Parent.constructor(args)
|
||||
|
||||
@ -29,6 +29,7 @@ mod jit_direct;
|
||||
mod selfhost;
|
||||
mod tasks;
|
||||
mod trace;
|
||||
mod plugins;
|
||||
|
||||
// v2 plugin system imports
|
||||
use nyash_rust::runner_plugin_init;
|
||||
@ -37,7 +38,7 @@ use nyash_rust::runtime;
|
||||
|
||||
/// Resolve a using target according to priority: modules > relative > using-paths
|
||||
/// Returns Ok(resolved_path_or_token). On strict mode, ambiguous matches cause error.
|
||||
use pipeline::resolve_using_target;
|
||||
// use pipeline::resolve_using_target; // resolved within helpers; avoid unused warning
|
||||
|
||||
/// Main execution coordinator
|
||||
pub struct NyashRunner {
|
||||
@ -47,6 +48,7 @@ pub struct NyashRunner {
|
||||
/// Minimal task runner: read nyash.toml [env] and [tasks], run the named task via shell
|
||||
use tasks::run_named_task;
|
||||
|
||||
#[cfg(not(feature = "jit-direct-only"))]
|
||||
impl NyashRunner {
|
||||
/// Create a new runner with the given configuration
|
||||
pub fn new(config: CliConfig) -> Self {
|
||||
@ -55,402 +57,8 @@ impl NyashRunner {
|
||||
|
||||
/// Run Nyash based on the configuration
|
||||
pub fn run(&self) {
|
||||
// Macro sandbox child mode: --macro-expand-child <file>
|
||||
if let Some(ref macro_file) = self.config.macro_expand_child {
|
||||
crate::runner::modes::macro_child::run_macro_child(macro_file);
|
||||
return;
|
||||
}
|
||||
// Build system (MVP): nyash --build <nyash.toml>
|
||||
let groups = self.config.as_groups();
|
||||
if let Some(cfg_path) = groups.build.path.clone() {
|
||||
if let Err(e) = self.run_build_mvp(&cfg_path) {
|
||||
eprintln!("❌ build error: {}", e);
|
||||
std::process::exit(1);
|
||||
}
|
||||
return;
|
||||
}
|
||||
// Using/module overrides pre-processing
|
||||
let mut using_ctx = self.init_using_context();
|
||||
let mut pending_using: Vec<(String, Option<String>)> = Vec::new();
|
||||
// CLI --using SPEC entries (SPEC: 'ns', 'ns as Alias', '"path" as Alias')
|
||||
for spec in &groups.input.cli_usings {
|
||||
let s = spec.trim();
|
||||
if s.is_empty() {
|
||||
continue;
|
||||
}
|
||||
let (target, alias) = if let Some(pos) = s.find(" as ") {
|
||||
(
|
||||
s[..pos].trim().to_string(),
|
||||
Some(s[pos + 4..].trim().to_string()),
|
||||
)
|
||||
} else {
|
||||
(s.to_string(), None)
|
||||
};
|
||||
// Normalize quotes for path
|
||||
let is_path = target.starts_with('"')
|
||||
|| target.starts_with("./")
|
||||
|| target.starts_with('/')
|
||||
|| target.ends_with(".nyash");
|
||||
if is_path {
|
||||
let path = target.trim_matches('"').to_string();
|
||||
let name = alias.clone().unwrap_or_else(|| {
|
||||
std::path::Path::new(&path)
|
||||
.file_stem()
|
||||
.and_then(|s| s.to_str())
|
||||
.unwrap_or("module")
|
||||
.to_string()
|
||||
});
|
||||
pending_using.push((name, Some(path)));
|
||||
} else {
|
||||
pending_using.push((target, alias));
|
||||
}
|
||||
}
|
||||
for (ns, path) in using_ctx.pending_modules.iter() {
|
||||
let sb = crate::box_trait::StringBox::new(path.clone());
|
||||
crate::runtime::modules_registry::set(ns.clone(), Box::new(sb));
|
||||
}
|
||||
// Stage-1: Optional dependency tree bridge (log-only)
|
||||
if let Ok(dep_path) = std::env::var("NYASH_DEPS_JSON") {
|
||||
match std::fs::read_to_string(&dep_path) {
|
||||
Ok(s) => {
|
||||
let bytes = s.as_bytes().len();
|
||||
// Try to extract quick hints without failing
|
||||
let mut root_info = String::new();
|
||||
if let Ok(v) = serde_json::from_str::<serde_json::Value>(&s) {
|
||||
if let Some(r) = v.get("root_path").and_then(|x| x.as_str()) {
|
||||
root_info = format!(" root='{}'", r);
|
||||
}
|
||||
}
|
||||
crate::cli_v!(
|
||||
"[deps] loaded {} bytes from{} {}",
|
||||
bytes,
|
||||
if root_info.is_empty() { "" } else { ":" },
|
||||
root_info
|
||||
);
|
||||
}
|
||||
Err(e) => {
|
||||
crate::cli_v!("[deps] read error: {}", e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Phase-15: JSON IR v0 bridge (stdin/file)
|
||||
if self.try_run_json_v0_pipe() {
|
||||
return;
|
||||
}
|
||||
// Run named task from nyash.toml (MVP)
|
||||
if let Some(task) = groups.run_task.clone() {
|
||||
if let Err(e) = run_named_task(&task) {
|
||||
eprintln!("❌ Task error: {}", e);
|
||||
process::exit(1);
|
||||
}
|
||||
return;
|
||||
}
|
||||
// Verbose CLI flag maps to env for downstream helpers/scripts
|
||||
if groups.debug.cli_verbose {
|
||||
std::env::set_var("NYASH_CLI_VERBOSE", "1");
|
||||
}
|
||||
// GC mode forwarding: map CLI --gc to NYASH_GC_MODE for downstream runtimes
|
||||
if let Some(ref m) = groups.gc_mode {
|
||||
if !m.trim().is_empty() {
|
||||
std::env::set_var("NYASH_GC_MODE", m);
|
||||
}
|
||||
}
|
||||
// Script-level env directives (special comments) — parse early
|
||||
// Supported:
|
||||
// // @env KEY=VALUE
|
||||
// // @jit-debug (preset: exec, threshold=1, events+trace)
|
||||
// // @plugin-builtins (NYASH_USE_PLUGIN_BUILTINS=1)
|
||||
if let Some(ref filename) = groups.input.file {
|
||||
if let Ok(code) = fs::read_to_string(filename) {
|
||||
// Apply script-level directives and lint
|
||||
let strict_fields =
|
||||
std::env::var("NYASH_FIELDS_TOP_STRICT").ok().as_deref() == Some("1");
|
||||
if let Err(e) = cli_directives::apply_cli_directives_from_source(
|
||||
&code,
|
||||
strict_fields,
|
||||
groups.debug.cli_verbose,
|
||||
) {
|
||||
eprintln!("❌ Lint/Directive error: {}", e);
|
||||
std::process::exit(1);
|
||||
}
|
||||
|
||||
// Env overrides for using rules
|
||||
// Merge late env overrides (if any)
|
||||
if let Ok(paths) = std::env::var("NYASH_USING_PATH") {
|
||||
for p in paths.split(':') {
|
||||
let p = p.trim();
|
||||
if !p.is_empty() {
|
||||
using_ctx.using_paths.push(p.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
if let Ok(mods) = std::env::var("NYASH_MODULES") {
|
||||
for ent in mods.split(',') {
|
||||
if let Some((k, v)) = ent.split_once('=') {
|
||||
let k = k.trim();
|
||||
let v = v.trim();
|
||||
if !k.is_empty() && !v.is_empty() {
|
||||
using_ctx
|
||||
.pending_modules
|
||||
.push((k.to_string(), v.to_string()));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Apply pending modules to registry as StringBox (path or ns token)
|
||||
for (ns, path) in using_ctx.pending_modules.iter() {
|
||||
let sb = nyash_rust::box_trait::StringBox::new(path.clone());
|
||||
nyash_rust::runtime::modules_registry::set(ns.clone(), Box::new(sb));
|
||||
}
|
||||
// Resolve pending using with clear precedence and ambiguity handling
|
||||
let strict = std::env::var("NYASH_USING_STRICT").ok().as_deref() == Some("1");
|
||||
let verbose = crate::config::env::cli_verbose();
|
||||
let ctx = std::path::Path::new(filename).parent();
|
||||
for (ns, alias) in pending_using.iter() {
|
||||
let value = match resolve_using_target(
|
||||
ns,
|
||||
false,
|
||||
&using_ctx.pending_modules,
|
||||
&using_ctx.using_paths,
|
||||
&using_ctx.aliases,
|
||||
ctx,
|
||||
strict,
|
||||
verbose,
|
||||
) {
|
||||
Ok(v) => v,
|
||||
Err(e) => {
|
||||
eprintln!("❌ using: {}", e);
|
||||
std::process::exit(1);
|
||||
}
|
||||
};
|
||||
let sb = nyash_rust::box_trait::StringBox::new(value.clone());
|
||||
nyash_rust::runtime::modules_registry::set(ns.clone(), Box::new(sb));
|
||||
if let Some(a) = alias {
|
||||
let sb2 = nyash_rust::box_trait::StringBox::new(value);
|
||||
nyash_rust::runtime::modules_registry::set(a.clone(), Box::new(sb2));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// If strict mode requested via env, ensure handle-only shim behavior is enabled
|
||||
if std::env::var("NYASH_JIT_STRICT").ok().as_deref() == Some("1") {
|
||||
if std::env::var("NYASH_JIT_ARGS_HANDLE_ONLY").ok().is_none() {
|
||||
std::env::set_var("NYASH_JIT_ARGS_HANDLE_ONLY", "1");
|
||||
}
|
||||
// Enforce JIT-only by default in strict mode unless explicitly overridden
|
||||
if std::env::var("NYASH_JIT_ONLY").ok().is_none() {
|
||||
std::env::set_var("NYASH_JIT_ONLY", "1");
|
||||
}
|
||||
}
|
||||
|
||||
// 🏭 Phase 9.78b: Initialize unified registry
|
||||
runtime::init_global_unified_registry();
|
||||
|
||||
// Try to initialize BID plugins from nyash.toml (best-effort)
|
||||
// Allow disabling during snapshot/CI via NYASH_DISABLE_PLUGINS=1
|
||||
if std::env::var("NYASH_DISABLE_PLUGINS").ok().as_deref() != Some("1") {
|
||||
runner_plugin_init::init_bid_plugins();
|
||||
// Build BoxIndex after plugin host is initialized
|
||||
crate::runner::box_index::refresh_box_index();
|
||||
}
|
||||
// Allow interpreter to create plugin-backed boxes via unified registry
|
||||
// Opt-in by default for FileBox/TOMLBox which are required by ny-config and similar tools.
|
||||
if std::env::var("NYASH_USE_PLUGIN_BUILTINS").ok().is_none() {
|
||||
std::env::set_var("NYASH_USE_PLUGIN_BUILTINS", "1");
|
||||
}
|
||||
// Merge FileBox,TOMLBox with defaults if present
|
||||
let mut override_types: Vec<String> =
|
||||
if let Ok(list) = std::env::var("NYASH_PLUGIN_OVERRIDE_TYPES") {
|
||||
list.split(',')
|
||||
.map(|s| s.trim().to_string())
|
||||
.filter(|s| !s.is_empty())
|
||||
.collect()
|
||||
} else {
|
||||
vec!["ArrayBox".into(), "MapBox".into()]
|
||||
};
|
||||
for t in ["FileBox", "TOMLBox"] {
|
||||
if !override_types.iter().any(|x| x == t) {
|
||||
override_types.push(t.into());
|
||||
}
|
||||
}
|
||||
std::env::set_var("NYASH_PLUGIN_OVERRIDE_TYPES", override_types.join(","));
|
||||
|
||||
// Opt-in: load Ny script plugins listed in nyash.toml [ny_plugins]
|
||||
if groups.load_ny_plugins
|
||||
|| std::env::var("NYASH_LOAD_NY_PLUGINS").ok().as_deref() == Some("1")
|
||||
{
|
||||
if let Ok(text) = std::fs::read_to_string("nyash.toml") {
|
||||
if let Ok(doc) = toml::from_str::<toml::Value>(&text) {
|
||||
if let Some(np) = doc.get("ny_plugins") {
|
||||
let mut list: Vec<String> = Vec::new();
|
||||
if let Some(arr) = np.as_array() {
|
||||
for v in arr {
|
||||
if let Some(s) = v.as_str() {
|
||||
list.push(s.to_string());
|
||||
}
|
||||
}
|
||||
} else if let Some(tbl) = np.as_table() {
|
||||
for (_k, v) in tbl {
|
||||
if let Some(s) = v.as_str() {
|
||||
list.push(s.to_string());
|
||||
} else if let Some(arr) = v.as_array() {
|
||||
for e in arr {
|
||||
if let Some(s) = e.as_str() {
|
||||
list.push(s.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
if !list.is_empty() {
|
||||
let list_only =
|
||||
std::env::var("NYASH_NY_PLUGINS_LIST_ONLY").ok().as_deref()
|
||||
== Some("1");
|
||||
println!("🧩 Ny script plugins ({}):", list.len());
|
||||
for p in list {
|
||||
if list_only {
|
||||
println!(" • {}", p);
|
||||
continue;
|
||||
}
|
||||
// Execute each script best-effort via interpreter
|
||||
match std::fs::read_to_string(&p) {
|
||||
Ok(code) => {
|
||||
match nyash_rust::parser::NyashParser::parse_from_string(
|
||||
&code,
|
||||
) {
|
||||
Ok(ast) => {
|
||||
let mut interpreter =
|
||||
nyash_rust::interpreter::NyashInterpreter::new(
|
||||
);
|
||||
match interpreter.execute(ast) {
|
||||
Ok(_) => println!("[ny_plugins] {}: OK", p),
|
||||
Err(e) => {
|
||||
println!(
|
||||
"[ny_plugins] {}: FAIL ({})",
|
||||
p, e
|
||||
);
|
||||
// continue to next
|
||||
}
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
println!("[ny_plugins] {}: FAIL (parse: {})", p, e);
|
||||
}
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
println!("[ny_plugins] {}: FAIL (read: {})", p, e);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Optional: enable VM stats via CLI flags
|
||||
if groups.backend.vm_stats {
|
||||
std::env::set_var("NYASH_VM_STATS", "1");
|
||||
}
|
||||
if groups.backend.vm_stats_json {
|
||||
// Prefer explicit JSON flag over any default
|
||||
std::env::set_var("NYASH_VM_STATS_JSON", "1");
|
||||
}
|
||||
// Optional: JIT controls via CLI flags (centralized)
|
||||
{
|
||||
// CLI opt-in for JSONL events
|
||||
if groups.backend.jit.events {
|
||||
std::env::set_var("NYASH_JIT_EVENTS", "1");
|
||||
}
|
||||
if groups.backend.jit.events_compile {
|
||||
std::env::set_var("NYASH_JIT_EVENTS_COMPILE", "1");
|
||||
}
|
||||
if groups.backend.jit.events_runtime {
|
||||
std::env::set_var("NYASH_JIT_EVENTS_RUNTIME", "1");
|
||||
}
|
||||
if let Some(ref p) = groups.backend.jit.events_path {
|
||||
std::env::set_var("NYASH_JIT_EVENTS_PATH", p);
|
||||
}
|
||||
let mut jc = nyash_rust::jit::config::JitConfig::from_env();
|
||||
jc.exec |= groups.backend.jit.exec;
|
||||
jc.stats |= groups.backend.jit.stats;
|
||||
jc.stats_json |= groups.backend.jit.stats_json;
|
||||
jc.dump |= groups.backend.jit.dump;
|
||||
if groups.backend.jit.threshold.is_some() {
|
||||
jc.threshold = groups.backend.jit.threshold;
|
||||
}
|
||||
jc.phi_min |= groups.backend.jit.phi_min;
|
||||
jc.hostcall |= groups.backend.jit.hostcall;
|
||||
jc.handle_debug |= groups.backend.jit.handle_debug;
|
||||
jc.native_f64 |= groups.backend.jit.native_f64;
|
||||
jc.native_bool |= groups.backend.jit.native_bool;
|
||||
// If observability is enabled and no threshold is provided, force threshold=1 so lowering runs and emits events
|
||||
let events_on = std::env::var("NYASH_JIT_EVENTS").ok().as_deref() == Some("1")
|
||||
|| std::env::var("NYASH_JIT_EVENTS_COMPILE").ok().as_deref() == Some("1")
|
||||
|| std::env::var("NYASH_JIT_EVENTS_RUNTIME").ok().as_deref() == Some("1");
|
||||
if events_on && jc.threshold.is_none() {
|
||||
jc.threshold = Some(1);
|
||||
}
|
||||
if groups.backend.jit.only {
|
||||
std::env::set_var("NYASH_JIT_ONLY", "1");
|
||||
}
|
||||
// Apply runtime capability probe (e.g., disable b1 ABI if unsupported)
|
||||
let caps = nyash_rust::jit::config::probe_capabilities();
|
||||
jc = nyash_rust::jit::config::apply_runtime_caps(jc, caps);
|
||||
// Optional DOT emit via CLI (ensures dump is on when path specified)
|
||||
if let Some(path) = &groups.emit.emit_cfg {
|
||||
std::env::set_var("NYASH_JIT_DOT", path);
|
||||
jc.dump = true;
|
||||
}
|
||||
// Persist to env (CLI parity) and set as current
|
||||
jc.apply_env();
|
||||
nyash_rust::jit::config::set_current(jc.clone());
|
||||
}
|
||||
// Architectural pivot: JIT is compiler-only (EXE/AOT). Ensure VM runtime does not dispatch to JIT
|
||||
// unless explicitly requested via independent JIT mode, or when emitting AOT objects.
|
||||
if !groups.compile_native && !groups.backend.jit.direct {
|
||||
// When AOT object emission is requested, allow JIT to run for object generation
|
||||
let aot_obj = std::env::var("NYASH_AOT_OBJECT_OUT").ok();
|
||||
if aot_obj.is_none() || aot_obj.as_deref() == Some("") {
|
||||
// Force-disable runtime JIT execution path for VM/Interpreter flows
|
||||
std::env::set_var("NYASH_JIT_EXEC", "0");
|
||||
}
|
||||
}
|
||||
// Benchmark mode - can run without a file
|
||||
if groups.benchmark {
|
||||
println!("📊 Nyash Performance Benchmark Suite");
|
||||
println!("====================================");
|
||||
println!("Running {} iterations per test...", groups.iterations);
|
||||
println!();
|
||||
#[cfg(feature = "vm-legacy")]
|
||||
{
|
||||
self.execute_benchmark_mode();
|
||||
return;
|
||||
}
|
||||
#[cfg(not(feature = "vm-legacy"))]
|
||||
{
|
||||
eprintln!(
|
||||
"❌ Benchmark mode requires VM backend. Rebuild with --features vm-legacy."
|
||||
);
|
||||
std::process::exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
if let Some(ref filename) = groups.input.file {
|
||||
// Independent JIT direct mode (no VM execute path)
|
||||
if groups.backend.jit.direct {
|
||||
self.run_file_jit_direct(filename);
|
||||
return;
|
||||
}
|
||||
// Delegate file-mode execution to modes::common dispatcher
|
||||
self.run_file(filename);
|
||||
} else {
|
||||
demos::run_all_demos();
|
||||
}
|
||||
// New behavior-preserving delegator
|
||||
self.run_refactored();
|
||||
}
|
||||
|
||||
// init_bid_plugins moved to runner_plugin_init.rs
|
||||
@ -466,6 +74,223 @@ impl NyashRunner {
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(not(feature = "jit-direct-only"))]
|
||||
impl NyashRunner {
|
||||
/// New behavior-preserving refactor of run(): structured into smaller helpers
|
||||
fn run_refactored(&self) {
|
||||
// Early: macro child
|
||||
if let Some(ref macro_file) = self.config.macro_expand_child {
|
||||
crate::runner::modes::macro_child::run_macro_child(macro_file);
|
||||
return;
|
||||
}
|
||||
let groups = self.config.as_groups();
|
||||
// Early: build
|
||||
if let Some(cfg_path) = groups.build.path.clone() {
|
||||
if let Err(e) = self.run_build_mvp(&cfg_path) {
|
||||
eprintln!("❌ build error: {}", e);
|
||||
std::process::exit(1);
|
||||
}
|
||||
return;
|
||||
}
|
||||
// Preprocess usings and directives (includes dep-tree log)
|
||||
self.preprocess_usings_and_directives(&groups);
|
||||
// JSON v0 bridge
|
||||
if self.try_run_json_v0_pipe() { return; }
|
||||
// Named task
|
||||
if let Some(task) = groups.run_task.clone() {
|
||||
if let Err(e) = run_named_task(&task) {
|
||||
eprintln!("❌ Task error: {}", e);
|
||||
process::exit(1);
|
||||
}
|
||||
return;
|
||||
}
|
||||
// Common env + runtime/plugins
|
||||
self.apply_common_env(&groups);
|
||||
self.init_runtime_and_plugins(&groups);
|
||||
// Backend config + policy
|
||||
self.configure_backend(&groups);
|
||||
self.enforce_runtime_jit_policy(&groups);
|
||||
// Benchmark
|
||||
if self.maybe_run_benchmark(&groups) { return; }
|
||||
// Dispatch
|
||||
self.dispatch_entry(&groups);
|
||||
}
|
||||
|
||||
// ---- Helpers (extracted from original run) ----
|
||||
|
||||
fn preprocess_usings_and_directives(&self, groups: &crate::cli::CliGroups) {
|
||||
use pipeline::resolve_using_target;
|
||||
// Initialize UsingContext (defaults + nyash.toml + env)
|
||||
let mut using_ctx = self.init_using_context();
|
||||
// Collect CLI --using SPEC into (target, alias)
|
||||
let mut pending_using: Vec<(String, Option<String>)> = Vec::new();
|
||||
for spec in &groups.input.cli_usings {
|
||||
let s = spec.trim();
|
||||
if s.is_empty() { continue; }
|
||||
let (target, alias) = if let Some(pos) = s.find(" as ") {
|
||||
(s[..pos].trim().to_string(), Some(s[pos + 4..].trim().to_string()))
|
||||
} else { (s.to_string(), None) };
|
||||
let is_path = target.starts_with('"') || target.starts_with("./") || target.starts_with('/') || target.ends_with(".nyash");
|
||||
if is_path {
|
||||
let path = target.trim_matches('"').to_string();
|
||||
let name = alias.clone().unwrap_or_else(|| {
|
||||
std::path::Path::new(&path).file_stem().and_then(|s| s.to_str()).unwrap_or("module").to_string()
|
||||
});
|
||||
pending_using.push((name, Some(path)));
|
||||
} else {
|
||||
pending_using.push((target, alias));
|
||||
}
|
||||
}
|
||||
// Apply pending modules (from context) to registry as StringBox
|
||||
for (ns, path) in using_ctx.pending_modules.iter() {
|
||||
let sb = crate::box_trait::StringBox::new(path.clone());
|
||||
crate::runtime::modules_registry::set(ns.clone(), Box::new(sb));
|
||||
}
|
||||
// Optional dependency tree bridge (log-only)
|
||||
if let Ok(dep_path) = std::env::var("NYASH_DEPS_JSON") {
|
||||
match std::fs::read_to_string(&dep_path) {
|
||||
Ok(s) => {
|
||||
let bytes = s.as_bytes().len();
|
||||
let mut root_info = String::new();
|
||||
if let Ok(v) = serde_json::from_str::<serde_json::Value>(&s) {
|
||||
if let Some(r) = v.get("root_path").and_then(|x| x.as_str()) {
|
||||
root_info = format!(" root='{}'", r);
|
||||
}
|
||||
}
|
||||
crate::cli_v!("[deps] loaded {} bytes from{} {}", bytes, if root_info.is_empty() { "" } else { ":" }, root_info);
|
||||
}
|
||||
Err(e) => { crate::cli_v!("[deps] read error: {}", e); }
|
||||
}
|
||||
}
|
||||
// If a file is provided, apply script-level directives and late using/env merges
|
||||
if let Some(ref filename) = groups.input.file {
|
||||
if let Ok(code) = fs::read_to_string(filename) {
|
||||
// Apply directives and lint
|
||||
let strict_fields = std::env::var("NYASH_FIELDS_TOP_STRICT").ok().as_deref() == Some("1");
|
||||
if let Err(e) = cli_directives::apply_cli_directives_from_source(&code, strict_fields, groups.debug.cli_verbose) {
|
||||
eprintln!("❌ Lint/Directive error: {}", e);
|
||||
std::process::exit(1);
|
||||
}
|
||||
// Late env overrides (paths/modules)
|
||||
if let Ok(paths) = std::env::var("NYASH_USING_PATH") {
|
||||
for p in paths.split(':') { let p = p.trim(); if !p.is_empty() { using_ctx.using_paths.push(p.to_string()); } }
|
||||
}
|
||||
if let Ok(mods) = std::env::var("NYASH_MODULES") {
|
||||
for ent in mods.split(',') {
|
||||
if let Some((k, v)) = ent.split_once('=') {
|
||||
let k = k.trim(); let v = v.trim();
|
||||
if !k.is_empty() && !v.is_empty() { using_ctx.pending_modules.push((k.to_string(), v.to_string())); }
|
||||
}
|
||||
}
|
||||
}
|
||||
// Re-apply pending modules in case env added more (idempotent)
|
||||
for (ns, path) in using_ctx.pending_modules.iter() {
|
||||
let sb = nyash_rust::box_trait::StringBox::new(path.clone());
|
||||
nyash_rust::runtime::modules_registry::set(ns.clone(), Box::new(sb));
|
||||
}
|
||||
// Resolve CLI --using entries against context and register values (with aliasing)
|
||||
let strict = std::env::var("NYASH_USING_STRICT").ok().as_deref() == Some("1");
|
||||
let verbose = crate::config::env::cli_verbose();
|
||||
let ctx = std::path::Path::new(filename).parent();
|
||||
for (ns, alias) in pending_using.iter() {
|
||||
let value = match resolve_using_target(ns, false, &using_ctx.pending_modules, &using_ctx.using_paths, &using_ctx.aliases, ctx, strict, verbose) {
|
||||
Ok(v) => v,
|
||||
Err(e) => { eprintln!("❌ using: {}", e); std::process::exit(1); }
|
||||
};
|
||||
let sb = nyash_rust::box_trait::StringBox::new(value.clone());
|
||||
nyash_rust::runtime::modules_registry::set(ns.clone(), Box::new(sb));
|
||||
if let Some(a) = alias {
|
||||
let sb2 = nyash_rust::box_trait::StringBox::new(value);
|
||||
nyash_rust::runtime::modules_registry::set(a.clone(), Box::new(sb2));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Apply early environment toggles that affect CLI behavior and VM stats.
|
||||
/// Side effects: sets `NYASH_CLI_VERBOSE`, `NYASH_GC_MODE` when specified by CLI groups.
|
||||
fn apply_common_env(&self, groups: &crate::cli::CliGroups) {
|
||||
if groups.debug.cli_verbose { std::env::set_var("NYASH_CLI_VERBOSE", "1"); }
|
||||
if let Some(ref m) = groups.gc_mode { if !m.trim().is_empty() { std::env::set_var("NYASH_GC_MODE", m); } }
|
||||
}
|
||||
|
||||
// init_runtime_and_plugins moved to runner/plugins.rs
|
||||
|
||||
/// Configure backend knobs (VM/JIT) from CLI flags and env, merging with runtime capabilities.
|
||||
/// Side effects: writes environment variables for flags and applies JIT config via nyash_rust::jit::config.
|
||||
fn configure_backend(&self, groups: &crate::cli::CliGroups) {
|
||||
if groups.backend.vm_stats { std::env::set_var("NYASH_VM_STATS", "1"); }
|
||||
if groups.backend.vm_stats_json { std::env::set_var("NYASH_VM_STATS_JSON", "1"); }
|
||||
{
|
||||
if groups.backend.jit.events { std::env::set_var("NYASH_JIT_EVENTS", "1"); }
|
||||
if groups.backend.jit.events_compile { std::env::set_var("NYASH_JIT_EVENTS_COMPILE", "1"); }
|
||||
if groups.backend.jit.events_runtime { std::env::set_var("NYASH_JIT_EVENTS_RUNTIME", "1"); }
|
||||
if let Some(ref p) = groups.backend.jit.events_path { std::env::set_var("NYASH_JIT_EVENTS_PATH", p); }
|
||||
let mut jc = nyash_rust::jit::config::JitConfig::from_env();
|
||||
jc.exec |= groups.backend.jit.exec;
|
||||
jc.stats |= groups.backend.jit.stats;
|
||||
jc.stats_json |= groups.backend.jit.stats_json;
|
||||
jc.dump |= groups.backend.jit.dump;
|
||||
if groups.backend.jit.threshold.is_some() { jc.threshold = groups.backend.jit.threshold; }
|
||||
jc.phi_min |= groups.backend.jit.phi_min;
|
||||
jc.hostcall |= groups.backend.jit.hostcall;
|
||||
jc.handle_debug |= groups.backend.jit.handle_debug;
|
||||
jc.native_f64 |= groups.backend.jit.native_f64;
|
||||
jc.native_bool |= groups.backend.jit.native_bool;
|
||||
let events_on = std::env::var("NYASH_JIT_EVENTS").ok().as_deref() == Some("1")
|
||||
|| std::env::var("NYASH_JIT_EVENTS_COMPILE").ok().as_deref() == Some("1")
|
||||
|| std::env::var("NYASH_JIT_EVENTS_RUNTIME").ok().as_deref() == Some("1");
|
||||
if events_on && jc.threshold.is_none() { jc.threshold = Some(1); }
|
||||
if groups.backend.jit.only { std::env::set_var("NYASH_JIT_ONLY", "1"); }
|
||||
let caps = nyash_rust::jit::config::probe_capabilities();
|
||||
jc = nyash_rust::jit::config::apply_runtime_caps(jc, caps);
|
||||
if let Some(path) = &groups.emit.emit_cfg { std::env::set_var("NYASH_JIT_DOT", path); jc.dump = true; }
|
||||
jc.apply_env();
|
||||
nyash_rust::jit::config::set_current(jc.clone());
|
||||
}
|
||||
if std::env::var("NYASH_JIT_STRICT").ok().as_deref() == Some("1") {
|
||||
if std::env::var("NYASH_JIT_ARGS_HANDLE_ONLY").ok().is_none() { std::env::set_var("NYASH_JIT_ARGS_HANDLE_ONLY", "1"); }
|
||||
if std::env::var("NYASH_JIT_ONLY").ok().is_none() { std::env::set_var("NYASH_JIT_ONLY", "1"); }
|
||||
}
|
||||
}
|
||||
|
||||
/// Enforce runtime policy for JIT execution when AOT object output is absent.
|
||||
/// Side effects: may set `NYASH_JIT_EXEC=0` when policy requires.
|
||||
fn enforce_runtime_jit_policy(&self, groups: &crate::cli::CliGroups) {
|
||||
if !groups.compile_native && !groups.backend.jit.direct {
|
||||
let aot_obj = std::env::var("NYASH_AOT_OBJECT_OUT").ok();
|
||||
if aot_obj.is_none() || aot_obj.as_deref() == Some("") {
|
||||
std::env::set_var("NYASH_JIT_EXEC", "0");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Optionally run the benchmark suite and exit, depending on CLI flags.
|
||||
/// Returns true when a benchmark run occurred.
|
||||
fn maybe_run_benchmark(&self, groups: &crate::cli::CliGroups) -> bool {
|
||||
if groups.benchmark {
|
||||
println!("📊 Nyash Performance Benchmark Suite");
|
||||
println!("====================================");
|
||||
println!("Running {} iterations per test...", groups.iterations);
|
||||
println!();
|
||||
#[cfg(feature = "vm-legacy")]
|
||||
{ self.execute_benchmark_mode(); return true; }
|
||||
#[cfg(not(feature = "vm-legacy"))]
|
||||
{ eprintln!("❌ Benchmark mode requires VM backend. Rebuild with --features vm-legacy."); std::process::exit(1); }
|
||||
}
|
||||
false
|
||||
}
|
||||
|
||||
/// Final dispatch to selected execution mode (file/JIT-direct or demos).
|
||||
fn dispatch_entry(&self, groups: &crate::cli::CliGroups) {
|
||||
if let Some(ref filename) = groups.input.file {
|
||||
if groups.backend.jit.direct { self.run_file_jit_direct(filename); return; }
|
||||
self.run_file(filename);
|
||||
} else { demos::run_all_demos(); }
|
||||
}
|
||||
}
|
||||
|
||||
impl NyashRunner {
|
||||
/// Run a file through independent JIT engine (no VM execute loop)
|
||||
fn run_file_jit_direct(&self, filename: &str) {
|
||||
|
||||
@ -2,7 +2,6 @@ use super::super::NyashRunner;
|
||||
use crate::runner::json_v0_bridge;
|
||||
use nyash_rust::{parser::NyashParser, interpreter::NyashInterpreter};
|
||||
// Use the library crate's plugin init module rather than the bin crate root
|
||||
use nyash_rust::runner_plugin_init;
|
||||
use std::{fs, process};
|
||||
use std::io::Read;
|
||||
use std::process::Stdio;
|
||||
@ -15,113 +14,7 @@ use crate::cli_v;
|
||||
// (moved) suggest_in_base is now in runner/pipeline.rs
|
||||
|
||||
impl NyashRunner {
|
||||
/// File-mode dispatcher (thin wrapper around backend/mode selection)
|
||||
#[allow(dead_code)]
|
||||
pub(crate) fn run_file_legacy(&self, filename: &str) {
|
||||
// Phase-15.3: Ny compiler MVP (Ny -> JSON v0) behind env gate
|
||||
if std::env::var("NYASH_USE_NY_COMPILER").ok().as_deref() == Some("1") {
|
||||
if self.try_run_selfhost_pipeline(filename) {
|
||||
return;
|
||||
} else if crate::config::env::cli_verbose() {
|
||||
eprintln!("[ny-compiler] fallback to default path (MVP unavailable for this input)");
|
||||
}
|
||||
}
|
||||
// Direct v0 bridge when requested via CLI/env
|
||||
let groups = self.config.as_groups();
|
||||
let use_ny_parser = groups.parser.parser_ny || std::env::var("NYASH_USE_NY_PARSER").ok().as_deref() == Some("1");
|
||||
if use_ny_parser {
|
||||
let code = match fs::read_to_string(filename) {
|
||||
Ok(content) => content,
|
||||
Err(e) => { eprintln!("❌ Error reading file {}: {}", filename, e); process::exit(1); }
|
||||
};
|
||||
match json_v0_bridge::parse_source_v0_to_module(&code) {
|
||||
Ok(module) => {
|
||||
if crate::config::env::cli_verbose() {
|
||||
println!("🚀 Nyash MIR Interpreter - (parser=ny) Executing file: {} 🚀", filename);
|
||||
}
|
||||
self.execute_mir_module(&module);
|
||||
return;
|
||||
}
|
||||
Err(e) => { eprintln!("❌ Direct bridge parse error: {}", e); process::exit(1); }
|
||||
}
|
||||
}
|
||||
// AST dump mode
|
||||
if groups.debug.dump_ast {
|
||||
println!("🧠 Nyash AST Dump - Processing file: {}", filename);
|
||||
let code = match fs::read_to_string(filename) {
|
||||
Ok(content) => content,
|
||||
Err(e) => { eprintln!("❌ Error reading file {}: {}", filename, e); process::exit(1); }
|
||||
};
|
||||
let ast = match NyashParser::parse_from_string(&code) {
|
||||
Ok(ast) => ast,
|
||||
Err(e) => { eprintln!("❌ Parse error: {}", e); process::exit(1); }
|
||||
};
|
||||
println!("{:#?}", ast);
|
||||
return;
|
||||
}
|
||||
|
||||
// MIR dump/verify
|
||||
if groups.debug.dump_mir || groups.debug.verify_mir {
|
||||
crate::cli_v!("🚀 Nyash MIR Compiler - Processing file: {} 🚀", filename);
|
||||
self.execute_mir_mode(filename);
|
||||
return;
|
||||
}
|
||||
|
||||
// WASM / AOT (feature-gated)
|
||||
if groups.compile_wasm {
|
||||
#[cfg(feature = "wasm-backend")]
|
||||
{ self.execute_wasm_mode(filename); return; }
|
||||
#[cfg(not(feature = "wasm-backend"))]
|
||||
{ eprintln!("❌ WASM backend not available. Please rebuild with: cargo build --features wasm-backend"); process::exit(1); }
|
||||
}
|
||||
if groups.compile_native {
|
||||
#[cfg(feature = "cranelift-jit")]
|
||||
{ self.execute_aot_mode(filename); return; }
|
||||
#[cfg(not(feature = "cranelift-jit"))]
|
||||
{ eprintln!("❌ Native AOT compilation requires Cranelift. Please rebuild: cargo build --features cranelift-jit"); process::exit(1); }
|
||||
}
|
||||
|
||||
// Backend selection
|
||||
match groups.backend.backend.as_str() {
|
||||
"mir" => {
|
||||
crate::cli_v!("🚀 Nyash MIR Interpreter - Executing file: {} 🚀", filename);
|
||||
self.execute_mir_interpreter_mode(filename);
|
||||
}
|
||||
"vm" => {
|
||||
crate::cli_v!("🚀 Nyash VM Backend - Executing file: {} 🚀", filename);
|
||||
self.execute_vm_mode(filename);
|
||||
}
|
||||
"cranelift" => {
|
||||
#[cfg(feature = "cranelift-jit")]
|
||||
{
|
||||
crate::cli_v!("⚙️ Nyash Cranelift JIT - Executing file: {}", filename);
|
||||
self.execute_cranelift_mode(filename);
|
||||
}
|
||||
#[cfg(not(feature = "cranelift-jit"))]
|
||||
{
|
||||
eprintln!("❌ Cranelift backend not available. Please rebuild with: cargo build --features cranelift-jit");
|
||||
process::exit(1);
|
||||
}
|
||||
}
|
||||
"llvm" => {
|
||||
crate::cli_v!("⚡ Nyash LLVM Backend - Executing file: {} ⚡", filename);
|
||||
self.execute_llvm_mode(filename);
|
||||
}
|
||||
_ => {
|
||||
if cli_verbose() {
|
||||
println!("🦀 Nyash Rust Implementation - Executing file: {} 🦀", filename);
|
||||
let groups = self.config.as_groups();
|
||||
if let Some(fuel) = groups.debug.debug_fuel {
|
||||
println!("🔥 Debug fuel limit: {} iterations", fuel);
|
||||
} else {
|
||||
println!("🔥 Debug fuel limit: unlimited");
|
||||
}
|
||||
println!("====================================================");
|
||||
}
|
||||
self.execute_nyash_file(filename);
|
||||
}
|
||||
}
|
||||
}
|
||||
// legacy run_file_legacy removed (was commented out)
|
||||
|
||||
/// Helper: run PyVM harness over a MIR module, returning the exit code
|
||||
fn run_pyvm_harness(&self, module: &nyash_rust::mir::MirModule, tag: &str) -> Result<i32, String> {
|
||||
@ -137,356 +30,15 @@ impl NyashRunner {
|
||||
/// Phase-15.3: Attempt Ny compiler pipeline (Ny -> JSON v0 via Ny program), then execute MIR
|
||||
pub(crate) fn try_run_ny_compiler_pipeline(&self, filename: &str) -> bool {
|
||||
// Delegate to centralized selfhost pipeline to avoid drift
|
||||
return self.try_run_selfhost_pipeline(filename);
|
||||
use std::io::Write;
|
||||
// Read input source
|
||||
let code = match fs::read_to_string(filename) {
|
||||
Ok(c) => c,
|
||||
Err(e) => { eprintln!("[ny-compiler] read error: {}", e); return false; }
|
||||
};
|
||||
// Optional Phase-15: strip `using` lines and register modules (same policy as execute_nyash_file)
|
||||
let enable_using = crate::config::env::enable_using();
|
||||
let mut code_ref: std::borrow::Cow<'_, str> = std::borrow::Cow::Borrowed(&code);
|
||||
if enable_using {
|
||||
let mut out = String::with_capacity(code.len());
|
||||
let mut used_names: Vec<(String, Option<String>)> = Vec::new();
|
||||
for line in code.lines() {
|
||||
let t = line.trim_start();
|
||||
if t.starts_with("using ") {
|
||||
cli_v!("[using] stripped(line→selfhost): {}", line);
|
||||
let rest0 = t.strip_prefix("using ").unwrap().trim();
|
||||
let rest0 = rest0.strip_suffix(';').unwrap_or(rest0).trim();
|
||||
let (target, alias) = if let Some(pos) = rest0.find(" as ") {
|
||||
(rest0[..pos].trim().to_string(), Some(rest0[pos+4..].trim().to_string()))
|
||||
} else { (rest0.to_string(), None) };
|
||||
let is_path = target.starts_with('"') || target.starts_with("./") || target.starts_with('/') || target.ends_with(".nyash");
|
||||
if is_path {
|
||||
let path = target.trim_matches('"').to_string();
|
||||
let name = alias.clone().unwrap_or_else(|| {
|
||||
std::path::Path::new(&path).file_stem().and_then(|s| s.to_str()).unwrap_or("module").to_string()
|
||||
});
|
||||
used_names.push((name, Some(path)));
|
||||
} else {
|
||||
used_names.push((target, alias));
|
||||
}
|
||||
continue;
|
||||
}
|
||||
out.push_str(line);
|
||||
out.push('\n');
|
||||
}
|
||||
// Register modules into minimal registry with best-effort path resolution
|
||||
for (ns_or_alias, alias_or_path) in used_names {
|
||||
if let Some(path) = alias_or_path {
|
||||
let sb = crate::box_trait::StringBox::new(path);
|
||||
crate::runtime::modules_registry::set(ns_or_alias, Box::new(sb));
|
||||
} else {
|
||||
let rel = format!("apps/{}.nyash", ns_or_alias.replace('.', "/"));
|
||||
let exists = std::path::Path::new(&rel).exists();
|
||||
let path_or_ns = if exists { rel } else { ns_or_alias.clone() };
|
||||
let sb = crate::box_trait::StringBox::new(path_or_ns);
|
||||
crate::runtime::modules_registry::set(ns_or_alias, Box::new(sb));
|
||||
}
|
||||
}
|
||||
code_ref = std::borrow::Cow::Owned(out);
|
||||
}
|
||||
|
||||
// Write to tmp/ny_parser_input.ny (as expected by Ny parser v0), unless forced to reuse existing tmp
|
||||
let use_tmp_only = crate::config::env::ny_compiler_use_tmp_only();
|
||||
let tmp_dir = std::path::Path::new("tmp");
|
||||
if let Err(e) = std::fs::create_dir_all(tmp_dir) {
|
||||
eprintln!("[ny-compiler] mkdir tmp failed: {}", e);
|
||||
return false;
|
||||
}
|
||||
let tmp_path = tmp_dir.join("ny_parser_input.ny");
|
||||
if !use_tmp_only {
|
||||
match std::fs::File::create(&tmp_path) {
|
||||
Ok(mut f) => {
|
||||
if let Err(e) = f.write_all(code_ref.as_bytes()) {
|
||||
eprintln!("[ny-compiler] write tmp failed: {}", e);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
Err(e) => { eprintln!("[ny-compiler] open tmp failed: {}", e); return false; }
|
||||
}
|
||||
}
|
||||
// EXE-first: if requested, try external parser EXE (nyash_compiler)
|
||||
if crate::config::env::use_ny_compiler_exe() {
|
||||
// Resolve parser EXE path
|
||||
let exe_path = if let Some(p) = crate::config::env::ny_compiler_exe_path() {
|
||||
std::path::PathBuf::from(p)
|
||||
} else {
|
||||
let mut p = std::path::PathBuf::from("dist/nyash_compiler");
|
||||
#[cfg(windows)]
|
||||
{ p.push("nyash_compiler.exe"); }
|
||||
#[cfg(not(windows))]
|
||||
{ p.push("nyash_compiler"); }
|
||||
if !p.exists() {
|
||||
// Try PATH
|
||||
if let Ok(w) = which::which("nyash_compiler") { w } else { p }
|
||||
} else { p }
|
||||
};
|
||||
if exe_path.exists() {
|
||||
let mut cmd = std::process::Command::new(&exe_path);
|
||||
// Prefer passing the original filename directly (parser EXE accepts positional path)
|
||||
cmd.arg(filename);
|
||||
// Gates
|
||||
if crate::config::env::ny_compiler_min_json() { cmd.arg("--min-json"); }
|
||||
if crate::config::env::selfhost_read_tmp() { cmd.arg("--read-tmp"); }
|
||||
if let Some(raw) = crate::config::env::ny_compiler_child_args() { for tok in raw.split_whitespace() { cmd.arg(tok); } }
|
||||
let timeout_ms: u64 = crate::config::env::ny_compiler_timeout_ms();
|
||||
let out = match super::common_util::io::spawn_with_timeout(cmd, timeout_ms) {
|
||||
Ok(o) => o,
|
||||
Err(e) => { eprintln!("[ny-compiler] exe spawn failed: {}", e); return false; }
|
||||
};
|
||||
if out.timed_out {
|
||||
let head = String::from_utf8_lossy(&out.stdout).chars().take(200).collect::<String>();
|
||||
eprintln!("[ny-compiler] exe timeout after {} ms; stdout(head)='{}'", timeout_ms, head.replace('\n', "\\n"));
|
||||
return false;
|
||||
}
|
||||
let stdout = match String::from_utf8(out.stdout) { Ok(s) => s, Err(_) => String::new() };
|
||||
let mut json_line = String::new();
|
||||
for line in stdout.lines() { let t = line.trim(); if t.starts_with('{') && t.contains("\"version\"") && t.contains("\"kind\"") { json_line = t.to_string(); break; } }
|
||||
if json_line.is_empty() {
|
||||
if crate::config::env::cli_verbose() {
|
||||
let head: String = stdout.chars().take(200).collect();
|
||||
let errh: String = String::from_utf8_lossy(&out.stderr).chars().take(200).collect();
|
||||
eprintln!("[ny-compiler] exe produced no JSON; stdout(head)='{}' stderr(head)='{}'", head.replace('\n', "\\n"), errh.replace('\n', "\\n"));
|
||||
}
|
||||
return false;
|
||||
}
|
||||
// Parse JSON v0 → MIR module
|
||||
match json_v0_bridge::parse_json_v0_to_module(&json_line) {
|
||||
Ok(module) => {
|
||||
println!("🚀 Ny compiler EXE path (ny→json_v0) ON");
|
||||
json_v0_bridge::maybe_dump_mir(&module);
|
||||
let emit_only = crate::config::env::ny_compiler_emit_only();
|
||||
if emit_only {
|
||||
return false;
|
||||
} else {
|
||||
// Prefer PyVM when requested (reference semantics), regardless of BoxCall presence
|
||||
let prefer_pyvm = crate::config::env::vm_use_py();
|
||||
if prefer_pyvm {
|
||||
if let Ok(py3) = which::which("python3") {
|
||||
let runner = std::path::Path::new("tools/pyvm_runner.py");
|
||||
if runner.exists() {
|
||||
let tmp_dir = std::path::Path::new("tmp");
|
||||
let _ = std::fs::create_dir_all(tmp_dir);
|
||||
let mir_json_path = tmp_dir.join("nyash_pyvm_mir.json");
|
||||
if let Err(e) = crate::runner::mir_json_emit::emit_mir_json_for_harness_bin(&module, &mir_json_path) {
|
||||
eprintln!("❌ PyVM MIR JSON emit error: {}", e);
|
||||
return true; // prevent double-run fallback
|
||||
}
|
||||
let code = self.run_pyvm_harness(&module, "exe").unwrap_or(1);
|
||||
println!("Result: {}", code);
|
||||
std::process::exit(code);
|
||||
} else {
|
||||
eprintln!("❌ PyVM runner not found: {}", runner.display());
|
||||
std::process::exit(1);
|
||||
}
|
||||
} else {
|
||||
eprintln!("❌ python3 not found in PATH. Install Python 3 to use PyVM.");
|
||||
std::process::exit(1);
|
||||
}
|
||||
}
|
||||
// Default: execute via built-in MIR interpreter
|
||||
self.execute_mir_module(&module);
|
||||
return true;
|
||||
}
|
||||
}
|
||||
Err(e) => { eprintln!("[ny-compiler] JSON parse failed (exe): {}", e); return false; }
|
||||
}
|
||||
} else {
|
||||
if crate::config::env::cli_verbose() { eprintln!("[ny-compiler] exe not found at {}", exe_path.display()); }
|
||||
}
|
||||
}
|
||||
|
||||
// Locate current exe to invoke Ny VM for the Ny parser program
|
||||
let exe = match std::env::current_exe() {
|
||||
Ok(p) => p,
|
||||
Err(e) => { eprintln!("[ny-compiler] current_exe failed: {}", e); return false; }
|
||||
};
|
||||
// Select selfhost compiler entry
|
||||
// NYASH_NY_COMPILER_PREF=legacy|new|auto (default auto: prefer new when exists)
|
||||
let cand_new = std::path::Path::new("apps/selfhost/compiler/compiler.nyash");
|
||||
let cand_old = std::path::Path::new("apps/selfhost/parser/ny_parser_v0/main.nyash");
|
||||
let pref = std::env::var("NYASH_NY_COMPILER_PREF").ok();
|
||||
let parser_prog = match pref.as_deref() {
|
||||
Some("legacy") => cand_old,
|
||||
Some("new") => cand_new,
|
||||
_ => if cand_new.exists() { cand_new } else { cand_old },
|
||||
};
|
||||
if !parser_prog.exists() { eprintln!("[ny-compiler] compiler program not found: {}", parser_prog.display()); return false; }
|
||||
let mut cmd = std::process::Command::new(exe);
|
||||
cmd.arg("--backend").arg("vm").arg(parser_prog);
|
||||
// Gate: pass script args to child parser program
|
||||
// - NYASH_NY_COMPILER_MIN_JSON=1 → "-- --min-json"
|
||||
// - NYASH_SELFHOST_READ_TMP=1 → "-- --read-tmp"
|
||||
// - NYASH_NY_COMPILER_CHILD_ARGS: additional raw args (split by whitespace)
|
||||
let min_json = std::env::var("NYASH_NY_COMPILER_MIN_JSON").ok().unwrap_or_else(|| "0".to_string());
|
||||
let mut inserted_sep = false;
|
||||
if min_json == "1" {
|
||||
cmd.arg("--").arg("--min-json");
|
||||
inserted_sep = true;
|
||||
}
|
||||
if std::env::var("NYASH_SELFHOST_READ_TMP").ok().as_deref() == Some("1") {
|
||||
if !inserted_sep { cmd.arg("--"); inserted_sep = true; }
|
||||
cmd.arg("--read-tmp");
|
||||
}
|
||||
if let Ok(raw) = std::env::var("NYASH_NY_COMPILER_CHILD_ARGS") {
|
||||
if !inserted_sep { cmd.arg("--"); inserted_sep = true; }
|
||||
for tok in raw.split_whitespace() { cmd.arg(tok); }
|
||||
}
|
||||
// Propagate minimal env; prefer stdlib over plugins in child for stable stdout
|
||||
cmd.env_remove("NYASH_USE_NY_COMPILER");
|
||||
cmd.env_remove("NYASH_CLI_VERBOSE");
|
||||
cmd.env("NYASH_DISABLE_PLUGINS", "1");
|
||||
cmd.env_remove("NYASH_USE_PLUGIN_BUILTINS");
|
||||
// Suppress parent runner's result printing in child
|
||||
cmd.env("NYASH_JSON_ONLY", "1");
|
||||
// Prefer PyVM in child to ensure println/externcall are printed to stdout deterministically
|
||||
cmd.env("NYASH_VM_USE_PY", "1");
|
||||
// Propagate optional gates to child (if present)
|
||||
if let Ok(v) = std::env::var("NYASH_JSON_INCLUDE_USINGS") { cmd.env("NYASH_JSON_INCLUDE_USINGS", v); }
|
||||
if let Ok(v) = std::env::var("NYASH_ENABLE_USING") { cmd.env("NYASH_ENABLE_USING", v); }
|
||||
if let Ok(v) = std::env::var("NYASH_ENABLE_USING") { cmd.env("NYASH_ENABLE_USING", v); }
|
||||
// Child timeout guard (Hotfix for potential infinite loop in child Ny parser)
|
||||
// Config: NYASH_NY_COMPILER_TIMEOUT_MS (default 2000ms)
|
||||
let timeout_ms: u64 = std::env::var("NYASH_NY_COMPILER_TIMEOUT_MS")
|
||||
.ok()
|
||||
.and_then(|s| s.parse::<u64>().ok())
|
||||
.unwrap_or(2000);
|
||||
let out = match super::common_util::io::spawn_with_timeout(cmd, timeout_ms) {
|
||||
Ok(o) => o,
|
||||
Err(e) => { eprintln!("[ny-compiler] spawn failed: {}", e); return false; }
|
||||
};
|
||||
if out.timed_out {
|
||||
let head = String::from_utf8_lossy(&out.stdout).chars().take(200).collect::<String>();
|
||||
eprintln!("[ny-compiler] child timeout after {} ms; stdout(head)='{}'", timeout_ms, head.replace('\n', "\\n"));
|
||||
}
|
||||
let stdout = match String::from_utf8(out.stdout.clone()) { Ok(s) => s, Err(_) => String::new() };
|
||||
if timed_out {
|
||||
// Fall back path will be taken below when json_line remains empty
|
||||
} else if let Ok(s) = String::from_utf8(err_buf.clone()) {
|
||||
// If the child exited non-zero and printed stderr, surface it and fallback
|
||||
// We cannot easily access ExitStatus here after try_wait loop; rely on JSON detection path.
|
||||
if s.trim().len() > 0 && crate::config::env::cli_verbose() {
|
||||
eprintln!("[ny-compiler] parser stderr:\n{}", s);
|
||||
}
|
||||
}
|
||||
let mut json_line = String::new();
|
||||
for line in stdout.lines() {
|
||||
let t = line.trim();
|
||||
if t.starts_with('{') && t.contains("\"version\"") && t.contains("\"kind\"") { json_line = t.to_string(); break; }
|
||||
}
|
||||
if json_line.is_empty() {
|
||||
// Fallback: try Python MVP parser to produce JSON v0 from the same tmp source (unless skipped).
|
||||
if crate::config::env::cli_verbose() {
|
||||
let head: String = stdout.chars().take(200).collect();
|
||||
cli_v!("[ny-compiler] JSON not found in child stdout (head): {}", head.replace('\\n', "\\n"));
|
||||
cli_v!("[ny-compiler] falling back to tools/ny_parser_mvp.py for this input");
|
||||
}
|
||||
if std::env::var("NYASH_NY_COMPILER_SKIP_PY").ok().as_deref() != Some("1") {
|
||||
let py = which::which("python3").ok();
|
||||
if let Some(py3) = py {
|
||||
let script = std::path::Path::new("tools/ny_parser_mvp.py");
|
||||
if script.exists() {
|
||||
let out2 = std::process::Command::new(py3)
|
||||
.arg(script)
|
||||
.arg(tmp_path.as_os_str())
|
||||
.output();
|
||||
match out2 {
|
||||
Ok(o2) if o2.status.success() => {
|
||||
if let Ok(s2) = String::from_utf8(o2.stdout) {
|
||||
// pick the first JSON-ish line
|
||||
for line in s2.lines() {
|
||||
let t = line.trim();
|
||||
if t.starts_with('{') && t.contains("\"version\"") && t.contains("\"kind\"") { json_line = t.to_string(); break; }
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(o2) => {
|
||||
let msg = String::from_utf8_lossy(&o2.stderr);
|
||||
eprintln!("[ny-compiler] python parser failed: {}", msg);
|
||||
}
|
||||
Err(e2) => {
|
||||
eprintln!("[ny-compiler] spawn python3 failed: {}", e2);
|
||||
}
|
||||
}
|
||||
}
|
||||
} }
|
||||
if json_line.is_empty() { return false; }
|
||||
}
|
||||
// Parse JSON v0 → MIR module
|
||||
match json_v0_bridge::parse_json_v0_to_module(&json_line) {
|
||||
Ok(module) => {
|
||||
let emit_only_default = "1".to_string();
|
||||
let emit_only = if emit_only_default == "1" { true } else { crate::config::env::ny_compiler_emit_only() };
|
||||
println!("🚀 Ny compiler MVP (ny→json_v0) path ON");
|
||||
json_v0_bridge::maybe_dump_mir(&module);
|
||||
if emit_only {
|
||||
// Do not execute; fall back to default path to keep final Result unaffected (Stage‑1 policy)
|
||||
false
|
||||
} else {
|
||||
// Prefer PyVM when requested (reference semantics)
|
||||
let prefer_pyvm = crate::config::env::vm_use_py();
|
||||
if prefer_pyvm {
|
||||
if let Ok(py3) = which::which("python3") {
|
||||
let runner = std::path::Path::new("tools/pyvm_runner.py");
|
||||
if runner.exists() {
|
||||
let tmp_dir = std::path::Path::new("tmp");
|
||||
let _ = std::fs::create_dir_all(tmp_dir);
|
||||
let mir_json_path = tmp_dir.join("nyash_pyvm_mir.json");
|
||||
if let Err(e) = crate::runner::mir_json_emit::emit_mir_json_for_harness_bin(&module, &mir_json_path) {
|
||||
eprintln!("❌ PyVM MIR JSON emit error: {}", e);
|
||||
return true; // prevent double-run fallback
|
||||
}
|
||||
if crate::config::env::cli_verbose() {
|
||||
eprintln!("[ny-compiler] using PyVM (mvp) → {}", mir_json_path.display());
|
||||
}
|
||||
// Determine entry function (prefer Main.main; top-level main only if allowed)
|
||||
let allow_top = crate::config::env::entry_allow_toplevel_main();
|
||||
let entry = if module.functions.contains_key("Main.main") {
|
||||
"Main.main"
|
||||
} else if allow_top && module.functions.contains_key("main") {
|
||||
"main"
|
||||
} else if module.functions.contains_key("main") {
|
||||
eprintln!("[entry] Warning: using top-level 'main' without explicit allow; set NYASH_ENTRY_ALLOW_TOPLEVEL_MAIN=1 to silence.");
|
||||
"main"
|
||||
} else {
|
||||
"Main.main"
|
||||
};
|
||||
let code = self.run_pyvm_harness(&module, "mvp").unwrap_or(1);
|
||||
println!("Result: {}", code);
|
||||
std::process::exit(code);
|
||||
} else {
|
||||
eprintln!("❌ PyVM runner not found: {}", runner.display());
|
||||
std::process::exit(1);
|
||||
}
|
||||
} else {
|
||||
eprintln!("❌ python3 not found in PATH. Install Python 3 to use PyVM.");
|
||||
std::process::exit(1);
|
||||
}
|
||||
}
|
||||
// Default: execute via MIR interpreter
|
||||
self.execute_mir_module(&module);
|
||||
true
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
eprintln!("[ny-compiler] JSON parse failed: {}", e);
|
||||
false
|
||||
}
|
||||
}
|
||||
self.try_run_selfhost_pipeline(filename)
|
||||
}
|
||||
|
||||
/// Execute Nyash file with interpreter (common helper)
|
||||
pub(crate) fn execute_nyash_file(&self, filename: &str) {
|
||||
let quiet_pipe = std::env::var("NYASH_JSON_ONLY").ok().as_deref() == Some("1");
|
||||
// Ensure plugin host and provider mappings are initialized (idempotent)
|
||||
if std::env::var("NYASH_DISABLE_PLUGINS").ok().as_deref() != Some("1") {
|
||||
// Call via lib crate to avoid referring to the bin crate root
|
||||
runner_plugin_init::init_bid_plugins();
|
||||
}
|
||||
// Ensure runtime and plugins are initialized via unified helper (idempotent)
|
||||
let groups = self.config.as_groups();
|
||||
self.init_runtime_and_plugins(&groups);
|
||||
// Read the file
|
||||
let code = match fs::read_to_string(filename) {
|
||||
Ok(content) => content,
|
||||
|
||||
12
src/runner/modes/common_util/resolve/mod.rs
Normal file
12
src/runner/modes/common_util/resolve/mod.rs
Normal file
@ -0,0 +1,12 @@
|
||||
/*!
|
||||
* Using resolver utilities (split)
|
||||
* - strip: remove `using` lines, inline modules, register aliases/modules
|
||||
* - seam: seam logging and optional brace-fix at join points
|
||||
*/
|
||||
|
||||
pub mod strip;
|
||||
pub mod seam;
|
||||
|
||||
// Public re-exports to preserve existing call sites
|
||||
pub use strip::{strip_using_and_register, preexpand_at_local};
|
||||
|
||||
84
src/runner/modes/common_util/resolve/seam.rs
Normal file
84
src/runner/modes/common_util/resolve/seam.rs
Normal file
@ -0,0 +1,84 @@
|
||||
/// Log tail of inlined prelude chunk for seam inspection.
|
||||
pub fn log_inlined_tail(path_key: &str, inlined_text: &str, seam_dbg: bool) {
|
||||
if !seam_dbg { return; }
|
||||
let tail = inlined_text
|
||||
.chars()
|
||||
.rev()
|
||||
.take(120)
|
||||
.collect::<String>()
|
||||
.chars()
|
||||
.rev()
|
||||
.collect::<String>();
|
||||
eprintln!(
|
||||
"[using][seam][inlined] {} tail=<<<{}>>>",
|
||||
path_key,
|
||||
tail.replace('\n', "\\n")
|
||||
);
|
||||
}
|
||||
|
||||
/// Log the seam between prelude and body for quick visual diff.
|
||||
pub fn log_prelude_body_seam(prelude_clean: &str, body: &str, seam_dbg: bool) {
|
||||
if !seam_dbg { return; }
|
||||
let tail = prelude_clean
|
||||
.chars()
|
||||
.rev()
|
||||
.take(160)
|
||||
.collect::<String>()
|
||||
.chars()
|
||||
.rev()
|
||||
.collect::<String>();
|
||||
let head = body.chars().take(160).collect::<String>();
|
||||
eprintln!("[using][seam] prelude_tail=<<<{}>>>", tail.replace('\n', "\\n"));
|
||||
eprintln!("[using][seam] body_head =<<<{}>>>", head.replace('\n', "\\n"));
|
||||
}
|
||||
|
||||
/// Apply optional seam safety: append missing '}' for unmatched '{' in prelude
|
||||
/// When `trace` is true, emits a short note with delta count.
|
||||
pub fn fix_prelude_braces_if_enabled(prelude_clean: &str, combined: &mut String, trace: bool) {
|
||||
if std::env::var("NYASH_RESOLVE_FIX_BRACES").ok().as_deref() != Some("1") {
|
||||
return;
|
||||
}
|
||||
// compute { } delta ignoring strings and comments
|
||||
let mut delta: i32 = 0;
|
||||
let mut it = prelude_clean.chars().peekable();
|
||||
let mut in_str = false;
|
||||
let mut in_sl = false;
|
||||
let mut in_ml = false;
|
||||
while let Some(c) = it.next() {
|
||||
if in_sl {
|
||||
if c == '\n' { in_sl = false; }
|
||||
continue;
|
||||
}
|
||||
if in_ml {
|
||||
if c == '*' {
|
||||
if let Some('/') = it.peek().copied() {
|
||||
it.next();
|
||||
in_ml = false;
|
||||
}
|
||||
}
|
||||
continue;
|
||||
}
|
||||
if in_str {
|
||||
if c == '\\' { it.next(); continue; }
|
||||
if c == '"' { in_str = false; }
|
||||
continue;
|
||||
}
|
||||
if c == '"' { in_str = true; continue; }
|
||||
if c == '/' {
|
||||
match it.peek().copied() {
|
||||
Some('/') => { in_sl = true; it.next(); continue; }
|
||||
Some('*') => { in_ml = true; it.next(); continue; }
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
if c == '{' { delta += 1; }
|
||||
if c == '}' { delta -= 1; }
|
||||
}
|
||||
if delta > 0 {
|
||||
if trace { eprintln!("[using][seam] fix: appending {} '}}' before body", delta); }
|
||||
for _ in 0..delta {
|
||||
combined.push('}');
|
||||
combined.push('\n');
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -19,8 +19,7 @@ pub fn strip_using_and_register(
|
||||
let dedup_fn = std::env::var("NYASH_RESOLVE_DEDUP_FN").ok().as_deref() == Some("1");
|
||||
let seam_dbg = std::env::var("NYASH_RESOLVE_SEAM_DEBUG").ok().as_deref() == Some("1");
|
||||
let mut cmd = std::process::Command::new("python3");
|
||||
cmd.arg("tools/using_combine.py")
|
||||
.arg("--entry").arg(filename);
|
||||
cmd.arg("tools/using_combine.py").arg("--entry").arg(filename);
|
||||
if fix_braces { cmd.arg("--fix-braces"); }
|
||||
if dedup_box { cmd.arg("--dedup-box"); }
|
||||
if dedup_fn { cmd.arg("--dedup-fn"); }
|
||||
@ -35,11 +34,10 @@ pub fn strip_using_and_register(
|
||||
return Err(format!("using combiner failed: {}", err));
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
return Err(format!("using combiner spawn error: {}", e));
|
||||
}
|
||||
Err(e) => return Err(format!("using combiner spawn error: {}", e)),
|
||||
}
|
||||
}
|
||||
|
||||
fn strip_and_inline(
|
||||
runner: &NyashRunner,
|
||||
code: &str,
|
||||
@ -57,21 +55,12 @@ pub fn strip_using_and_register(
|
||||
let rest0 = rest0.strip_suffix(';').unwrap_or(rest0).trim();
|
||||
let (target, alias) = if let Some(pos) = rest0.find(" as ") {
|
||||
(rest0[..pos].trim().to_string(), Some(rest0[pos + 4..].trim().to_string()))
|
||||
} else {
|
||||
(rest0.to_string(), None)
|
||||
};
|
||||
let is_path = target.starts_with('"')
|
||||
|| target.starts_with("./")
|
||||
|| target.starts_with('/')
|
||||
|| target.ends_with(".nyash");
|
||||
} else { (rest0.to_string(), None) };
|
||||
let is_path = target.starts_with('"') || target.starts_with("./") || target.starts_with('/') || target.ends_with(".nyash");
|
||||
if is_path {
|
||||
let path = target.trim_matches('"').to_string();
|
||||
let name = alias.clone().unwrap_or_else(|| {
|
||||
std::path::Path::new(&path)
|
||||
.file_stem()
|
||||
.and_then(|s| s.to_str())
|
||||
.unwrap_or("module")
|
||||
.to_string()
|
||||
std::path::Path::new(&path).file_stem().and_then(|s| s.to_str()).unwrap_or("module").to_string()
|
||||
});
|
||||
used.push((name, Some(path)));
|
||||
} else {
|
||||
@ -98,8 +87,8 @@ pub fn strip_using_and_register(
|
||||
}
|
||||
for (ns, alias_opt) in used {
|
||||
// Two forms:
|
||||
// - using path "..." [as Alias] → handled earlier (stored as (name, Some(path)))
|
||||
// - using namespace.with.dots [as Alias] → resolve ns → register alias → inline
|
||||
// - using path "..." [as Alias]
|
||||
// - using namespace.with.dots [as Alias]
|
||||
let resolved_path = if let Some(alias) = alias_opt {
|
||||
// alias case: resolve namespace to a concrete path
|
||||
let mut found: Option<String> = using_ctx
|
||||
@ -109,39 +98,7 @@ pub fn strip_using_and_register(
|
||||
.map(|(_, p)| p.clone());
|
||||
if trace {
|
||||
if let Some(f) = &found {
|
||||
eprintln!("[using] hit modules: {} -> {}", ns, f);
|
||||
} else {
|
||||
eprintln!("[using] miss modules: {}", ns);
|
||||
}
|
||||
}
|
||||
if found.is_none() {
|
||||
if let Ok(text) = std::fs::read_to_string("nyash.toml") {
|
||||
if let Ok(doc) = toml::from_str::<toml::Value>(&text) {
|
||||
if let Some(mut cur) = doc.get("modules").and_then(|v| v.as_table()) {
|
||||
let mut segs = ns.split('.').peekable();
|
||||
let mut hit: Option<String> = None;
|
||||
while let Some(seg) = segs.next() {
|
||||
if let Some(next) = cur.get(seg) {
|
||||
if let Some(t) = next.as_table() {
|
||||
cur = t;
|
||||
continue;
|
||||
}
|
||||
if segs.peek().is_none() {
|
||||
if let Some(s) = next.as_str() {
|
||||
hit = Some(s.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
break;
|
||||
}
|
||||
if hit.is_some() {
|
||||
if trace {
|
||||
eprintln!("[using] hit nyash.toml: {} -> {}", ns, hit.as_ref().unwrap());
|
||||
}
|
||||
found = hit;
|
||||
}
|
||||
}
|
||||
}
|
||||
eprintln!("[using/resolve] alias '{}' -> '{}'", ns, f);
|
||||
}
|
||||
}
|
||||
if found.is_none() {
|
||||
@ -157,11 +114,7 @@ pub fn strip_using_and_register(
|
||||
) {
|
||||
Ok(v) => {
|
||||
// Treat unchanged token (namespace) as unresolved
|
||||
if v == ns {
|
||||
found = None;
|
||||
} else {
|
||||
found = Some(v)
|
||||
}
|
||||
if v == ns { found = None; } else { found = Some(v) }
|
||||
}
|
||||
Err(e) => return Err(format!("using: {}", e)),
|
||||
}
|
||||
@ -199,50 +152,34 @@ pub fn strip_using_and_register(
|
||||
// Resolve relative to current file dir
|
||||
// Guard: skip obvious namespace tokens (ns.ns without extension)
|
||||
if (!path.contains('/') && !path.contains('\\')) && !path.ends_with(".nyash") && path.contains('.') {
|
||||
if verbose {
|
||||
eprintln!("[using] unresolved '{}' (namespace token, skip inline)", path);
|
||||
}
|
||||
if verbose { eprintln!("[using] unresolved '{}' (namespace token, skip inline)", path); }
|
||||
continue;
|
||||
}
|
||||
let mut p = std::path::PathBuf::from(&path);
|
||||
if p.is_relative() {
|
||||
// If the raw relative path exists from CWD, use it.
|
||||
// Otherwise, try relative to the current file's directory.
|
||||
if !p.exists() {
|
||||
if let Some(dir) = std::path::Path::new(filename).parent() {
|
||||
let cand = dir.join(&p);
|
||||
if cand.exists() {
|
||||
p = cand;
|
||||
}
|
||||
if cand.exists() { p = cand; }
|
||||
}
|
||||
}
|
||||
}
|
||||
// normalize to absolute to stabilize de-dup
|
||||
if let Ok(abs) = std::fs::canonicalize(&p) { p = abs; }
|
||||
let key = p.to_string_lossy().to_string();
|
||||
if visited.contains(&key) {
|
||||
continue;
|
||||
}
|
||||
if visited.contains(&key) { continue; }
|
||||
visited.insert(key.clone());
|
||||
if let Ok(text) = std::fs::read_to_string(&p) {
|
||||
let inlined = strip_and_inline(runner, &text, &key, visited)?;
|
||||
prelude.push_str(&inlined);
|
||||
prelude.push_str("\n");
|
||||
if seam_dbg {
|
||||
let tail = inlined.chars().rev().take(120).collect::<String>().chars().rev().collect::<String>();
|
||||
eprintln!("[using][seam][inlined] {} tail=<<<{}>>>", key, tail.replace('\n', "\\n"));
|
||||
}
|
||||
crate::runner::modes::common_util::resolve::seam::log_inlined_tail(&key, &inlined, seam_dbg);
|
||||
} else if verbose {
|
||||
eprintln!("[using] warn: could not read {}", p.display());
|
||||
}
|
||||
}
|
||||
}
|
||||
// Prepend inlined modules so their boxes are defined before use
|
||||
// Seam guard: collapse consecutive blank lines at the join (prelude || body) to a single blank line
|
||||
if prelude.is_empty() {
|
||||
return Ok(out);
|
||||
}
|
||||
// Optionally deduplicate repeated static boxes in prelude by name (default OFF)
|
||||
if prelude.is_empty() { return Ok(out); }
|
||||
// Optional de-dup of static boxes by name
|
||||
let mut prelude_text = prelude;
|
||||
if std::env::var("NYASH_RESOLVE_DEDUP_BOX").ok().as_deref() == Some("1") {
|
||||
let mut seen: std::collections::HashSet<String> = std::collections::HashSet::new();
|
||||
@ -250,19 +187,15 @@ pub fn strip_using_and_register(
|
||||
let bytes: Vec<char> = prelude_text.chars().collect();
|
||||
let mut i = 0usize;
|
||||
while i < bytes.len() {
|
||||
// naive scan for "static box "
|
||||
if i + 12 < bytes.len() && bytes[i..].iter().take(11).collect::<String>() == "static box " {
|
||||
// read name token
|
||||
let mut j = i + 11;
|
||||
let mut name = String::new();
|
||||
while j < bytes.len() {
|
||||
let c = bytes[j];
|
||||
if c.is_alphanumeric() || c == '_' { name.push(c); j += 1; } else { break; }
|
||||
}
|
||||
// find opening brace '{'
|
||||
while j < bytes.len() && bytes[j].is_whitespace() { j += 1; }
|
||||
if j < bytes.len() && bytes[j] == '{' {
|
||||
// scan to matching closing brace for this box
|
||||
let mut k = j;
|
||||
let mut depth = 0i32;
|
||||
while k < bytes.len() {
|
||||
@ -271,210 +204,110 @@ pub fn strip_using_and_register(
|
||||
if c == '}' { depth -= 1; if depth == 0 { k += 1; break; } }
|
||||
k += 1;
|
||||
}
|
||||
// decide
|
||||
if seen.contains(&name) {
|
||||
// skip duplicate box
|
||||
i = k; // drop this block
|
||||
continue;
|
||||
} else {
|
||||
if seen.contains(&name) { i = k; continue; } else {
|
||||
seen.insert(name);
|
||||
// keep this block as-is
|
||||
out_txt.push_str(&bytes[i..k].iter().collect::<String>());
|
||||
i = k;
|
||||
continue;
|
||||
i = k; continue;
|
||||
}
|
||||
}
|
||||
}
|
||||
// default: copy one char
|
||||
out_txt.push(bytes[i]);
|
||||
i += 1;
|
||||
}
|
||||
prelude_text = out_txt;
|
||||
}
|
||||
// Optional: de-duplicate repeated function definitions inside specific boxes (default OFF)
|
||||
// Optional: function dedup (MiniVmPrints.print_prints_in_slice)
|
||||
if std::env::var("NYASH_RESOLVE_DEDUP_FN").ok().as_deref() == Some("1") {
|
||||
// Currently target MiniVmPrints.print_prints_in_slice only (low risk)
|
||||
let mut out_txt = String::with_capacity(prelude_text.len());
|
||||
let bytes: Vec<char> = prelude_text.chars().collect();
|
||||
let mut i = 0usize;
|
||||
while i < bytes.len() {
|
||||
// scan for "static box "
|
||||
let ahead: String = bytes[i..bytes.len().min(i + 12)].iter().collect();
|
||||
if ahead.starts_with("static box ") {
|
||||
// parse box name
|
||||
let mut j = i + 11; // len("static box ") == 11
|
||||
let mut j = i + 11;
|
||||
let mut name = String::new();
|
||||
while j < bytes.len() {
|
||||
let c = bytes[j];
|
||||
if c.is_ascii_alphanumeric() || c == '_' { name.push(c); j += 1; } else { break; }
|
||||
}
|
||||
// skip ws to '{'
|
||||
while j < bytes.len() { let c = bytes[j]; if c.is_ascii_alphanumeric() || c == '_' { name.push(c); j += 1; } else { break; } }
|
||||
while j < bytes.len() && bytes[j].is_whitespace() { j += 1; }
|
||||
if j < bytes.len() && bytes[j] == '{' {
|
||||
// find matching closing '}' for the box body
|
||||
let mut k = j;
|
||||
let mut depth = 0i32;
|
||||
let mut in_str = false;
|
||||
while k < bytes.len() {
|
||||
let c = bytes[k];
|
||||
if in_str {
|
||||
if c == '\\' { k += 2; continue; }
|
||||
if c == '"' { in_str = false; }
|
||||
k += 1;
|
||||
continue;
|
||||
} else {
|
||||
if c == '"' { in_str = true; k += 1; continue; }
|
||||
if c == '{' { depth += 1; }
|
||||
if c == '}' { depth -= 1; if depth == 0 { k += 1; break; } }
|
||||
k += 1;
|
||||
}
|
||||
if in_str { if c == '\\' { k += 2; continue; } if c == '"' { in_str = false; } k += 1; continue; } else { if c == '"' { in_str = true; k += 1; continue; } if c == '{' { depth += 1; } if c == '}' { depth -= 1; if depth == 0 { k += 1; break; } } k += 1; }
|
||||
}
|
||||
// write header up to body start '{'
|
||||
out_txt.push_str(&bytes[i..(j + 1)].iter().collect::<String>());
|
||||
// process body (limited dedup for MiniVmPrints.print_prints_in_slice)
|
||||
let body_end = k.saturating_sub(1);
|
||||
if name == "MiniVmPrints" {
|
||||
let mut kept = false;
|
||||
let mut p = j + 1;
|
||||
while p <= body_end {
|
||||
// find next line start
|
||||
let mut ls = p;
|
||||
if ls > j + 1 {
|
||||
while ls <= body_end && bytes[ls - 1] != '\n' { ls += 1; }
|
||||
}
|
||||
let mut ls = p; if ls > j + 1 { while ls <= body_end && bytes[ls - 1] != '\n' { ls += 1; } }
|
||||
if ls > body_end { break; }
|
||||
// skip spaces
|
||||
let mut q = ls;
|
||||
while q <= body_end && bytes[q].is_whitespace() && bytes[q] != '\n' { q += 1; }
|
||||
// check for function definition of print_prints_in_slice
|
||||
let mut q = ls; while q <= body_end && bytes[q].is_whitespace() && bytes[q] != '\n' { q += 1; }
|
||||
let rem: String = bytes[q..(body_end + 1).min(q + 64)].iter().collect();
|
||||
if rem.starts_with("print_prints_in_slice(") {
|
||||
// find ')'
|
||||
let mut r = q;
|
||||
let mut dp = 0i32;
|
||||
let mut in_s = false;
|
||||
let mut r = q; let mut dp = 0i32; let mut instr = false;
|
||||
while r <= body_end {
|
||||
let c = bytes[r];
|
||||
if in_s { if c == '\\' { r += 2; continue; } if c == '"' { in_s = false; } r += 1; continue; }
|
||||
if c == '"' { in_s = true; r += 1; continue; }
|
||||
if c == '(' { dp += 1; r += 1; continue; }
|
||||
if c == ')' { dp -= 1; r += 1; if dp <= 0 { break; } continue; }
|
||||
if instr { if c == '\\' { r += 2; continue; } if c == '"' { instr = false; } r += 1; continue; }
|
||||
if c == '"' { instr = true; r += 1; continue; }
|
||||
if c == '(' { dp += 1; }
|
||||
if c == ')' { dp -= 1; if dp == 0 { r += 1; break; } }
|
||||
if dp == 0 && c == '{' { break; }
|
||||
r += 1;
|
||||
}
|
||||
while r <= body_end && bytes[r].is_whitespace() { r += 1; }
|
||||
if r <= body_end && bytes[r] == '{' {
|
||||
// find body end
|
||||
let mut t = r;
|
||||
let mut d2 = 0i32;
|
||||
let mut in_s2 = false;
|
||||
while t <= body_end {
|
||||
let c2 = bytes[t];
|
||||
if in_s2 { if c2 == '\\' { t += 2; continue; } if c2 == '"' { in_s2 = false; } t += 1; continue; }
|
||||
if c2 == '"' { in_s2 = true; t += 1; continue; }
|
||||
if c2 == '{' { d2 += 1; }
|
||||
if c2 == '}' { d2 -= 1; if d2 == 0 { t += 1; break; } }
|
||||
t += 1;
|
||||
let mut s = r; let mut bd = 0i32; let mut is2 = false;
|
||||
while s <= body_end {
|
||||
let c = bytes[s];
|
||||
if is2 { if c == '\\' { s += 2; continue; } if c == '"' { is2 = false; } s += 1; continue; }
|
||||
if c == '"' { is2 = true; s += 1; continue; }
|
||||
if c == '{' { bd += 1; }
|
||||
if c == '}' { bd -= 1; if bd == 0 { s += 1; break; } }
|
||||
s += 1;
|
||||
}
|
||||
// start-of-line
|
||||
let mut sol = q;
|
||||
while sol > j + 1 && bytes[sol - 1] != '\n' { sol -= 1; }
|
||||
if !kept {
|
||||
out_txt.push_str(&bytes[sol..t].iter().collect::<String>());
|
||||
out_txt.push_str(&bytes[q..s].iter().collect::<String>());
|
||||
kept = true;
|
||||
}
|
||||
p = t;
|
||||
// advance outer scanner to the end of this function body
|
||||
i = s;
|
||||
let _ = i; // mark as read to satisfy unused_assignments lint
|
||||
continue;
|
||||
}
|
||||
}
|
||||
// copy this line
|
||||
let mut eol = ls;
|
||||
while eol <= body_end && bytes[eol] != '\n' { eol += 1; }
|
||||
out_txt.push_str(&bytes[ls..(eol.min(body_end + 1))].iter().collect::<String>());
|
||||
if eol <= body_end && bytes[eol] == '\n' { out_txt.push('\n'); }
|
||||
p = eol + 1;
|
||||
out_txt.push(bytes[p]); p += 1;
|
||||
}
|
||||
} else {
|
||||
// copy body as-is
|
||||
out_txt.push_str(&bytes[(j + 1)..=body_end].iter().collect::<String>());
|
||||
}
|
||||
// write closing '}'
|
||||
out_txt.push('}');
|
||||
i = k;
|
||||
continue;
|
||||
if !kept { out_txt.push_str(&bytes[j + 1..=body_end].iter().collect::<String>()); }
|
||||
out_txt.push('}'); out_txt.push('\n'); i = k; continue;
|
||||
} else { out_txt.push_str(&bytes[j + 1..k].iter().collect::<String>()); i = k; continue; }
|
||||
}
|
||||
}
|
||||
// default: copy one char
|
||||
out_txt.push(bytes[i]);
|
||||
i += 1;
|
||||
out_txt.push(bytes[i]); i += 1;
|
||||
}
|
||||
prelude_text = out_txt;
|
||||
}
|
||||
let prelude_clean = prelude_text.trim_end_matches(['\n', '\r']);
|
||||
if seam_dbg {
|
||||
let tail = prelude_clean.chars().rev().take(160).collect::<String>().chars().rev().collect::<String>();
|
||||
let head = out.chars().take(160).collect::<String>();
|
||||
eprintln!("[using][seam] prelude_tail=<<<{}>>>", tail.replace('\n', "\\n"));
|
||||
eprintln!("[using][seam] body_head =<<<{}>>>", head.replace('\n', "\\n"));
|
||||
}
|
||||
// Seam join + optional fix
|
||||
let prelude_clean = prelude_text.trim_end_matches('\n');
|
||||
crate::runner::modes::common_util::resolve::seam::log_prelude_body_seam(prelude_clean, &out, seam_dbg);
|
||||
let mut combined = String::with_capacity(prelude_clean.len() + out.len() + 1);
|
||||
combined.push_str(prelude_clean);
|
||||
combined.push('\n');
|
||||
// Optional seam safety: append missing '}' for unmatched '{' in prelude
|
||||
if std::env::var("NYASH_RESOLVE_FIX_BRACES").ok().as_deref() == Some("1") {
|
||||
// compute { } delta ignoring strings and comments
|
||||
let mut delta: i32 = 0;
|
||||
let mut it = prelude_clean.chars().peekable();
|
||||
let mut in_str = false;
|
||||
let mut in_sl = false;
|
||||
let mut in_ml = false;
|
||||
while let Some(c) = it.next() {
|
||||
if in_sl {
|
||||
if c == '\n' { in_sl = false; }
|
||||
continue;
|
||||
}
|
||||
if in_ml {
|
||||
if c == '*' {
|
||||
if let Some('/') = it.peek().copied() {
|
||||
// consume '/'
|
||||
it.next();
|
||||
in_ml = false;
|
||||
}
|
||||
}
|
||||
continue;
|
||||
}
|
||||
if in_str {
|
||||
if c == '\\' { it.next(); continue; }
|
||||
if c == '"' { in_str = false; }
|
||||
continue;
|
||||
}
|
||||
if c == '"' { in_str = true; continue; }
|
||||
if c == '/' {
|
||||
match it.peek().copied() {
|
||||
Some('/') => { in_sl = true; it.next(); continue; }
|
||||
Some('*') => { in_ml = true; it.next(); continue; }
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
if c == '{' { delta += 1; }
|
||||
if c == '}' { delta -= 1; }
|
||||
}
|
||||
if delta > 0 {
|
||||
if trace { eprintln!("[using][seam] fix: appending {} '}}' before body", delta); }
|
||||
for _ in 0..delta { combined.push('}'); combined.push('\n'); }
|
||||
}
|
||||
}
|
||||
crate::runner::modes::common_util::resolve::seam::fix_prelude_braces_if_enabled(prelude_clean, &mut combined, trace);
|
||||
combined.push_str(&out);
|
||||
Ok(combined)
|
||||
}
|
||||
|
||||
let mut visited = HashSet::new();
|
||||
let combined = strip_and_inline(runner, code, filename, &mut visited)?;
|
||||
// Dev sugar: always pre-expand @name[:T] = expr at line-head to keep sources readable
|
||||
Ok(preexpand_at_local(&combined))
|
||||
}
|
||||
|
||||
/// Pre-expand line-head `@name[: Type] = expr` into `local name[: Type] = expr`.
|
||||
/// Minimal, safe, no semantics change. Applies only at line head (after spaces/tabs).
|
||||
pub(crate) fn preexpand_at_local(src: &str) -> String {
|
||||
pub fn preexpand_at_local(src: &str) -> String {
|
||||
let mut out = String::with_capacity(src.len());
|
||||
for line in src.lines() {
|
||||
let bytes = line.as_bytes();
|
||||
@ -483,33 +316,18 @@ pub(crate) fn preexpand_at_local(src: &str) -> String {
|
||||
if i < bytes.len() && bytes[i] == b'@' {
|
||||
// parse identifier
|
||||
let mut j = i + 1;
|
||||
// first char [A-Za-z_]
|
||||
if j < bytes.len() && ((bytes[j] as char).is_ascii_alphabetic() || bytes[j] == b'_') {
|
||||
j += 1;
|
||||
while j < bytes.len() {
|
||||
let c = bytes[j] as char;
|
||||
if c.is_ascii_alphanumeric() || c == '_' { j += 1; } else { break; }
|
||||
}
|
||||
// optional type: spaces ':' spaces ident
|
||||
let mut k = j;
|
||||
while k < bytes.len() && (bytes[k] == b' ' || bytes[k] == b'\t') { k += 1; }
|
||||
while j < bytes.len() { let c = bytes[j] as char; if c.is_ascii_alphanumeric() || c == '_' { j += 1; } else { break; } }
|
||||
let mut k = j; while k < bytes.len() && (bytes[k] == b' ' || bytes[k] == b'\t') { k += 1; }
|
||||
if k < bytes.len() && bytes[k] == b':' {
|
||||
k += 1;
|
||||
while k < bytes.len() && (bytes[k] == b' ' || bytes[k] == b'\t') { k += 1; }
|
||||
// simple type ident
|
||||
k += 1; while k < bytes.len() && (bytes[k] == b' ' || bytes[k] == b'\t') { k += 1; }
|
||||
if k < bytes.len() && ((bytes[k] as char).is_ascii_alphabetic() || bytes[k] == b'_') {
|
||||
k += 1;
|
||||
while k < bytes.len() {
|
||||
let c = bytes[k] as char;
|
||||
if c.is_ascii_alphanumeric() || c == '_' { k += 1; } else { break; }
|
||||
}
|
||||
k += 1; while k < bytes.len() { let c = bytes[k] as char; if c.is_ascii_alphanumeric() || c == '_' { k += 1; } else { break; } }
|
||||
}
|
||||
}
|
||||
// consume spaces to '='
|
||||
let mut eqp = k;
|
||||
while eqp < bytes.len() && (bytes[eqp] == b' ' || bytes[eqp] == b'\t') { eqp += 1; }
|
||||
let mut eqp = k; while eqp < bytes.len() && (bytes[eqp] == b' ' || bytes[eqp] == b'\t') { eqp += 1; }
|
||||
if eqp < bytes.len() && bytes[eqp] == b'=' {
|
||||
// build transformed line: prefix + 'local ' + rest from after '@' up to '=' + ' =' + remainder
|
||||
out.push_str(&line[..i]);
|
||||
out.push_str("local ");
|
||||
out.push_str(&line[i + 1..eqp]);
|
||||
@ -98,6 +98,44 @@ pub fn execute_pyvm_only(runner: &NyashRunner, filename: &str) {
|
||||
if removed > 0 { crate::cli_v!("[PyVM] escape_elide_barriers: removed {} barriers", removed); }
|
||||
}
|
||||
|
||||
// Optional: delegate to Ny selfhost executor (Stage 0 scaffold: no-op)
|
||||
if std::env::var("NYASH_SELFHOST_EXEC").ok().as_deref() == Some("1") {
|
||||
// Emit MIR JSON to a temp file and invoke Ny runner script.
|
||||
let tmp_dir = std::path::Path::new("tmp");
|
||||
let _ = std::fs::create_dir_all(tmp_dir);
|
||||
let mir_json_path = tmp_dir.join("nyash_selfhost_mir.json");
|
||||
if let Err(e) = crate::runner::mir_json_emit::emit_mir_json_for_harness_bin(&compile_result.module, &mir_json_path) {
|
||||
eprintln!("❌ Selfhost MIR JSON emit error: {}", e);
|
||||
process::exit(1);
|
||||
}
|
||||
// Resolve nyash executable and runner path
|
||||
let exe = std::env::current_exe().unwrap_or_else(|_| std::path::PathBuf::from("target/release/nyash"));
|
||||
let runner = std::path::Path::new("apps/selfhost-runtime/runner.nyash");
|
||||
if !runner.exists() {
|
||||
eprintln!("❌ Selfhost runner missing: {}", runner.display());
|
||||
process::exit(1);
|
||||
}
|
||||
let mut cmd = std::process::Command::new(&exe);
|
||||
cmd.arg("--backend").arg("vm")
|
||||
.arg(runner)
|
||||
.arg("--")
|
||||
.arg(mir_json_path.display().to_string());
|
||||
// Optional: pass box pref to child (ny|plugin)
|
||||
if let Ok(pref) = std::env::var("NYASH_SELFHOST_BOX_PREF") {
|
||||
let p = pref.to_lowercase();
|
||||
if p == "ny" || p == "plugin" {
|
||||
cmd.arg(format!("--box-pref={}", p));
|
||||
}
|
||||
}
|
||||
let status = cmd
|
||||
// Avoid recursive selfhost delegation inside the child.
|
||||
.env_remove("NYASH_SELFHOST_EXEC")
|
||||
.status()
|
||||
.unwrap_or_else(|e| { eprintln!("❌ spawn selfhost runner failed: {}", e); std::process::exit(1); });
|
||||
let code = status.code().unwrap_or(1);
|
||||
process::exit(code);
|
||||
}
|
||||
|
||||
// Delegate to common PyVM harness
|
||||
match crate::runner::modes::common_util::pyvm::run_pyvm_harness_lib(&compile_result.module, "pyvm") {
|
||||
Ok(code) => { process::exit(code); }
|
||||
|
||||
75
src/runner/plugins.rs
Normal file
75
src/runner/plugins.rs
Normal file
@ -0,0 +1,75 @@
|
||||
/*!
|
||||
* Runner plugin/registry initialization (extracted)
|
||||
*/
|
||||
|
||||
use super::*;
|
||||
|
||||
impl NyashRunner {
|
||||
/// Initialize global runtime registry and load configured plugins.
|
||||
///
|
||||
/// Behavior (no-op changes to defaults):
|
||||
/// - Always initializes the unified type/box registry.
|
||||
/// - Loads native BID plugins unless `NYASH_DISABLE_PLUGINS=1`.
|
||||
/// - Ensures built‑ins override list includes ArrayBox/MapBox/FileBox/TOMLBox
|
||||
/// by merging `NYASH_PLUGIN_OVERRIDE_TYPES` with required entries.
|
||||
/// - When `--load-ny-plugins` or `NYASH_LOAD_NY_PLUGINS=1` is set, best‑effort
|
||||
/// loads Nyash scripts listed under `nyash.toml`'s `ny_plugins`.
|
||||
///
|
||||
/// Side effects:
|
||||
/// - Sets `NYASH_USE_PLUGIN_BUILTINS=1` if unset, to allow interpreter side creation.
|
||||
/// - Updates `NYASH_PLUGIN_OVERRIDE_TYPES` env var (merged values).
|
||||
pub(crate) fn init_runtime_and_plugins(&self, groups: &crate::cli::CliGroups) {
|
||||
// Unified registry
|
||||
runtime::init_global_unified_registry();
|
||||
// Plugins (guarded)
|
||||
if std::env::var("NYASH_DISABLE_PLUGINS").ok().as_deref() != Some("1") {
|
||||
runner_plugin_init::init_bid_plugins();
|
||||
crate::runner::box_index::refresh_box_index();
|
||||
}
|
||||
// Allow interpreter to create plugin-backed boxes via unified registry
|
||||
if std::env::var("NYASH_USE_PLUGIN_BUILTINS").ok().is_none() {
|
||||
std::env::set_var("NYASH_USE_PLUGIN_BUILTINS", "1");
|
||||
}
|
||||
// Merge override types env (ensure FileBox/TOMLBox present)
|
||||
let mut override_types: Vec<String> = if let Ok(list) = std::env::var("NYASH_PLUGIN_OVERRIDE_TYPES") {
|
||||
list.split(',').map(|s| s.trim().to_string()).filter(|s| !s.is_empty()).collect()
|
||||
} else { vec!["ArrayBox".into(), "MapBox".into()] };
|
||||
for t in ["FileBox", "TOMLBox"] { if !override_types.iter().any(|x| x == t) { override_types.push(t.into()); } }
|
||||
std::env::set_var("NYASH_PLUGIN_OVERRIDE_TYPES", override_types.join(","));
|
||||
|
||||
// Optional Ny script plugins loader (best-effort)
|
||||
if groups.load_ny_plugins || std::env::var("NYASH_LOAD_NY_PLUGINS").ok().as_deref() == Some("1") {
|
||||
if let Ok(text) = std::fs::read_to_string("nyash.toml") {
|
||||
if let Ok(doc) = toml::from_str::<toml::Value>(&text) {
|
||||
if let Some(np) = doc.get("ny_plugins") {
|
||||
let mut list: Vec<String> = Vec::new();
|
||||
if let Some(arr) = np.as_array() { for v in arr { if let Some(s) = v.as_str() { list.push(s.to_string()); } } }
|
||||
else if let Some(tbl) = np.as_table() { for (_k, v) in tbl { if let Some(s) = v.as_str() { list.push(s.to_string()); } else if let Some(arr) = v.as_array() { for e in arr { if let Some(s) = e.as_str() { list.push(s.to_string()); } } } } }
|
||||
if !list.is_empty() {
|
||||
let list_only = std::env::var("NYASH_NY_PLUGINS_LIST_ONLY").ok().as_deref() == Some("1");
|
||||
println!("🧩 Ny script plugins ({}):", list.len());
|
||||
for p in list {
|
||||
if list_only { println!(" • {}", p); continue; }
|
||||
match std::fs::read_to_string(&p) {
|
||||
Ok(code) => {
|
||||
match nyash_rust::parser::NyashParser::parse_from_string(&code) {
|
||||
Ok(ast) => {
|
||||
let mut interpreter = nyash_rust::interpreter::NyashInterpreter::new();
|
||||
match interpreter.execute(ast) {
|
||||
Ok(_) => println!("[ny_plugins] {}: OK", p),
|
||||
Err(e) => println!("[ny_plugins] {}: FAIL ({})", p, e),
|
||||
}
|
||||
}
|
||||
Err(e) => println!("[ny_plugins] {}: FAIL (parse: {})", p, e),
|
||||
}
|
||||
}
|
||||
Err(e) => println!("[ny_plugins] {}: FAIL (read: {})", p, e),
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -112,14 +112,39 @@ impl NyashRunner {
|
||||
let parser_prog = std::path::Path::new("apps/selfhost/compiler/compiler.nyash");
|
||||
if parser_prog.exists() {
|
||||
// Build extra args forwarded to child program
|
||||
let mut extra: Vec<&str> = Vec::new();
|
||||
let mut extra_owned: Vec<String> = Vec::new();
|
||||
if crate::config::env::ny_compiler_min_json() {
|
||||
extra.extend(["--", "--min-json"]);
|
||||
extra_owned.push("--".to_string());
|
||||
extra_owned.push("--min-json".to_string());
|
||||
}
|
||||
extra.extend(["--", "--read-tmp"]);
|
||||
extra_owned.push("--".to_string());
|
||||
extra_owned.push("--read-tmp".to_string());
|
||||
if crate::config::env::ny_compiler_stage3() {
|
||||
extra.extend(["--", "--stage3"]);
|
||||
extra_owned.push("--".to_string());
|
||||
extra_owned.push("--stage3".to_string());
|
||||
}
|
||||
// Optional: map env toggles to child args (prepasses)
|
||||
if std::env::var("NYASH_SCOPEBOX_ENABLE").ok().as_deref() == Some("1") {
|
||||
extra_owned.push("--".to_string());
|
||||
extra_owned.push("--scopebox".to_string());
|
||||
}
|
||||
if std::env::var("NYASH_LOOPFORM_NORMALIZE").ok().as_deref() == Some("1") {
|
||||
extra_owned.push("--".to_string());
|
||||
extra_owned.push("--loopform".to_string());
|
||||
}
|
||||
// Optional: developer-provided child args passthrough (space-separated)
|
||||
if let Ok(raw) = std::env::var("NYASH_SELFHOST_CHILD_ARGS") {
|
||||
let items: Vec<String> = raw
|
||||
.split(' ')
|
||||
.filter(|s| !s.trim().is_empty())
|
||||
.map(|s| s.to_string())
|
||||
.collect();
|
||||
if !items.is_empty() {
|
||||
extra_owned.push("--".to_string());
|
||||
for it in items { extra_owned.push(it); }
|
||||
}
|
||||
}
|
||||
let extra: Vec<&str> = extra_owned.iter().map(|s| s.as_str()).collect();
|
||||
let timeout_ms: u64 = crate::config::env::ny_compiler_timeout_ms();
|
||||
if let Some(line) = child::run_ny_program_capture_json(
|
||||
&exe,
|
||||
|
||||
65
src/tests/if_in_loop_no_phi_regression.rs
Normal file
65
src/tests/if_in_loop_no_phi_regression.rs
Normal file
@ -0,0 +1,65 @@
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use crate::parser::NyashParser;
|
||||
use crate::mir::MirInstruction;
|
||||
|
||||
// Regression for predecessor mis-selection on nested if inside loop in PHI-off mode
|
||||
// Path: cond == true && cond2 == false should observe inner else assignment at merge.
|
||||
#[test]
|
||||
fn nested_if_inside_loop_edges_copy_from_exiting_blocks() {
|
||||
// Force PHI-off
|
||||
std::env::set_var("NYASH_MIR_NO_PHI", "1");
|
||||
|
||||
let code = r#"
|
||||
x = 0
|
||||
i = 0
|
||||
loop (i < 1) {
|
||||
i = i + 1
|
||||
if (1 == 1) {
|
||||
if (1 == 0) {
|
||||
x = 1
|
||||
} else {
|
||||
x = 2
|
||||
}
|
||||
}
|
||||
}
|
||||
return x
|
||||
"#;
|
||||
|
||||
let ast = NyashParser::parse_from_string(code).expect("parse");
|
||||
let mut compiler = crate::mir::MirCompiler::new();
|
||||
let result = compiler.compile(ast).expect("compile");
|
||||
|
||||
// Find return block/value id
|
||||
let f = result.module.functions.get("main").expect("main");
|
||||
let (ret_block, out_v) = f
|
||||
.blocks
|
||||
.iter()
|
||||
.find_map(|(bid, bb)| match &bb.terminator {
|
||||
Some(MirInstruction::Return { value: Some(v) }) => Some((*bid, *v)),
|
||||
_ => None,
|
||||
})
|
||||
.expect("ret block");
|
||||
|
||||
// Every predecessor must carry a Copy to the merged value in PHI-off mode
|
||||
let preds: Vec<_> = f.blocks.get(&ret_block).unwrap().predecessors.iter().copied().collect();
|
||||
assert!(!preds.is_empty(), "ret must have predecessors");
|
||||
for p in preds {
|
||||
let bb = f.blocks.get(&p).unwrap();
|
||||
let has_copy = bb
|
||||
.instructions
|
||||
.iter()
|
||||
.any(|inst| matches!(inst, MirInstruction::Copy { dst, .. } if *dst == out_v));
|
||||
assert!(has_copy, "missing Copy to merged value in predecessor {:?}", p);
|
||||
}
|
||||
// ret block must not contain Copy to out_v
|
||||
let merge_has_copy = f
|
||||
.blocks
|
||||
.get(&ret_block)
|
||||
.unwrap()
|
||||
.instructions
|
||||
.iter()
|
||||
.any(|inst| matches!(inst, MirInstruction::Copy { dst, .. } if *dst == out_v));
|
||||
assert!(!merge_has_copy, "merge/ret must not contain Copy to merged value");
|
||||
}
|
||||
}
|
||||
@ -30,6 +30,16 @@ unset NYASH_PYVM_DUMP_CODE
|
||||
unset NYASH_RESOLVE_SEAM_DEBUG
|
||||
unset NYASH_RESOLVE_FIX_BRACES
|
||||
unset NYASH_RESOLVE_DEDUP_BOX
|
||||
"$BIN" --backend vm "$APP_INS"
|
||||
|
||||
# Run inspector and capture output
|
||||
OUT=$("$BIN" --backend vm "$APP_INS")
|
||||
echo "$OUT"
|
||||
|
||||
# CI guard: brace delta must be zero after FIX_BRACES normalization in the dump
|
||||
echo "[dev] assert: prelude_brace_delta==0" >&2
|
||||
echo "$OUT" | grep -q "^prelude_brace_delta=0$" || {
|
||||
echo "[error] prelude_brace_delta is not zero" >&2
|
||||
exit 1
|
||||
}
|
||||
|
||||
popd >/dev/null
|
||||
|
||||
@ -50,6 +50,7 @@ check_case "apps/tests/macro/loopform/two_vars.nyash"
|
||||
check_case "apps/tests/macro/loopform/with_continue.nyash"
|
||||
check_case "apps/tests/macro/loopform/with_break.nyash"
|
||||
check_case "apps/tests/llvm_phi_mix.nyash"
|
||||
check_case "apps/tests/loop_if_phi_continue.nyash"
|
||||
|
||||
if [ "$fails" -ne 0 ]; then
|
||||
exit 2
|
||||
|
||||
31
tools/test/smoke/loop_phi_values.sh
Normal file
31
tools/test/smoke/loop_phi_values.sh
Normal file
@ -0,0 +1,31 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
ROOT_DIR=$(cd "$(dirname "$0")/../../.." && pwd)
|
||||
|
||||
echo "[smoke] loop phi values (then-continue + per-var PHI)" >&2
|
||||
|
||||
pushd "$ROOT_DIR" >/dev/null
|
||||
|
||||
cargo build --release -q
|
||||
|
||||
BIN=./target/release/nyash
|
||||
APP=apps/tests/loop_if_phi_continue.nyash
|
||||
|
||||
# Run VM (PyVM) and suppress runner result line to compare pure prints
|
||||
export NYASH_VM_USE_PY=1
|
||||
export NYASH_JSON_ONLY=1
|
||||
out=$("$BIN" --backend vm "$APP")
|
||||
|
||||
expected=$'7\n1'
|
||||
if [[ "$out" != "$expected" ]]; then
|
||||
echo "[smoke] FAIL: unexpected output" >&2
|
||||
echo "--- got ---" >&2
|
||||
printf '%s\n' "$out" >&2
|
||||
echo "--- exp ---" >&2
|
||||
printf '%s\n' "$expected" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "[smoke] OK: loop phi values correct" >&2
|
||||
popd >/dev/null
|
||||
|
||||
@ -36,4 +36,15 @@ if [[ "$out" != "$expected" ]]; then
|
||||
fi
|
||||
|
||||
echo "[smoke] OK: collect_prints using + mixed order" >&2
|
||||
|
||||
# Seam hygiene check: ensure prelude_brace_delta==0 on the dump
|
||||
NYASH_PYVM_DUMP_CODE=1 NYASH_RESOLVE_SEAM_DEBUG=1 NYASH_RESOLVE_FIX_BRACES=1 NYASH_RESOLVE_DEDUP_BOX=1 \
|
||||
"$BIN" --backend vm "$APP" >/dev/null 2>&1 || true
|
||||
INS_OUT=$("$BIN" --backend vm apps/tests/dev_seam_inspect_dump.nyash)
|
||||
echo "$INS_OUT" | grep -q "^prelude_brace_delta=0$" || {
|
||||
echo "[smoke] FAIL: seam prelude_brace_delta is not zero" >&2
|
||||
echo "$INS_OUT" >&2
|
||||
exit 1
|
||||
}
|
||||
echo "[smoke] OK: seam prelude brace delta == 0" >&2
|
||||
popd >/dev/null
|
||||
|
||||
28
tools/test/smoke/selfhost/loopform_identity_smoke.sh
Normal file
28
tools/test/smoke/selfhost/loopform_identity_smoke.sh
Normal file
@ -0,0 +1,28 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
ROOT_DIR=$(cd "$(dirname "$0")/../../.." && pwd)
|
||||
cd "$ROOT_DIR"
|
||||
|
||||
echo "[smoke] build nyash (release)"
|
||||
cargo build --release -q
|
||||
|
||||
BIN=./target/release/nyash
|
||||
CHILD=apps/selfhost/compiler/compiler.nyash
|
||||
|
||||
echo "[smoke] run child (baseline)"
|
||||
BASE=$("$BIN" --backend vm "$CHILD" -- --min-json)
|
||||
|
||||
echo "[smoke] run child (loopform on)"
|
||||
WITH=$("$BIN" --backend vm "$CHILD" -- --min-json --loopform)
|
||||
|
||||
if [[ "$BASE" != "$WITH" ]]; then
|
||||
echo "❌ loopform identity prepass altered JSON" >&2
|
||||
diff -u <(echo "$BASE") <(echo "$WITH") || true
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "$BASE" | grep -q '"kind":"Program"' || { echo "❌ baseline JSON missing Program kind" >&2; exit 1; }
|
||||
|
||||
echo "✅ loopform identity smoke passed"
|
||||
|
||||
33
tools/test/smoke/selfhost/scopebox_identity_smoke.sh
Normal file
33
tools/test/smoke/selfhost/scopebox_identity_smoke.sh
Normal file
@ -0,0 +1,33 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
ROOT_DIR=$(cd "$(dirname "$0")/../../.." && pwd)
|
||||
cd "$ROOT_DIR"
|
||||
|
||||
echo "[smoke] build nyash (release)"
|
||||
cargo build --release -q
|
||||
|
||||
BIN=./target/release/nyash
|
||||
CHILD=apps/selfhost/compiler/compiler.nyash
|
||||
|
||||
if [[ ! -x "$BIN" ]]; then
|
||||
echo "nyash binary not found: $BIN" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "[smoke] run child (baseline)"
|
||||
BASE=$("$BIN" --backend vm "$CHILD" -- --min-json)
|
||||
|
||||
echo "[smoke] run child (scopebox on)"
|
||||
WITH=$("$BIN" --backend vm "$CHILD" -- --min-json --scopebox)
|
||||
|
||||
if [[ "$BASE" != "$WITH" ]]; then
|
||||
echo "❌ scopebox identity prepass altered JSON" >&2
|
||||
diff -u <(echo "$BASE") <(echo "$WITH") || true
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "$BASE" | grep -q '"kind":"Program"' || { echo "❌ baseline JSON missing Program kind" >&2; exit 1; }
|
||||
|
||||
echo "✅ scopebox identity smoke passed"
|
||||
|
||||
39
tools/test/smoke/selfhost/selfhost_runner_smoke.sh
Normal file
39
tools/test/smoke/selfhost/selfhost_runner_smoke.sh
Normal file
@ -0,0 +1,39 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
# Smoke: Stage 0/1 selfhost runner wiring (harness JSON path)
|
||||
|
||||
ROOT_DIR=$(cd "$(dirname "$0")/../../.." && pwd)
|
||||
cd "$ROOT_DIR"
|
||||
|
||||
echo "[smoke] building nyash (release)"
|
||||
cargo build --release -q
|
||||
|
||||
BIN=./target/release/nyash
|
||||
APP=apps/tests/dev_prints_count_probe.nyash
|
||||
|
||||
if [[ ! -x "$BIN" ]]; then
|
||||
echo "nyash binary not found: $BIN" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "[smoke] running selfhost runner (harness)"
|
||||
set +e
|
||||
OUT=$(NYASH_VM_USE_PY=1 NYASH_SELFHOST_EXEC=1 "$BIN" --backend vm "$APP" -- --trace 2>&1)
|
||||
CODE=$?
|
||||
set -e
|
||||
|
||||
echo "$OUT" | sed -e 's/^/[child] /'
|
||||
|
||||
if [[ $CODE -ne 0 ]]; then
|
||||
echo "❌ selfhost runner exited with code $CODE" >&2
|
||||
exit $CODE
|
||||
fi
|
||||
|
||||
if ! echo "$OUT" | grep -q "\[selfhost\] harness summary: functions="; then
|
||||
echo "❌ expected harness summary line not found" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "✅ selfhost runner smoke passed"
|
||||
|
||||
Reference in New Issue
Block a user