AotPrep collections_hot matmul tuning and bench tweaks
This commit is contained in:
@ -1,5 +1,28 @@
|
|||||||
# Current Task — Phase 21.7(Normalization & Unification: Methodize Static Boxes)
|
# Current Task — Phase 21.7(Normalization & Unification: Methodize Static Boxes)
|
||||||
|
|
||||||
|
Update (2025-11-14 — CollectionsHot rewrite expansion, waiting for Claude Code)
|
||||||
|
- Status: pending (waiting on Claude Code to land rewrite coverage improvements)
|
||||||
|
- Scope: AotPrep CollectionsHot — expand Array/Map get/set/has rewrite to externcall by strengthening receiver type resolution.
|
||||||
|
- Done (this host):
|
||||||
|
- Stage‑3 local hint added in builder (Undefined variable: local → guide to set NYASH_PARSER_STAGE3=1 / HAKO_PARSER_STAGE3=1).
|
||||||
|
- 2‑arg lastIndexOf removed across .hako (prefix + 1‑arg pattern) — AotPrep no longer trips PyVM.
|
||||||
|
- CollectionsHot: fixpoint type_table, on‑the‑fly phi peek, backward scan, CH trace logs, push rewrite working.
|
||||||
|
- Bench (arraymap, 10s budget): externcall>0 is observed; boxcall remains in loop; ratio still high.
|
||||||
|
- Waiting (Claude Code tasks):
|
||||||
|
1) Implement tmap_backprop (receiver type from call‑site signals), iterate to small fixpoint (max 2 rounds).
|
||||||
|
2) Integrate resolve_recv_type_backward into decision order (after phi peek), widen key/index heuristics.
|
||||||
|
3) Emit gated logs: "[aot/collections_hot] backprop recv=<vid> => arr|map via method=<mname>" (NYASH_AOT_CH_TRACE=1).
|
||||||
|
4) Update README with decision order and NYASH_AOT_CH_BACKPROP toggle (default=1).
|
||||||
|
- Acceptance:
|
||||||
|
- PREP.json contains nyash.array.get_h/set_h and/or nyash.map.get_h(hh)/set_h(hh); boxcall count decreases; jsonfrag=0.
|
||||||
|
- Bench diag shows externcall>1 (push + at least one of get/set); build/exe succeeds.
|
||||||
|
- How to verify quickly:
|
||||||
|
- ORIG: HAKO_APPLY_AOT_PREP=0 NYASH_JSON_ONLY=1 tools/hakorune_emit_mir.sh /tmp/arraymap_min.hako tmp/arraymap_orig.json
|
||||||
|
- PREP: HAKO_APPLY_AOT_PREP=1 NYASH_AOT_COLLECTIONS_HOT=1 NYASH_LLVM_FAST=1 NYASH_MIR_LOOP_HOIST=1 NYASH_JSON_ONLY=1 tools/hakorune_emit_mir.sh /tmp/arraymap_min.hako tmp/arraymap_prep.json
|
||||||
|
- Bench: NYASH_SKIP_TOML_ENV=1 NYASH_DISABLE_PLUGINS=1 tools/perf/microbench.sh --case arraymap --exe --budget-ms 10000
|
||||||
|
- Optional logs: NYASH_AOT_CH_TRACE=1 HAKO_SELFHOST_TRACE=1 …/hakorune_emit_mir.sh
|
||||||
|
|
||||||
|
|
||||||
Update (2025-11-12 — Optimization pre-work and bench harness)
|
Update (2025-11-12 — Optimization pre-work and bench harness)
|
||||||
- Implemented (opt‑in, defaults OFF)
|
- Implemented (opt‑in, defaults OFF)
|
||||||
- Ret block purity verifier: NYASH_VERIFY_RET_PURITY=1 → Return直前の副作用命令をFail‑Fast(Const/Copy/Phi/Nopのみ許可)。構造純化の安全弁として維持。
|
- Ret block purity verifier: NYASH_VERIFY_RET_PURITY=1 → Return直前の副作用命令をFail‑Fast(Const/Copy/Phi/Nopのみ許可)。構造純化の安全弁として維持。
|
||||||
|
|||||||
@ -54,7 +54,7 @@ Current effort: keep baking new hoist/CSE patterns so `arraymap`, `matmul`, and
|
|||||||
| `branch` | 75.00% | 目標達成(≤125%)。 |
|
| `branch` | 75.00% | 目標達成(≤125%)。 |
|
||||||
| `arraymap` | 150.00% | Array/Map hot-path + binop CSE をさらに磨いて ≤125% を目指す。 |
|
| `arraymap` | 150.00% | Array/Map hot-path + binop CSE をさらに磨いて ≤125% を目指す。 |
|
||||||
| `chip8` | 25.00% | 十分速い。FAST_INT/hoist が効いている。 |
|
| `chip8` | 25.00% | 十分速い。FAST_INT/hoist が効いている。 |
|
||||||
| `kilo` | 0.21% (N=200,000) | EXE モード既定で N を 200k に自動調整。C 参照の方が重い構成のため比率は極小。 |
|
| `kilo` | 0.21% (N=200,000) | LLVM backend では EXE 経路を強制し、既定 N を 200k に自動調整。C 参照の方が重い構成のため比率は極小。 |
|
||||||
| `sieve` | 200.00% | `NYASH_VERIFY_RET_PURITY=1` ON での測定。auto キー判定がまだ保守的。 |
|
| `sieve` | 200.00% | `NYASH_VERIFY_RET_PURITY=1` ON での測定。auto キー判定がまだ保守的。 |
|
||||||
| `matmul` | 300.00% | まだ 3 重ループの Array/Map get/set が支配。自動 CSE と auto map key を詰める予定。 |
|
| `matmul` | 300.00% | まだ 3 重ループの Array/Map get/set が支配。自動 CSE と auto map key を詰める予定。 |
|
||||||
| `linidx` | 100.00% | Linear index case is at parity; hoist + CSE already helps share SSA. |
|
| `linidx` | 100.00% | Linear index case is at parity; hoist + CSE already helps share SSA. |
|
||||||
@ -63,7 +63,7 @@ Current effort: keep baking new hoist/CSE patterns so `arraymap`, `matmul`, and
|
|||||||
### Notes
|
### Notes
|
||||||
|
|
||||||
- You can rerun any case with the command above; `NYASH_LLVM_SKIP_BUILD=1` keeps repeated ny-llvmc builds cheap once the binaries are ready.
|
- You can rerun any case with the command above; `NYASH_LLVM_SKIP_BUILD=1` keeps repeated ny-llvmc builds cheap once the binaries are ready.
|
||||||
- `kilo` は C 参照側が重く既定 N=5,000,000 だと長時間化するため、EXE モードかつ N 未指定では既定 N を 200,000 に自動調整するようにしました(`tools/perf/microbench.sh`)。必要なら `--n <value>` で上書きしてください。
|
- `kilo` は C 参照側が重く既定 N=5,000,000 だと長時間化するため、LLVM backend では常に EXE 経路+既定 N=200,000 で測定するようにしました(`tools/perf/microbench.sh` が `--backend llvm` 時に自動で `--exe` + `N=200000` 相当へ調整します)。必要なら `--n <value>` で上書きしてください。
|
||||||
- `lang/src/llvm_ir/boxes/aot_prep/README.md` に StrlenFold / LoopHoist / ConstDedup / CollectionsHot のパス一覧をまとめ、NYASH_AOT_MAP_KEY_MODE={h|i64|hh|auto} の切り替えも説明しています。今後は hoist/collections の強化で arraymap/matmul/maplin/sieve を 125% 以内に引き下げる施策を続けます。
|
- `lang/src/llvm_ir/boxes/aot_prep/README.md` に StrlenFold / LoopHoist / ConstDedup / CollectionsHot のパス一覧をまとめ、NYASH_AOT_MAP_KEY_MODE={h|i64|hh|auto} の切り替えも説明しています。今後は hoist/collections の強化で arraymap/matmul/maplin/sieve を 125% 以内に引き下げる施策を続けます。
|
||||||
- 代表測定では `NYASH_VERIFY_RET_PURITY=1` を有効化し、Return 直前の副作用を Fail-Fast で検出しながら回しています(ごく軽微なハンドル・boxcallの変化が 2~3× に跳ねることがある点をご留意ください)。
|
- 代表測定では `NYASH_VERIFY_RET_PURITY=1` を有効化し、Return 直前の副作用を Fail-Fast で検出しながら回しています(ごく軽微なハンドル・boxcallの変化が 2~3× に跳ねることがある点をご留意ください)。
|
||||||
|
|
||||||
|
|||||||
173
docs/development/troubleshooting/stage3-local-keyword-guide.md
Normal file
173
docs/development/troubleshooting/stage3-local-keyword-guide.md
Normal file
@ -0,0 +1,173 @@
|
|||||||
|
# Stage-3 Local Keyword Troubleshooting Guide
|
||||||
|
|
||||||
|
## Problem Description
|
||||||
|
|
||||||
|
When using Stage-B (self-hosted) code that contains `local` keyword, you may encounter:
|
||||||
|
```
|
||||||
|
❌ MIR compilation error: Undefined variable: local
|
||||||
|
```
|
||||||
|
|
||||||
|
## Root Cause
|
||||||
|
|
||||||
|
The `local` keyword is a **Stage-3 keyword** that requires explicit enablement via environment variables. Without these ENV variables:
|
||||||
|
1. The tokenizer downgrades `local` from `TokenType::LOCAL` to `TokenType::IDENTIFIER`
|
||||||
|
2. The MIR builder then treats it as an undefined variable
|
||||||
|
|
||||||
|
## Quick Fix
|
||||||
|
|
||||||
|
### For AotPrep Verification (Recommended)
|
||||||
|
Use the provided script which automatically sets all required ENV variables:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
tools/hakorune_emit_mir.sh input.hako output.json
|
||||||
|
```
|
||||||
|
|
||||||
|
This script automatically enables:
|
||||||
|
- `NYASH_PARSER_STAGE3=1`
|
||||||
|
- `HAKO_PARSER_STAGE3=1`
|
||||||
|
- `NYASH_PARSER_ALLOW_SEMICOLON=1`
|
||||||
|
|
||||||
|
### For Manual Execution
|
||||||
|
|
||||||
|
```bash
|
||||||
|
NYASH_PARSER_STAGE3=1 \
|
||||||
|
HAKO_PARSER_STAGE3=1 \
|
||||||
|
NYASH_PARSER_ALLOW_SEMICOLON=1 \
|
||||||
|
./target/release/hakorune --backend vm your_file.hako
|
||||||
|
```
|
||||||
|
|
||||||
|
### For AotPrep with CollectionsHot
|
||||||
|
|
||||||
|
```bash
|
||||||
|
NYASH_SKIP_TOML_ENV=1 \
|
||||||
|
NYASH_DISABLE_PLUGINS=1 \
|
||||||
|
HAKO_APPLY_AOT_PREP=1 \
|
||||||
|
NYASH_AOT_COLLECTIONS_HOT=1 \
|
||||||
|
NYASH_LLVM_FAST=1 \
|
||||||
|
NYASH_MIR_LOOP_HOIST=1 \
|
||||||
|
NYASH_JSON_ONLY=1 \
|
||||||
|
tools/hakorune_emit_mir.sh input.hako output.json
|
||||||
|
```
|
||||||
|
|
||||||
|
## Diagnostic Tools
|
||||||
|
|
||||||
|
### 1. Improved Error Message (New!)
|
||||||
|
|
||||||
|
When you forget to enable Stage-3, you'll now see:
|
||||||
|
|
||||||
|
```
|
||||||
|
❌ MIR compilation error: Undefined variable: local
|
||||||
|
Hint: 'local' is a Stage-3 keyword. Enable NYASH_PARSER_STAGE3=1 (and HAKO_PARSER_STAGE3=1 for Stage-B).
|
||||||
|
For AotPrep verification, use tools/hakorune_emit_mir.sh which sets these automatically.
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Tokenizer Trace
|
||||||
|
|
||||||
|
To see exactly when keywords are being downgraded:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
NYASH_TOK_TRACE=1 ./target/release/hakorune --backend vm your_file.hako
|
||||||
|
```
|
||||||
|
|
||||||
|
Output example:
|
||||||
|
```
|
||||||
|
[tok-stage3] Degrading LOCAL to IDENTIFIER (NYASH_PARSER_STAGE3=false)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Stage-3 Keywords
|
||||||
|
|
||||||
|
The following keywords require `NYASH_PARSER_STAGE3=1`:
|
||||||
|
- `local` - Local variable declaration
|
||||||
|
- `flow` - Flow control (reserved)
|
||||||
|
- `try` - Exception handling
|
||||||
|
- `catch` - Exception handling
|
||||||
|
- `throw` - Exception handling
|
||||||
|
- `while` - Loop construct
|
||||||
|
- `for` - Loop construct
|
||||||
|
- `in` - Loop/iteration operator
|
||||||
|
|
||||||
|
## Code Changes Made
|
||||||
|
|
||||||
|
### 1. Builder Error Message Enhancement
|
||||||
|
**File**: `src/mir/builder.rs:382-406`
|
||||||
|
|
||||||
|
Added Stage-3 keyword detection with helpful hints when `parser_stage3()` is disabled.
|
||||||
|
|
||||||
|
### 2. Documentation Update
|
||||||
|
**File**: `lang/src/llvm_ir/boxes/aot_prep/README.md`
|
||||||
|
|
||||||
|
Added "Stage-3 キーワード要件" section explaining:
|
||||||
|
- Why Stage-3 ENV variables are needed for AotPrep
|
||||||
|
- Recommended usage of `tools/hakorune_emit_mir.sh`
|
||||||
|
- Manual ENV variable requirements
|
||||||
|
- Diagnostic options
|
||||||
|
|
||||||
|
## Related Issues
|
||||||
|
|
||||||
|
### CollectionsHot lastIndexOf Error
|
||||||
|
|
||||||
|
If you see:
|
||||||
|
```
|
||||||
|
❌ VM error: Invalid instruction: lastIndexOf expects 1 arg(s), got 2
|
||||||
|
```
|
||||||
|
|
||||||
|
This is a **separate issue** from Stage-3 keywords. The CollectionsHot pass uses a two-argument `lastIndexOf(needle, start_pos)` which is not yet implemented in the VM's StringBox.
|
||||||
|
|
||||||
|
**Workaround**: Disable CollectionsHot until the VM implementation is updated:
|
||||||
|
```bash
|
||||||
|
# Omit NYASH_AOT_COLLECTIONS_HOT=1
|
||||||
|
tools/hakorune_emit_mir.sh input.hako output.json
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing
|
||||||
|
|
||||||
|
### Test 1: Error Message Without ENV
|
||||||
|
```bash
|
||||||
|
cat > /tmp/test.hako << 'EOF'
|
||||||
|
static box Main {
|
||||||
|
method main(args) {
|
||||||
|
local x
|
||||||
|
x = 42
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
}
|
||||||
|
EOF
|
||||||
|
|
||||||
|
env -u NYASH_PARSER_STAGE3 -u HAKO_PARSER_STAGE3 \
|
||||||
|
./target/release/hakorune --backend vm /tmp/test.hako
|
||||||
|
```
|
||||||
|
|
||||||
|
Expected: Error message with Stage-3 hint
|
||||||
|
|
||||||
|
### Test 2: Success With ENV
|
||||||
|
```bash
|
||||||
|
NYASH_PARSER_STAGE3=1 HAKO_PARSER_STAGE3=1 NYASH_PARSER_ALLOW_SEMICOLON=1 \
|
||||||
|
./target/release/hakorune --backend vm /tmp/test.hako
|
||||||
|
```
|
||||||
|
|
||||||
|
Expected: Program executes successfully
|
||||||
|
|
||||||
|
### Test 3: Tokenizer Trace
|
||||||
|
```bash
|
||||||
|
NYASH_TOK_TRACE=1 env -u NYASH_PARSER_STAGE3 \
|
||||||
|
./target/release/hakorune --backend vm /tmp/test.hako 2>&1 | grep tok-stage3
|
||||||
|
```
|
||||||
|
|
||||||
|
Expected: `[tok-stage3] Degrading LOCAL to IDENTIFIER (NYASH_PARSER_STAGE3=false)`
|
||||||
|
|
||||||
|
## References
|
||||||
|
|
||||||
|
- **Tokenizer Stage-3 Gate**: `src/tokenizer/lex_ident.rs:69-89`
|
||||||
|
- **Parser Stage-3 Check**: `src/config/env.rs:495-504`
|
||||||
|
- **Builder Error Generation**: `src/mir/builder.rs:382-406`
|
||||||
|
- **AotPrep Documentation**: `lang/src/llvm_ir/boxes/aot_prep/README.md`
|
||||||
|
- **Emit MIR Script**: `tools/hakorune_emit_mir.sh`
|
||||||
|
|
||||||
|
## Summary
|
||||||
|
|
||||||
|
✅ **Stage-3 keyword error resolved** - Improved error messages guide users to the fix
|
||||||
|
✅ **Documentation updated** - AotPrep README now explains Stage-3 requirements
|
||||||
|
✅ **Diagnostic tools available** - `NYASH_TOK_TRACE=1` for tokenizer debugging
|
||||||
|
✅ **Recommended workflow** - Use `tools/hakorune_emit_mir.sh` for hassle-free execution
|
||||||
|
|
||||||
|
The original issue was environmental configuration, not a bug in CollectionsHot itself. Once Stage-3 ENV variables are properly set, Stage-B code compiles and executes correctly.
|
||||||
@ -2,11 +2,35 @@ Exception Handling — Postfix catch / cleanup (Stage‑3)
|
|||||||
|
|
||||||
Summary
|
Summary
|
||||||
- Nyash adopts a flatter, postfix-first exception style:
|
- Nyash adopts a flatter, postfix-first exception style:
|
||||||
- try is deprecated. Use postfix `catch` and `cleanup` instead.
|
- There is no `try` statement in the language spec. Use postfix `catch` and `cleanup` instead.
|
||||||
- `catch` = handle exceptions from the immediately preceding expression/call.
|
- `catch` = handle exceptions from the immediately preceding expression/call.
|
||||||
- `cleanup` = always-run finalization (formerly finally), regardless of success or failure.
|
- `cleanup` = always-run finalization (formerly finally), regardless of success or failure.
|
||||||
- This matches the language’s scope unification and keeps blocks shallow and readable.
|
- This matches the language’s scope unification and keeps blocks shallow and readable.
|
||||||
|
|
||||||
|
Spec Clarifications (Stage‑3)
|
||||||
|
- Acceptance gates and profiles
|
||||||
|
- Expression‑postfix: `NYASH_PARSER_STAGE3=1` enables `expr catch(...) {..} cleanup {..}` on calls/chains.
|
||||||
|
- Block‑postfix: `NYASH_BLOCK_CATCH=1` or Stage‑3 enables `{ ... } catch(...) {..} cleanup {..}` (standalone block statement)。
|
||||||
|
- Method‑postfix: `NYASH_METHOD_CATCH=1` or Stage‑3 enables method body postfix on the most recent method.
|
||||||
|
- Cardinality and order
|
||||||
|
- Postfix (expr/block/method): at most one `catch` and at most one `cleanup` — in this order. A second `catch` after postfix is a parse error. Multiple `cleanup` are not allowed.
|
||||||
|
- Legacy compatibility: some builds may still accept the historical `try { ... } catch ... cleanup ...` form, but it is not part of the language spec and will be disabled by default. Prefer postfix forms.
|
||||||
|
- Binding and chaining
|
||||||
|
- Postfix binds to the immediately preceding expression (the last call in a chain) or to the just‑parsed block/method body. It does not extend to the entire statement unless parentheses are used.
|
||||||
|
- After constructing the postfix `TryCatch`, further method chaining on that expression is not accepted.
|
||||||
|
- Semantics and control‑flow
|
||||||
|
- `cleanup` (finally) always runs, regardless of success/failure of the try part.
|
||||||
|
- `return` inside the try part is deferred until after `cleanup` executes. This is implemented by the MIR builder as a deferred return slot/jump to the `cleanup`/exit block.
|
||||||
|
- `return` inside `cleanup` is disallowed by default; enable with `NYASH_CLEANUP_ALLOW_RETURN=1`.
|
||||||
|
- `throw` inside `cleanup` is disallowed by default; enable with `NYASH_CLEANUP_ALLOW_THROW=1`.
|
||||||
|
- `break/continue` inside `cleanup` are allowed (no special guard); use with care. Cleanup executes before the loop transfer takes effect.
|
||||||
|
- Nested cleanup follows lexical unwinding order (inner cleanup runs before outer cleanup).
|
||||||
|
- If no `catch` is present, thrown exceptions still trigger `cleanup`, then propagate outward.
|
||||||
|
- Diagnostics
|
||||||
|
- Method‑postfix: duplicate postfix after a method body is a parse error: "duplicate postfix catch/cleanup after method".
|
||||||
|
- Block‑postfix: a standalone postfix without a preceding block is a parse error: "catch/cleanup must follow a try block or standalone block".
|
||||||
|
- Expression‑postfix: only one `catch` is accepted at expression level; a second `catch` triggers a parse error.
|
||||||
|
|
||||||
Status
|
Status
|
||||||
- Phase 1: normalization sugar(既存)
|
- Phase 1: normalization sugar(既存)
|
||||||
- `NYASH_CATCH_NEW=1` でコア正規化パスが有効化。
|
- `NYASH_CATCH_NEW=1` でコア正規化パスが有効化。
|
||||||
@ -63,6 +87,14 @@ Semantics
|
|||||||
- cleanup is always executed regardless of success/failure (formerly finally).
|
- cleanup is always executed regardless of success/failure (formerly finally).
|
||||||
- Multiple catch blocks match by type in order; the first match is taken.
|
- Multiple catch blocks match by type in order; the first match is taken.
|
||||||
- In loops, `break/continue` cooperate with cleanup: cleanup is run before leaving the scope.
|
- In loops, `break/continue` cooperate with cleanup: cleanup is run before leaving the scope.
|
||||||
|
- Return deferral: A `return` in the try section defers until after cleanup. `return`/`throw` inside cleanup are disabled by default; see env toggles below.
|
||||||
|
|
||||||
|
Environment toggles
|
||||||
|
- `NYASH_PARSER_STAGE3=1`: Enable Stage‑3 syntax (postfix catch/cleanup for expressions; also gates others by default)
|
||||||
|
- `NYASH_BLOCK_CATCH=1`: Allow block‑postfix (independent of Stage‑3 if needed)
|
||||||
|
- `NYASH_METHOD_CATCH=1`: Allow method‑postfix (independent of Stage‑3 if needed)
|
||||||
|
- `NYASH_CLEANUP_ALLOW_RETURN=1`: Permit `return` inside cleanup (default: off)
|
||||||
|
- `NYASH_CLEANUP_ALLOW_THROW=1`: Permit `throw` inside cleanup (default: off)
|
||||||
|
|
||||||
Migration notes
|
Migration notes
|
||||||
- try is deprecated: prefer postfix `catch/cleanup`.
|
- try is deprecated: prefer postfix `catch/cleanup`.
|
||||||
|
|||||||
@ -427,6 +427,75 @@ static box Main {
|
|||||||
if bundles.length() > 0 || bundle_srcs.length() > 0 || require_mods.length() > 0 {
|
if bundles.length() > 0 || bundle_srcs.length() > 0 || require_mods.length() > 0 {
|
||||||
local merged_prefix = BundleResolver.resolve(bundles, bundle_names, bundle_srcs, require_mods)
|
local merged_prefix = BundleResolver.resolve(bundles, bundle_names, bundle_srcs, require_mods)
|
||||||
if merged_prefix == null { return 1 }
|
if merged_prefix == null { return 1 }
|
||||||
|
// Debug: emit line-map for merged bundles so parse error line can be mapped
|
||||||
|
{
|
||||||
|
local dbg = env.get("HAKO_STAGEB_DEBUG")
|
||||||
|
if dbg != null && ("" + dbg) == "1" {
|
||||||
|
// Count lines helper (inline)
|
||||||
|
local total = 0
|
||||||
|
{
|
||||||
|
local s2 = merged_prefix
|
||||||
|
if s2 == null { total = 0 } else {
|
||||||
|
local i=0; local n=(""+s2).length(); local c=1
|
||||||
|
loop(i<n){ if (""+s2).substring(i,i+1)=="\n" { c=c+1 } i=i+1 }
|
||||||
|
total = c
|
||||||
|
}
|
||||||
|
}
|
||||||
|
print("[stageb/line-map] prefix total lines=" + total)
|
||||||
|
// bundle-src (anonymous)
|
||||||
|
if bundles != null && bundles.length() > 0 {
|
||||||
|
local i = 0; local acc = 1
|
||||||
|
loop(i < bundles.length()) {
|
||||||
|
local seg = "" + bundles.get(i)
|
||||||
|
local ln = 0
|
||||||
|
{
|
||||||
|
local s2 = seg
|
||||||
|
if s2 == null { ln = 0 } else {
|
||||||
|
local ii=0; local nn=(""+s2).length(); local cc=1
|
||||||
|
loop(ii<nn){ if (""+s2).substring(ii,ii+1)=="\n" { cc=cc+1 } ii=ii+1 }
|
||||||
|
ln = cc
|
||||||
|
}
|
||||||
|
}
|
||||||
|
local start = acc
|
||||||
|
local finish = acc + ln - 1
|
||||||
|
print("[stageb/line-map] bundle-src[#" + i + "] " + start + ".." + finish)
|
||||||
|
acc = finish + 1
|
||||||
|
i = i + 1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// bundle-mod (named)
|
||||||
|
if bundle_names != null && bundle_srcs != null {
|
||||||
|
local i2 = 0; local acc2 = 1
|
||||||
|
if bundles != null {
|
||||||
|
// count lines of joined bundle-src
|
||||||
|
local joined = bundles.join("\n")
|
||||||
|
if joined == null { acc2 = 1 } else {
|
||||||
|
local ii=0; local nn=(""+joined).length(); local cc=1
|
||||||
|
loop(ii<nn){ if (""+joined).substring(ii,ii+1)=="\n" { cc=cc+1 } ii=ii+1 }
|
||||||
|
acc2 = cc + 1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
loop(i2 < bundle_srcs.length()) {
|
||||||
|
local name = "" + bundle_names.get(i2)
|
||||||
|
local seg = "" + bundle_srcs.get(i2)
|
||||||
|
local ln = 0
|
||||||
|
{
|
||||||
|
local s2 = seg
|
||||||
|
if s2 == null { ln = 0 } else {
|
||||||
|
local ii=0; local nn=(""+s2).length(); local cc=1
|
||||||
|
loop(ii<nn){ if (""+s2).substring(ii,ii+1)=="\n" { cc=cc+1 } ii=ii+1 }
|
||||||
|
ln = cc
|
||||||
|
}
|
||||||
|
}
|
||||||
|
local start = acc2
|
||||||
|
local finish = acc2 + ln - 1
|
||||||
|
print("[stageb/line-map] bundle-mod[name=" + name + "] " + start + ".." + finish)
|
||||||
|
acc2 = finish + 1
|
||||||
|
i2 = i2 + 1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
body_src = merged_prefix + body_src
|
body_src = merged_prefix + body_src
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@ -4,75 +4,35 @@ using selfhost.shared.common.string_helpers as StringHelpers
|
|||||||
using selfhost.llvm.ir.aot_prep.helpers.common as AotPrepHelpers
|
using selfhost.llvm.ir.aot_prep.helpers.common as AotPrepHelpers
|
||||||
|
|
||||||
static box AotPrepBinopCSEBox {
|
static box AotPrepBinopCSEBox {
|
||||||
run(json) {
|
// Static helpers (replace anonymous fun)
|
||||||
if json == null { return null }
|
read_field(text, key) {
|
||||||
local pos = 0
|
|
||||||
local out = json
|
|
||||||
local read_field = fun(text, key) {
|
|
||||||
local needle = "\"" + key + "\":\""
|
local needle = "\"" + key + "\":\""
|
||||||
local idx = text.indexOf(needle)
|
local idx = text.indexOf(needle)
|
||||||
if idx < 0 { return "" }
|
if idx < 0 { return "" }
|
||||||
return JsonFragBox.read_string_after(text, idx + needle.length())
|
return JsonFragBox.read_string_after(text, idx + needle.length())
|
||||||
}
|
}
|
||||||
local read_digits_field = fun(text, key) {
|
read_digits_field(text, key) {
|
||||||
local needle = "\"" + key + "\":"
|
local needle = "\"" + key + "\":"
|
||||||
local idx = text.indexOf(needle)
|
local idx = text.indexOf(needle)
|
||||||
if idx < 0 { return "" }
|
if idx < 0 { return "" }
|
||||||
return StringHelpers.read_digits(text, idx + needle.length())
|
return StringHelpers.read_digits(text, idx + needle.length())
|
||||||
}
|
}
|
||||||
loop(true) {
|
resolve_copy(copy_src, vid) {
|
||||||
local key = "\"instructions\":["
|
|
||||||
local kinst = out.indexOf(key, pos)
|
|
||||||
if kinst < 0 { break }
|
|
||||||
local lb = out.indexOf("[", kinst)
|
|
||||||
if lb < 0 { break }
|
|
||||||
local rb = JsonFragBox._seek_array_end(out, lb)
|
|
||||||
if rb < 0 { break }
|
|
||||||
local body = out.substring(lb+1, rb)
|
|
||||||
local insts = []
|
|
||||||
local i = 0
|
|
||||||
loop(true) {
|
|
||||||
local os = body.indexOf("{", i)
|
|
||||||
if os < 0 { break }
|
|
||||||
local oe = AotPrepHelpers._seek_object_end(body, os)
|
|
||||||
if oe < 0 { break }
|
|
||||||
insts.push(body.substring(os, oe+1))
|
|
||||||
i = oe + 1
|
|
||||||
}
|
|
||||||
local copy_src = {}
|
|
||||||
local new_body = ""
|
|
||||||
local first = 1
|
|
||||||
local append_item = fun(item) {
|
|
||||||
if first == 0 { new_body = new_body + "," }
|
|
||||||
new_body = new_body + item
|
|
||||||
first = 0
|
|
||||||
}
|
|
||||||
for inst in insts {
|
|
||||||
local op = read_field(inst, "op")
|
|
||||||
if op == "copy" {
|
|
||||||
local dst = read_digits_field(inst, "dst")
|
|
||||||
local src = read_digits_field(inst, "src")
|
|
||||||
if dst != "" && src != "" {
|
|
||||||
copy_src[dst] = src
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
local resolve_copy = fun(vid) {
|
|
||||||
local current = vid
|
local current = vid
|
||||||
local depth = 0
|
local depth = 0
|
||||||
loop(true) {
|
loop(true) {
|
||||||
if current == "" { break }
|
if current == "" { break }
|
||||||
if !copy_src.contains(current) { break }
|
if !copy_src.has(current) { break }
|
||||||
current = copy_src[current]
|
current = copy_src.get(current)
|
||||||
depth = depth + 1
|
depth = depth + 1
|
||||||
if depth >= 12 { break }
|
if depth >= 12 { break }
|
||||||
}
|
}
|
||||||
return current
|
return current
|
||||||
}
|
}
|
||||||
local canon_binop = fun(op, lhs, rhs) {
|
canon_binop(op, lhs, rhs, copy_src) {
|
||||||
if lhs == "" || rhs == "" { return "" }
|
if lhs == "" || rhs == "" { return "" }
|
||||||
local key_lhs = resolve_copy(lhs)
|
local key_lhs = AotPrepBinopCSEBox.resolve_copy(copy_src, lhs)
|
||||||
local key_rhs = resolve_copy(rhs)
|
local key_rhs = AotPrepBinopCSEBox.resolve_copy(copy_src, rhs)
|
||||||
if key_lhs == "" || key_rhs == "" { return "" }
|
if key_lhs == "" || key_rhs == "" { return "" }
|
||||||
if op == "+" || op == "add" || op == "*" || op == "mul" {
|
if op == "+" || op == "add" || op == "*" || op == "mul" {
|
||||||
local li = StringHelpers.to_i64(key_lhs)
|
local li = StringHelpers.to_i64(key_lhs)
|
||||||
@ -85,27 +45,76 @@ static box AotPrepBinopCSEBox {
|
|||||||
}
|
}
|
||||||
return op + ":" + key_lhs + ":" + key_rhs
|
return op + ":" + key_lhs + ":" + key_rhs
|
||||||
}
|
}
|
||||||
for inst in insts {
|
run(json) {
|
||||||
local op = read_field(inst, "op")
|
if json == null { return null }
|
||||||
|
local pos = 0
|
||||||
|
local out = json
|
||||||
|
// Track seen canonicalized binops within each block
|
||||||
|
// Key format: op:lhs:rhs with copy chains resolved and commutativity normalized
|
||||||
|
// Reset per instructions-array span
|
||||||
|
loop(true) {
|
||||||
|
local key = "\"instructions\":["
|
||||||
|
local kinst = out.indexOf(key, pos)
|
||||||
|
if kinst < 0 { break }
|
||||||
|
local lb = out.indexOf("[", kinst)
|
||||||
|
if lb < 0 { break }
|
||||||
|
local rb = JsonFragBox._seek_array_end(out, lb)
|
||||||
|
if rb < 0 { break }
|
||||||
|
local body = out.substring(lb+1, rb)
|
||||||
|
// Pass-1: collect copy sources
|
||||||
|
local copy_src = new MapBox()
|
||||||
|
local seen = new MapBox()
|
||||||
|
local i1 = 0
|
||||||
|
loop(true) {
|
||||||
|
local os = body.indexOf("{", i1)
|
||||||
|
if os < 0 { break }
|
||||||
|
local oe = AotPrepHelpers._seek_object_end(body, os)
|
||||||
|
if oe < 0 { break }
|
||||||
|
local inst1 = body.substring(os, oe+1)
|
||||||
|
local op1 = AotPrepBinopCSEBox.read_field(inst1, "op")
|
||||||
|
if op1 == "copy" {
|
||||||
|
local dst1 = AotPrepBinopCSEBox.read_digits_field(inst1, "dst")
|
||||||
|
local src1 = AotPrepBinopCSEBox.read_digits_field(inst1, "src")
|
||||||
|
if dst1 != "" && src1 != "" { copy_src.set(dst1, src1) }
|
||||||
|
}
|
||||||
|
i1 = oe + 1
|
||||||
|
}
|
||||||
|
// Pass-2: build new body with CSE
|
||||||
|
local new_body = ""
|
||||||
|
local first = 1
|
||||||
|
local i2 = 0
|
||||||
|
loop(true) {
|
||||||
|
local os2 = body.indexOf("{", i2)
|
||||||
|
if os2 < 0 { break }
|
||||||
|
local oe2 = AotPrepHelpers._seek_object_end(body, os2)
|
||||||
|
if oe2 < 0 { break }
|
||||||
|
local inst = body.substring(os2, oe2+1)
|
||||||
|
local op = AotPrepBinopCSEBox.read_field(inst, "op")
|
||||||
if op == "binop" {
|
if op == "binop" {
|
||||||
local operation = read_field(inst, "operation")
|
local operation = AotPrepBinopCSEBox.read_field(inst, "operation")
|
||||||
local lhs = read_digits_field(inst, "lhs")
|
local lhs = AotPrepBinopCSEBox.read_digits_field(inst, "lhs")
|
||||||
local rhs = read_digits_field(inst, "rhs")
|
local rhs = AotPrepBinopCSEBox.read_digits_field(inst, "rhs")
|
||||||
local key = canon_binop(operation, lhs, rhs)
|
local key = AotPrepBinopCSEBox.canon_binop(operation, lhs, rhs, copy_src)
|
||||||
if key != "" && seen.contains(key) {
|
if key != "" && seen.has(key) {
|
||||||
local dst = read_digits_field(inst, "dst")
|
local dst = AotPrepBinopCSEBox.read_digits_field(inst, "dst")
|
||||||
if dst != "" {
|
if dst != "" {
|
||||||
append_item("{\"op\":\"copy\",\"dst\":" + dst + ",\"src\":" + seen[key] + "}")
|
local item = "{\"op\":\"copy\",\"dst\":" + dst + ",\"src\":" + seen.get(key) + "}"
|
||||||
|
if first == 0 { new_body = new_body + "," }
|
||||||
|
new_body = new_body + item
|
||||||
|
first = 0
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
} else if key != "" {
|
} else if key != "" {
|
||||||
local dst = read_digits_field(inst, "dst")
|
local dst = AotPrepBinopCSEBox.read_digits_field(inst, "dst")
|
||||||
if dst != "" {
|
if dst != "" {
|
||||||
seen[key] = dst
|
seen.set(key, dst)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
append_item(inst)
|
if first == 0 { new_body = new_body + "," }
|
||||||
|
new_body = new_body + inst
|
||||||
|
first = 0
|
||||||
|
i2 = oe2 + 1
|
||||||
}
|
}
|
||||||
out = out.substring(0, lb+1) + new_body + out.substring(rb, out.length())
|
out = out.substring(0, lb+1) + new_body + out.substring(rb, out.length())
|
||||||
pos = lb + new_body.length() + 1
|
pos = lb + new_body.length() + 1
|
||||||
|
|||||||
@ -17,7 +17,7 @@ static box AotPrepConstDedupBox {
|
|||||||
local body = out.substring(lb+1, rb)
|
local body = out.substring(lb+1, rb)
|
||||||
local i = 0
|
local i = 0
|
||||||
local new_body = ""
|
local new_body = ""
|
||||||
local first_vid_by_value = {}
|
local first_vid_by_value = new MapBox()
|
||||||
loop(i < body.length()) {
|
loop(i < body.length()) {
|
||||||
local os = body.indexOf("{", i)
|
local os = body.indexOf("{", i)
|
||||||
if os < 0 {
|
if os < 0 {
|
||||||
@ -37,12 +37,12 @@ static box AotPrepConstDedupBox {
|
|||||||
local kval = obj.indexOf("\"value\":{\"type\":\"i64\",\"value\":")
|
local kval = obj.indexOf("\"value\":{\"type\":\"i64\",\"value\":")
|
||||||
local vals = (kval>=0 ? StringHelpers.read_digits(obj, kval+30) : "")
|
local vals = (kval>=0 ? StringHelpers.read_digits(obj, kval+30) : "")
|
||||||
if dsts != "" && vals != "" {
|
if dsts != "" && vals != "" {
|
||||||
if first_vid_by_value.contains(vals) {
|
if first_vid_by_value.has(vals) {
|
||||||
local src = first_vid_by_value[vals]
|
local src = first_vid_by_value.get(vals)
|
||||||
local repl = "{\"op\":\"copy\",\"dst\":" + dsts + ",\"src\":" + src + "}"
|
local repl = "{\"op\":\"copy\",\"dst\":" + dsts + ",\"src\":" + src + "}"
|
||||||
new_body = new_body + repl
|
new_body = new_body + repl
|
||||||
} else {
|
} else {
|
||||||
first_vid_by_value[vals] = dsts
|
first_vid_by_value.set(vals, dsts)
|
||||||
new_body = new_body + obj
|
new_body = new_body + obj
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
|
|||||||
@ -4,42 +4,29 @@ using selfhost.shared.common.string_helpers as StringHelpers
|
|||||||
using selfhost.llvm.ir.aot_prep.helpers.common as AotPrepHelpers // for evaluate_binop_constant
|
using selfhost.llvm.ir.aot_prep.helpers.common as AotPrepHelpers // for evaluate_binop_constant
|
||||||
|
|
||||||
static box AotPrepLoopHoistBox {
|
static box AotPrepLoopHoistBox {
|
||||||
run(json) {
|
// Static helpers (replace anonymous fun)
|
||||||
if json == null { return null }
|
read_field(text, key) {
|
||||||
local out = json
|
|
||||||
local pos = 0
|
|
||||||
local build_items = fun(body) {
|
|
||||||
local items = []
|
|
||||||
local i = 0
|
|
||||||
loop(true) {
|
|
||||||
local os = body.indexOf("{", i)
|
|
||||||
if os < 0 { break }
|
|
||||||
local oe = AotPrepHelpers._seek_object_end(body, os)
|
|
||||||
if oe < 0 { break }
|
|
||||||
items.push(body.substring(os, oe+1))
|
|
||||||
i = oe + 1
|
|
||||||
}
|
|
||||||
return items
|
|
||||||
}
|
|
||||||
local read_field = fun(text, key) {
|
|
||||||
local needle = "\"" + key + "\":\""
|
local needle = "\"" + key + "\":\""
|
||||||
local idx = text.indexOf(needle)
|
local idx = text.indexOf(needle)
|
||||||
if idx < 0 { return "" }
|
if idx < 0 { return "" }
|
||||||
return JsonFragBox.read_string_after(text, idx + needle.length())
|
return JsonFragBox.read_string_after(text, idx + needle.length())
|
||||||
}
|
}
|
||||||
local read_digits_field = fun(text, key) {
|
read_digits_field(text, key) {
|
||||||
local needle = "\"" + key + "\":"
|
local needle = "\"" + key + "\":"
|
||||||
local idx = text.indexOf(needle)
|
local idx = text.indexOf(needle)
|
||||||
if idx < 0 { return "" }
|
if idx < 0 { return "" }
|
||||||
return StringHelpers.read_digits(text, idx + needle.length())
|
return StringHelpers.read_digits(text, idx + needle.length())
|
||||||
}
|
}
|
||||||
local read_const_value = fun(text) {
|
read_const_value(text) {
|
||||||
local needle = "\"value\":{\"type\":\"i64\",\"value\":"
|
local needle = "\"value\":{\"type\":\"i64\",\"value\":"
|
||||||
local idx = text.indexOf(needle)
|
local idx = text.indexOf(needle)
|
||||||
if idx < 0 { return "" }
|
if idx < 0 { return "" }
|
||||||
return StringHelpers.read_digits(text, idx + needle.length())
|
return StringHelpers.read_digits(text, idx + needle.length())
|
||||||
}
|
}
|
||||||
local ref_fields = ["lhs", "rhs", "cond", "target"]
|
run(json) {
|
||||||
|
if json == null { return null }
|
||||||
|
local out = json
|
||||||
|
local pos = 0
|
||||||
loop(true) {
|
loop(true) {
|
||||||
local key = "\"instructions\":["
|
local key = "\"instructions\":["
|
||||||
local kinst = out.indexOf(key, pos)
|
local kinst = out.indexOf(key, pos)
|
||||||
@ -49,62 +36,125 @@ static box AotPrepLoopHoistBox {
|
|||||||
local rb = JsonFragBox._seek_array_end(out, lb)
|
local rb = JsonFragBox._seek_array_end(out, lb)
|
||||||
if rb < 0 { break }
|
if rb < 0 { break }
|
||||||
local body = out.substring(lb+1, rb)
|
local body = out.substring(lb+1, rb)
|
||||||
local insts = build_items(body)
|
local const_defs = new MapBox()
|
||||||
local const_defs = {}
|
local const_vals = new MapBox()
|
||||||
local const_vals = {}
|
// Pass-1: collect const defs
|
||||||
for inst in insts {
|
{
|
||||||
local op = read_field(inst, "op")
|
local i0 = 0
|
||||||
if op == "const" {
|
loop(true) {
|
||||||
local dst = read_digits_field(inst, "dst")
|
local os0 = body.indexOf("{", i0)
|
||||||
local val = read_const_value(inst)
|
if os0 < 0 { break }
|
||||||
if dst != "" && val != "" { const_defs[dst] = inst; const_vals[dst] = val }
|
local oe0 = AotPrepHelpers._seek_object_end(body, os0)
|
||||||
|
if oe0 < 0 { break }
|
||||||
|
local inst0 = body.substring(os0, oe0+1)
|
||||||
|
local op0 = AotPrepLoopHoistBox.read_field(inst0, "op")
|
||||||
|
if op0 == "const" {
|
||||||
|
local dst0 = AotPrepLoopHoistBox.read_digits_field(inst0, "dst")
|
||||||
|
local val0 = AotPrepLoopHoistBox.read_const_value(inst0)
|
||||||
|
if dst0 != "" && val0 != "" { const_defs.set(dst0, inst0); const_vals.set(dst0, val0) }
|
||||||
|
}
|
||||||
|
i0 = oe0 + 1
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
local folded = true
|
local folded = true
|
||||||
while folded {
|
while folded {
|
||||||
folded = false
|
folded = false
|
||||||
for inst in insts {
|
local i1 = 0
|
||||||
local op = read_field(inst, "op")
|
loop(true) {
|
||||||
if op != "binop" { continue }
|
local os1 = body.indexOf("{", i1)
|
||||||
local dst = read_digits_field(inst, "dst")
|
if os1 < 0 { break }
|
||||||
if dst == "" || const_vals.contains(dst) { continue }
|
local oe1 = AotPrepHelpers._seek_object_end(body, os1)
|
||||||
local lhs = read_digits_field(inst, "lhs")
|
if oe1 < 0 { break }
|
||||||
local rhs = read_digits_field(inst, "rhs")
|
local inst1 = body.substring(os1, oe1+1)
|
||||||
local operation = read_field(inst, "operation")
|
local op1 = AotPrepLoopHoistBox.read_field(inst1, "op")
|
||||||
if lhs == "" || rhs == "" || operation == "" { continue }
|
if op1 == "binop" {
|
||||||
local lhs_val = const_vals.contains(lhs) ? const_vals[lhs] : ""
|
local dst1 = AotPrepLoopHoistBox.read_digits_field(inst1, "dst")
|
||||||
local rhs_val = const_vals.contains(rhs) ? const_vals[rhs] : ""
|
if dst1 != "" && !const_vals.has(dst1) {
|
||||||
if lhs_val == "" || rhs_val == "" { continue }
|
local lhs1 = AotPrepLoopHoistBox.read_digits_field(inst1, "lhs")
|
||||||
local computed = AotPrepHelpers.evaluate_binop_constant(operation, lhs_val, rhs_val)
|
local rhs1 = AotPrepLoopHoistBox.read_digits_field(inst1, "rhs")
|
||||||
if computed == "" { continue }
|
local operation1 = AotPrepLoopHoistBox.read_field(inst1, "operation")
|
||||||
const_defs[dst] = inst
|
if lhs1 != "" && rhs1 != "" && operation1 != "" {
|
||||||
const_vals[dst] = computed
|
local lhs_val1 = const_vals.has(lhs1) ? const_vals.get(lhs1) : ""
|
||||||
|
local rhs_val1 = const_vals.has(rhs1) ? const_vals.get(rhs1) : ""
|
||||||
|
if lhs_val1 != "" && rhs_val1 != "" {
|
||||||
|
local computed1 = AotPrepHelpers.evaluate_binop_constant(operation1, lhs_val1, rhs_val1)
|
||||||
|
if computed1 != "" {
|
||||||
|
const_defs.set(dst1, inst1)
|
||||||
|
const_vals.set(dst1, computed1)
|
||||||
folded = true
|
folded = true
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
local needed = {}
|
}
|
||||||
for inst in insts {
|
}
|
||||||
local op = read_field(inst, "op")
|
}
|
||||||
if op == "const" { continue }
|
i1 = oe1 + 1
|
||||||
for field in ref_fields {
|
}
|
||||||
local ref = read_digits_field(inst, field)
|
}
|
||||||
if ref != "" && const_defs.contains(ref) { needed[ref] = true }
|
local needed = new MapBox()
|
||||||
|
{
|
||||||
|
local i2 = 0
|
||||||
|
loop(true) {
|
||||||
|
local os2 = body.indexOf("{", i2)
|
||||||
|
if os2 < 0 { break }
|
||||||
|
local oe2 = AotPrepHelpers._seek_object_end(body, os2)
|
||||||
|
if oe2 < 0 { break }
|
||||||
|
local inst2 = body.substring(os2, oe2+1)
|
||||||
|
local op2 = AotPrepLoopHoistBox.read_field(inst2, "op")
|
||||||
|
if op2 != "const" {
|
||||||
|
// check lhs, rhs, cond, target
|
||||||
|
local rf = AotPrepLoopHoistBox.read_digits_field(inst2, "lhs")
|
||||||
|
if rf != "" && const_defs.has(rf) { needed.set(rf, true) }
|
||||||
|
rf = AotPrepLoopHoistBox.read_digits_field(inst2, "rhs")
|
||||||
|
if rf != "" && const_defs.has(rf) { needed.set(rf, true) }
|
||||||
|
rf = AotPrepLoopHoistBox.read_digits_field(inst2, "cond")
|
||||||
|
if rf != "" && const_defs.has(rf) { needed.set(rf, true) }
|
||||||
|
rf = AotPrepLoopHoistBox.read_digits_field(inst2, "target")
|
||||||
|
if rf != "" && const_defs.has(rf) { needed.set(rf, true) }
|
||||||
|
}
|
||||||
|
i2 = oe2 + 1
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if needed.size() == 0 { pos = rb + 1; continue }
|
if needed.size() == 0 { pos = rb + 1; continue }
|
||||||
local hoist_items = []
|
// Build merged: hoist first, then keep (two scans)
|
||||||
local keep_items = []
|
local any_hoist = 0
|
||||||
for inst in insts {
|
|
||||||
local dst = read_digits_field(inst, "dst")
|
|
||||||
if dst != "" && needed.contains(dst) && const_defs.contains(dst) { hoist_items.push(inst); continue }
|
|
||||||
keep_items.push(inst)
|
|
||||||
}
|
|
||||||
if hoist_items.size() == 0 { pos = rb + 1; continue }
|
|
||||||
local merged = ""
|
local merged = ""
|
||||||
local first = 1
|
local first = 1
|
||||||
local append_item = fun(item) { if first == 0 { merged = merged + "," } merged = merged + item; first = 0 }
|
{
|
||||||
for item in hoist_items { append_item(item) }
|
local i3 = 0
|
||||||
for item in keep_items { append_item(item) }
|
loop(true) {
|
||||||
|
local os3 = body.indexOf("{", i3)
|
||||||
|
if os3 < 0 { break }
|
||||||
|
local oe3 = AotPrepHelpers._seek_object_end(body, os3)
|
||||||
|
if oe3 < 0 { break }
|
||||||
|
local inst3 = body.substring(os3, oe3+1)
|
||||||
|
local dst3 = AotPrepLoopHoistBox.read_digits_field(inst3, "dst")
|
||||||
|
if dst3 != "" && needed.has(dst3) && const_defs.has(dst3) {
|
||||||
|
if first == 0 { merged = merged + "," }
|
||||||
|
merged = merged + inst3
|
||||||
|
first = 0
|
||||||
|
any_hoist = 1
|
||||||
|
}
|
||||||
|
i3 = oe3 + 1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if any_hoist == 0 { pos = rb + 1; continue }
|
||||||
|
{
|
||||||
|
local i4 = 0
|
||||||
|
loop(true) {
|
||||||
|
local os4 = body.indexOf("{", i4)
|
||||||
|
if os4 < 0 { break }
|
||||||
|
local oe4 = AotPrepHelpers._seek_object_end(body, os4)
|
||||||
|
if oe4 < 0 { break }
|
||||||
|
local inst4 = body.substring(os4, oe4+1)
|
||||||
|
local dst4 = AotPrepLoopHoistBox.read_digits_field(inst4, "dst")
|
||||||
|
if !(dst4 != "" && needed.has(dst4) && const_defs.has(dst4)) {
|
||||||
|
if first == 0 { merged = merged + "," }
|
||||||
|
merged = merged + inst4
|
||||||
|
first = 0
|
||||||
|
}
|
||||||
|
i4 = oe4 + 1
|
||||||
|
}
|
||||||
|
}
|
||||||
out = out.substring(0, lb+1) + merged + out.substring(rb, out.length())
|
out = out.substring(0, lb+1) + merged + out.substring(rb, out.length())
|
||||||
pos = lb + merged.length() + 1
|
pos = lb + merged.length() + 1
|
||||||
}
|
}
|
||||||
|
|||||||
10
nyash.toml
10
nyash.toml
@ -214,6 +214,16 @@ path = "lang/src/shared/common/string_helpers.hako"
|
|||||||
# Phase 20.34 — Box‑First selfhost build line (aliases for Hako boxes)
|
# Phase 20.34 — Box‑First selfhost build line (aliases for Hako boxes)
|
||||||
"hako.mir.builder" = "lang/src/mir/builder/MirBuilderBox.hako"
|
"hako.mir.builder" = "lang/src/mir/builder/MirBuilderBox.hako"
|
||||||
"hako.mir.builder.min" = "lang/src/mir/builder/MirBuilderMinBox.hako"
|
"hako.mir.builder.min" = "lang/src/mir/builder/MirBuilderMinBox.hako"
|
||||||
|
"selfhost.llvm.ir.aot_prep" = "lang/src/llvm_ir/boxes/aot_prep.hako"
|
||||||
|
"selfhost.llvm.ir.aot_prep.helpers.common" = "lang/src/llvm_ir/boxes/aot_prep/helpers/common.hako"
|
||||||
|
"selfhost.llvm.ir.aot_prep.passes.strlen" = "lang/src/llvm_ir/boxes/aot_prep/passes/strlen.hako"
|
||||||
|
"selfhost.llvm.ir.aot_prep.passes.loop_hoist" = "lang/src/llvm_ir/boxes/aot_prep/passes/loop_hoist.hako"
|
||||||
|
"selfhost.llvm.ir.aot_prep.passes.const_dedup" = "lang/src/llvm_ir/boxes/aot_prep/passes/const_dedup.hako"
|
||||||
|
"selfhost.llvm.ir.aot_prep.passes.binop_cse" = "lang/src/llvm_ir/boxes/aot_prep/passes/binop_cse.hako"
|
||||||
|
"selfhost.llvm.ir.aot_prep.passes.collections_hot" = "lang/src/llvm_ir/boxes/aot_prep/passes/collections_hot.hako"
|
||||||
|
"selfhost.llvm.ir.aot_prep.passes.fold_const_ret" = "lang/src/llvm_ir/boxes/aot_prep/passes/fold_const_ret.hako"
|
||||||
|
"selfhost.shared.json.core.json_canonical" = "lang/src/shared/json/json_canonical_box.hako"
|
||||||
|
"selfhost.shared.common.common_imports" = "lang/src/shared/common/common_imports.hako"
|
||||||
"hako.mir.builder.pattern_registry" = "lang/src/mir/builder/pattern_registry.hako"
|
"hako.mir.builder.pattern_registry" = "lang/src/mir/builder/pattern_registry.hako"
|
||||||
"hako.using.resolve.ssot" = "lang/src/using/resolve_ssot_box.hako"
|
"hako.using.resolve.ssot" = "lang/src/using/resolve_ssot_box.hako"
|
||||||
"hako.llvm.emit" = "lang/src/llvm_ir/emit/LLVMEmitBox.hako"
|
"hako.llvm.emit" = "lang/src/llvm_ir/emit/LLVMEmitBox.hako"
|
||||||
|
|||||||
@ -385,6 +385,16 @@ impl MirBuilder {
|
|||||||
} else {
|
} else {
|
||||||
// Enhance diagnostics using Using simple registry (Phase 1)
|
// Enhance diagnostics using Using simple registry (Phase 1)
|
||||||
let mut msg = format!("Undefined variable: {}", name);
|
let mut msg = format!("Undefined variable: {}", name);
|
||||||
|
|
||||||
|
// Stage-3 keyword diagnostic (local/flow/try/catch/throw)
|
||||||
|
if name == "local" && !crate::config::env::parser_stage3() {
|
||||||
|
msg.push_str("\nHint: 'local' is a Stage-3 keyword. Enable NYASH_PARSER_STAGE3=1 (and HAKO_PARSER_STAGE3=1 for Stage-B).");
|
||||||
|
msg.push_str("\nFor AotPrep verification, use tools/hakorune_emit_mir.sh which sets these automatically.");
|
||||||
|
} else if (name == "flow" || name == "try" || name == "catch" || name == "throw")
|
||||||
|
&& !crate::config::env::parser_stage3() {
|
||||||
|
msg.push_str(&format!("\nHint: '{}' is a Stage-3 keyword. Enable NYASH_PARSER_STAGE3=1 (and HAKO_PARSER_STAGE3=1 for Stage-B).", name));
|
||||||
|
}
|
||||||
|
|
||||||
let suggest = crate::using::simple_registry::suggest_using_for_symbol(&name);
|
let suggest = crate::using::simple_registry::suggest_using_for_symbol(&name);
|
||||||
if !suggest.is_empty() {
|
if !suggest.is_empty() {
|
||||||
msg.push_str("\nHint: symbol appears in using module(s): ");
|
msg.push_str("\nHint: symbol appears in using module(s): ");
|
||||||
|
|||||||
25
src/runner/modes/common_util/resolve/context.rs
Normal file
25
src/runner/modes/common_util/resolve/context.rs
Normal file
@ -0,0 +1,25 @@
|
|||||||
|
//! Resolve context — capture per-thread prelude merge context for enriched diagnostics.
|
||||||
|
use std::cell::RefCell;
|
||||||
|
|
||||||
|
thread_local! {
|
||||||
|
static LAST_MERGED_PRELUDES: RefCell<Vec<String>> = RefCell::new(Vec::new());
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Record the list of prelude file paths used for the last text merge in this thread.
|
||||||
|
pub fn set_last_merged_preludes(paths: Vec<String>) {
|
||||||
|
LAST_MERGED_PRELUDES.with(|c| {
|
||||||
|
*c.borrow_mut() = paths;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Get a clone of the last recorded prelude file paths (if any).
|
||||||
|
pub fn clone_last_merged_preludes() -> Vec<String> {
|
||||||
|
LAST_MERGED_PRELUDES.with(|c| c.borrow().clone())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Take and clear the last recorded prelude file paths.
|
||||||
|
#[allow(dead_code)]
|
||||||
|
pub fn take_last_merged_preludes() -> Vec<String> {
|
||||||
|
LAST_MERGED_PRELUDES.with(|c| std::mem::take(&mut *c.borrow_mut()))
|
||||||
|
}
|
||||||
|
|
||||||
@ -25,6 +25,7 @@ pub mod using_resolution;
|
|||||||
pub mod prelude_manager;
|
pub mod prelude_manager;
|
||||||
pub mod selfhost_pipeline;
|
pub mod selfhost_pipeline;
|
||||||
pub mod path_util;
|
pub mod path_util;
|
||||||
|
pub mod context;
|
||||||
|
|
||||||
// 📦 箱化モジュールの公開にゃ!
|
// 📦 箱化モジュールの公開にゃ!
|
||||||
pub use using_resolution::{
|
pub use using_resolution::{
|
||||||
@ -54,3 +55,9 @@ pub use strip::{
|
|||||||
merge_prelude_asts_with_main,
|
merge_prelude_asts_with_main,
|
||||||
merge_prelude_text,
|
merge_prelude_text,
|
||||||
};
|
};
|
||||||
|
|
||||||
|
// Expose context helpers for enhanced diagnostics
|
||||||
|
pub use context::{
|
||||||
|
set_last_merged_preludes,
|
||||||
|
clone_last_merged_preludes,
|
||||||
|
};
|
||||||
|
|||||||
@ -777,6 +777,8 @@ pub fn merge_prelude_text(
|
|||||||
dfs_text(runner, p, &mut expanded, &mut seen)?;
|
dfs_text(runner, p, &mut expanded, &mut seen)?;
|
||||||
}
|
}
|
||||||
let prelude_paths = &expanded;
|
let prelude_paths = &expanded;
|
||||||
|
// Record for enriched diagnostics (parse error context)
|
||||||
|
crate::runner::modes::common_util::resolve::set_last_merged_preludes(prelude_paths.clone());
|
||||||
|
|
||||||
if prelude_paths.is_empty() {
|
if prelude_paths.is_empty() {
|
||||||
// No using statements, return original
|
// No using statements, return original
|
||||||
|
|||||||
@ -66,6 +66,18 @@ impl NyashRunner {
|
|||||||
Ok(ast) => ast,
|
Ok(ast) => ast,
|
||||||
Err(e) => {
|
Err(e) => {
|
||||||
eprintln!("❌ Parse error in {}: {}", filename, e);
|
eprintln!("❌ Parse error in {}: {}", filename, e);
|
||||||
|
// Enhanced context: list merged prelude files if any (from text-merge path)
|
||||||
|
let preludes = crate::runner::modes::common_util::resolve::clone_last_merged_preludes();
|
||||||
|
if !preludes.is_empty() {
|
||||||
|
eprintln!("[parse/context] merged prelude files ({}):", preludes.len());
|
||||||
|
let show = std::cmp::min(16, preludes.len());
|
||||||
|
for p in preludes.iter().take(show) {
|
||||||
|
eprintln!(" - {}", p);
|
||||||
|
}
|
||||||
|
if preludes.len() > show {
|
||||||
|
eprintln!(" ... ({} more)", preludes.len() - show);
|
||||||
|
}
|
||||||
|
}
|
||||||
process::exit(1);
|
process::exit(1);
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|||||||
@ -183,6 +183,18 @@ impl NyashRunner {
|
|||||||
Ok(ast) => ast,
|
Ok(ast) => ast,
|
||||||
Err(e) => {
|
Err(e) => {
|
||||||
eprintln!("❌ Parse error in {}: {}", filename, e);
|
eprintln!("❌ Parse error in {}: {}", filename, e);
|
||||||
|
// Enhanced context: list merged prelude files if any
|
||||||
|
let preludes = crate::runner::modes::common_util::resolve::clone_last_merged_preludes();
|
||||||
|
if !preludes.is_empty() {
|
||||||
|
eprintln!("[parse/context] merged prelude files ({}):", preludes.len());
|
||||||
|
let show = std::cmp::min(16, preludes.len());
|
||||||
|
for p in preludes.iter().take(show) {
|
||||||
|
eprintln!(" - {}", p);
|
||||||
|
}
|
||||||
|
if preludes.len() > show {
|
||||||
|
eprintln!(" ... ({} more)", preludes.len() - show);
|
||||||
|
}
|
||||||
|
}
|
||||||
process::exit(1);
|
process::exit(1);
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|||||||
@ -370,37 +370,62 @@ HCODE
|
|||||||
# Write raw MIR JSON first
|
# Write raw MIR JSON first
|
||||||
printf '%s' "$mir" > "$out_path"
|
printf '%s' "$mir" > "$out_path"
|
||||||
|
|
||||||
# Optional AOT prep stage (text-level, no FileBox required for JSON-in/out function)
|
# Optional AOT prep stage (run_json; no FileBox)
|
||||||
# Run only when fast/hoist/collections_hot are requested to avoid unnecessary overhead.
|
# Trigger when HAKO_APPLY_AOT_PREP=1 or when fast/hoist/collections_hot are requested.
|
||||||
if [ "${NYASH_AOT_COLLECTIONS_HOT:-0}" = "1" ] || [ "${NYASH_LLVM_FAST:-0}" = "1" ] || [ "${NYASH_MIR_LOOP_HOIST:-0}" = "1" ]; then
|
if [ "${HAKO_APPLY_AOT_PREP:-0}" = "1" ] || [ "${NYASH_AOT_COLLECTIONS_HOT:-0}" = "1" ] || [ "${NYASH_LLVM_FAST:-0}" = "1" ] || [ "${NYASH_MIR_LOOP_HOIST:-0}" = "1" ]; then
|
||||||
if [ "${HAKO_SELFHOST_TRACE:-0}" = "1" ]; then
|
if [ "${HAKO_SELFHOST_TRACE:-0}" = "1" ]; then
|
||||||
echo "[provider/emit:trace] Applying AotPrep passes to MIR JSON..." >&2
|
echo "[provider/emit:trace] Applying AotPrep.run_json to MIR JSON..." >&2
|
||||||
fi
|
fi
|
||||||
_prep_hako=$(mktemp --suffix .hako)
|
_prep_hako=$(mktemp --suffix .hako)
|
||||||
cat > "$_prep_hako" <<'HAKO'
|
cat > "$_prep_hako" <<'HAKO'
|
||||||
using selfhost.llvm.ir.aot_prep as AotPrepBox
|
using selfhost.llvm.ir.aot_prep as AotPrepBox
|
||||||
static box Main { method main(args) {
|
static box Main { method main(args) {
|
||||||
local in = args.get(0)
|
local src = env.get("HAKO_PREP_INPUT")
|
||||||
// Prefer file-path based prep to avoid huge argv issues; FileBox is core-ro in this runner
|
if src == null || src == "" { print("[prep:skip:empty]"); return 0 }
|
||||||
local out = AotPrepBox.prep(in)
|
local out = AotPrepBox.run_json(src)
|
||||||
if out == null { println("[prep:fail]"); return 1 }
|
print("[PREP_OUT_BEGIN]")
|
||||||
println(out)
|
print(out)
|
||||||
|
print("[PREP_OUT_END]")
|
||||||
return 0
|
return 0
|
||||||
} }
|
} }
|
||||||
HAKO
|
HAKO
|
||||||
|
# Read MIR JSON and pass via env; capture between markers
|
||||||
|
_prep_stdout=$(mktemp)
|
||||||
|
_prep_stderr=$(mktemp)
|
||||||
set +e
|
set +e
|
||||||
_prep_out=$(NYASH_ENABLE_USING=1 HAKO_ENABLE_USING=1 NYASH_FILEBOX_MODE=core-ro \
|
HAKO_PREP_INPUT="$(cat "$out_path")" \
|
||||||
|
NYASH_FILEBOX_MODE=core-ro \
|
||||||
|
NYASH_PARSER_STAGE3=1 HAKO_PARSER_STAGE3=1 NYASH_PARSER_ALLOW_SEMICOLON=1 \
|
||||||
|
NYASH_ENABLE_USING=1 HAKO_ENABLE_USING=1 HAKO_USING_RESOLVER_FIRST=1 \
|
||||||
NYASH_AOT_COLLECTIONS_HOT=${NYASH_AOT_COLLECTIONS_HOT:-0} NYASH_LLVM_FAST=${NYASH_LLVM_FAST:-0} NYASH_MIR_LOOP_HOIST=${NYASH_MIR_LOOP_HOIST:-0} NYASH_AOT_MAP_KEY_MODE=${NYASH_AOT_MAP_KEY_MODE:-auto} \
|
NYASH_AOT_COLLECTIONS_HOT=${NYASH_AOT_COLLECTIONS_HOT:-0} NYASH_LLVM_FAST=${NYASH_LLVM_FAST:-0} NYASH_MIR_LOOP_HOIST=${NYASH_MIR_LOOP_HOIST:-0} NYASH_AOT_MAP_KEY_MODE=${NYASH_AOT_MAP_KEY_MODE:-auto} \
|
||||||
"$NYASH_BIN" --backend vm "$_prep_hako" -- "$out_path" 2>/dev/null | tail -n 1)
|
NYASH_JSON_ONLY=${NYASH_JSON_ONLY:-1} \
|
||||||
|
"$NYASH_BIN" --backend vm "$_prep_hako" >"$_prep_stdout" 2>"$_prep_stderr"
|
||||||
_rc=$?
|
_rc=$?
|
||||||
set -e
|
set -e
|
||||||
if [ $_rc -eq 0 ] && [ -f "$_prep_out" ]; then
|
if [ "${HAKO_SELFHOST_TRACE:-0}" = "1" ]; then
|
||||||
mv -f "$_prep_out" "$out_path"
|
echo "[provider/emit:trace] AotPrep runner rc=$_rc" >&2
|
||||||
[ "${HAKO_SELFHOST_TRACE:-0}" = "1" ] && echo "[provider/emit:trace] AotPrep applied successfully" >&2
|
|
||||||
else
|
|
||||||
[ "${HAKO_SELFHOST_TRACE:-0}" = "1" ] && echo "[provider/emit:trace] AotPrep skipped or failed (rc=$_rc)" >&2
|
|
||||||
fi
|
fi
|
||||||
rm -f "$_prep_hako" 2>/dev/null || true
|
if [ $_rc -eq 0 ] && grep -q "\[PREP_OUT_BEGIN\]" "$_prep_stdout" && grep -q "\[PREP_OUT_END\]" "$_prep_stdout"; then
|
||||||
|
awk '/\[PREP_OUT_BEGIN\]/{flag=1;next}/\[PREP_OUT_END\]/{flag=0}flag' "$_prep_stdout" > "$out_path"
|
||||||
|
[ "${HAKO_SELFHOST_TRACE:-0}" = "1" ] && echo "[provider/emit:trace] AotPrep applied (run_json)" >&2
|
||||||
|
# Optional: surface CollectionsHot trace lines for diagnostics when requested
|
||||||
|
if [ "${NYASH_AOT_CH_TRACE:-0}" = "1" ]; then
|
||||||
|
if command -v rg >/dev/null 2>&1; then
|
||||||
|
rg -n '^\[aot/collections_hot\]' "$_prep_stdout" >&2 || true
|
||||||
|
else
|
||||||
|
grep '^\[aot/collections_hot\]' "$_prep_stdout" >&2 || true
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
if [ "${HAKO_SELFHOST_TRACE:-0}" = "1" ]; then
|
||||||
|
echo "[provider/emit:trace] AotPrep skipped or failed (run_json) rc=$_rc" >&2
|
||||||
|
if [ -s "$_prep_stderr" ]; then
|
||||||
|
echo "[provider/emit:trace] AotPrep stderr (tail):" >&2
|
||||||
|
tail -n 60 "$_prep_stderr" >&2 || true
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
rm -f "$_prep_hako" "$_prep_stdout" "$_prep_stderr" 2>/dev/null || true
|
||||||
fi
|
fi
|
||||||
|
|
||||||
echo "[OK] MIR JSON written (delegate:provider): $out_path"
|
echo "[OK] MIR JSON written (delegate:provider): $out_path"
|
||||||
@ -497,7 +522,7 @@ HCODE
|
|||||||
HAKO_MIR_BUILDER_NORMALIZE_TAG="${HAKO_MIR_BUILDER_NORMALIZE_TAG:-}" \
|
HAKO_MIR_BUILDER_NORMALIZE_TAG="${HAKO_MIR_BUILDER_NORMALIZE_TAG:-}" \
|
||||||
HAKO_MIR_BUILDER_DEBUG="${HAKO_MIR_BUILDER_DEBUG:-}" \
|
HAKO_MIR_BUILDER_DEBUG="${HAKO_MIR_BUILDER_DEBUG:-}" \
|
||||||
NYASH_DISABLE_PLUGINS="${NYASH_DISABLE_PLUGINS:-0}" NYASH_FILEBOX_MODE="core-ro" HAKO_PROVIDER_POLICY="safe-core-first" \
|
NYASH_DISABLE_PLUGINS="${NYASH_DISABLE_PLUGINS:-0}" NYASH_FILEBOX_MODE="core-ro" HAKO_PROVIDER_POLICY="safe-core-first" \
|
||||||
NYASH_ENABLE_USING=1 HAKO_ENABLE_USING=1 \
|
NYASH_ENABLE_USING=1 HAKO_ENABLE_USING=1 HAKO_USING_RESOLVER_FIRST=1 \
|
||||||
NYASH_PARSER_STAGE3=1 HAKO_PARSER_STAGE3=1 NYASH_PARSER_ALLOW_SEMICOLON=1 \
|
NYASH_PARSER_STAGE3=1 HAKO_PARSER_STAGE3=1 NYASH_PARSER_ALLOW_SEMICOLON=1 \
|
||||||
NYASH_USE_NY_COMPILER=0 HAKO_USE_NY_COMPILER=0 NYASH_DISABLE_NY_COMPILER=1 HAKO_DISABLE_NY_COMPILER=1 \
|
NYASH_USE_NY_COMPILER=0 HAKO_USE_NY_COMPILER=0 NYASH_DISABLE_NY_COMPILER=1 HAKO_DISABLE_NY_COMPILER=1 \
|
||||||
NYASH_MACRO_DISABLE=1 HAKO_MACRO_DISABLE=1 \
|
NYASH_MACRO_DISABLE=1 HAKO_MACRO_DISABLE=1 \
|
||||||
@ -523,6 +548,16 @@ HCODE
|
|||||||
head -n 20 "$tmp_stderr" >&2 || true
|
head -n 20 "$tmp_stderr" >&2 || true
|
||||||
echo "[builder/selfhost-first:fail:stderr] Last 40 lines:" >&2
|
echo "[builder/selfhost-first:fail:stderr] Last 40 lines:" >&2
|
||||||
tail -n 40 "$tmp_stderr" >&2 || true
|
tail -n 40 "$tmp_stderr" >&2 || true
|
||||||
|
# Pretty diagnostics for missing using modules
|
||||||
|
USING_MISSING=$(cat "$tmp_stdout" "$tmp_stderr" 2>/dev/null | grep -Eo "\[using\] not found: '[^']+'" | sort -u || true)
|
||||||
|
if [ -n "$USING_MISSING" ]; then
|
||||||
|
echo "[builder/selfhost-first:diagnose] Missing using modules detected:" >&2
|
||||||
|
echo "$USING_MISSING" >&2
|
||||||
|
echo "[builder/selfhost-first:diagnose] Hint: enable resolver-first (HAKO_USING_RESOLVER_FIRST=1) and ensure nyash.toml maps these modules." >&2
|
||||||
|
echo "[builder/selfhost-first:diagnose] Example entries (nyash.toml [modules]):" >&2
|
||||||
|
echo " \"selfhost.shared.json.core.json_canonical\" = \"lang/src/shared/json/json_canonical_box.hako\"" >&2
|
||||||
|
echo " \"selfhost.shared.common.common_imports\" = \"lang/src/shared/common/common_imports.hako\"" >&2
|
||||||
|
fi
|
||||||
fi
|
fi
|
||||||
fi
|
fi
|
||||||
# Don't return immediately - check for fallback below
|
# Don't return immediately - check for fallback below
|
||||||
@ -597,6 +632,7 @@ HCODE
|
|||||||
NYASH_DISABLE_PLUGINS="${NYASH_DISABLE_PLUGINS:-0}" NYASH_FILEBOX_MODE="core-ro" \
|
NYASH_DISABLE_PLUGINS="${NYASH_DISABLE_PLUGINS:-0}" NYASH_FILEBOX_MODE="core-ro" \
|
||||||
NYASH_PARSER_STAGE3=1 HAKO_PARSER_STAGE3=1 NYASH_PARSER_ALLOW_SEMICOLON=1 \
|
NYASH_PARSER_STAGE3=1 HAKO_PARSER_STAGE3=1 NYASH_PARSER_ALLOW_SEMICOLON=1 \
|
||||||
HAKO_BUILDER_PROGRAM_JSON="$prog_json" \
|
HAKO_BUILDER_PROGRAM_JSON="$prog_json" \
|
||||||
|
NYASH_ENABLE_USING=1 HAKO_ENABLE_USING=1 HAKO_USING_RESOLVER_FIRST=1 \
|
||||||
"$NYASH_BIN" --backend vm "$tmp_hako" 2>"$tmp_stderr" | tee "$tmp_stdout" >/dev/null)
|
"$NYASH_BIN" --backend vm "$tmp_hako" 2>"$tmp_stderr" | tee "$tmp_stdout" >/dev/null)
|
||||||
local rc=$?
|
local rc=$?
|
||||||
set -e
|
set -e
|
||||||
@ -616,6 +652,12 @@ HCODE
|
|||||||
fi
|
fi
|
||||||
echo "[provider/emit:fail:stdout] Last 40 lines:" >&2
|
echo "[provider/emit:fail:stdout] Last 40 lines:" >&2
|
||||||
tail -n 40 "$tmp_stdout" >&2 || true
|
tail -n 40 "$tmp_stdout" >&2 || true
|
||||||
|
USING_MISSING=$(cat "$tmp_stdout" "$tmp_stderr" 2>/dev/null | grep -Eo "\[using\] not found: '[^']+'" | sort -u || true)
|
||||||
|
if [ -n "$USING_MISSING" ]; then
|
||||||
|
echo "[provider/emit:diagnose] Missing using modules detected:" >&2
|
||||||
|
echo "$USING_MISSING" >&2
|
||||||
|
echo "[provider/emit:diagnose] Hint: enable resolver-first (HAKO_USING_RESOLVER_FIRST=1) and ensure nyash.toml maps these modules." >&2
|
||||||
|
fi
|
||||||
fi
|
fi
|
||||||
return 1
|
return 1
|
||||||
fi
|
fi
|
||||||
@ -629,7 +671,52 @@ HCODE
|
|||||||
return 1
|
return 1
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
# Write raw MIR JSON first
|
||||||
printf '%s' "$mir" > "$out_path"
|
printf '%s' "$mir" > "$out_path"
|
||||||
|
|
||||||
|
# Apply AotPrep via run_json when enabled
|
||||||
|
if [ "${HAKO_APPLY_AOT_PREP:-0}" = "1" ] || [ "${NYASH_AOT_COLLECTIONS_HOT:-0}" = "1" ] || [ "${NYASH_LLVM_FAST:-0}" = "1" ] || [ "${NYASH_MIR_LOOP_HOIST:-0}" = "1" ]; then
|
||||||
|
[ "${HAKO_SELFHOST_TRACE:-0}" = "1" ] && echo "[provider/emit:trace] Applying AotPrep(run_json)..." >&2
|
||||||
|
local aot_runner; aot_runner=$(mktemp --suffix=.hako)
|
||||||
|
cat > "$aot_runner" << 'EOF'
|
||||||
|
using selfhost.llvm.ir.aot_prep as AotPrepBox
|
||||||
|
static box Main { method main(args) {
|
||||||
|
local src = env.get("HAKO_PREP_INPUT")
|
||||||
|
if src == null || src == "" { print("[prep:skip:empty]"); return 0 }
|
||||||
|
local out = AotPrepBox.run_json(src)
|
||||||
|
print("[PREP_OUT_BEGIN]")
|
||||||
|
print(out)
|
||||||
|
print("[PREP_OUT_END]")
|
||||||
|
return 0
|
||||||
|
} }
|
||||||
|
EOF
|
||||||
|
local aot_rc=0
|
||||||
|
local prep_stdout; prep_stdout=$(mktemp)
|
||||||
|
local prep_stderr; prep_stderr=$(mktemp)
|
||||||
|
set +e
|
||||||
|
HAKO_PREP_INPUT="$(cat "$out_path")" \
|
||||||
|
NYASH_FILEBOX_MODE=core-ro \
|
||||||
|
NYASH_PARSER_STAGE3=1 HAKO_PARSER_STAGE3=1 NYASH_PARSER_ALLOW_SEMICOLON=1 \
|
||||||
|
NYASH_ENABLE_USING=1 HAKO_ENABLE_USING=1 HAKO_USING_RESOLVER_FIRST=1 \
|
||||||
|
NYASH_AOT_COLLECTIONS_HOT=${NYASH_AOT_COLLECTIONS_HOT:-0} NYASH_LLVM_FAST=${NYASH_LLVM_FAST:-0} NYASH_MIR_LOOP_HOIST=${NYASH_MIR_LOOP_HOIST:-0} NYASH_AOT_MAP_KEY_MODE=${NYASH_AOT_MAP_KEY_MODE:-auto} \
|
||||||
|
"$NYASH_BIN" --backend vm "$aot_runner" >"$prep_stdout" 2>"$prep_stderr"
|
||||||
|
aot_rc=$?
|
||||||
|
set -e
|
||||||
|
if [ $aot_rc -eq 0 ] && grep -q "\[PREP_OUT_BEGIN\]" "$prep_stdout" && grep -q "\[PREP_OUT_END\]" "$prep_stdout"; then
|
||||||
|
awk '/\[PREP_OUT_BEGIN\]/{flag=1;next}/\[PREP_OUT_END\]/{flag=0}flag' "$prep_stdout" > "$out_path"
|
||||||
|
[ "${HAKO_SELFHOST_TRACE:-0}" = "1" ] && echo "[prep:ok] AotPrep applied (run_json)" >&2
|
||||||
|
else
|
||||||
|
if [ "${HAKO_SELFHOST_TRACE:-0}" = "1" ]; then
|
||||||
|
echo "[prep:warn] AotPrep failed (rc=$aot_rc), using original MIR" >&2
|
||||||
|
if [ -s "$prep_stderr" ]; then
|
||||||
|
echo "[prep:stderr:tail]" >&2
|
||||||
|
tail -n 60 "$prep_stderr" >&2 || true
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
rm -f "$aot_runner" "$prep_stdout" "$prep_stderr" 2>/dev/null || true
|
||||||
|
fi
|
||||||
|
|
||||||
echo "[OK] MIR JSON written (delegate:provider): $out_path"
|
echo "[OK] MIR JSON written (delegate:provider): $out_path"
|
||||||
return 0
|
return 0
|
||||||
}
|
}
|
||||||
|
|||||||
@ -5,9 +5,9 @@ SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
|
|||||||
ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
|
ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
|
||||||
BIN="$ROOT/target/release/hakorune"
|
BIN="$ROOT/target/release/hakorune"
|
||||||
|
|
||||||
usage() { echo "Usage: $0 --case {loop|strlen|box|branch|call|stringchain|arraymap|chip8|kilo|sieve|matmul|linidx|maplin} [--n N] [--runs R] [--backend {llvm|vm}] [--exe]"; }
|
usage() { echo "Usage: $0 --case {loop|strlen|box|branch|call|stringchain|arraymap|chip8|kilo|sieve|matmul|linidx|maplin} [--n N] [--runs R] [--backend {llvm|vm}] [--exe] [--budget-ms B]"; }
|
||||||
|
|
||||||
CASE="loop"; N=5000000; RUNS=5; BACKEND="llvm"; EXE_MODE=0
|
CASE="loop"; N=5000000; RUNS=5; BACKEND="llvm"; EXE_MODE=0; BUDGET_MS=0
|
||||||
while [[ $# -gt 0 ]]; do
|
while [[ $# -gt 0 ]]; do
|
||||||
case "$1" in
|
case "$1" in
|
||||||
--case) CASE="$2"; shift 2;;
|
--case) CASE="$2"; shift 2;;
|
||||||
@ -15,6 +15,7 @@ while [[ $# -gt 0 ]]; do
|
|||||||
--runs) RUNS="$2"; shift 2;;
|
--runs) RUNS="$2"; shift 2;;
|
||||||
--backend) BACKEND="$2"; shift 2;;
|
--backend) BACKEND="$2"; shift 2;;
|
||||||
--exe) EXE_MODE=1; shift 1;;
|
--exe) EXE_MODE=1; shift 1;;
|
||||||
|
--budget-ms) BUDGET_MS="$2"; shift 2;;
|
||||||
--help|-h) usage; exit 0;;
|
--help|-h) usage; exit 0;;
|
||||||
*) echo "Unknown arg: $1"; usage; exit 2;;
|
*) echo "Unknown arg: $1"; usage; exit 2;;
|
||||||
esac
|
esac
|
||||||
@ -46,6 +47,7 @@ bench_hako() {
|
|||||||
fi
|
fi
|
||||||
PYTHONPATH="${PYTHONPATH:-$ROOT}" \
|
PYTHONPATH="${PYTHONPATH:-$ROOT}" \
|
||||||
NYASH_AOT_COLLECTIONS_HOT=1 NYASH_LLVM_FAST=1 NYASH_MIR_LOOP_HOIST=1 NYASH_AOT_MAP_KEY_MODE=auto \
|
NYASH_AOT_COLLECTIONS_HOT=1 NYASH_LLVM_FAST=1 NYASH_MIR_LOOP_HOIST=1 NYASH_AOT_MAP_KEY_MODE=auto \
|
||||||
|
NYASH_ENABLE_USING=1 HAKO_ENABLE_USING=1 HAKO_USING_RESOLVER_FIRST=1 \
|
||||||
NYASH_NY_LLVM_COMPILER="${NYASH_NY_LLVM_COMPILER:-$ROOT/target/release/ny-llvmc}" \
|
NYASH_NY_LLVM_COMPILER="${NYASH_NY_LLVM_COMPILER:-$ROOT/target/release/ny-llvmc}" \
|
||||||
NYASH_EMIT_EXE_NYRT="${NYASH_EMIT_EXE_NYRT:-$ROOT/target/release}" \
|
NYASH_EMIT_EXE_NYRT="${NYASH_EMIT_EXE_NYRT:-$ROOT/target/release}" \
|
||||||
NYASH_LLVM_USE_HARNESS=1 "$BIN" --backend llvm "$file" >/dev/null 2>&1
|
NYASH_LLVM_USE_HARNESS=1 "$BIN" --backend llvm "$file" >/dev/null 2>&1
|
||||||
@ -79,6 +81,17 @@ time_exe_run() {
|
|||||||
mktemp_hako() { mktemp --suffix .hako; }
|
mktemp_hako() { mktemp --suffix .hako; }
|
||||||
mktemp_c() { mktemp --suffix .c; }
|
mktemp_c() { mktemp --suffix .c; }
|
||||||
|
|
||||||
|
# Fallback diagnostics for EXE flow: check MIR JSON for externcall/boxcall/jsonfrag
|
||||||
|
diag_mir_json() {
|
||||||
|
local json="$1"
|
||||||
|
local rewrites; rewrites=$(rg -c '"op":"externcall"' "$json" 2>/dev/null || echo 0)
|
||||||
|
local arrays; arrays=$(rg -c 'nyash\.array\.' "$json" 2>/dev/null || echo 0)
|
||||||
|
local maps; maps=$(rg -c 'nyash\.map\.' "$json" 2>/dev/null || echo 0)
|
||||||
|
local boxcalls; boxcalls=$(rg -c '"op":"boxcall"' "$json" 2>/dev/null || echo 0)
|
||||||
|
local jsonfrag; jsonfrag=$(rg -c '\[emit/jsonfrag\]' "$json" 2>/dev/null || echo 0)
|
||||||
|
echo "[diag] externcall=${rewrites} (array=${arrays}, map=${maps}), boxcall_left=${boxcalls}, jsonfrag=${jsonfrag}" >&2
|
||||||
|
}
|
||||||
|
|
||||||
case "$CASE" in
|
case "$CASE" in
|
||||||
loop)
|
loop)
|
||||||
HAKO_FILE=$(mktemp_hako)
|
HAKO_FILE=$(mktemp_hako)
|
||||||
@ -684,10 +697,16 @@ C
|
|||||||
;;
|
;;
|
||||||
kilo)
|
kilo)
|
||||||
# kilo は C 参照側が重く、デフォルト N=5_000_000 だと実行が非常に長くなる。
|
# kilo は C 参照側が重く、デフォルト N=5_000_000 だと実行が非常に長くなる。
|
||||||
# EXE モードでかつ N が未指定(既定値)の場合は、計測が現実的になるよう N を下げる。
|
# Phase 21.5 最適化フェーズでは LLVM 系ベンチは EXE 経路のみを対象にする。
|
||||||
if [[ "$EXE_MODE" = "1" && "$N" = "5000000" ]]; then
|
# - LLVM backend かつ N が既定値(5_000_000)の場合は、常に N=200_000 に下げる。
|
||||||
|
# - LLVM backend で EXE_MODE=0 の場合も、EXE 経路へ強制昇格する(VM フォールバック禁止)。
|
||||||
|
if [[ "$BACKEND" = "llvm" && "$N" = "5000000" ]]; then
|
||||||
N=200000
|
N=200000
|
||||||
fi
|
fi
|
||||||
|
if [[ "$BACKEND" = "llvm" && "$EXE_MODE" = "0" ]]; then
|
||||||
|
echo "[info] kilo: forcing --exe for llvm backend (Phase 21.5 optimization)" >&2
|
||||||
|
EXE_MODE=1
|
||||||
|
fi
|
||||||
HAKO_FILE=$(mktemp_hako)
|
HAKO_FILE=$(mktemp_hako)
|
||||||
cat >"$HAKO_FILE" <<HAKO
|
cat >"$HAKO_FILE" <<HAKO
|
||||||
box KiloBench {
|
box KiloBench {
|
||||||
@ -814,45 +833,31 @@ if [[ "$EXE_MODE" = "1" ]]; then
|
|||||||
ensure_nyrt
|
ensure_nyrt
|
||||||
HAKO_EXE=$(mktemp --suffix .out)
|
HAKO_EXE=$(mktemp --suffix .out)
|
||||||
TMP_JSON=$(mktemp --suffix .json)
|
TMP_JSON=$(mktemp --suffix .json)
|
||||||
# Default: use jsonfrag (stable/fast). Set PERF_USE_PROVIDER=1 to prefer provider/selfhost MIR.
|
# Default: use provider-first with AotPrep for maximum optimization
|
||||||
|
# DEBUG: Show file paths
|
||||||
|
echo "[matmul/debug] HAKO_FILE=$HAKO_FILE TMP_JSON=$TMP_JSON" >&2
|
||||||
if ! \
|
if ! \
|
||||||
|
HAKO_SELFHOST_TRACE=1 \
|
||||||
HAKO_SELFHOST_BUILDER_FIRST=0 HAKO_SELFHOST_NO_DELEGATE=0 \
|
HAKO_SELFHOST_BUILDER_FIRST=0 HAKO_SELFHOST_NO_DELEGATE=0 \
|
||||||
|
HAKO_APPLY_AOT_PREP=1 \
|
||||||
NYASH_AOT_COLLECTIONS_HOT=1 NYASH_LLVM_FAST=1 NYASH_MIR_LOOP_HOIST=1 NYASH_AOT_MAP_KEY_MODE=auto \
|
NYASH_AOT_COLLECTIONS_HOT=1 NYASH_LLVM_FAST=1 NYASH_MIR_LOOP_HOIST=1 NYASH_AOT_MAP_KEY_MODE=auto \
|
||||||
HAKO_MIR_BUILDER_LOOP_JSONFRAG="${HAKO_MIR_BUILDER_LOOP_JSONFRAG:-$([[ "${PERF_USE_JSONFRAG:-0}" = 1 ]] && echo 1 || echo 0)}" \
|
HAKO_MIR_BUILDER_LOOP_JSONFRAG="${HAKO_MIR_BUILDER_LOOP_JSONFRAG:-$([[ "${PERF_USE_JSONFRAG:-0}" = 1 ]] && echo 1 || echo 0)}" \
|
||||||
HAKO_MIR_BUILDER_LOOP_FORCE_JSONFRAG="${HAKO_MIR_BUILDER_LOOP_FORCE_JSONFRAG:-$([[ "${PERF_USE_JSONFRAG:-0}" = 1 ]] && echo 1 || echo 0)}" \
|
HAKO_MIR_BUILDER_LOOP_FORCE_JSONFRAG="${HAKO_MIR_BUILDER_LOOP_FORCE_JSONFRAG:-$([[ "${PERF_USE_JSONFRAG:-0}" = 1 ]] && echo 1 || echo 0)}" \
|
||||||
HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE="${HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE:-1}" \
|
HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE="${HAKO_MIR_BUILDER_JSONFRAG_NORMALIZE:-1}" \
|
||||||
HAKO_MIR_BUILDER_JSONFRAG_PURIFY="${HAKO_MIR_BUILDER_JSONFRAG_PURIFY:-1}" \
|
HAKO_MIR_BUILDER_JSONFRAG_PURIFY="${HAKO_MIR_BUILDER_JSONFRAG_PURIFY:-1}" \
|
||||||
NYASH_ENABLE_USING=1 HAKO_ENABLE_USING=1 \
|
NYASH_ENABLE_USING=1 HAKO_ENABLE_USING=1 \
|
||||||
NYASH_JSON_ONLY=1 bash "$ROOT/tools/hakorune_emit_mir.sh" "$HAKO_FILE" "$TMP_JSON" >/dev/null 2>&1; then
|
NYASH_JSON_ONLY=1 bash "$ROOT/tools/hakorune_emit_mir.sh" "$HAKO_FILE" "$TMP_JSON" 2>&1 | tee /tmp/matmul_emit_log.txt | grep -E "\[prep:|provider/emit\]" >&2; then
|
||||||
echo "[FAIL] emit MIR JSON failed (hint: set PERF_USE_PROVIDER=1 or HAKO_MIR_BUILDER_LOOP_FORCE_JSONFRAG=1)" >&2; exit 3
|
echo "[FAIL] emit MIR JSON failed (hint: set PERF_USE_PROVIDER=1 or HAKO_MIR_BUILDER_LOOP_FORCE_JSONFRAG=1)" >&2; exit 3
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Optional AOT prep stage: apply pre-normalization/passes on MIR JSON before building EXE
|
# Quick diagnostics: ensure AotPrep rewrites are present and jsonfrag fallback is not used
|
||||||
# Enabled when fast/hoist/collections_hot are ON (we already set them explicitly above)
|
# DEBUG: Copy TMP_JSON for inspection
|
||||||
# This ensures EXE path receives the same optimized JSON as harness runs.
|
cp "$TMP_JSON" /tmp/matmul_from_perf.json 2>/dev/null || true
|
||||||
(
|
echo "[matmul/debug] TMP_JSON copied to /tmp/matmul_from_perf.json" >&2
|
||||||
PREP_HAKO=$(mktemp --suffix .hako)
|
echo "[matmul/debug] Direct externcall count: $(grep -o '"op":"externcall"' "$TMP_JSON" 2>/dev/null | wc -l)" >&2
|
||||||
cat >"$PREP_HAKO" <<'HAKO'
|
diag_mir_json "$TMP_JSON"
|
||||||
using selfhost.llvm.ir.aot_prep as AotPrepBox
|
|
||||||
static box Main { method main(args) {
|
# AotPrep is now applied in hakorune_emit_mir.sh via HAKO_APPLY_AOT_PREP=1
|
||||||
local in = args.get(0)
|
|
||||||
local out = AotPrepBox.prep(in)
|
|
||||||
if out == null { println("[prep:fail]") return 1 }
|
|
||||||
println(out)
|
|
||||||
return 0
|
|
||||||
} }
|
|
||||||
HAKO
|
|
||||||
set +e
|
|
||||||
OUT_PATH=$(NYASH_ENABLE_USING=1 HAKO_ENABLE_USING=1 NYASH_FILEBOX_MODE=core-ro \
|
|
||||||
NYASH_AOT_COLLECTIONS_HOT=1 NYASH_LLVM_FAST=1 NYASH_MIR_LOOP_HOIST=1 NYASH_AOT_MAP_KEY_MODE=auto \
|
|
||||||
"$BIN" --backend vm "$PREP_HAKO" -- "$TMP_JSON" 2>/dev/null | tail -n 1)
|
|
||||||
rc=$?
|
|
||||||
set -e
|
|
||||||
if [[ $rc -eq 0 && -f "$OUT_PATH" ]]; then
|
|
||||||
mv -f "$OUT_PATH" "$TMP_JSON"
|
|
||||||
fi
|
|
||||||
rm -f "$PREP_HAKO" 2>/dev/null || true
|
|
||||||
)
|
|
||||||
# Build EXE via helper (selects crate backend ny-llvmc under the hood)
|
# Build EXE via helper (selects crate backend ny-llvmc under the hood)
|
||||||
if ! NYASH_LLVM_BACKEND=crate NYASH_LLVM_SKIP_BUILD=1 \
|
if ! NYASH_LLVM_BACKEND=crate NYASH_LLVM_SKIP_BUILD=1 \
|
||||||
NYASH_NY_LLVM_COMPILER="${NYASH_NY_LLVM_COMPILER:-$ROOT/target/release/ny-llvmc}" \
|
NYASH_NY_LLVM_COMPILER="${NYASH_NY_LLVM_COMPILER:-$ROOT/target/release/ny-llvmc}" \
|
||||||
@ -862,6 +867,20 @@ HAKO
|
|||||||
echo "[FAIL] build Nyash EXE failed (crate backend). Ensure ny-llvmc exists or try NYASH_LLVM_BACKEND=crate." >&2; exit 3
|
echo "[FAIL] build Nyash EXE failed (crate backend). Ensure ny-llvmc exists or try NYASH_LLVM_BACKEND=crate." >&2; exit 3
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
# Execute runs. If BUDGET_MS>0, keep running until budget is exhausted.
|
||||||
|
if [[ "$BUDGET_MS" != "0" ]]; then
|
||||||
|
i=0; used=0
|
||||||
|
while true; do
|
||||||
|
i=$((i+1))
|
||||||
|
t_c=$(time_exe_run "$C_EXE"); t_h=$(time_exe_run "$HAKO_EXE")
|
||||||
|
sum_c=$((sum_c + t_c)); sum_h=$((sum_h + t_h)); used=$((used + t_h))
|
||||||
|
if command -v python3 >/dev/null 2>&1; then ratio=$(python3 -c "print(round(${t_h}/max(${t_c},1)*100,2))" 2>/dev/null || echo NA); else ratio=NA; fi
|
||||||
|
echo "run#$i c=${t_c}ms hak=${t_h}ms ratio=${ratio}% (budget used=${used}/${BUDGET_MS}ms)" >&2
|
||||||
|
if [[ $used -ge $BUDGET_MS ]]; then RUNS=$i; break; fi
|
||||||
|
# Safety valve to avoid infinite loop if t_h is 0ms
|
||||||
|
if [[ $i -ge 999 ]]; then RUNS=$i; break; fi
|
||||||
|
done
|
||||||
|
else
|
||||||
for i in $(seq 1 "$RUNS"); do
|
for i in $(seq 1 "$RUNS"); do
|
||||||
t_c=$(time_exe_run "$C_EXE")
|
t_c=$(time_exe_run "$C_EXE")
|
||||||
t_h=$(time_exe_run "$HAKO_EXE")
|
t_h=$(time_exe_run "$HAKO_EXE")
|
||||||
@ -873,6 +892,7 @@ HAKO
|
|||||||
fi
|
fi
|
||||||
echo "run#$i c=${t_c}ms hak=${t_h}ms ratio=${ratio}%" >&2
|
echo "run#$i c=${t_c}ms hak=${t_h}ms ratio=${ratio}%" >&2
|
||||||
done
|
done
|
||||||
|
fi
|
||||||
avg_c=$((sum_c / RUNS)); avg_h=$((sum_h / RUNS))
|
avg_c=$((sum_c / RUNS)); avg_h=$((sum_h / RUNS))
|
||||||
echo "avg c=${avg_c}ms hak=${avg_h}ms" >&2
|
echo "avg c=${avg_c}ms hak=${avg_h}ms" >&2
|
||||||
if [ "$avg_c" -lt 5 ]; then
|
if [ "$avg_c" -lt 5 ]; then
|
||||||
|
|||||||
Reference in New Issue
Block a user