Should EVALUATE:STEP bomb on an error?

Currently if you try something bad in EVALUATE:STEP, it panics (exception):

>> evaluate:step [unspaced null]
** PANIC: unspaced requires line argument to not be null

As EVALUATE:STEP is a relatively low-level service, it would seem more likely one would want to handle the error on the same basis as other possible return values:

>> [position product]: evaluate:step [unspaced null foo bar]
== [foo bar]

>> product
== <<unspaced-null>>  ; some error you can handle

In this case, the bomb isn't particularly informative and seems reasonable to say 'user beware—assume errors will happen'. It's kind of difficult to work around too.

This sort of puts it in the same class as RESCUE with different semantics:

rescue [ok ok something bad] => [**something-bad null]
rescue [ok ok] => [null ok]
evaluate [something bad ok ok] => [[ok ok] **something-bad ]
evaluate [ok ok] => [ok [ok]]

I guess the wrinkle here is how do you determine where something bad ends and ok ok resumes? That may or may not be obvious.

1 Like

Quite right.

Rebol can't measure the span of a single step of evaluation without having the side-effect of running it. That's just the nature of the beast.

I'd once tried making a "neutral" mode of the evaluator which would only gather arguments but not have any side effects. This would be able to count through the function arguments, and the arguments to functions that were those arguments, and so on:

 >> eval-neutral [print "Hi" print "Bye"]
 == [print "Bye"]   ; no actual printing done, but arity of PRINT exploited

 >> eval-neutral [print "Bye"]
 == []

But this falls down the moment you run code which changes the definitions:

 >> redefine-print: func [] [print: does [print "PRINT is arity-0 now"]]

 >> eval-neutral [redefine-print print "Bye"]
 == [print "Bye"]  ; didn't actually *run* REDEFINE-PRINT

 >> eval-neutral [print "Bye"]
 == []  ; should have only stepped past PRINT, leaving "Bye"

Some aspect of this foundational problem applies any time you try to resume things. Hence, the only granularity of resumption can be the end of BLOCK!/GROUP!.

(It's this "we can't know the limits of boundaries of expressions" that tripped up the idea of making the MATH dialect for precedence reordering able to mix expressions without putting all executable expressions between the operators in GROUP!s.)

I've written some ideas on making a more formal contract between callers and things that error. Perhaps you would be able to weigh in there.

We can't to do anything about "panics" (hard failures, which can occur anywhere in the middle of an incomplete expression, at any stack level, even from just looking up a word that's a typo). No hope of graceful handling there...

...BUT we can theoretically do something about the new and novel definitional ERROR! antiforms :atom: which emerge from the overall step, that have not yet been promoted to a PANIC. Because the ERROR! antiform is a legitimate evaluation product, and the flow of control has not yet been broken.

(And luckily, all meaningfully interceptible errors are definitional. Read the above link to understand why.)

Though It Turns Out To Be Tricky. :thinking:

EVALUATE:STEP gives back a multi-result pack, with the position as the first pack item, and the synthesized value as the second.

Usually if a function returns an ERROR!... we tend to say that's the only thing it can return. But in the case of EVALUATE:STEP we want both an error antiform and a position.

This means putting ERROR! antiforms in PACK, which is a little bit of a thorn.

Are There Alternative Approaches To ERROR! in PACK?

The expression completion position could be a field in the error itself.

Using some overlong descriptive names to illustrate:

[pos value]: evaluate:step [1 / 0 10 + 20] except e -> [
     if e.id = 'raised-error-in-evaluate-next [
         assert [e.error.id = 'divide-by-zero]  ; actual error is wrapped in e
         pos: e.resume-position  ; e.g. [10 + 20]
     ] else [
         panic e  ; some other error
    ]
]

There are more mundane approaches, such as adding :EXCEPT such that EVALUATE:STEP:EXCEPT produces a ~[pos value warning]~ pack instead of just a ~[pos value]~ pack. Then you have to remember to check that the warning (non-antiform ERROR!) is not NULL on all paths. That sounds less foolproof.

I decided to back down on the "no error! antiforms in packs" policy, instead saying that packs containing ERROR! antiforms can't be elided (explicitly with ELIDE, or implicitly by multi-step evaluations).

The trick is that if you DECAY a pack to its first element (or try to ELIDE it), it first checks to see if any of the elements are raised errors...or packs with raised errors in them...and promotes them to PANICs. This way you don't accidentally gloss over them...

So it's actually pretty trivial to accomplish the original desire now--another home run for isotopes! :baseball:

>> name: null

>> block: [1 + 2 1 / 0 10 + 20]

>> collect [
       while [[{block} ^result]: eval:step block] [
           if error? ^result [
               keep quasi (disarm ^result).id
           ] else [
               keep result
           ]
       ]
   ]
== [3 ~zero-divide~ 30]

That's pretty good. There's a bit of a question though about the "circling" of the block to avoid trying to decay the pack with an error in it... that might need an additional notation to say you want to ignore the error:

while [[{block} ~^result~]: eval:step block] [...]

If you're willing to say that, maybe it's enough to do that and the decay works without needing you to circle anything (though it would mean synthesizing a new pack with a TRASH! in it, or similar, as the overall expression result):

while [[block ~^result~]: eval:step block] [...]

A Note On Why You Can't Intercept UNSPACED NULL

unspaced ["hello" null] gives a definitional error due to the choice of UNSPACED to return a definitional error in that case. But unspaced null causes a parameter type checking error, and is a hard failure. Type check errors are not definitional, which is by design--and we would not want to do otherwise.

It would be like making typos interceptible. Imagine if typos raised definitional errors. You'd say try takke block and the TRY would suppress the "no such variable as TAKKE" error and turn it to NULL. Then BLOCK would be evaluated in isolation.

You only want definitional errors to come from inside the functions themselves once they've started running and have all their arguments.

Theoretically, UNSPACED could make NULL one of its accepted parameter types. Then from the inside of its implementation, it could raise a definitional error related to it. I'll leave it as an exercise for the reader to think about why not to do that. :slight_smile:

1 Like