Why Doesn't `(third [1 2])` Trigger A Range Check Error?

Quick question for clarity. I assume that

compose [ 1 (1 / 0) 3 ]

would be a math error. Why is

compose [ 1 (third [ 1 2 ]) 3 ]

not some kind of a range check error?

1 Like

Systemically, we consider NULL to be a "soft" form of failure. It serves a signaling role that's a little like what NONE! tried to do in historical Rebol, but since it's not an ANY-VALUE! more functions treat it as an error:

>> third [1 2]
== ~null~  ; anti

>> append [a b c] third [1 2]
** PANIC: append requires value argument to not be null

>> compose [1 (third [1 2]) 3]
** PANIC: Cannot use NULL in COMPOSE slots

The theory is that places which require an ANY-VALUE! will error down the line, and that having a lot of constructs that make it easier to react to the "soft failure" is a better tradeoff.

One of those constructs is OPT, which converts NULL to VOID, allowing a seamless opt-out of things like APPEND or COMPOSE:

>> append [a b c] opt third [1 2]
== [a b c]

>> compose [1 (opt third [1 2]) 3]
== [1 3]

So I guess I'll just say that experience has borne out that soft failure is a more convenient for functions like THIRD than PANIC'ing. If anyone finds a case where they don't think so, I'd be interested to see it.

Nowadays there's an option on the table for raising definitional errors, which could easily be turned into nulls with try third (...)

>> third [a b]
** Error: Cannot pick 3 of BLOCK! (or somesuch)

>> try third [a b]
== ~null~  ; anti

(I've mentioned that this is different from the limited design of Rebol's historical errors, e.g. if you said attempt [third [a b]] in an R3-Alpha or Red that errored, it would give null back...but so would attempt [thirrd [a b]] because the lack of definitional-ism means it couldn't discern errors arising from a direct call from typos or other downstream errors.)

For cases where you would have been trusting a THIRD that returns NULL to trigger downstream errors, this gives better error locality. e.g. append [a b c] third [d e] would blame the THIRD, not the append.

And for cases where you might not be able to trust that NULL wouldn't be interpreted as an error downstream, it would be more robust. Also you'd give readers a clue at the callsite when you actually were intending that the operation might fail by the presence of the TRY.

This comes down down to the behavior of PICK (since FIRST is a specialization of PICK). I was just thinking about that with respect to objects:

>> obj: make object! [x: 10]

>> pick obj 'x
== 10

>> pick obj 'y
** Error: y is not a field of object

>> try pick obj 'y
== ~null~  ; anti

If we raised a definitional error out of pick in this case, then you could try pick and get null. It would conflate with the case where the variable existed and actually held null. (But if that was an issue, you could use EXCEPT or another handler to handle the definitional error specifically.)

When you think about PICK in general beyond just the block case, it does seem like more uniformly giving an error which can be "softened" via TRY would be a good idea.

I'll give it a shot and see what the effects are.

1 Like

I've been living with this pattern for a year-and-a-half now.

It has the obvious upside of error locality...

...BUT recent changes have made it so that NULL is accepted many fewer places. If a value is null, you can't test if it's a BLOCK? ... for instance... unless you OPT it.

So nulls are accepted almost nowhere, besides conditional places... where they are now the only falsey type. So if you're passing a NULL to a conditional slot, why should you need to approve it as "possibly null"? You're testing it!

This means you wind up double-paying for documenting optionality in your source:

append [a b c] opt try pick block 3

The main benefit you're getting is when you separate the operations significantly:

var: pick block 3   ; let's say this makes null
...
... ; time passes
...
append [a b c] var  ; !!! error, null

So in that case, having to put a TRY at the place where you're assigning VAR gets you a bit of locality.

Is The Locality Is Really Worth It?

As I've asked now-and-again: Are nulls so special? What if you get an INTEGER! back and you didn't want an integer, but later you see it's an INTEGER!? That's even worse--because integers have no safeguard of stopping from APPENDing-etc.

We might be able to mitigate the loss of locality by making NULLs carry more information about where they were created, making them even more suitable.

The Missing Object Field Is A Different Beast

This is different than picking out of blocks, because blocks can't hold null, so there's no conflation by default. I think a definitional error you have to react to (e.g. with TRY) here is a nice solution.

A devil's advocate for not having to say TRY might say:

  • BLOCK! elements can't be NULL. So we're able to give a NULL antiform back from out-of-range BLOCK! picks that unambiguously indicates the element wasn't there.

  • Almost all function arguments (that aren't specifically conditional tests for NULL) reject NULL... including even type tests like INTEGER?.

  • Hence the ergonomics of "just returning null" win out as convenient and expedient, vs. forcing callers to "defuse" a range check error.

  • The balance of the decision changes when NULL is in-band for the target type (e.g. picking fields out of objects, which can hold any stable antiform). At that point, things that are conceptually "absent" or "out-out-range" need to raise an error.

But I Find Myself Reluctant To Go Back To That

Searching for TRY PICK in the codebase to consider removing it, one of the first examples I found was:

>> winsock: make library! %/C/Windows/System32/wsock32.dll
== &[library! %/C/Windows/System32/wsock32.dll]

>> pick winsock "gethostbyname"
== &[handle!]

>> pick winsock "gethostbynickname"
** Error: Couldn't find "gethostbynickname"
      in &[library! %/C/Windows/System32/wsock32.dll]

>> try pick winsock "gethostbynickname"
== ~null~  ; anti

You see that the ERROR! can be much more informative than NULL.

And it occurs to me that a NULL produced from a TRY could remember the error it defused, in case you wanted to know what it was... this could be offered in some debug modes.

In trying to explain why an out-of-range block pick should succeed while picking a symbol name out of a library should raise an error, I'm drawing a bit of a blank.

I Think TRY PICK Stays...

Like many things, it doesn't really come up as much as you would think. Working code rarely picks out of range things. (Perhaps that was your initial point?)

1 Like

So a new angle on this is that meta-^variables can give back anything... even unstable antiforms. ERROR!, PACK!, you name it.

This means that try frame.^field changes the rules here.

I think (?) that has to panic if FIELD is not available, because errors are in-band of what you're asking for.... and EXCEPT won't help you discern them.

Not certain this means anything has to change for non-meta-picking. You wouldn't meta-pick an environment variable or meta-pick a function out of a DLL, and I think it would be dumb to say that just because meta-picks have ERROR! in band that you lose the easy handling of errors on non-meta-picks.

Just something to be aware of.

A case that is food for thought here is MAP!.

In MAP! right now, you get an ERROR! antiform back when you pick with a key that's not in the MAP!. You can then TRY that, and it becomes a null.

But if your key is a ^META then you can get back an ERROR! that's meta-represented in the map.

This is something that's a bit more of a big deal, because with MAP! you are pretty much expecting to be asking for things that you don't know if they're in the map or not. So being able to tell the difference between a meta-represented error that's in the map vs. absence of anything in the map would likely be important.

I'm really only just now getting experience with the troubles of facilitating meta-representation, so I can't tell how bad this is. Just because you're using meta-representation doesn't mean you're including errors. I've mentioned that (obj.^field: some-erroring-thing) will still evaluate the expression overall to an ERROR!, which means you have to deliberately put a TRY on the outside to get it to not escalate to a panic. So clients who are meta-representing errors may have to go an extra mile and separate the tests for absence/presence (with something like HAS) than to rely on getting an ERROR! back from a picking operation.

Sometimes I get a bit bogged down in the details and forget how well a lot of things are working, because I focus a lot on the tough parts. :roll_eyes:

Definitional errors are certainly slippery, and this new idea that they are in-band for field selection in objects is definitely a blessing in some places... but a curse in others.

Something about PICK-ing is that it actually gets folded into GET-ing when you use tuples. Consider the difference between:

pick:meta foo 'bar

get $foo.^bar  ; distinct from (get $foo.('^bar)) aka (pick foo '^bar)
get $foo.^('bar)  ; how to use implicit meta with a group!

So GET can be unable to succeed in yet more ways...

  1. Lookup of foo didn't succeed (in the PICK that would have been a panic in FOO lookup before pick was called, but it happens during the GET with the parameter it has passed)

  2. Lookup of FOO succeeded, but you can't pick out of it (maybe it's an INTEGER! or something)

  3. The lookup was valid, it's a member, but it was pure unset

  4. The lookup was valid (e.g. let's say to an OBJECT!), but bar doesn't exist as a member (or it's a BLOCK! but an index was used that was out of range--this is considered the same kind of "almost" situation)

  5. And then... the lookup could be valid, it's a member, and it's an ERROR! antiform.

:roll_eyes:

I feel like 1-3 being a PANIC you had to work around some other way, with 4 being a definitional error you could handle with an EXCEPT or TRY was a success, and the arrival of 5 is a pretty big thorn.

This is definitely a pain, leading me to empathize with the annoyance that led to the axing of storing active ERROR!s in variables like in Rebol2.

But there's no turning back on this. "Lift the Universe" has too much upside... by eliminating any particular "moment" at which FRAME!s need to perform transformations on arguments.

I'm still struggling a bit with the idea that meta-picks panic on (4), while non-meta picks use the lack of conflation to allow ERROR! as an out-of-band signal. I'm not really concerned about how this handles if you're doing foo.^bar vs. foo.bar literally, but rather if you wrote get var without knowing whether var is going to cue meta-behavior or not.

There are no specific problems with it yet, but it just makes me uncomfortable.