Why Doesn't `(third [1 2])` Trigger A Range Check Error?

I've been living with this pattern for a year-and-a-half now.

It has the obvious upside of error locality...

...BUT recent changes have made it so that NULL is accepted many fewer places. If a value is null, you can't test if it's a BLOCK? ... for instance... unless you OPT it.

So nulls are accepted almost nowhere, besides conditional places... where they are now the only falsey type. So if you're passing a NULL to a conditional slot, why should you need to approve it as "possibly null"? You're testing it!

This means you wind up double-paying for documenting optionality in your source:

append [a b c] opt try pick block 3

The main benefit you're getting is when you separate the operations significantly:

var: pick block 3   ; let's say this makes null
...
... ; time passes
...
append [a b c] var  ; !!! error, null

So in that case, having to put a TRY at the place where you're assigning VAR gets you a bit of locality.

Is The Locality Is Really Worth It?

As I've asked now-and-again: Are nulls so special? What if you get an INTEGER! back and you didn't want an integer, but later you see it's an INTEGER!? That's even worse--because integers have no safeguard of stopping from APPENDing-etc.

We might be able to mitigate the loss of locality by making NULLs carry more information about where they were created, making them even more suitable.

The Missing Object Field Is A Different Beast

This is different than picking out of blocks, because blocks can't hold null, so there's no conflation by default. I think a definitional error you have to react to (e.g. with TRY) here is a nice solution.

A devil's advocate for not having to say TRY might say:

  • BLOCK! elements can't be NULL. So we're able to give a NULL antiform back from out-of-range BLOCK! picks that unambiguously indicates the element wasn't there.

  • Almost all function arguments (that aren't specifically conditional tests for NULL) reject NULL... including even type tests like INTEGER?.

  • Hence the ergonomics of "just returning null" win out as convenient and expedient, vs. forcing callers to "defuse" a range check error.

  • The balance of the decision changes when NULL is in-band for the target type (e.g. picking fields out of objects, which can hold any stable antiform). At that point, things that are conceptually "absent" or "out-out-range" need to raise an error.

But I Find Myself Reluctant To Go Back To That

Searching for TRY PICK in the codebase to consider removing it, one of the first examples I found was:

>> winsock: make library! %/C/Windows/System32/wsock32.dll
== &[library! %/C/Windows/System32/wsock32.dll]

>> pick winsock "gethostbyname"
== &[handle!]

>> pick winsock "gethostbynickname"
** Error: Couldn't find "gethostbynickname"
      in &[library! %/C/Windows/System32/wsock32.dll]

>> try pick winsock "gethostbynickname"
== ~null~  ; anti

You see that the ERROR! can be much more informative than NULL.

And it occurs to me that a NULL produced from a TRY could remember the error it defused, in case you wanted to know what it was... this could be offered in some debug modes.

In trying to explain why an out-of-range block pick should succeed while picking a symbol name out of a library should raise an error, I'm drawing a bit of a blank.

I Think TRY PICK Stays...

Like many things, it doesn't really come up as much as you would think. Working code rarely picks out of range things. (Perhaps that was your initial point?)

1 Like