There's places in FFI clients that look more or less like this:
let ptr: make pointer! 0
if something.field [
let whatever: blah blah field
ptr: address of whatever
]
call-ffi-thing-that-takes-pointer ptr
It's kind of inconvenient that 0 pointers aren't conditionally false.
At the same time, it's kind of inconvenient at times that the integer! 0 can't be tested for as conditionally false. Are pointers a different beast?
Having ~null~ Typecheck Distinctly Offers Crucial Advantages
There's something that might seem at first glance to be annoying:
>> make pointer! 0
** Error: Cannot make pointer from 0
MAKE has to return the type of the thing you are MAKE-ing, so it can't return the antiform KEYWORD! of NULL.
BUT this could be a raised error you defuse with TRY. And if it's the only raised error (as opposed to actual failure) you can get exactly the answer you'd want:
>> try make pointer! 0
== ~null~ ; anti
This does mean that in type specifications, if a null pointer is legal you'd have to annotate it as such, with [~null~ pointer!]
instead of simply [pointer!]
.
But that's a very good thing. This basically means your interfaces become documented as to whether they accept nulls or not, effectively giving you std::optional
/ Option()
/ Maybe
. Documenting that communicates powerfully and can give much better error locality, even in an interpreted language.
What are the disadvantages of making NULL the 0 pointer?
There are obvious advantages to being able to easily conditionally test the pointer variable.
The first disadvantage someone might cite is "you lose the protection of being able to tell if you assigned the pointer or not". NULL is supposed to represent the easy-to-test state of whether you've assigned a variable or not--friendlier than an unset variable, but still unfriendly in most cases. So let's say you had meant to assign ptr
but just forgot to.
I think this fear isn't that compelling. We already have the case where NULL antiforms represent the logic falsey state (the only falsey state) and don't stress over whether you "forgot to set the logic variable".
The bigger deal is that there are actually some edge cases where 0 pointers are meaningful addresses. e.g. dlsym()
on POSIX for looking up symbols in a library makes a distinction between failure to find the address of a symbol, and address 0.
So if you wrote:
something: try pick libc "some-symbol"
Then if it couldn't find the symbol, PICK returns a raised error, which TRY converts to NULL. But then it could have successfully found the symbol at address 0, so that is distinct.
This seems pretty esoteric to me. If you're dealing with one of these situations, you could write:
let pointer: pick libc "some-symbol" except e -> [
fail ["Couldn't find some-symbol in libc:" mold e]
]
let address: any [to integer! maybe pointer, 0]
So this way there could be a legitimate NULL returned from the LIBRARY!, vs. a raised result. If you know you're in one of these situations, you are probably doing something hardware fiddly and an integer is what you want.
The Advantages Of Making ~null~ the 0 State For Pointers Seem To Outweigh The Disadvantages
I've looked at a fair bit of code and it seems the edge case of symbols that legitimately reside at address zero in memory is not compelling enough to stop the better choice.