"type of antiform" now possible... Should We Use It?

Having run with this for a year now, it seems to be a good decision. The groundwork is laid so that TYPE OF can correctly answer for "extension types" as well, which provides unlimited new fundamental datatypes coming from extension DLLs. (There are some questions about name collisions--like what happens if two extensions use the same name for their new types--but that's not the most burning issue right now.)


As things developed, I decided to narrow the WORD! antiforms to just be ~null~ and ~okay~, and let this be the LOGIC! antiform class. It just came to seem that leaving the door open to other antiforms wasn't as valuable as constraining it.

Historically I'd had reservations about type-related questions like INTEGER? accepting nulls, thinking that NULL should be more "disruptive". If integer? my-null-var came back false I thought that seemed to be too tolerant. So I made you say integer? cond my-null-var (COND(ITIONAL) turns NULL into a "veto" which forces a NULL answer of whatever function you give it to, without running the function).

I think my prescriptivisim got a bit out of hand here, and saying type of null is LOGIC! makes sense for today's world.

But notably, if you aren't really testing for LOGIC! and just want a null-in-null-out answer for TYPE OF, you can still use type of cond.

switch type of cond x [
    null [...]
    integer! [...]
    block! [...]
    panic "Expected null, integer, or block"  ; ~okay~ would trigger this
]

I think that's probably usually more likely to be what you want to test for than LOGIC!.

Further pushing back on the prescriptivism, I think that erroring on NULL for tests like INTEGER? haven't turned out to be all they're cracked up to be. Being an antiform stops NULL from spreading into some places, but it's a stable antiform... so it can be conditionally tested, and things like SPLICE! can too. So why shouldn't (integer? null) just be null?

Saying that INTEGER? fails on undecayable unstable forms such as VOID! or TRASH! is likely the better rule.

What About TYPE OF Unstable Antiforms?

It seems "obvious" that type of should decay its argument. So if there's an unstable antiform test it would be something like type* of.

Though a bit weird today is that ACTION! is an unstable antiform an evolution of piggy-backing instability through ACTION!-in-PACK!, and it decays to FRAME!. That's a bit of an awkward tension, as old tests broke:

>> action! = type of append/
== |~null~|  ; antiform (logic!)

>> frame! = type of append/
== |~okay~|  ; antiform (logic!)

What feels frustrating about this is that you also get this behavior:

>> action? append/
== |~okay~|  ; antiform (logic!)

>> frame? append/
== |~okay~|  ; antiform (logic!)

That's because ACTION? is based on taking a ^META parameter, while FRAME? is not.

Overall I think it's wise to think of ACTION! as unstable, because things like FOR-EACH aren't willing to put unstable antiforms into non-^META variables. This helps when you're doing enumerations and not prepared for things that might be ACTION!s... you know you did for-each ^x [...] so you can connect that with using ^x in the loop body.

Yet there's a contradiction that x: func [...] [...] is assigning an unstable antiform without you having to say ^x: func [...] [...]. That's an ergonomic compromise...it's true also for x: ~ assigning voids or x: ~<foo>~ for assigning trash. Hence it's only PACK!s that decay in non-meta SET-XXX cases.

Right now that implies the rules for set $x: ... vs. set $x ... are different; the latter has protection against unstable antiforms (to get that it would need to not decay packs, you'd have to decay them yourself).

Assignment exemptions aside, ACTION! decaying to FRAME! feels like a safe compromise that's less uncomfortable than it feels. The ergonomics are good; because words being "active" is a "special" thing that you kind of want to quarantine.

TYPE OF In Any Sophisticiated Language is Tricky

Consider for instance that arrays in C and C++ can act like pointers. But if you're asking "is this a pointer" whether something says yes or not depends on whether you "decay" it. A non-decayed array says no, a decayed one says yes.

(There's bizarre mechanics like whether you say decltype(x) vs decltype((x)) which get you at the "raw information" vs the "when used as an expression".)

It would be possible to say TYPE OF gave errors on unstable antiforms instead of implicitly decaying them. So maybe type of pack [1 2] panics, but type of decay pack [1 2] would be INTEGER!. But that feels wrong.

Types are a mire, and I still feel like the right answer to doing more pattern-based type-checking hasn't shown itself.