I had a weird thought--which resembles an old thought--but...
What if this was the meaning of any/even? e.g. what if we turned over the interstitial slash interpretation to function call dialecting as well?
The competing idea here was that it would do cascading, e.g.
>> odd?: not/even?/
>> odd? 13
== \~okay~\ ; antiform
But when you compare that idea with dialected function calls, it's not all that powerful, and doesn't come in all that handy.
If you aren't trying to make the function to store in a variable or pass as a predicate, it looks better as not even? 13... (you'd literally never write not/even? 13 just because you could)
I've also suggested folding things under a "specialization by example" ("pointfree" or "partialize") which would completely toast the limited cascading:
>> demo: pointfree [reverse append _ [a b c]]
>> demo [d e f]
== [[a b c] f e d]
In That Light, Don't Slashes Seem To Have a Higher Calling?
It would be very helpful to be able to pass functions as arguments to functions in a clear way that suggested "hey, this is a function".
I'd tried to do this at one time as e.g. all /even? [2 4 6] but any/even? [2 4 6] makes more sense.
![]()
There's also things like an old REDUCE* definition which is a specialization with OPT to erase nulls:
>> reduce* [1 + 2 null 100 + 200]
== [3 300]
Much more general as REDUCE/OPT:
>> reduce/opt [1 + 2 null 100 + 200]
== [3 300]
One Loss: Deferred Infix Manipulation
I had forgotten about this benefit:
(not any [a, b, c]) then [print "Without the GROUP! THEN reacts to ANY [...]"]`
not/any [a, b, c] then [print "If NOT/ANY composed the functions, this works"]
That makes the slashed form cascading the functions notably less superfluous.
But I still think the predicate-passing seems to jump out in front as much more useful day-to-day.
Especially because this particular case is covered by ANY/NOT with the predicate scheme!