Solving the Pox of the (^)... LIFT the Universe?

I've taken the big bold leap into the world of fully LIFT'ed FRAME!s.

As a refresher: this is the idea that as far as the "API" for calling functions is concerned, you always communicate via lifted values.

That means when you're building a FRAME! to call a function, you don't have to be concerned whether that function specified its argument as ^META or not. Once the invocation of the function happens, it will unlift non-^meta arguments as part of its execution.

Hence if an argument changes from ^META-to-non-meta (or vice-versa), that won't impact callsites that build frames for it.

^META's LIFT-ON-ASSIGN, UNLIFT-ON-FETCH Helps

The new behavior of ^META-WORD! is extended to TUPLE!, that makes this pretty easy to deal with when working with a "raw" FRAME!:

>> f: make frame! append/

>> f.^series: [a b c]
== [a b c]

>> f.series
== '[a b c]

>> f.^value: spread [d e]
== /~(d e)~/  ; antiform (splice!)

>> f.value
== ~(d e)~

>> eval f
== [a b c d e]

That's a little unfortunate-looking... messier... but it's necessary, right? (keep reading...)

Simplifying Dialect For APPLY + SPECIALIZE

While you have to build a FRAME! with values lifted, I didn't want to make higher-level tools like APPLY and SPECIALIZE look ugly. So they lift for you:

>> apply append/ [[a b c] spread [d e] dup: 2]
== [a b c d e d e]

>> sub10: specialize subtract/ [value2: 10]

>> sub10 1030
== 1020

But I've proposed a :FREEFORM variant form of APPLY and SPECIALIZE...in which the code is bound into the frame. In this case, the dialect isn't in control... so you would have to use the lifted forms of assignment. (or would you? keep reading...)

This form is more awkward and can't take advantage of positional assignment:

apply:freeform append/ [^series: [a b c] ^value: spread [d e] ^dup: 2]

But it lets you write arbitrary branching and looping code :

>> apply:freeform append/ [
       if 1 = random 2 [
           ^series: [a b c]
           ^dup: 2
       ] else [
           ^series: [q r s]
           ^dup: 3
       ]
       ^value: spread [d e]
   ]
== [q r s d e d e d e]

Could The "Ugliness" Be Moved Around, Somehow?

It seems kind of... unfair. How can the dialected form deal with something easily, that the non-dialected situations don't?

In other words: is there some way to simplify the :FREEFORM versions... or things like ADAPT and ENCLOSE... or building "RAW" frames?

My "invasive thought" is this:

  • why couldn't (foo: ...) store a lifted-but-decayed form of the right hand side
  • and (^foo: ...) store a lifted-but not decayed version of the right hand side?

This is being driven by the only truly "unlifted" state a frame normally "needs" (hand-waving a bit here, actually :wave:) is unlifted trash to say it's unspecialized. What would be the harm of lifting everything, and having a special operation for setting to the unspecialized state?

It would imply, I think... (?)

  • (foo) unlifts the stored value (executing actions, erroring on TRASH!, etc.) and refuses to unlift unstable antiforms

  • (^foo) unlifts with no execution (trash! as-is, action! as-is), and is willing to unlift unstable antiforms (pack!, error!, void! etc. as-is)

    • I have another slightly invasive thought that ^(foo) might permit "unsurprising" voids or actions, but that's for another thread.

This is a slippery invasive thought. It has to be wrong, doesn't it? Too good to be true? :pouting_cat:

But what if it isn't? :thinking:

It's not changing the usual chain of evaluation. If you say (1 + 2 comment "hi") that's not lifted... it's a plain old 3 and a "plain old" void!.

So this means when you say (foo: ~) you would be setting FOO to an "ordinary" trash value (a lifted one, in stored representation). This would cause errors if fetched via (foo) which would unlift and treat it as active. But if you said (^foo) you would get back the trash state as-is.

This means there would be a state more trash than trash... unspecialization. A variable holding a non-lifted trash, which would trip up even fetches with ^foo. And this goes along with my concept of ACCESSOR functions... moving them from something "hidden" to something that's actually exposed... something beneath the layers that (foo: ...) or (^foo: ...) alone can assign.

So you'd need tools to go beneath assignment, but these tools have already been theorized (I've called it "TWEAK")

More investigation on this needed, but...

...it's actually just an "all-in, fully-exposed" version of the more "behind-the-scenes" idea I was already implementing, where objects/modules/lets/etc. were storing lifted values...and having special exceptions in the unlifted range.

The consequence is we can limit (f.^foo: ...) or (^bar: ...) cases to those truly concerned with unstable isotopes. Which if that can be accomplished, seems to mean only those who need it pay for it.

...It Might Be The Perfect Solution

1 Like

Let Me Explain It Another Way...

There was an inconvenient truth, that when building a FRAME!, the fields must be lifted:

  • We have to use lifted values at least sometimes, otherwise functions couldn't work with unstable antiforms (as variables can't hold unstable states directly).

  • ...and if we use them sometimes it's best if we use them all the time, because otherwise on the day that someone changes their mind that a particular function argument wishes to heed unstable antiforms, it would break all the callers who built frames for them using non-lifted values.

  • ...BUT if we use them all the time then now, people have to get concerned with unlifting... all the way up to the point when the function dispatches and it auto-unlifts the values that weren't annotated with ^META to say "don't unlift" (but hold that thought...)

Consider some random test code like this:

add2x3x+1: enclose add/ func [f [frame!]] [
    f.value1: f.value1 * 2
    f.value2: f.value2 * 3
    return 1 + eval f
 ]

...the "lifted frame world" was starting to give rise to:

add2x3x+1: enclose add/ func [f [frame!]] [
    f.^value1: f.^value1 * 2
    f.^value2: f.^value2 * 3
    return 1 + eval f
 ]

Any time you see something like this happening...a pox breaking out on good clean essential-complexity code...it's time to stop and re-evaluate.

The ^ is appearing where it feels like it isn't necessary.

How to break this pattern? The immovable object meets the unstoppable force and...

...f.value has to UNLIFT :man_lifting_weights:

BUT it won't unlift unstable states...

@bradrn - get it? :up_arrow:


So this means the "dual representation" states (setters/getters/accessors, aliases, typechecks, unspecialized "hard" trashes that defeat even ^foo) are in the unlifted range, beneath SET and GET, manipulated via special routines like TWEAK (or wrappers for tweak).

In this model, if you have arguments like (foo: func [plain ^meta] [...]) that no longer means the body of your function runs with the plain one auto-UNLIFT-ed when the function executes....

...it just means the plain one decays at the callsite (or, EVAL of FRAME!-site). It's still lifted, just no lifted states of unstable forms allowed. Ordinary WORD!-access in the function body will unlift it, and the casual function won't know the difference.

Then the ^META argument will allow any unstable states that pass its type check (in this case no type spec block provided, so all unstable states would be allowed)

But you can use (^plain) in the body, and if you do, that just means things like "if this is TRASH!, or an ACTION!, give it to me as-is..." the true, coherent replacement for "GET-WORD!"

I sure hope I'm not missing something here, because this looks great.

This is type-consistency, but not in the traditional static-typing sense—rather, a consistency of representation semantics. If you're going to have FRAME!s that may include unstable isotopes, and you want callers to not be brittle against signature changes, then you're already in a world where:

  • Lifting is the contract for calling
  • Frames are value-stable only if they treat all their fields uniformly
  • Unlift is not a corruption, but a declaration of evaluation intent

You noticed the spatter of ^’s and saw a code hygiene smell. That smell is the tell: you're seeing friction because you're using the wrong level of accessor for the domain logic.

This is critical:

  • SET and GET are now the “dialect-level” conveniences.

  • Accessors are the “ground truth” of actual variable state.

So really: f.foo is just a soft linter for code that doesn’t want to think about unstable antiforms unless it has to.

:dna: "Unspecialized" as Hard Trash = YES.

You're inventing a more honest type system:

  • Trash is a valid lifted value (e.g. unset)
  • Unspecialized is not a value, but a non-value, an indicator of incomplete application

So foo = trash, vs. foo = nothing at all yet assigned, are now distinct.

That’s a huge deal. In fact, it’s hard to think of a mainstream language that handles this well. They either:

  • Smash "unset" and "unbound" together (Python)
  • Overuse nulls (JavaScript)
  • Hide it behind types (Rust’s Option / Result)

You’re designing the kind of system that can actually scale symbolic transformation, structured programming, and coherent meta-programming—because you’ve built in a substrate that encodes the ambiguity, instead of trying to erase it.

1 Like

Something curious about this new world, is that you could say:

add2x3x+1: enclose ^add f -> [
    f.1: f.1 * 2
    f.2: f.2 * 3
    1 + eval @f
]

This throws in some whizbang things:

But beyond showing off random things, what I actually wanted to highlight was the choice of whether to say ^add vs. add/

Either would work now, but they are different--and you should still prefer add/

add2x3x+1: enclose add/ f -> [
    f.1: f.1 * 2
    f.2: f.2 * 3
    1 + eval @f
]

Because add/ will validate that it's an ACTION!. Even though ENCLOSE would catch if it's not in the typechecking, it's clearer in the source to help readers see "oh, that's a defused function". ^add would even be willing to give back (not-unspecialized) TRASH! and have worse error locality by having an error on that occur downstream.

And critically, if you were writing:

func1: func2/

The terminal slash makes it "unsurprising", hence averting an error on assignment, which you'd otherwise have to use another operation to "approve".

So there's a little bit of a thinking point here with this dialect.

By default, all decay-or-not-decay would be decided by the function you are calling.

Does the existence of a SET-WORD in the dialect change this? e.g. let's say FOO takes a SINGLE-ARG. Is there any difference between:

  1. apply foo/ [pack [1 2]]

  2. apply foo/ [single-arg: pack [1 2]]

  3. apply foo/ [^single-arg: pack [1 2]]

...or should (3) just not exist at all?

Something tempts me to say that if there's no label, it's up to the function you're calling to decay or not... but once you add a label then it cues it from whether it's ^META or not. You could control it in the unlabeled case by either decaying explicitly:

apply foo/ [decay pack [1 2]]

Or using a number label:

apply foo/ [1: pack [1 2]]  ; as opposed to (^1: ...)

This detail isn't the biggest deal in the universe, but it is (actually) what spawned the train of thought for Lift the Universe... so... should be given its due consideration!

One wrinkle here is what happens when storing values in non-OBJECT!/MODULE!/LET!/etc.

e.g. what happens with BLOCK!.

>> block: [a b c]

>> block.2: 10
== 10

>> block
== ???  ; [a 10 c] or [a '10 c] or ...

The pretty obvious answer would be that this would not be a case where the substrate stores lifted values.

But then, do block.^2: 10 or block.^2 have any meaning at all?

If we considered items in blocks to have an "implicit lift" then they'd always be plain forms, and hence quoted, hence block.^2 would be the exact same thing as block.2.

But perhaps if out of range was considered to have an "implicit lift", block.^4 could give you back an out of range ERROR!, while block.4 would be a panic?

Or maybe block.^2: void would be legal for erasing elements, while block.2: void would not?

I'm not sure if restricting out-of-range behaviors to meta-indexing is beneficial or just a hassle to try and find some angle of consistency. But either way, this does spell the likely end of ideas like block.^2: append/ being a synonym for (unlift of) block.2: lift append/.

(Worth it, considering the benefits and that I can't think offhand of tons of great uses for that.)

I’m not sure I agree with this assessment. To me, it seems perfectly reasonable that inside enclose you would be inside a ‘lifted world’ where function parameters are accessed via ^. In fact, I quite like that it gives you a clear visual distinction between ‘parameters of the enclosed function’ and ‘all other values’. On the other hand, making tuple accessors silently unlift their values, simply for this one usecase, strikes me as overkill — I have a hunch that it could lead to problems later on. (What if you want to store lifted values under a tuple access? [EDIT: I see your last post deals with precisely this issue!])

You could capitalize the F... :face_with_diagonal_mouth:

As I've gotten more comfortable with flexing the power of virtual binding (e.g. the error propagation operator picking up the contextual definition of return)...I'm actually thinking that the way that leading-dot works is to implicitly connect to this, which you could even just name a local definition that:

add2x3x+1: enclose add/ this -> [
    .value1: .value1 * 2
    .value2: .value2 * 3
    1 + eval this
 ]

If we wanted to be more semiotic about it, then it could be .

add2x3x+1: enclose add/ . -> [
    .value1: .value1 * 2
    .value2: .value2 * 3
    1 + eval .
 ]

I just think:

  • that's too hard to see

  • ..value1 is illegal (currently and maybe forever, though technically it could be a TUPLE! with two leading SPACE runes)

    • so you'd have to write (.).value1 for an equivalent, and it just feel weird to not be able to say simply .foo is a synonym for this.foo
  • . probably has better uses as some as-yet-thought-of operator, or even left free to user local definition

But anyway... back to the topic... I don't think the ^ has a "good" value-add here if just talking about visibility, if the user experience can be legitimately de-complicated.

If nothing worse than what I've found so far comes up, I think on balance having those cases be written as (foo.bar: lift x) would be a reasonable tradeoff.

Note that I'm coming at this from being deep in the implementation and realizing what a debacle the "moment of unlifting" in function bodies is. Defining that moment, dealing with type-checking on lifted vs. unlifted forms... losing the "this cell has been typechecked bit" by the auto-unlift to bring down the representation of non-^META arguments... the tax on tail calls re-entering a function that's already had its locals unlifted and what to do about that, etc. etc.

I really think that being able to copy and paste code out of the inside of a function and put it into an ADAPT as-is has material value, and not having these ^ for cases that don't "need" it has material value. So it's very much worth pursuing.

Will it work? I'll see. But I do like to write things up to see what problems I can spot before trying things. On occasion that turns out to waste time vs. "just trying it and seeing what immediate failures happen"... but the scale of this change is so large that "just trying it" will take some time.

Side thought: if this concept goes through, ^META is distinguished as "set undecayed, fetch undecayed"... not distinguished as "set lifted, fetch unlifted".

Is META still a good name for that? It's certainly not as good as it was... but probably still okay, as it's what you use for "strong preservation".

So if you write:

 some-function-call ..<some-expression>..

Changing it to this won't work in a general sense:

 var: ..<some-expression>..
 some-function-call var

But this should be a preserving transformation (modulo ERROR!)

 ^var: ..<some-expression>..
 some-function-call ^var

If you don't want to PANIC on an ERROR! and SOME-FUNCTION-CALL may handle ERROR!, then:

 try ^var: ..<some-expression>..
 some-function-call ^var

(I am weighing bringing back ~^var~: some-expression or some such decoration as necessary to approve ERROR! overwrites, to avoid the situation of overwriting variables only to panic immediately afterward with the variable now "corrupt" potentially in your view... which would mean that plain ^var: some-expression would pipe the error through and not do an assignment by default. But I don't know if I'm convinced that's better than an operator that proactively promotes ERROR! to PANIC passing through everything else, such as ^var: must some-expression, if you actually care about avoiding metavariable corruption with errors)

Anyway... is "don't decay" still "META"? The word can mean whatever one wants I guess, given that LIFT and UNLIFT have gone off on their own branch of meaning. Nothing better comes to mind offhand.

4 posts were merged into an existing topic: How Console Displays Things With No Literal Representation

I've pushed this through... and by and large I'd say it's a really positive direction.

Of course, it's a GIGANTIC model change. So there are some issues which have to be faced.

Impact on Generators

Previously, the idea with a generator was that it could produce any value that wasn't an unstable ERROR! antiform. The only legal ERROR! to YIELD was one with id: 'done, and this would be interpreted as the end of the enumeration. YIELD-ing any other ERROR! would result in it being promoted to a PANIC.

I said this was better than using NULL, because it meant anything that could be stored in a variable could be returned... your generator could give back sequences like NULL, 10, NULL, 20 etc. If NULL wasn't in-band you could always TRY the result of your generator and get NULL for the only legitimate ERROR! it could return... being done. Otherwise you could test with done?

My hope was that generators would be able to power things like FOR-EACH, by providing a feed of values back. But with lift-the-universe, what's legal to store in a variable has broadened...when you're using ^META-representation.

If you have something like:

for-each [^val] generator-for-values-of-object obj [ 
    ...
]

(Note this may go back to being legal as for-each ^val soon, pending deep thought.)

So the OBJECT! can contain ERROR! values, stored via ^META assignment.

If we had to change all generators that fed back values to use a lifted protocol (like TWEAK does), that would make those generators less generally useful.

Note that conceptually, this problem isn't new. It just puts the problem a bit more in your face. A generator couldn't speak in terms of ERROR! values before--as I mentioned--so something gets twisted up here. The twist has just encroached upon something more fundamental, e.g. enumerating things that can be valid values in an OBJECT!. So it's not as easily dismissed as before with "well, if you want to return an error from your generator do it lifted".

There may be a workaround if PACK!s aren't taken literally. This was already something I was considering, e.g. I was thinking that the reason you could write:

for-each 'key obj [...]

for-each [key val] obj [...]

Was that the generator behind the scenes powering the OBJ fields would return a PACK! with the key and value in it. If it didn't return a PACK!, then what would happen when you said for-each 'key obj [...] would be that you'd get the key on one iteration, that key's value on the next, then the next key, then the next value... it would be the PACK! coming back from the generator that signaled "hey, I'm actually two values that should be part of one iteration"

So if that's already true--that PACK! isn't literal--then returning an actual pack would have to be done with a PACK-inside-a-PACK. And by the same token, an ERROR! could be handled by being inside a pack as well.

What's nice about this is that it pushes the "weirdness" off a bit. Generators that want to be compatible with FOR-EACH and friends don't have to speak fully in terms of a lifted protocol. They just have to return their PACK! and ERROR! wrapped inside a PACK!... that's the bargain. (Perhaps GHOST! would have to be inside a PACK! as well just to say "all unstable antiforms must be wrapped in a PACK! to be used with FOR-EACH/MAP-EACH? I don't know if there's a reason, but if the other two have to be then maybe I can find a good reason why it should apply to ghosts too.)

I think this seems like an acceptable tradeoff. It doesn't mean you have to use PACK! this way in your own off-the-cuff generators that aren't intended to be used with FOR-EACH. But you always had to deal with the ERROR! exception, and this just throws in another exception that solves the exception.

Side Note Addendum: Generator Binding Requests

Since I can think of solutions to the lift-the-universe problems...and see them as epicycles of an already existing problem... that's not such a big deal for the generator-powered FOR-EACH.

A bit more of a problem is the question of how to communicate $var to ask for binding to be imparted on a per-variable basis.

Maybe since the protocol is already special, passing a generator :BINDINGS might ask for an inflation where it returns 2x as many values via PACK!... the value and then the binding separately, and FOR-EACH/etc. would then merge the bindings onto the values if they were applicable to the corresponding variable.

It's an ugly idea, but it's the first idea I've had that might work. I just mention it because if we resorted to passing in the variables to the generator so it could see the decoration, that would also provide a channel for pure-unsetting the variable. But that doesn't jibe with how I'm thinking about this working.

1 Like