I put together an interesting dialected example of PARSE going one step at a time:
Evaluator Hooking ("RebindableSyntax")
This is an inversion of control, that steps away from "PARSE and be done". It's kind of like how @rgchris rebelled against the "ZIP and be done" model of the ZIP dialect.
I think the GENERATOR / YIELDER model works well here. Like I point out, you can process your data one rule at a time:
make-parser: lambda [data] [
make-yielder [rule [block!]] [
parse data [opt some yield/ rule]
]
]
And there you go.
>> parser: make-parser [a b c "d" 1020 "e" 304]
>> parser [across some word!]
== [a b c]
>> parser [one]
== "d"
>> parser [collect some [keep integer! | text!]]
== [1020 304]
>> parser [one]
** Error: enumeration done
>> try parser [one]
== ~null~
TRY suppresses arbitrary mismatch errors, so probably not what you want to use for end-detection. Note you can use done? parser []
to test for if a parse is done, since it's a yielder... the only way an empty block rule wouldn't match would be if the parse ended (might need to rig that up specially, but it's doable)
Maybe this PARSER should tolerate non-BLOCK!s too...
>> parser: make-parser [a 1020]
>> parser word!
== a
>> done? parser []
== ~null~ ; anti
>> parser [one]
== 1020
>> done? parser []
== ~okay~ ; anti
This Looks Very Sweet... 
I think this crystallizes that we need the creation operations to be named like MAKE-PARSER and MAKE-YIELDER, because you need to be able to name the products (PARSER, YIELDER, GENERATOR...)