WIP: ref(lexer): attempt to not try on every token #7
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "reduce-try-usage"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
What do we do with invalid tokens? I think we should make them fail and not even lex.
The question is whether we should lex
>/foo/asTokenand leave the parser to deal with it, or we just reject it.OR, as an alternative, we can have another token type that serves this specific purpose and catches all symbols that are known to be (somewhat) malformed and emit (or not) an error to warn the user. This would make
Tokenonly valid tokens.^ update to that idea: it is not idea because we don't have a clear way to distinguish
Tokenand such a Error token.Tokenis currently defined semantically as "not other tokens".In fact, I don't know if
trying a huge block is better or worse thantrying each parser.e18082db1bto449b7c8ca763bd6841e8tof9423d4af0I think for this lexer since it has to match haddock’s current markup we should just default to other and not fail or error. But we have the foundation for haddock 2 markup to be structured and stricter like Verso and give nice errors.
f9423d4af0to8c666f637ca36ff898e6toa64ac93bd9Maybe we shouldn't try to match
\(and\)at lexer level? I don't think it's the lexer's responsibility to handle whether a construct is closed or not, lexer should only do character grouping.Update on this:
Current Haddock uses
tryeverywhere too so we probably shouldn't care that much.Pull request closed