Remove TreeAndSpacing.

A `TokenStream` contains a `Lrc<Vec<(TokenTree, Spacing)>>`. But this is
not quite right. `Spacing` makes sense for `TokenTree::Token`, but does
not make sense for `TokenTree::Delimited`, because a
`TokenTree::Delimited` cannot be joined with another `TokenTree`.

This commit fixes this problem, by adding `Spacing` to `TokenTree::Token`,
changing `TokenStream` to contain a `Lrc<Vec<TokenTree>>`, and removing the
`TreeAndSpacing` typedef.

The commit removes these two impls:
- `impl From<TokenTree> for TokenStream`
- `impl From<TokenTree> for TreeAndSpacing`

These were useful, but also resulted in code with many `.into()` calls
that was hard to read, particularly for anyone not highly familiar with
the relevant types. This commit makes some other changes to compensate:
- `TokenTree::token()` becomes `TokenTree::token_{alone,joint}()`.
- `TokenStream::token_{alone,joint}()` are added.
- `TokenStream::delimited` is added.

This results in things like this:
```rust
TokenTree::token(token::Semi, stmt.span).into()
```
changing to this:
```rust
TokenStream::token_alone(token::Semi, stmt.span)
```
This makes the type of the result, and its spacing, clearer.

These changes also simplifies `Cursor` and `CursorRef`, because they no longer
need to distinguish between `next` and `next_with_spacing`.
This commit is contained in:
Nicholas Nethercote 2022-07-28 10:31:04 +10:00
parent bdf520fd41
commit 332dffb1f9
23 changed files with 317 additions and 307 deletions

View file

@ -152,7 +152,7 @@ impl<'cx, 'a> Context<'cx, 'a> {
fn build_panic(&self, expr_str: &str, panic_path: Path) -> P<Expr> {
let escaped_expr_str = escape_to_fmt(expr_str);
let initial = [
TokenTree::token(
TokenTree::token_alone(
token::Literal(token::Lit {
kind: token::LitKind::Str,
symbol: Symbol::intern(&if self.fmt_string.is_empty() {
@ -167,12 +167,12 @@ impl<'cx, 'a> Context<'cx, 'a> {
}),
self.span,
),
TokenTree::token(token::Comma, self.span),
TokenTree::token_alone(token::Comma, self.span),
];
let captures = self.capture_decls.iter().flat_map(|cap| {
[
TokenTree::token(token::Ident(cap.ident.name, false), cap.ident.span),
TokenTree::token(token::Comma, self.span),
TokenTree::token_alone(token::Ident(cap.ident.name, false), cap.ident.span),
TokenTree::token_alone(token::Comma, self.span),
]
});
self.cx.expr(