1
Fork 0

Auto merge of #99887 - nnethercote:rm-TreeAndSpacing, r=petrochenkov

Remove `TreeAndSpacing`.

A `TokenStream` contains a `Lrc<Vec<(TokenTree, Spacing)>>`. But this is
not quite right. `Spacing` makes sense for `TokenTree::Token`, but does
not make sense for `TokenTree::Delimited`, because a
`TokenTree::Delimited` cannot be joined with another `TokenTree`.

This commit fixes this problem, by adding `Spacing` to `TokenTree::Token`,
changing `TokenStream` to contain a `Lrc<Vec<TokenTree>>`, and removing the
`TreeAndSpacing` typedef.

The commit removes these two impls:
- `impl From<TokenTree> for TokenStream`
- `impl From<TokenTree> for TreeAndSpacing`

These were useful, but also resulted in code with many `.into()` calls
that was hard to read, particularly for anyone not highly familiar with
the relevant types. This commit makes some other changes to compensate:
- `TokenTree::token()` becomes `TokenTree::token_{alone,joint}()`.
- `TokenStream::token_{alone,joint}()` are added.
- `TokenStream::delimited` is added.

This results in things like this:
```rust
TokenTree::token(token::Semi, stmt.span).into()
```
changing to this:
```rust
TokenStream::token_alone(token::Semi, stmt.span)
```
This makes the type of the result, and its spacing, clearer.

These changes also simplifies `Cursor` and `CursorRef`, because they no longer
need to distinguish between `next` and `next_with_spacing`.

r? `@petrochenkov`
This commit is contained in:
bors 2022-07-30 14:50:05 +00:00
commit 1202bbaf48
23 changed files with 317 additions and 307 deletions

View file

@ -4,7 +4,7 @@ use crate::proc_macro_server;
use rustc_ast as ast;
use rustc_ast::ptr::P;
use rustc_ast::token;
use rustc_ast::tokenstream::{TokenStream, TokenTree};
use rustc_ast::tokenstream::TokenStream;
use rustc_data_structures::sync::Lrc;
use rustc_errors::ErrorGuaranteed;
use rustc_parse::parser::ForceCollect;
@ -123,7 +123,7 @@ impl MultiItemModifier for DeriveProcMacro {
Annotatable::Stmt(stmt) => token::NtStmt(stmt),
_ => unreachable!(),
};
TokenTree::token(token::Interpolated(Lrc::new(nt)), DUMMY_SP).into()
TokenStream::token_alone(token::Interpolated(Lrc::new(nt)), DUMMY_SP)
} else {
item.to_tokens()
};