
The parser pushes a `TokenType` to `Parser::expected_token_types` on every call to the various `check`/`eat` methods, and clears it on every call to `bump`. Some of those `TokenType` values are full tokens that require cloning and dropping. This is a *lot* of work for something that is only used in error messages and it accounts for a significant fraction of parsing execution time. This commit overhauls `TokenType` so that `Parser::expected_token_types` can be implemented as a bitset. This requires changing `TokenType` to a C-style parameterless enum, and adding `TokenTypeSet` which uses a `u128` for the bits. (The new `TokenType` has 105 variants.) The new types `ExpTokenPair` and `ExpKeywordPair` are now arguments to the `check`/`eat` methods. This is for maximum speed. The elements in the pairs are always statically known; e.g. a `token::BinOp(token::Star)` is always paired with a `TokenType::Star`. So we now compute `TokenType`s in advance and pass them in to `check`/`eat` rather than the current approach of constructing them on insertion into `expected_token_types`. Values of these pair types can be produced by the new `exp!` macro, which is used at every `check`/`eat` call site. The macro is for convenience, allowing any pair to be generated from a single identifier. The ident/keyword filtering in `expected_one_of_not_found` is no longer necessary. It was there to account for some sloppiness in `TokenKind`/`TokenType` comparisons. The existing `TokenType` is moved to a new file `token_type.rs`, and all its new infrastructure is added to that file. There is more boilerplate code than I would like, but I can't see how to make it shorter.
59 lines
1.6 KiB
Rust
59 lines
1.6 KiB
Rust
//! The compiler code necessary to support the cfg! extension, which expands to
|
|
//! a literal `true` or `false` based on whether the given cfg matches the
|
|
//! current compilation environment.
|
|
|
|
use rustc_ast::token;
|
|
use rustc_ast::tokenstream::TokenStream;
|
|
use rustc_errors::PResult;
|
|
use rustc_expand::base::{DummyResult, ExpandResult, ExtCtxt, MacEager, MacroExpanderResult};
|
|
use rustc_parse::exp;
|
|
use rustc_span::Span;
|
|
use {rustc_ast as ast, rustc_attr_parsing as attr};
|
|
|
|
use crate::errors;
|
|
|
|
pub(crate) fn expand_cfg(
|
|
cx: &mut ExtCtxt<'_>,
|
|
sp: Span,
|
|
tts: TokenStream,
|
|
) -> MacroExpanderResult<'static> {
|
|
let sp = cx.with_def_site_ctxt(sp);
|
|
|
|
ExpandResult::Ready(match parse_cfg(cx, sp, tts) {
|
|
Ok(cfg) => {
|
|
let matches_cfg = attr::cfg_matches(
|
|
&cfg,
|
|
&cx.sess,
|
|
cx.current_expansion.lint_node_id,
|
|
Some(cx.ecfg.features),
|
|
);
|
|
MacEager::expr(cx.expr_bool(sp, matches_cfg))
|
|
}
|
|
Err(err) => {
|
|
let guar = err.emit();
|
|
DummyResult::any(sp, guar)
|
|
}
|
|
})
|
|
}
|
|
|
|
fn parse_cfg<'a>(
|
|
cx: &ExtCtxt<'a>,
|
|
span: Span,
|
|
tts: TokenStream,
|
|
) -> PResult<'a, ast::MetaItemInner> {
|
|
let mut p = cx.new_parser_from_tts(tts);
|
|
|
|
if p.token == token::Eof {
|
|
return Err(cx.dcx().create_err(errors::RequiresCfgPattern { span }));
|
|
}
|
|
|
|
let cfg = p.parse_meta_item_inner()?;
|
|
|
|
let _ = p.eat(exp!(Comma));
|
|
|
|
if !p.eat(exp!(Eof)) {
|
|
return Err(cx.dcx().create_err(errors::OneCfgPattern { span }));
|
|
}
|
|
|
|
Ok(cfg)
|
|
}
|