Auto merge of #50473 - petrochenkov:pmapi, r=alexcrichton

Review proc macro API 1.2

cc https://github.com/rust-lang/rust/issues/38356

Summary of applied changes:
- Documentation for proc macro API 1.2 is expanded.
- Renamed APIs: `Term` -> `Ident`, `TokenTree::Term` -> `TokenTree::Ident`, `Op` -> `Punct`, `TokenTree::Op` -> `TokenTree::Punct`, `Op::op` -> `Punct::as_char`.
- Removed APIs: `Ident::as_str`, use `Display` impl for `Ident` instead.
- New APIs (not stabilized in 1.2): `Ident::new_raw` for creating a raw identifier (I'm not sure `new_x` it's a very idiomatic name though).
- Runtime changes:
    - `Punct::new` now ensures that the input `char` is a valid punctuation character in Rust.
    - `Ident::new` ensures that the input `str` is a valid identifier in Rust.
    - Lifetimes in proc macros are now represented as two joint tokens - `Punct('\'', Spacing::Joint)` and `Ident("lifetime_name_without_quote")` similarly to multi-character operators.
- Stabilized APIs: None yet.

A bit of motivation for renaming (although it was already stated in the review comments):
- With my compiler frontend glasses on `Ident` is the single most appropriate name for this thing, *especially* if we are doing input validation on construction. `TokenTree::Ident` effectively wraps `token::Ident` or `ast::Ident + is_raw`, its meaning is "identifier" and it's already named `ident` in declarative macros.
- Regarding `Punct`, the motivation is that `Op` is actively misleading. The thing doesn't mean an operator, it's neither a subset of operators (there is non-operator punctuation in the language), nor superset (operators can be multicharacter while this thing is always a single character). So I named it `Punct` (first proposed in [the original RFC](https://github.com/rust-lang/rfcs/pull/1566), then [by @SimonSapin](https://github.com/rust-lang/rust/issues/38356#issuecomment-276676526)) , together with input validation it's now a subset of ASCII punctuation character category (`u8::is_ascii_punctuation`).
This commit is contained in:
bors 2018-05-16 11:18:05 +00:00
commit 2a3f5367a2
53 changed files with 595 additions and 210 deletions

View file

@ -11,13 +11,11 @@
//! A support library for macro authors when defining new macros.
//!
//! This library, provided by the standard distribution, provides the types
//! consumed in the interfaces of procedurally defined macro definitions.
//! Currently the primary use of this crate is to provide the ability to define
//! new custom derive modes through `#[proc_macro_derive]`.
//! consumed in the interfaces of procedurally defined macro definitions such as
//! function-like macros `#[proc_macro]`, macro attribures `#[proc_macro_attribute]` and
//! custom derive attributes`#[proc_macro_derive]`.
//!
//! Note that this crate is intentionally very bare-bones currently. The main
//! type, `TokenStream`, only supports `fmt::Display` and `FromStr`
//! implementations, indicating that it can only go to and come from a string.
//! Note that this crate is intentionally bare-bones currently.
//! This functionality is intended to be expanded over time as more surface
//! area for macro authors is stabilized.
//!
@ -55,18 +53,19 @@ use std::str::FromStr;
use syntax::ast;
use syntax::errors::DiagnosticBuilder;
use syntax::parse::{self, token};
use syntax::symbol::Symbol;
use syntax::symbol::{keywords, Symbol};
use syntax::tokenstream;
use syntax::parse::lexer::comments;
use syntax::parse::lexer::{self, comments};
use syntax_pos::{FileMap, Pos, SyntaxContext, FileName};
use syntax_pos::hygiene::Mark;
/// The main type provided by this crate, representing an abstract stream of
/// tokens.
/// tokens, or, more specifically, a sequence of token trees.
/// The type provide interfaces for iterating over those token trees and, conversely,
/// collecting a number of token trees into one stream.
///
/// This is both the input and output of `#[proc_macro_derive]` definitions.
/// Currently it's required to be a list of valid Rust items, but this
/// restriction may be lifted in the future.
/// This is both the input and output of `#[proc_macro]`, `#[proc_macro_attribute]`
/// and `#[proc_macro_derive]` definitions.
///
/// The API of this type is intentionally bare-bones, but it'll be expanded over
/// time!
@ -74,9 +73,9 @@ use syntax_pos::hygiene::Mark;
#[derive(Clone)]
pub struct TokenStream(tokenstream::TokenStream);
#[unstable(feature = "proc_macro", issue = "38356")]
#[stable(feature = "proc_macro_lib", since = "1.15.0")]
impl !Send for TokenStream {}
#[unstable(feature = "proc_macro", issue = "38356")]
#[stable(feature = "proc_macro_lib", since = "1.15.0")]
impl !Sync for TokenStream {}
/// Error returned from `TokenStream::from_str`.
@ -86,13 +85,13 @@ pub struct LexError {
_inner: (),
}
#[unstable(feature = "proc_macro", issue = "38356")]
#[stable(feature = "proc_macro_lib", since = "1.15.0")]
impl !Send for LexError {}
#[unstable(feature = "proc_macro", issue = "38356")]
#[stable(feature = "proc_macro_lib", since = "1.15.0")]
impl !Sync for LexError {}
impl TokenStream {
/// Returns an empty `TokenStream`.
/// Returns an empty `TokenStream` containing no token trees.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn empty() -> TokenStream {
TokenStream(tokenstream::TokenStream::empty())
@ -105,6 +104,12 @@ impl TokenStream {
}
}
/// Attempts to break the string into tokens and parse those tokens into a token stream.
/// May fail for a number of reasons, for example, if the string contains unbalanced delimiters
/// or characters not existing in the language.
///
/// NOTE: Some errors may cause panics instead of returning `LexError`. We reserve the right to
/// change these errors into `LexError`s later.
#[stable(feature = "proc_macro_lib", since = "1.15.0")]
impl FromStr for TokenStream {
type Err = LexError;
@ -125,6 +130,9 @@ impl FromStr for TokenStream {
}
}
/// Prints the token stream as a string that is supposed to be losslessly convertible back
/// into the same token stream (modulo spans), except for possibly `TokenTree::Group`s
/// with `Delimiter::None` delimiters and negative numeric literals.
#[stable(feature = "proc_macro_lib", since = "1.15.0")]
impl fmt::Display for TokenStream {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
@ -132,6 +140,7 @@ impl fmt::Display for TokenStream {
}
}
/// Prints token in a form convenient for debugging.
#[stable(feature = "proc_macro_lib", since = "1.15.0")]
impl fmt::Debug for TokenStream {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
@ -140,6 +149,7 @@ impl fmt::Debug for TokenStream {
}
}
/// Creates a token stream containing a single token tree.
#[unstable(feature = "proc_macro", issue = "38356")]
impl From<TokenTree> for TokenStream {
fn from(tree: TokenTree) -> TokenStream {
@ -147,6 +157,7 @@ impl From<TokenTree> for TokenStream {
}
}
/// Collects a number of token trees into a single stream.
#[unstable(feature = "proc_macro", issue = "38356")]
impl iter::FromIterator<TokenTree> for TokenStream {
fn from_iter<I: IntoIterator<Item = TokenTree>>(trees: I) -> Self {
@ -154,7 +165,9 @@ impl iter::FromIterator<TokenTree> for TokenStream {
}
}
#[unstable(feature = "proc_macro", issue = "38356")]
/// A "flattening" operation on token streams, collects token trees
/// from multiple token streams into a single stream.
#[stable(feature = "proc_macro_lib", since = "1.15.0")]
impl iter::FromIterator<TokenStream> for TokenStream {
fn from_iter<I: IntoIterator<Item = TokenStream>>(streams: I) -> Self {
let mut builder = tokenstream::TokenStreamBuilder::new();
@ -165,7 +178,7 @@ impl iter::FromIterator<TokenStream> for TokenStream {
}
}
/// Implementation details for the `TokenTree` type, such as iterators.
/// Public implementation details for the `TokenStream` type, such as iterators.
#[unstable(feature = "proc_macro", issue = "38356")]
pub mod token_stream {
use syntax::tokenstream;
@ -173,7 +186,9 @@ pub mod token_stream {
use {TokenTree, TokenStream, Delimiter};
/// An iterator over `TokenTree`s.
/// An iterator over `TokenStream`'s `TokenTree`s.
/// The iteration is "shallow", e.g. the iterator doesn't recurse into delimited groups,
/// and returns whole groups as token trees.
#[derive(Clone)]
#[unstable(feature = "proc_macro", issue = "38356")]
pub struct IntoIter {
@ -191,6 +206,12 @@ pub mod token_stream {
let next = self.cursor.next_as_stream()?;
Some(TokenTree::from_internal(next, &mut self.stack))
})?;
// HACK: The condition "dummy span + group with empty delimiter" represents an AST
// fragment approximately converted into a token stream. This may happen, for
// example, with inputs to proc macro attributes, including derives. Such "groups"
// need to flattened during iteration over stream's token trees.
// Eventually this needs to be removed in favor of keeping original token trees
// and not doing the roundtrip through AST.
if tree.span().0 == DUMMY_SP {
if let TokenTree::Group(ref group) = tree {
if group.delimiter() == Delimiter::None {
@ -217,7 +238,7 @@ pub mod token_stream {
/// `quote!(..)` accepts arbitrary tokens and expands into a `TokenStream` describing the input.
/// For example, `quote!(a + b)` will produce a expression, that, when evaluated, constructs
/// the `TokenStream` `[Word("a"), Op('+', Alone), Word("b")]`.
/// the `TokenStream` `[Ident("a"), Punct('+', Alone), Ident("b")]`.
///
/// Unquoting is done with `$`, and works by taking the single next ident as the unquoted term.
/// To quote `$` itself, use `$$`.
@ -268,6 +289,9 @@ impl Span {
}
/// The span of the invocation of the current procedural macro.
/// Identifiers created with this span will be resolved as if they were written
/// directly at the macro call location (call-site hygiene) and other code
/// at the macro call site will be able to refer to them as well.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn call_site() -> Span {
::__internal::with_sess(|(_, mark)| Span(mark.expn_info().unwrap().call_site))
@ -355,6 +379,7 @@ impl Span {
diagnostic_method!(help, Level::Help);
}
/// Prints a span in a form convenient for debugging.
#[unstable(feature = "proc_macro", issue = "38356")]
impl fmt::Debug for Span {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
@ -460,12 +485,12 @@ impl PartialEq<FileName> for SourceFile {
#[unstable(feature = "proc_macro", issue = "38356")]
#[derive(Clone)]
pub enum TokenTree {
/// A delimited tokenstream
/// A token stream surrounded by bracket delimiters.
Group(Group),
/// A unicode identifier
Term(Term),
/// A punctuation character (`+`, `,`, `$`, etc.).
Op(Op),
/// An identifier.
Ident(Ident),
/// A single punctuation character (`+`, `,`, `$`, etc.).
Punct(Punct),
/// A literal character (`'a'`), string (`"hello"`), number (`2.3`), etc.
Literal(Literal),
}
@ -476,14 +501,14 @@ impl !Send for TokenTree {}
impl !Sync for TokenTree {}
impl TokenTree {
/// Returns the span of this token, accessing the `span` method of each of
/// the internal tokens.
/// Returns the span of this tree, delegating to the `span` method of
/// the contained token or a delimited stream.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn span(&self) -> Span {
match *self {
TokenTree::Group(ref t) => t.span(),
TokenTree::Term(ref t) => t.span(),
TokenTree::Op(ref t) => t.span(),
TokenTree::Ident(ref t) => t.span(),
TokenTree::Punct(ref t) => t.span(),
TokenTree::Literal(ref t) => t.span(),
}
}
@ -497,13 +522,14 @@ impl TokenTree {
pub fn set_span(&mut self, span: Span) {
match *self {
TokenTree::Group(ref mut t) => t.set_span(span),
TokenTree::Term(ref mut t) => t.set_span(span),
TokenTree::Op(ref mut t) => t.set_span(span),
TokenTree::Ident(ref mut t) => t.set_span(span),
TokenTree::Punct(ref mut t) => t.set_span(span),
TokenTree::Literal(ref mut t) => t.set_span(span),
}
}
}
/// Prints token treee in a form convenient for debugging.
#[unstable(feature = "proc_macro", issue = "38356")]
impl fmt::Debug for TokenTree {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
@ -511,8 +537,8 @@ impl fmt::Debug for TokenTree {
// so don't bother with an extra layer of indirection
match *self {
TokenTree::Group(ref tt) => tt.fmt(f),
TokenTree::Term(ref tt) => tt.fmt(f),
TokenTree::Op(ref tt) => tt.fmt(f),
TokenTree::Ident(ref tt) => tt.fmt(f),
TokenTree::Punct(ref tt) => tt.fmt(f),
TokenTree::Literal(ref tt) => tt.fmt(f),
}
}
@ -526,16 +552,16 @@ impl From<Group> for TokenTree {
}
#[unstable(feature = "proc_macro", issue = "38356")]
impl From<Term> for TokenTree {
fn from(g: Term) -> TokenTree {
TokenTree::Term(g)
impl From<Ident> for TokenTree {
fn from(g: Ident) -> TokenTree {
TokenTree::Ident(g)
}
}
#[unstable(feature = "proc_macro", issue = "38356")]
impl From<Op> for TokenTree {
fn from(g: Op) -> TokenTree {
TokenTree::Op(g)
impl From<Punct> for TokenTree {
fn from(g: Punct) -> TokenTree {
TokenTree::Punct(g)
}
}
@ -546,23 +572,24 @@ impl From<Literal> for TokenTree {
}
}
/// Prints the token tree as a string that is supposed to be losslessly convertible back
/// into the same token tree (modulo spans), except for possibly `TokenTree::Group`s
/// with `Delimiter::None` delimiters and negative numeric literals.
#[unstable(feature = "proc_macro", issue = "38356")]
impl fmt::Display for TokenTree {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
match *self {
TokenTree::Group(ref t) => t.fmt(f),
TokenTree::Term(ref t) => t.fmt(f),
TokenTree::Op(ref t) => t.fmt(f),
TokenTree::Ident(ref t) => t.fmt(f),
TokenTree::Punct(ref t) => t.fmt(f),
TokenTree::Literal(ref t) => t.fmt(f),
}
}
}
/// A delimited token stream
/// A delimited token stream.
///
/// A `Group` internally contains a `TokenStream` which is delimited by a
/// `Delimiter`. Groups represent multiple tokens internally and have a `Span`
/// for the entire stream.
/// A `Group` internally contains a `TokenStream` which is surrounded by `Delimiter`s.
#[derive(Clone, Debug)]
#[unstable(feature = "proc_macro", issue = "38356")]
pub struct Group {
@ -586,12 +613,16 @@ pub enum Delimiter {
Brace,
/// `[ ... ]`
Bracket,
/// An implicit delimiter, e.g. `$var`, where $var is `...`.
/// `Ø ... Ø`
/// An implicit delimiter, that may, for example, appear around tokens coming from a
/// "macro variable" `$var`. It is important to preserve operator priorities in cases like
/// `$var * 3` where `$var` is `1 + 2`.
/// Implicit delimiters may not survive roundtrip of a token stream through a string.
None,
}
impl Group {
/// Creates a new `group` with the given delimiter and token stream.
/// Creates a new `Group` with the given delimiter and token stream.
///
/// This constructor will set the span for this group to
/// `Span::call_site()`. To change the span you can use the `set_span`
@ -639,6 +670,9 @@ impl Group {
}
}
/// Prints the group as a string that should be losslessly convertible back
/// into the same group (modulo spans), except for possibly `TokenTree::Group`s
/// with `Delimiter::None` delimiters.
#[unstable(feature = "proc_macro", issue = "38356")]
impl fmt::Display for Group {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
@ -646,145 +680,181 @@ impl fmt::Display for Group {
}
}
/// An `Op` is an operator like `+` or `-`, and only represents one character.
/// An `Punct` is an single punctuation character like `+`, `-` or `#`.
///
/// Operators like `+=` are represented as two instance of `Op` with different
/// Multicharacter operators like `+=` are represented as two instances of `Punct` with different
/// forms of `Spacing` returned.
#[unstable(feature = "proc_macro", issue = "38356")]
#[derive(Copy, Clone, Debug)]
pub struct Op {
op: char,
#[derive(Clone, Debug)]
pub struct Punct {
ch: char,
spacing: Spacing,
span: Span,
}
#[unstable(feature = "proc_macro", issue = "38356")]
impl !Send for Op {}
impl !Send for Punct {}
#[unstable(feature = "proc_macro", issue = "38356")]
impl !Sync for Op {}
impl !Sync for Punct {}
/// Whether an `Op` is either followed immediately by another `Op` or followed by whitespace.
/// Whether an `Punct` is followed immediately by another `Punct` or
/// followed by another token or whitespace.
#[derive(Copy, Clone, Debug, PartialEq, Eq)]
#[unstable(feature = "proc_macro", issue = "38356")]
pub enum Spacing {
/// e.g. `+` is `Alone` in `+ =`.
/// E.g. `+` is `Alone` in `+ =`, `+ident` or `+()`.
Alone,
/// e.g. `+` is `Joint` in `+=`.
/// E.g. `+` is `Joint` in `+=` or `'#`.
/// Additionally, single quote `'` can join with identifiers to form lifetimes `'ident`.
Joint,
}
impl Op {
/// Creates a new `Op` from the given character and spacing.
impl Punct {
/// Creates a new `Punct` from the given character and spacing.
/// The `ch` argument must be a valid punctuation character permitted by the language,
/// otherwise the function will panic.
///
/// The returned `Op` will have the default span of `Span::call_site()`
/// The returned `Punct` will have the default span of `Span::call_site()`
/// which can be further configured with the `set_span` method below.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn new(op: char, spacing: Spacing) -> Op {
Op {
op: op,
pub fn new(ch: char, spacing: Spacing) -> Punct {
const LEGAL_CHARS: &[char] = &['=', '<', '>', '!', '~', '+', '-', '*', '/', '%', '^',
'&', '|', '@', '.', ',', ';', ':', '#', '$', '?', '\''];
if !LEGAL_CHARS.contains(&ch) {
panic!("unsupported character `{:?}`", ch)
}
Punct {
ch: ch,
spacing: spacing,
span: Span::call_site(),
}
}
/// Returns the character this operation represents, for example `'+'`
/// Returns the value of this punctuation character as `char`.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn op(&self) -> char {
self.op
pub fn as_char(&self) -> char {
self.ch
}
/// Returns the spacing of this operator, indicating whether it's a joint
/// operator with more operators coming next in the token stream or an
/// `Alone` meaning that the operator has ended.
/// Returns the spacing of this punctuation character, indicating whether it's immediately
/// followed by another `Punct` in the token stream, so they can potentially be combined into
/// a multicharacter operator (`Joint`), or it's followed by some other token or whitespace
/// (`Alone`) so the operator has certainly ended.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn spacing(&self) -> Spacing {
self.spacing
}
/// Returns the span for this operator character
/// Returns the span for this punctuation character.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn span(&self) -> Span {
self.span
}
/// Configure the span for this operator's character
/// Configure the span for this punctuation character.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn set_span(&mut self, span: Span) {
self.span = span;
}
}
/// Prints the punctuation character as a string that should be losslessly convertible
/// back into the same character.
#[unstable(feature = "proc_macro", issue = "38356")]
impl fmt::Display for Op {
impl fmt::Display for Punct {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
TokenStream::from(TokenTree::from(self.clone())).fmt(f)
}
}
/// An interned string.
#[derive(Copy, Clone, Debug)]
/// An identifier (`ident`).
#[derive(Clone, Debug)]
#[unstable(feature = "proc_macro", issue = "38356")]
pub struct Term {
pub struct Ident {
sym: Symbol,
span: Span,
is_raw: bool,
}
#[unstable(feature = "proc_macro", issue = "38356")]
impl !Send for Term {}
impl !Send for Ident {}
#[unstable(feature = "proc_macro", issue = "38356")]
impl !Sync for Term {}
impl !Sync for Ident {}
impl Term {
/// Creates a new `Term` with the given `string` as well as the specified
impl Ident {
/// Creates a new `Ident` with the given `string` as well as the specified
/// `span`.
/// The `string` argument must be a valid identifier permitted by the
/// language, otherwise the function will panic.
///
/// Note that `span`, currently in rustc, configures the hygiene information
/// for this identifier. As of this time `Span::call_site()` explicitly
/// opts-in to **non-hygienic** information (aka copy/pasted code) while
/// spans like `Span::def_site()` will opt-in to hygienic information,
/// meaning that code at the call site of the macro can't access this
/// identifier.
/// for this identifier.
///
/// As of this time `Span::call_site()` explicitly opts-in to "call-site" hygiene
/// meaning that identifiers created with this span will be resolved as if they were written
/// directly at the location of the macro call, and other code at the macro call site will be
/// able to refer to them as well.
///
/// Later spans like `Span::def_site()` will allow to opt-in to "definition-site" hygiene
/// meaning that identifiers created with this span will be resolved at the location of the
/// macro definition and other code at the macro call site will not be able to refer to them.
///
/// Due to the current importance of hygiene this constructor, unlike other
/// tokens, requires a `Span` to be specified at construction.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn new(string: &str, span: Span) -> Term {
Term {
pub fn new(string: &str, span: Span) -> Ident {
if !lexer::is_valid_ident(string) {
panic!("`{:?}` is not a valid identifier", string)
}
Ident {
sym: Symbol::intern(string),
span,
is_raw: false,
}
}
// FIXME: Remove this, do not stabilize
/// Get a reference to the interned string.
/// Same as `Ident::new`, but creates a raw identifier (`r#ident`).
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn as_str(&self) -> &str {
unsafe { &*(&*self.sym.as_str() as *const str) }
pub fn new_raw(string: &str, span: Span) -> Ident {
let mut ident = Ident::new(string, span);
if ident.sym == keywords::Underscore.name() ||
token::is_path_segment_keyword(ast::Ident::with_empty_ctxt(ident.sym)) {
panic!("`{:?}` is not a valid raw identifier", string)
}
ident.is_raw = true;
ident
}
/// Returns the span of this `Term`, encompassing the entire string returned
/// Returns the span of this `Ident`, encompassing the entire string returned
/// by `as_str`.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn span(&self) -> Span {
self.span
}
/// Configures the span of this `Term`, possibly changing hygiene
/// information.
/// Configures the span of this `Ident`, possibly changing its hygiene context.
#[unstable(feature = "proc_macro", issue = "38356")]
pub fn set_span(&mut self, span: Span) {
self.span = span;
}
}
/// Prints the identifier as a string that should be losslessly convertible
/// back into the same identifier.
#[unstable(feature = "proc_macro", issue = "38356")]
impl fmt::Display for Term {
impl fmt::Display for Ident {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
if self.is_raw {
f.write_str("r#")?;
}
self.sym.as_str().fmt(f)
}
}
/// A literal character (`'a'`), string (`"hello"`), a number (`2.3`), etc.
/// A literal string (`"hello"`), byte string (`b"hello"`),
/// character (`'a'`), byte character (`b'a'`), an integer or floating point number
/// with or without a suffix (`1`, `1u8`, `2.3`, `2.3f32`).
/// Boolean literals like `true` and `false` do not belong here, they are `Ident`s.
#[derive(Clone, Debug)]
#[unstable(feature = "proc_macro", issue = "38356")]
pub struct Literal {
@ -805,6 +875,8 @@ macro_rules! suffixed_int_literals {
/// This function will create an integer like `1u32` where the integer
/// value specified is the first part of the token and the integral is
/// also suffixed at the end.
/// Literals created from negative numbers may not survive rountrips through
/// `TokenStream` or strings and may be broken into two tokens (`-` and positive literal).
///
/// Literals created through this method have the `Span::call_site()`
/// span by default, which can be configured with the `set_span` method
@ -829,6 +901,8 @@ macro_rules! unsuffixed_int_literals {
/// specified on this token, meaning that invocations like
/// `Literal::i8_unsuffixed(1)` are equivalent to
/// `Literal::u32_unsuffixed(1)`.
/// Literals created from negative numbers may not survive rountrips through
/// `TokenStream` or strings and may be broken into two tokens (`-` and positive literal).
///
/// Literals created through this method have the `Span::call_site()`
/// span by default, which can be configured with the `set_span` method
@ -880,6 +954,8 @@ impl Literal {
/// This constructor is similar to those like `Literal::i8_unsuffixed` where
/// the float's value is emitted directly into the token but no suffix is
/// used, so it may be inferred to be a `f64` later in the compiler.
/// Literals created from negative numbers may not survive rountrips through
/// `TokenStream` or strings and may be broken into two tokens (`-` and positive literal).
///
/// # Panics
///
@ -903,6 +979,8 @@ impl Literal {
/// specified is the preceding part of the token and `f32` is the suffix of
/// the token. This token will always be inferred to be an `f32` in the
/// compiler.
/// Literals created from negative numbers may not survive rountrips through
/// `TokenStream` or strings and may be broken into two tokens (`-` and positive literal).
///
/// # Panics
///
@ -925,6 +1003,8 @@ impl Literal {
/// This constructor is similar to those like `Literal::i8_unsuffixed` where
/// the float's value is emitted directly into the token but no suffix is
/// used, so it may be inferred to be a `f64` later in the compiler.
/// Literals created from negative numbers may not survive rountrips through
/// `TokenStream` or strings and may be broken into two tokens (`-` and positive literal).
///
/// # Panics
///
@ -948,6 +1028,8 @@ impl Literal {
/// specified is the preceding part of the token and `f64` is the suffix of
/// the token. This token will always be inferred to be an `f64` in the
/// compiler.
/// Literals created from negative numbers may not survive rountrips through
/// `TokenStream` or strings and may be broken into two tokens (`-` and positive literal).
///
/// # Panics
///
@ -1016,6 +1098,8 @@ impl Literal {
}
}
/// Prints the literal as a string that should be losslessly convertible
/// back into the same literal (except for possible rounding for floating point literals).
#[unstable(feature = "proc_macro", issue = "38356")]
impl fmt::Display for Literal {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
@ -1068,15 +1152,15 @@ impl TokenTree {
})
}
macro_rules! op {
($a:expr) => (tt!(Op::new($a, op_kind)));
($a:expr) => (tt!(Punct::new($a, op_kind)));
($a:expr, $b:expr) => ({
stack.push(tt!(Op::new($b, op_kind)));
tt!(Op::new($a, Spacing::Joint))
stack.push(tt!(Punct::new($b, op_kind)));
tt!(Punct::new($a, Spacing::Joint))
});
($a:expr, $b:expr, $c:expr) => ({
stack.push(tt!(Op::new($c, op_kind)));
stack.push(tt!(Op::new($b, Spacing::Joint)));
tt!(Op::new($a, Spacing::Joint))
stack.push(tt!(Punct::new($c, op_kind)));
stack.push(tt!(Punct::new($b, Spacing::Joint)));
tt!(Punct::new($a, Spacing::Joint))
})
}
@ -1127,27 +1211,33 @@ impl TokenTree {
Pound => op!('#'),
Dollar => op!('$'),
Question => op!('?'),
SingleQuote => op!('\''),
Ident(ident, false) | Lifetime(ident) => {
tt!(Term::new(&ident.name.as_str(), Span(span)))
Ident(ident, false) => {
tt!(self::Ident::new(&ident.name.as_str(), Span(span)))
}
Ident(ident, true) => {
tt!(Term::new(&format!("r#{}", ident), Span(span)))
tt!(self::Ident::new_raw(&ident.name.as_str(), Span(span)))
}
Lifetime(ident) => {
let ident = ident.without_first_quote();
stack.push(tt!(self::Ident::new(&ident.name.as_str(), Span(span))));
tt!(Punct::new('\'', Spacing::Joint))
}
Literal(lit, suffix) => tt!(self::Literal { lit, suffix, span: Span(span) }),
DocComment(c) => {
let style = comments::doc_comment_style(&c.as_str());
let stripped = comments::strip_doc_comment_decoration(&c.as_str());
let stream = vec![
tt!(Term::new("doc", Span(span))),
tt!(Op::new('=', Spacing::Alone)),
tt!(self::Ident::new("doc", Span(span))),
tt!(Punct::new('=', Spacing::Alone)),
tt!(self::Literal::string(&stripped)),
].into_iter().collect();
stack.push(tt!(Group::new(Delimiter::Bracket, stream)));
if style == ast::AttrStyle::Inner {
stack.push(tt!(Op::new('!', Spacing::Alone)));
stack.push(tt!(Punct::new('!', Spacing::Alone)));
}
tt!(Op::new('#', Spacing::Alone))
tt!(Punct::new('#', Spacing::Alone))
}
Interpolated(_) => {
@ -1167,26 +1257,16 @@ impl TokenTree {
use syntax::parse::token::*;
use syntax::tokenstream::{TokenTree, Delimited};
let (op, kind, span) = match self {
self::TokenTree::Op(tt) => (tt.op(), tt.spacing(), tt.span()),
let (ch, kind, span) = match self {
self::TokenTree::Punct(tt) => (tt.as_char(), tt.spacing(), tt.span()),
self::TokenTree::Group(tt) => {
return TokenTree::Delimited(tt.span.0, Delimited {
delim: tt.delimiter.to_internal(),
tts: tt.stream.0.into(),
}).into();
},
self::TokenTree::Term(tt) => {
let ident = ast::Ident::new(tt.sym, tt.span.0);
let sym_str = tt.sym.to_string();
let token = if sym_str.starts_with("'") {
Lifetime(ident)
} else if sym_str.starts_with("r#") {
let name = Symbol::intern(&sym_str[2..]);
let ident = ast::Ident::new(name, ident.span);
Ident(ident, true)
} else {
Ident(ident, false)
};
self::TokenTree::Ident(tt) => {
let token = Ident(ast::Ident::new(tt.sym, tt.span.0), tt.is_raw);
return TokenTree::Token(tt.span.0, token).into();
}
self::TokenTree::Literal(self::Literal {
@ -1223,7 +1303,7 @@ impl TokenTree {
}
};
let token = match op {
let token = match ch {
'=' => Eq,
'<' => Lt,
'>' => Gt,
@ -1245,7 +1325,8 @@ impl TokenTree {
'#' => Pound,
'$' => Dollar,
'?' => Question,
_ => panic!("unsupported character {}", op),
'\'' => SingleQuote,
_ => unreachable!(),
};
let tree = TokenTree::Token(span.0, token);
@ -1268,7 +1349,7 @@ impl TokenTree {
#[unstable(feature = "proc_macro_internals", issue = "27812")]
#[doc(hidden)]
pub mod __internal {
pub use quote::{LiteralKind, Quoter, unquote};
pub use quote::{LiteralKind, SpannedSymbol, Quoter, unquote};
use std::cell::Cell;

View file

@ -14,10 +14,11 @@
//! This quasiquoter uses macros 2.0 hygiene to reliably access
//! items from `proc_macro`, to build a `proc_macro::TokenStream`.
use {Delimiter, Literal, Spacing, Span, Term, Op, Group, TokenStream, TokenTree};
use {Delimiter, Literal, Spacing, Span, Ident, Punct, Group, TokenStream, TokenTree};
use syntax::ext::base::{ExtCtxt, ProcMacro};
use syntax::parse::token;
use syntax::symbol::Symbol;
use syntax::tokenstream;
pub struct Quoter;
@ -35,14 +36,14 @@ macro_rules! tt2ts {
}
macro_rules! quote_tok {
(,) => { tt2ts!(Op::new(',', Spacing::Alone)) };
(.) => { tt2ts!(Op::new('.', Spacing::Alone)) };
(:) => { tt2ts!(Op::new(':', Spacing::Alone)) };
(|) => { tt2ts!(Op::new('|', Spacing::Alone)) };
(,) => { tt2ts!(Punct::new(',', Spacing::Alone)) };
(.) => { tt2ts!(Punct::new('.', Spacing::Alone)) };
(:) => { tt2ts!(Punct::new(':', Spacing::Alone)) };
(|) => { tt2ts!(Punct::new('|', Spacing::Alone)) };
(::) => {
[
TokenTree::from(Op::new(':', Spacing::Joint)),
TokenTree::from(Op::new(':', Spacing::Alone)),
TokenTree::from(Punct::new(':', Spacing::Joint)),
TokenTree::from(Punct::new(':', Spacing::Alone)),
].iter()
.cloned()
.map(|mut x| {
@ -51,13 +52,13 @@ macro_rules! quote_tok {
})
.collect::<TokenStream>()
};
(!) => { tt2ts!(Op::new('!', Spacing::Alone)) };
(<) => { tt2ts!(Op::new('<', Spacing::Alone)) };
(>) => { tt2ts!(Op::new('>', Spacing::Alone)) };
(_) => { tt2ts!(Op::new('_', Spacing::Alone)) };
(!) => { tt2ts!(Punct::new('!', Spacing::Alone)) };
(<) => { tt2ts!(Punct::new('<', Spacing::Alone)) };
(>) => { tt2ts!(Punct::new('>', Spacing::Alone)) };
(_) => { tt2ts!(Punct::new('_', Spacing::Alone)) };
(0) => { tt2ts!(Literal::i8_unsuffixed(0)) };
(&) => { tt2ts!(Op::new('&', Spacing::Alone)) };
($i:ident) => { tt2ts!(Term::new(stringify!($i), Span::def_site())) };
(&) => { tt2ts!(Punct::new('&', Spacing::Alone)) };
($i:ident) => { tt2ts!(Ident::new(stringify!($i), Span::def_site())) };
}
macro_rules! quote_tree {
@ -110,15 +111,15 @@ impl Quote for TokenStream {
if after_dollar {
after_dollar = false;
match tree {
TokenTree::Term(_) => {
TokenTree::Ident(_) => {
let tree = TokenStream::from(tree);
return Some(quote!(::__internal::unquote(&(unquote tree)),));
}
TokenTree::Op(ref tt) if tt.op() == '$' => {}
TokenTree::Punct(ref tt) if tt.as_char() == '$' => {}
_ => panic!("`$` must be followed by an ident or `$` in `quote!`"),
}
} else if let TokenTree::Op(tt) = tree {
if tt.op() == '$' {
} else if let TokenTree::Punct(ref tt) = tree {
if tt.as_char() == '$' {
after_dollar = true;
return None;
}
@ -143,9 +144,9 @@ impl Quote for TokenStream {
impl Quote for TokenTree {
fn quote(self) -> TokenStream {
match self {
TokenTree::Op(tt) => quote!(::TokenTree::Op( (quote tt) )),
TokenTree::Punct(tt) => quote!(::TokenTree::Punct( (quote tt) )),
TokenTree::Group(tt) => quote!(::TokenTree::Group( (quote tt) )),
TokenTree::Term(tt) => quote!(::TokenTree::Term( (quote tt) )),
TokenTree::Ident(tt) => quote!(::TokenTree::Ident( (quote tt) )),
TokenTree::Literal(tt) => quote!(::TokenTree::Literal( (quote tt) )),
}
}
@ -175,15 +176,15 @@ impl Quote for Group {
}
}
impl Quote for Op {
impl Quote for Punct {
fn quote(self) -> TokenStream {
quote!(::Op::new((quote self.op()), (quote self.spacing())))
quote!(::Punct::new((quote self.as_char()), (quote self.spacing())))
}
}
impl Quote for Term {
impl Quote for Ident {
fn quote(self) -> TokenStream {
quote!(::Term::new((quote self.sym.as_str()), (quote self.span())))
quote!(::Ident::new((quote self.sym.as_str()), (quote self.span())))
}
}
@ -195,14 +196,32 @@ impl Quote for Span {
macro_rules! literals {
($($i:ident),*; $($raw:ident),*) => {
pub struct SpannedSymbol {
sym: Symbol,
span: Span,
}
impl SpannedSymbol {
pub fn new(string: &str, span: Span) -> SpannedSymbol {
SpannedSymbol { sym: Symbol::intern(string), span }
}
}
impl Quote for SpannedSymbol {
fn quote(self) -> TokenStream {
quote!(::__internal::SpannedSymbol::new((quote self.sym.as_str()),
(quote self.span)))
}
}
pub enum LiteralKind {
$($i,)*
$($raw(u16),)*
}
impl LiteralKind {
pub fn with_contents_and_suffix(self, contents: Term, suffix: Option<Term>)
-> Literal {
pub fn with_contents_and_suffix(self, contents: SpannedSymbol,
suffix: Option<SpannedSymbol>) -> Literal {
let sym = contents.sym;
let suffix = suffix.map(|t| t.sym);
match self {
@ -225,13 +244,14 @@ macro_rules! literals {
}
impl Literal {
fn kind_contents_and_suffix(self) -> (LiteralKind, Term, Option<Term>) {
fn kind_contents_and_suffix(self) -> (LiteralKind, SpannedSymbol, Option<SpannedSymbol>)
{
let (kind, contents) = match self.lit {
$(token::Lit::$i(contents) => (LiteralKind::$i, contents),)*
$(token::Lit::$raw(contents, n) => (LiteralKind::$raw(n), contents),)*
};
let suffix = self.suffix.map(|sym| Term::new(&sym.as_str(), self.span()));
(kind, Term::new(&contents.as_str(), self.span()), suffix)
let suffix = self.suffix.map(|sym| SpannedSymbol::new(&sym.as_str(), self.span()));
(kind, SpannedSymbol::new(&contents.as_str(), self.span()), suffix)
}
}

View file

@ -314,6 +314,7 @@ fn hash_token<'a, 'gcx, W: StableHasherResult>(
token::Token::Pound |
token::Token::Dollar |
token::Token::Question |
token::Token::SingleQuote |
token::Token::Whitespace |
token::Token::Comment |
token::Token::Eof => {}

View file

@ -353,7 +353,7 @@ impl<'a> Classifier<'a> {
token::Lifetime(..) => Class::Lifetime,
token::Eof | token::Interpolated(..) |
token::Tilde | token::At | token::DotEq => Class::None,
token::Tilde | token::At | token::DotEq | token::SingleQuote => Class::None,
};
// Anything that didn't return above is the simple case where we the

View file

@ -711,6 +711,7 @@ fn expr_mk_token(cx: &ExtCtxt, sp: Span, tok: &token::Token) -> P<ast::Expr> {
token::Pound => "Pound",
token::Dollar => "Dollar",
token::Question => "Question",
token::SingleQuote => "SingleQuote",
token::Eof => "Eof",
token::Whitespace | token::Comment | token::Shebang(_) => {

View file

@ -1770,6 +1770,12 @@ fn ident_continue(c: Option<char>) -> bool {
(c > '\x7f' && c.is_xid_continue())
}
// The string is a valid identifier or a lifetime identifier.
pub fn is_valid_ident(s: &str) -> bool {
let mut chars = s.chars();
ident_start(chars.next()) && chars.all(|ch| ident_continue(Some(ch)))
}
#[cfg(test)]
mod tests {
use super::*;

View file

@ -210,6 +210,8 @@ pub enum Token {
Pound,
Dollar,
Question,
/// Used by proc macros for representing lifetimes, not generated by lexer right now.
SingleQuote,
/// An opening delimiter, eg. `{`
OpenDelim(DelimToken),
/// A closing delimiter, eg. `}`
@ -513,6 +515,10 @@ impl Token {
Colon => ModSep,
_ => return None,
},
SingleQuote => match joint {
Ident(ident, false) => Lifetime(ident),
_ => return None,
},
Le | EqEq | Ne | Ge | AndAnd | OrOr | Tilde | BinOpEq(..) | At | DotDotDot | DotEq |
DotDotEq | Comma | Semi | ModSep | RArrow | LArrow | FatArrow | Pound | Dollar |

View file

@ -224,6 +224,7 @@ pub fn token_to_string(tok: &Token) -> String {
token::Pound => "#".to_string(),
token::Dollar => "$".to_string(),
token::Question => "?".to_string(),
token::SingleQuote => "'".to_string(),
/* Literals */
token::Literal(lit, suf) => {

View file

@ -53,7 +53,7 @@ pub fn bar(attr: TokenStream, input: TokenStream) -> TokenStream {
fn assert_inline(slice: &mut &[TokenTree]) {
match &slice[0] {
TokenTree::Op(tt) => assert_eq!(tt.op(), '#'),
TokenTree::Punct(tt) => assert_eq!(tt.as_char(), '#'),
_ => panic!("expected '#' char"),
}
match &slice[1] {
@ -65,8 +65,8 @@ fn assert_inline(slice: &mut &[TokenTree]) {
fn assert_doc(slice: &mut &[TokenTree]) {
match &slice[0] {
TokenTree::Op(tt) => {
assert_eq!(tt.op(), '#');
TokenTree::Punct(tt) => {
assert_eq!(tt.as_char(), '#');
assert_eq!(tt.spacing(), Spacing::Alone);
}
_ => panic!("expected #"),
@ -86,12 +86,12 @@ fn assert_doc(slice: &mut &[TokenTree]) {
}
match &tokens[0] {
TokenTree::Term(tt) => assert_eq!("doc", &*tt.to_string()),
TokenTree::Ident(tt) => assert_eq!("doc", &*tt.to_string()),
_ => panic!("expected `doc`"),
}
match &tokens[1] {
TokenTree::Op(tt) => {
assert_eq!(tt.op(), '=');
TokenTree::Punct(tt) => {
assert_eq!(tt.as_char(), '=');
assert_eq!(tt.spacing(), Spacing::Alone);
}
_ => panic!("expected equals"),
@ -106,7 +106,7 @@ fn assert_doc(slice: &mut &[TokenTree]) {
fn assert_invoc(slice: &mut &[TokenTree]) {
match &slice[0] {
TokenTree::Op(tt) => assert_eq!(tt.op(), '#'),
TokenTree::Punct(tt) => assert_eq!(tt.as_char(), '#'),
_ => panic!("expected '#' char"),
}
match &slice[1] {
@ -118,11 +118,11 @@ fn assert_invoc(slice: &mut &[TokenTree]) {
fn assert_foo(slice: &mut &[TokenTree]) {
match &slice[0] {
TokenTree::Term(tt) => assert_eq!(&*tt.to_string(), "fn"),
TokenTree::Ident(tt) => assert_eq!(&*tt.to_string(), "fn"),
_ => panic!("expected fn"),
}
match &slice[1] {
TokenTree::Term(tt) => assert_eq!(&*tt.to_string(), "foo"),
TokenTree::Ident(tt) => assert_eq!(&*tt.to_string(), "foo"),
_ => panic!("expected foo"),
}
match &slice[2] {
@ -148,8 +148,8 @@ fn fold_tree(input: TokenTree) -> TokenTree {
TokenTree::Group(b) => {
TokenTree::Group(Group::new(b.delimiter(), fold_stream(b.stream())))
}
TokenTree::Op(b) => TokenTree::Op(b),
TokenTree::Term(a) => TokenTree::Term(a),
TokenTree::Punct(b) => TokenTree::Punct(b),
TokenTree::Ident(a) => TokenTree::Ident(a),
TokenTree::Literal(a) => {
if a.to_string() != "\"foo\"" {
TokenTree::Literal(a)

View file

@ -11,7 +11,6 @@
// force-host
// no-prefer-dynamic
#![feature(proc_macro, proc_macro_lib)]
#![crate_type = "proc-macro"]
extern crate proc_macro;

View file

@ -11,8 +11,6 @@
// aux-build:issue_38586.rs
// ignore-stage1
#![feature(proc_macro)]
#[macro_use]
extern crate issue_38586;

View file

@ -11,7 +11,7 @@
// aux-build:bang_proc_macro2.rs
// ignore-stage1
#![feature(proc_macro, proc_macro_non_items)]
#![feature(use_extern_macros, proc_macro_non_items)]
#![allow(unused_macros)]
extern crate bang_proc_macro2;

View file

@ -10,7 +10,7 @@
// aux-build:bang_proc_macro.rs
#![feature(proc_macro, proc_macro_non_items)]
#![feature(proc_macro_non_items)]
#[macro_use]
extern crate bang_proc_macro;

View file

@ -10,7 +10,7 @@
// aux-build:proc-macro-gates.rs
#![feature(proc_macro, stmt_expr_attributes)]
#![feature(use_extern_macros, stmt_expr_attributes)]
extern crate proc_macro_gates as foo;

View file

@ -33,7 +33,7 @@ pub fn cond(input: TokenStream) -> TokenStream {
panic!("Invalid macro usage in cond: {}", cond);
}
let is_else = match test {
TokenTree::Term(word) => &*word.to_string() == "else",
TokenTree::Ident(ref word) => &*word.to_string() == "else",
_ => false,
};
conds.push(if is_else || input.peek().is_none() {

View file

@ -11,7 +11,7 @@
// no-prefer-dynamic
#![crate_type = "proc-macro"]
#![feature(proc_macro, proc_macro_lib, proc_macro_non_items)]
#![feature(proc_macro, proc_macro_non_items)]
extern crate proc_macro;

View file

@ -11,7 +11,7 @@
// no-prefer-dynamic
#![crate_type = "proc-macro"]
#![feature(proc_macro, proc_macro_lib, proc_macro_non_items)]
#![feature(proc_macro, proc_macro_non_items)]
extern crate proc_macro;

View file

@ -11,7 +11,7 @@
// aux-build:cond_plugin.rs
// ignore-stage1
#![feature(proc_macro, proc_macro_non_items)]
#![feature(use_extern_macros, proc_macro_non_items)]
extern crate cond_plugin;

View file

@ -13,7 +13,7 @@
// aux-build:hello_macro.rs
// ignore-stage1
#![feature(proc_macro, proc_macro_non_items)]
#![feature(use_extern_macros, proc_macro_non_items)]
extern crate hello_macro;

View file

@ -28,7 +28,7 @@ fn count_compound_ops_helper(input: TokenStream) -> u32 {
let mut count = 0;
for token in input {
match &token {
TokenTree::Op(tt) if tt.spacing() == Spacing::Alone => {
TokenTree::Punct(tt) if tt.spacing() == Spacing::Alone => {
count += 1;
}
TokenTree::Group(tt) => {

View file

@ -9,7 +9,7 @@
// except according to those terms.
// no-prefer-dynamic
#![feature(proc_macro)]
#![crate_type = "proc-macro"]
extern crate proc_macro;

View file

@ -8,7 +8,7 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
#![feature(proc_macro)]
#![feature(use_extern_macros)]
extern crate hygiene_example_codegen;

View file

@ -0,0 +1,36 @@
// Copyright 2018 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
// no-prefer-dynamic
#![feature(proc_macro)]
#![crate_type = "proc-macro"]
extern crate proc_macro;
use proc_macro::*;
#[proc_macro]
pub fn lifetimes_bang(input: TokenStream) -> TokenStream {
// Roundtrip through token trees
input.into_iter().collect()
}
#[proc_macro_attribute]
pub fn lifetimes_attr(_: TokenStream, input: TokenStream) -> TokenStream {
// Roundtrip through AST
input
}
#[proc_macro_derive(Lifetimes)]
pub fn lifetimes_derive(input: TokenStream) -> TokenStream {
// Roundtrip through a string
format!("mod m {{ {} }}", input).parse().unwrap()
}

View file

@ -38,14 +38,14 @@ fn assert_eq(a: TokenStream, b: TokenStream) {
assert_eq!(a.delimiter(), b.delimiter());
assert_eq(a.stream(), b.stream());
}
(TokenTree::Op(a), TokenTree::Op(b)) => {
assert_eq!(a.op(), b.op());
(TokenTree::Punct(a), TokenTree::Punct(b)) => {
assert_eq!(a.as_char(), b.as_char());
assert_eq!(a.spacing(), b.spacing());
}
(TokenTree::Literal(a), TokenTree::Literal(b)) => {
assert_eq!(a.to_string(), b.to_string());
}
(TokenTree::Term(a), TokenTree::Term(b)) => {
(TokenTree::Ident(a), TokenTree::Ident(b)) => {
assert_eq!(a.to_string(), b.to_string());
}
(a, b) => panic!("{:?} != {:?}", a, b),

View file

@ -11,7 +11,7 @@
// aux-build:bang-macro.rs
// ignore-stage1
#![feature(proc_macro, proc_macro_non_items)]
#![feature(use_extern_macros, proc_macro_non_items)]
extern crate bang_macro;
use bang_macro::rewrite;

View file

@ -11,7 +11,7 @@
// aux-build:count_compound_ops.rs
// ignore-stage1
#![feature(proc_macro, proc_macro_non_items)]
#![feature(use_extern_macros, proc_macro_non_items)]
extern crate count_compound_ops;
use count_compound_ops::count_compound_ops;

View file

@ -11,7 +11,7 @@
// aux-build:derive-attr-cfg.rs
// ignore-stage1
#![feature(proc_macro)]
#![feature(use_extern_macros)]
extern crate derive_attr_cfg;
use derive_attr_cfg::Foo;

View file

@ -12,7 +12,7 @@
// aux-build:hygiene_example.rs
// ignore-stage1
#![feature(proc_macro, proc_macro_non_items)]
#![feature(use_extern_macros, proc_macro_non_items)]
extern crate hygiene_example;
use hygiene_example::hello;

View file

@ -11,7 +11,7 @@
// aux-build:issue-39889.rs
// ignore-stage1
#![feature(proc_macro)]
#![feature(use_extern_macros)]
#![allow(unused)]
extern crate issue_39889;

View file

@ -11,7 +11,7 @@
// aux-build:issue-40001-plugin.rs
// ignore-stage1
#![feature(proc_macro, plugin)]
#![feature(plugin)]
#![plugin(issue_40001_plugin)]
#[whitelisted_attr]

View file

@ -0,0 +1,36 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
// aux-build:lifetimes.rs
// ignore-stage1
#![feature(proc_macro)]
extern crate lifetimes;
use lifetimes::*;
lifetimes_bang! {
fn bang<'a>() -> &'a u8 { &0 }
}
#[lifetimes_attr]
fn attr<'a>() -> &'a u8 { &1 }
#[derive(Lifetimes)]
pub struct Lifetimes<'a> {
pub field: &'a u8,
}
fn main() {
assert_eq!(bang::<'static>(), &0);
assert_eq!(attr::<'static>(), &1);
let l1 = Lifetimes { field: &0 };
let l2 = m::Lifetimes { field: &1 };
}

View file

@ -11,7 +11,7 @@
// aux-build:negative-token.rs
// ignore-stage1
#![feature(proc_macro, proc_macro_non_items)]
#![feature(proc_macro_non_items)]
extern crate negative_token;

View file

@ -13,7 +13,7 @@
// ignore-pretty
#![feature(proc_macro)]
#![feature(use_extern_macros)]
#[macro_use]
extern crate span_test_macros;

View file

@ -0,0 +1,38 @@
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
// force-host
// no-prefer-dynamic
#![feature(proc_macro)]
#![crate_type = "proc-macro"]
extern crate proc_macro;
use proc_macro::*;
#[proc_macro]
pub fn invalid_punct(_: TokenStream) -> TokenStream {
TokenTree::from(Punct::new('`', Spacing::Alone)).into()
}
#[proc_macro]
pub fn invalid_ident(_: TokenStream) -> TokenStream {
TokenTree::from(Ident::new("*", Span::call_site())).into()
}
#[proc_macro]
pub fn invalid_raw_ident(_: TokenStream) -> TokenStream {
TokenTree::from(Ident::new_raw("self", Span::call_site())).into()
}
#[proc_macro]
pub fn lexer_failure(_: TokenStream) -> TokenStream {
"a b ) c".parse().expect("parsing failed without panic")
}

View file

@ -0,0 +1,30 @@
// Copyright 2018 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
// no-prefer-dynamic
#![feature(proc_macro)]
#![crate_type = "proc-macro"]
extern crate proc_macro;
use proc_macro::*;
#[proc_macro]
pub fn single_quote_alone(_: TokenStream) -> TokenStream {
// `&'a u8`, but the `'` token is not joint
let trees: Vec<TokenTree> = vec![
Punct::new('&', Spacing::Alone).into(),
Punct::new('\'', Spacing::Alone).into(),
Ident::new("a", Span::call_site()).into(),
Ident::new("u8", Span::call_site()).into(),
];
trees.into_iter().collect()
}

View file

@ -11,7 +11,6 @@
// no-prefer-dynamic
#![crate_type = "proc-macro"]
#![feature(proc_macro, proc_macro_lib)]
extern crate proc_macro;

View file

@ -11,8 +11,6 @@
// aux-build:plugin.rs
// ignore-stage1
#![feature(proc_macro)]
#[macro_use] extern crate plugin;
#[derive(Foo, Bar)] //~ ERROR proc-macro derive panicked

View file

@ -1,5 +1,5 @@
error: proc-macro derive panicked
--> $DIR/issue-36935.rs:18:15
--> $DIR/issue-36935.rs:16:15
|
LL | #[derive(Foo, Bar)] //~ ERROR proc-macro derive panicked
| ^^^

View file

@ -0,0 +1,16 @@
// Copyright 2018 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
// aux-build:invalid-punct-ident.rs
#[macro_use]
extern crate invalid_punct_ident;
invalid_punct!(); //~ ERROR proc macro panicked

View file

@ -0,0 +1,10 @@
error: proc macro panicked
--> $DIR/invalid-punct-ident-1.rs:16:1
|
LL | invalid_punct!(); //~ ERROR proc macro panicked
| ^^^^^^^^^^^^^^^^^
|
= help: message: unsupported character `'`'`
error: aborting due to previous error

View file

@ -0,0 +1,16 @@
// Copyright 2018 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
// aux-build:invalid-punct-ident.rs
#[macro_use]
extern crate invalid_punct_ident;
invalid_ident!(); //~ ERROR proc macro panicked

View file

@ -0,0 +1,10 @@
error: proc macro panicked
--> $DIR/invalid-punct-ident-2.rs:16:1
|
LL | invalid_ident!(); //~ ERROR proc macro panicked
| ^^^^^^^^^^^^^^^^^
|
= help: message: `"*"` is not a valid identifier
error: aborting due to previous error

View file

@ -0,0 +1,16 @@
// Copyright 2018 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
// aux-build:invalid-punct-ident.rs
#[macro_use]
extern crate invalid_punct_ident;
invalid_raw_ident!(); //~ ERROR proc macro panicked

View file

@ -0,0 +1,10 @@
error: proc macro panicked
--> $DIR/invalid-punct-ident-3.rs:16:1
|
LL | invalid_raw_ident!(); //~ ERROR proc macro panicked
| ^^^^^^^^^^^^^^^^^^^^^
|
= help: message: `"self"` is not a valid raw identifier
error: aborting due to previous error

View file

@ -0,0 +1,17 @@
// Copyright 2018 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
// aux-build:invalid-punct-ident.rs
#[macro_use]
extern crate invalid_punct_ident;
lexer_failure!(); //~ ERROR proc macro panicked
//~| ERROR unexpected close delimiter: `)`

View file

@ -0,0 +1,14 @@
error: unexpected close delimiter: `)`
--> $DIR/invalid-punct-ident-4.rs:16:1
|
LL | lexer_failure!(); //~ ERROR proc macro panicked
| ^^^^^^^^^^^^^^^^^
error: proc macro panicked
--> $DIR/invalid-punct-ident-4.rs:16:1
|
LL | lexer_failure!(); //~ ERROR proc macro panicked
| ^^^^^^^^^^^^^^^^^
error: aborting due to 2 previous errors

View file

@ -0,0 +1,19 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
// aux-build:lifetimes.rs
#![feature(proc_macro, proc_macro_non_items)]
extern crate lifetimes;
use lifetimes::*;
type A = single_quote_alone!(); //~ ERROR expected type, found `'`

View file

@ -0,0 +1,8 @@
error: expected type, found `'`
--> $DIR/lifetimes.rs:19:10
|
LL | type A = single_quote_alone!(); //~ ERROR expected type, found `'`
| ^^^^^^^^^^^^^^^^^^^^^
error: aborting due to previous error

View file

@ -27,8 +27,8 @@ fn parse(input: TokenStream) -> Result<(), Diagnostic> {
.help("input must be: `===`"))
}
if let TokenTree::Op(tt) = tree {
if tt.op() == '=' {
if let TokenTree::Punct(ref tt) = tree {
if tt.as_char() == '=' {
count += 1;
last_span = span;
continue

View file

@ -11,7 +11,7 @@
// aux-build:parent-source-spans.rs
// ignore-stage1
#![feature(proc_macro, decl_macro, proc_macro_non_items)]
#![feature(use_extern_macros, decl_macro, proc_macro_non_items)]
extern crate parent_source_spans;

View file

@ -11,7 +11,7 @@
// aux-build:three-equals.rs
// ignore-stage1
#![feature(proc_macro, proc_macro_non_items)]
#![feature(use_extern_macros, proc_macro_non_items)]
extern crate three_equals;

View file

@ -14,7 +14,6 @@
// aux-build:bang_proc_macro.rs
#![feature(proc_macro)]
#![allow(unused_macros)]
#[macro_use]
extern crate derive_foo;

View file

@ -1,59 +1,59 @@
error: cannot find derive macro `FooWithLongNan` in this scope
--> $DIR/resolve-error.rs:37:10
--> $DIR/resolve-error.rs:36:10
|
LL | #[derive(FooWithLongNan)]
| ^^^^^^^^^^^^^^ help: try: `FooWithLongName`
error: cannot find attribute macro `attr_proc_macra` in this scope
--> $DIR/resolve-error.rs:41:3
--> $DIR/resolve-error.rs:40:3
|
LL | #[attr_proc_macra]
| ^^^^^^^^^^^^^^^ help: try: `attr_proc_macro`
error: cannot find attribute macro `FooWithLongNan` in this scope
--> $DIR/resolve-error.rs:45:3
--> $DIR/resolve-error.rs:44:3
|
LL | #[FooWithLongNan]
| ^^^^^^^^^^^^^^
error: cannot find derive macro `Dlone` in this scope
--> $DIR/resolve-error.rs:49:10
--> $DIR/resolve-error.rs:48:10
|
LL | #[derive(Dlone)]
| ^^^^^ help: try: `Clone`
error: cannot find derive macro `Dlona` in this scope
--> $DIR/resolve-error.rs:53:10
--> $DIR/resolve-error.rs:52:10
|
LL | #[derive(Dlona)]
| ^^^^^ help: try: `Clona`
error: cannot find derive macro `attr_proc_macra` in this scope
--> $DIR/resolve-error.rs:57:10
--> $DIR/resolve-error.rs:56:10
|
LL | #[derive(attr_proc_macra)]
| ^^^^^^^^^^^^^^^
error: cannot find macro `FooWithLongNama!` in this scope
--> $DIR/resolve-error.rs:62:5
--> $DIR/resolve-error.rs:61:5
|
LL | FooWithLongNama!();
| ^^^^^^^^^^^^^^^ help: you could try the macro: `FooWithLongNam`
error: cannot find macro `attr_proc_macra!` in this scope
--> $DIR/resolve-error.rs:65:5
--> $DIR/resolve-error.rs:64:5
|
LL | attr_proc_macra!();
| ^^^^^^^^^^^^^^^ help: you could try the macro: `attr_proc_mac`
error: cannot find macro `Dlona!` in this scope
--> $DIR/resolve-error.rs:68:5
--> $DIR/resolve-error.rs:67:5
|
LL | Dlona!();
| ^^^^^
error: cannot find macro `bang_proc_macrp!` in this scope
--> $DIR/resolve-error.rs:71:5
--> $DIR/resolve-error.rs:70:5
|
LL | bang_proc_macrp!();
| ^^^^^^^^^^^^^^^ help: you could try the macro: `bang_proc_macro`