Rollup merge of #53496 - matthiaskrgr:codespell_08_2018, r=varkor
Fix typos found by codespell.
This commit is contained in:
commit
b5519db323
99 changed files with 130 additions and 130 deletions
|
@ -32,7 +32,7 @@ shift
|
|||
|
||||
export CFLAGS="-fPIC $CFLAGS"
|
||||
|
||||
# FIXME: remove the patch when upate to 1.1.20
|
||||
# FIXME: remove the patch when updating to 1.1.20
|
||||
MUSL=musl-1.1.19
|
||||
|
||||
# may have been downloaded in a previous run
|
||||
|
|
|
@ -34,7 +34,7 @@ minimum. It also includes exercises!
|
|||
|
||||
# Use Rust
|
||||
|
||||
Once you've gotten familliar with the language, these resources can help you
|
||||
Once you've gotten familiar with the language, these resources can help you
|
||||
when you're actually using it day-to-day.
|
||||
|
||||
## The Standard Library
|
||||
|
|
|
@ -153,7 +153,7 @@ This option allows you to put extra data in each output filename.
|
|||
This flag lets you control how many threads are used when doing
|
||||
code generation.
|
||||
|
||||
Increasing paralellism may speed up compile times, but may also
|
||||
Increasing parallelism may speed up compile times, but may also
|
||||
produce slower code.
|
||||
|
||||
## remark
|
||||
|
|
|
@ -56,7 +56,7 @@ mod m {
|
|||
pub struct S(u8);
|
||||
|
||||
fn f() {
|
||||
// this is trying to use S from the 'use' line, but becuase the `u8` is
|
||||
// this is trying to use S from the 'use' line, but because the `u8` is
|
||||
// not pub, it is private
|
||||
::S;
|
||||
}
|
||||
|
@ -103,7 +103,7 @@ This warning can always be fixed by removing the unused pattern in the
|
|||
|
||||
## mutable-transmutes
|
||||
|
||||
This lint catches transmuting from `&T` to `&mut T` becuase it is undefined
|
||||
This lint catches transmuting from `&T` to `&mut T` because it is undefined
|
||||
behavior. Some example code that triggers this lint:
|
||||
|
||||
```rust,ignore
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
# Unstable features
|
||||
|
||||
Rustdoc is under active developement, and like the Rust compiler, some features are only available
|
||||
Rustdoc is under active development, and like the Rust compiler, some features are only available
|
||||
on the nightly releases. Some of these are new and need some more testing before they're able to get
|
||||
released to the world at large, and some of them are tied to features in the Rust compiler that are
|
||||
themselves unstable. Several features here require a matching `#![feature(...)]` attribute to
|
||||
|
|
|
@ -6,12 +6,12 @@ The tracking issue for this feature is: [#44493]
|
|||
|
||||
------------------------
|
||||
The `infer_outlives_requirements` feature indicates that certain
|
||||
outlives requirements can be infered by the compiler rather than
|
||||
outlives requirements can be inferred by the compiler rather than
|
||||
stating them explicitly.
|
||||
|
||||
For example, currently generic struct definitions that contain
|
||||
references, require where-clauses of the form T: 'a. By using
|
||||
this feature the outlives predicates will be infered, although
|
||||
this feature the outlives predicates will be inferred, although
|
||||
they may still be written explicitly.
|
||||
|
||||
```rust,ignore (pseudo-Rust)
|
||||
|
|
|
@ -6,7 +6,7 @@ The tracking issue for this feature is: [#44493]
|
|||
|
||||
------------------------
|
||||
The `infer_static_outlives_requirements` feature indicates that certain
|
||||
`'static` outlives requirements can be infered by the compiler rather than
|
||||
`'static` outlives requirements can be inferred by the compiler rather than
|
||||
stating them explicitly.
|
||||
|
||||
Note: It is an accompanying feature to `infer_outlives_requirements`,
|
||||
|
@ -14,7 +14,7 @@ which must be enabled to infer outlives requirements.
|
|||
|
||||
For example, currently generic struct definitions that contain
|
||||
references, require where-clauses of the form T: 'static. By using
|
||||
this feature the outlives predicates will be infered, although
|
||||
this feature the outlives predicates will be inferred, although
|
||||
they may still be written explicitly.
|
||||
|
||||
```rust,ignore (pseudo-Rust)
|
||||
|
|
|
@ -8,7 +8,7 @@
|
|||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
#![unstable(feature = "raw_vec_internals", reason = "implemention detail", issue = "0")]
|
||||
#![unstable(feature = "raw_vec_internals", reason = "implementation detail", issue = "0")]
|
||||
#![doc(hidden)]
|
||||
|
||||
use core::cmp;
|
||||
|
|
|
@ -27,7 +27,7 @@ use task::{Context, Poll};
|
|||
/// - The `Future` trait is currently not object safe: The `Future::poll`
|
||||
/// method makes uses the arbitrary self types feature and traits in which
|
||||
/// this feature is used are currently not object safe due to current compiler
|
||||
/// limitations. (See tracking issue for arbitray self types for more
|
||||
/// limitations. (See tracking issue for arbitrary self types for more
|
||||
/// information #44874)
|
||||
pub struct LocalFutureObj<'a, T> {
|
||||
ptr: *mut (),
|
||||
|
@ -102,7 +102,7 @@ impl<'a, T> Drop for LocalFutureObj<'a, T> {
|
|||
/// - The `Future` trait is currently not object safe: The `Future::poll`
|
||||
/// method makes uses the arbitrary self types feature and traits in which
|
||||
/// this feature is used are currently not object safe due to current compiler
|
||||
/// limitations. (See tracking issue for arbitray self types for more
|
||||
/// limitations. (See tracking issue for arbitrary self types for more
|
||||
/// information #44874)
|
||||
pub struct FutureObj<'a, T>(LocalFutureObj<'a, T>);
|
||||
|
||||
|
|
|
@ -2291,7 +2291,7 @@ impl<T: ?Sized> *mut T {
|
|||
///
|
||||
/// If we ever decide to make it possible to call the intrinsic with `a` that is not a
|
||||
/// power-of-two, it will probably be more prudent to just change to a naive implementation rather
|
||||
/// than trying to adapt this to accomodate that change.
|
||||
/// than trying to adapt this to accommodate that change.
|
||||
///
|
||||
/// Any questions go to @nagisa.
|
||||
#[lang="align_offset"]
|
||||
|
|
|
@ -1680,7 +1680,7 @@ impl<T> [T] {
|
|||
}
|
||||
}
|
||||
|
||||
/// Function to calculate lenghts of the middle and trailing slice for `align_to{,_mut}`.
|
||||
/// Function to calculate lengths of the middle and trailing slice for `align_to{,_mut}`.
|
||||
fn align_to_offsets<U>(&self) -> (usize, usize) {
|
||||
// What we gonna do about `rest` is figure out what multiple of `U`s we can put in a
|
||||
// lowest number of `T`s. And how many `T`s we need for each such "multiple".
|
||||
|
@ -1740,7 +1740,7 @@ impl<T> [T] {
|
|||
(us_len, ts_len)
|
||||
}
|
||||
|
||||
/// Transmute the slice to a slice of another type, ensuring aligment of the types is
|
||||
/// Transmute the slice to a slice of another type, ensuring alignment of the types is
|
||||
/// maintained.
|
||||
///
|
||||
/// This method splits the slice into three distinct slices: prefix, correctly aligned middle
|
||||
|
@ -1793,7 +1793,7 @@ impl<T> [T] {
|
|||
}
|
||||
}
|
||||
|
||||
/// Transmute the slice to a slice of another type, ensuring aligment of the types is
|
||||
/// Transmute the slice to a slice of another type, ensuring alignment of the types is
|
||||
/// maintained.
|
||||
///
|
||||
/// This method splits the slice into three distinct slices: prefix, correctly aligned middle
|
||||
|
|
|
@ -154,7 +154,7 @@ pub struct Parser<'a> {
|
|||
style: Option<usize>,
|
||||
/// How many newlines have been seen in the string so far, to adjust the error spans
|
||||
seen_newlines: usize,
|
||||
/// Start and end byte offset of every successfuly parsed argument
|
||||
/// Start and end byte offset of every successfully parsed argument
|
||||
pub arg_places: Vec<(usize, usize)>,
|
||||
}
|
||||
|
||||
|
|
|
@ -142,7 +142,7 @@ mod imp {
|
|||
|
||||
#[repr(C)]
|
||||
pub struct _ThrowInfo {
|
||||
pub attribues: c_uint,
|
||||
pub attributes: c_uint,
|
||||
pub pnfnUnwind: imp::ptr_t,
|
||||
pub pForwardCompat: imp::ptr_t,
|
||||
pub pCatchableTypeArray: imp::ptr_t,
|
||||
|
@ -178,7 +178,7 @@ pub struct _TypeDescriptor {
|
|||
}
|
||||
|
||||
static mut THROW_INFO: _ThrowInfo = _ThrowInfo {
|
||||
attribues: 0,
|
||||
attributes: 0,
|
||||
pnfnUnwind: ptr!(0),
|
||||
pForwardCompat: ptr!(0),
|
||||
pCatchableTypeArray: ptr!(0),
|
||||
|
|
|
@ -12,7 +12,7 @@
|
|||
//!
|
||||
//! This library, provided by the standard distribution, provides the types
|
||||
//! consumed in the interfaces of procedurally defined macro definitions such as
|
||||
//! function-like macros `#[proc_macro]`, macro attribures `#[proc_macro_attribute]` and
|
||||
//! function-like macros `#[proc_macro]`, macro attributes `#[proc_macro_attribute]` and
|
||||
//! custom derive attributes`#[proc_macro_derive]`.
|
||||
//!
|
||||
//! Note that this crate is intentionally bare-bones currently.
|
||||
|
|
|
@ -49,7 +49,7 @@ pub mod query_result;
|
|||
mod substitute;
|
||||
|
||||
/// A "canonicalized" type `V` is one where all free inference
|
||||
/// variables have been rewriten to "canonical vars". These are
|
||||
/// variables have been rewritten to "canonical vars". These are
|
||||
/// numbered starting from 0 in order of first appearance.
|
||||
#[derive(Copy, Clone, Debug, PartialEq, Eq, Hash, RustcDecodable, RustcEncodable)]
|
||||
pub struct Canonical<'gcx, V> {
|
||||
|
|
|
@ -561,7 +561,7 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
value.push_highlighted("<");
|
||||
}
|
||||
|
||||
// Output the lifetimes fot the first type
|
||||
// Output the lifetimes for the first type
|
||||
let lifetimes = sub.regions()
|
||||
.map(|lifetime| {
|
||||
let s = lifetime.to_string();
|
||||
|
|
|
@ -527,7 +527,7 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
* we're not careful, it will succeed.
|
||||
*
|
||||
* The reason is that when we walk through the subtyping
|
||||
* algorith, we begin by replacing `'a` with a skolemized
|
||||
* algorithm, we begin by replacing `'a` with a skolemized
|
||||
* variable `'1`. We then have `fn(_#0t) <: fn(&'1 int)`. This
|
||||
* can be made true by unifying `_#0t` with `&'1 int`. In the
|
||||
* process, we create a fresh variable for the skolemized
|
||||
|
|
|
@ -68,7 +68,7 @@ impl<'tcx> ConstValue<'tcx> {
|
|||
|
||||
/// A `Value` represents a single self-contained Rust value.
|
||||
///
|
||||
/// A `Value` can either refer to a block of memory inside an allocation (`ByRef`) or to a primitve
|
||||
/// A `Value` can either refer to a block of memory inside an allocation (`ByRef`) or to a primitive
|
||||
/// value held directly, outside of any allocation (`Scalar`). For `ByRef`-values, we remember
|
||||
/// whether the pointer is supposed to be aligned or not (also see Place).
|
||||
///
|
||||
|
|
|
@ -927,11 +927,11 @@ pub enum TerminatorKind<'tcx> {
|
|||
/// Drop(P, goto BB1, unwind BB2)
|
||||
/// }
|
||||
/// BB1 {
|
||||
/// // P is now unitialized
|
||||
/// // P is now uninitialized
|
||||
/// P <- V
|
||||
/// }
|
||||
/// BB2 {
|
||||
/// // P is now unitialized -- its dtor panicked
|
||||
/// // P is now uninitialized -- its dtor panicked
|
||||
/// P <- V
|
||||
/// }
|
||||
/// ```
|
||||
|
|
|
@ -171,7 +171,7 @@ impl<'a, 'tcx> Postorder<'a, 'tcx> {
|
|||
// (A, [C])]
|
||||
//
|
||||
// Now that the top of the stack has no successors we can traverse, each item will
|
||||
// be popped off during iteration until we get back to `A`. This yeilds [E, D, B].
|
||||
// be popped off during iteration until we get back to `A`. This yields [E, D, B].
|
||||
//
|
||||
// When we yield `B` and call `traverse_successor`, we push `C` to the stack, but
|
||||
// since we've already visited `E`, that child isn't added to the stack. The last
|
||||
|
|
|
@ -264,12 +264,12 @@ impl<'a, 'tcx> AutoTraitFinder<'a, 'tcx> {
|
|||
// The core logic responsible for computing the bounds for our synthesized impl.
|
||||
//
|
||||
// To calculate the bounds, we call SelectionContext.select in a loop. Like FulfillmentContext,
|
||||
// we recursively select the nested obligations of predicates we encounter. However, whenver we
|
||||
// we recursively select the nested obligations of predicates we encounter. However, whenever we
|
||||
// encounter an UnimplementedError involving a type parameter, we add it to our ParamEnv. Since
|
||||
// our goal is to determine when a particular type implements an auto trait, Unimplemented
|
||||
// errors tell us what conditions need to be met.
|
||||
//
|
||||
// This method ends up working somewhat similary to FulfillmentContext, but with a few key
|
||||
// This method ends up working somewhat similarly to FulfillmentContext, but with a few key
|
||||
// differences. FulfillmentContext works under the assumption that it's dealing with concrete
|
||||
// user code. According, it considers all possible ways that a Predicate could be met - which
|
||||
// isn't always what we want for a synthesized impl. For example, given the predicate 'T:
|
||||
|
@ -289,11 +289,11 @@ impl<'a, 'tcx> AutoTraitFinder<'a, 'tcx> {
|
|||
// we'll pick up any nested bounds, without ever inferring that 'T: IntoIterator' needs to
|
||||
// hold.
|
||||
//
|
||||
// One additonal consideration is supertrait bounds. Normally, a ParamEnv is only ever
|
||||
// One additional consideration is supertrait bounds. Normally, a ParamEnv is only ever
|
||||
// consutrcted once for a given type. As part of the construction process, the ParamEnv will
|
||||
// have any supertrait bounds normalized - e.g. if we have a type 'struct Foo<T: Copy>', the
|
||||
// ParamEnv will contain 'T: Copy' and 'T: Clone', since 'Copy: Clone'. When we construct our
|
||||
// own ParamEnv, we need to do this outselves, through traits::elaborate_predicates, or else
|
||||
// own ParamEnv, we need to do this ourselves, through traits::elaborate_predicates, or else
|
||||
// SelectionContext will choke on the missing predicates. However, this should never show up in
|
||||
// the final synthesized generics: we don't want our generated docs page to contain something
|
||||
// like 'T: Copy + Clone', as that's redundant. Therefore, we keep track of a separate
|
||||
|
|
|
@ -652,7 +652,7 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
|||
}
|
||||
|
||||
// If this error is due to `!: Trait` not implemented but `(): Trait` is
|
||||
// implemented, and fallback has occured, then it could be due to a
|
||||
// implemented, and fallback has occurred, then it could be due to a
|
||||
// variable that used to fallback to `()` now falling back to `!`. Issue a
|
||||
// note informing about the change in behaviour.
|
||||
if trait_predicate.skip_binder().self_ty().is_never()
|
||||
|
|
|
@ -82,7 +82,7 @@ impl<'cx, 'gcx, 'tcx> At<'cx, 'gcx, 'tcx> {
|
|||
// Errors and ambiuity in dropck occur in two cases:
|
||||
// - unresolved inference variables at the end of typeck
|
||||
// - non well-formed types where projections cannot be resolved
|
||||
// Either of these should hvae created an error before.
|
||||
// Either of these should have created an error before.
|
||||
tcx.sess
|
||||
.delay_span_bug(span, "dtorck encountered internal error");
|
||||
return InferOk {
|
||||
|
|
|
@ -892,7 +892,7 @@ pub struct GlobalCtxt<'tcx> {
|
|||
|
||||
pub(crate) queries: query::Queries<'tcx>,
|
||||
|
||||
// Records the free variables refrenced by every closure
|
||||
// Records the free variables referenced by every closure
|
||||
// expression. Do not track deps for this, just recompute it from
|
||||
// scratch every time.
|
||||
freevars: FxHashMap<DefId, Lrc<Vec<hir::Freevar>>>,
|
||||
|
|
|
@ -1503,7 +1503,7 @@ impl UniverseIndex {
|
|||
|
||||
/// Creates a universe index from the given integer. Not to be
|
||||
/// used lightly lest you pick a bad value. But sometimes we
|
||||
/// convert universe indicies into integers and back for various
|
||||
/// convert universe indices into integers and back for various
|
||||
/// reasons.
|
||||
pub fn from_u32(index: u32) -> Self {
|
||||
UniverseIndex(index)
|
||||
|
|
|
@ -262,7 +262,7 @@ where
|
|||
}
|
||||
}
|
||||
|
||||
// Visit the explict waiters which use condvars and are resumable
|
||||
// Visit the explicit waiters which use condvars and are resumable
|
||||
for (i, waiter) in query.latch.info.lock().waiters.iter().enumerate() {
|
||||
if let Some(ref waiter_query) = waiter.query {
|
||||
if visit(waiter.span, waiter_query.clone()).is_some() {
|
||||
|
|
|
@ -328,7 +328,7 @@ struct AssemblerCommand {
|
|||
/// Additional resources used by optimize_and_codegen (not module specific)
|
||||
#[derive(Clone)]
|
||||
pub struct CodegenContext {
|
||||
// Resouces needed when running LTO
|
||||
// Resources needed when running LTO
|
||||
pub time_passes: bool,
|
||||
pub lto: Lto,
|
||||
pub no_landing_pads: bool,
|
||||
|
@ -596,7 +596,7 @@ unsafe fn optimize(cgcx: &CodegenContext,
|
|||
-C passes=name-anon-globals to the compiler command line.");
|
||||
} else {
|
||||
bug!("We are using thin LTO buffers without running the NameAnonGlobals pass. \
|
||||
This will likely cause errors in LLVM and shoud never happen.");
|
||||
This will likely cause errors in LLVM and should never happen.");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -656,7 +656,7 @@ impl FunctionCx<'a, 'll, 'tcx> {
|
|||
llargs.push(b);
|
||||
return;
|
||||
}
|
||||
_ => bug!("codegen_argument: {:?} invalid for pair arugment", op)
|
||||
_ => bug!("codegen_argument: {:?} invalid for pair argument", op)
|
||||
}
|
||||
} else if arg.is_unsized_indirect() {
|
||||
match op.val {
|
||||
|
|
|
@ -26,7 +26,7 @@
|
|||
//!
|
||||
//! `MTLock` is a mutex which disappears if cfg!(parallel_queries) is false.
|
||||
//!
|
||||
//! `MTRef` is a immutable refernce if cfg!(parallel_queries), and an mutable reference otherwise.
|
||||
//! `MTRef` is a immutable reference if cfg!(parallel_queries), and an mutable reference otherwise.
|
||||
//!
|
||||
//! `rustc_erase_owner!` erases a OwningRef owner into Erased or Erased + Send + Sync
|
||||
//! depending on the value of cfg!(parallel_queries).
|
||||
|
@ -432,7 +432,7 @@ impl<T> Once<T> {
|
|||
/// closures may concurrently be computing a value which the inner value should take.
|
||||
/// Only one of these closures are used to actually initialize the value.
|
||||
/// If some other closure already set the value, we assert that it our closure computed
|
||||
/// a value equal to the value aready set and then
|
||||
/// a value equal to the value already set and then
|
||||
/// we return the value our closure computed wrapped in a `Option`.
|
||||
/// If our closure set the value, `None` is returned.
|
||||
/// If the value is already initialized, the closure is not called and `None` is returned.
|
||||
|
|
|
@ -889,7 +889,7 @@ impl<'a, 'tcx> LateLintPass<'a, 'tcx> for UnconditionalRecursion {
|
|||
// NB. this has an edge case with non-returning statements,
|
||||
// like `loop {}` or `panic!()`: control flow never reaches
|
||||
// the exit node through these, so one can have a function
|
||||
// that never actually calls itselfs but is still picked up by
|
||||
// that never actually calls itself but is still picked up by
|
||||
// this lint:
|
||||
//
|
||||
// fn f(cond: bool) {
|
||||
|
|
|
@ -486,7 +486,7 @@ impl<'a, 'tcx> ImproperCTypesVisitor<'a, 'tcx> {
|
|||
// Protect against infinite recursion, for example
|
||||
// `struct S(*mut S);`.
|
||||
// FIXME: A recursion limit is necessary as well, for irregular
|
||||
// recusive types.
|
||||
// recursive types.
|
||||
if !cache.insert(ty) {
|
||||
return FfiSafe;
|
||||
}
|
||||
|
|
|
@ -207,7 +207,7 @@ impl<'a, 'tcx> Collector<'a, 'tcx> {
|
|||
}
|
||||
}
|
||||
|
||||
// Update kind and, optionally, the name of all native libaries
|
||||
// Update kind and, optionally, the name of all native libraries
|
||||
// (there may be more than one) with the specified name.
|
||||
for &(ref name, ref new_name, kind) in &self.tcx.sess.opts.libs {
|
||||
let mut found = false;
|
||||
|
|
|
@ -541,7 +541,7 @@ impl<'cg, 'cx, 'tcx, 'gcx> InvalidationGenerator<'cg, 'cx, 'tcx, 'gcx> {
|
|||
// unique or mutable borrows are invalidated by writes.
|
||||
// Reservations count as writes since we need to check
|
||||
// that activating the borrow will be OK
|
||||
// TOOD(bob_twinkles) is this actually the right thing to do?
|
||||
// FIXME(bob_twinkles) is this actually the right thing to do?
|
||||
this.generate_invalidates(borrow_index, context.loc);
|
||||
}
|
||||
}
|
||||
|
|
|
@ -783,7 +783,7 @@ impl<'a, 'gcx, 'tcx> TypeChecker<'a, 'gcx, 'tcx> {
|
|||
/// predicates, or otherwise uses the inference context, executes
|
||||
/// `op` and then executes all the further obligations that `op`
|
||||
/// returns. This will yield a set of outlives constraints amongst
|
||||
/// regions which are extracted and stored as having occured at
|
||||
/// regions which are extracted and stored as having occurred at
|
||||
/// `locations`.
|
||||
///
|
||||
/// **Any `rustc::infer` operations that might generate region
|
||||
|
|
|
@ -83,7 +83,7 @@ fn place_components_conflict<'gcx, 'tcx>(
|
|||
// Our invariant is, that at each step of the iteration:
|
||||
// - If we didn't run out of access to match, our borrow and access are comparable
|
||||
// and either equal or disjoint.
|
||||
// - If we did run out of accesss, the borrow can access a part of it.
|
||||
// - If we did run out of access, the borrow can access a part of it.
|
||||
loop {
|
||||
// loop invariant: borrow_c is always either equal to access_c or disjoint from it.
|
||||
if let Some(borrow_c) = borrow_components.next() {
|
||||
|
|
|
@ -605,7 +605,7 @@ pub trait BitDenotation: BitwiseOperator {
|
|||
/// `sets.on_entry` to that local clone into `statement_effect` and
|
||||
/// `terminator_effect`).
|
||||
///
|
||||
/// When its false, no local clone is constucted; instead a
|
||||
/// When it's false, no local clone is constructed; instead a
|
||||
/// reference directly into `on_entry` is passed along via
|
||||
/// `sets.on_entry` instead, which represents the flow state at
|
||||
/// the block's start, not necessarily the state immediately prior
|
||||
|
|
|
@ -462,7 +462,7 @@ impl<'a, 'mir, 'tcx: 'mir, M: Machine<'mir, 'tcx>> EvalContext<'a, 'mir, 'tcx, M
|
|||
self.tcx.normalize_erasing_regions(ty::ParamEnv::reveal_all(), substituted)
|
||||
}
|
||||
|
||||
/// Return the size and aligment of the value at the given type.
|
||||
/// Return the size and alignment of the value at the given type.
|
||||
/// Note that the value does not matter if the type is sized. For unsized types,
|
||||
/// the value has to be a fat pointer, and we only care about the "extra" data in it.
|
||||
pub fn size_and_align_of_dst(
|
||||
|
|
|
@ -599,7 +599,7 @@ impl<'a, 'mir, 'tcx, M: Machine<'mir, 'tcx>> Memory<'a, 'mir, 'tcx, M> {
|
|||
Some(MemoryKind::Stack) => {},
|
||||
}
|
||||
if let Some(mut alloc) = alloc {
|
||||
// ensure llvm knows not to put this into immutable memroy
|
||||
// ensure llvm knows not to put this into immutable memory
|
||||
alloc.runtime_mutability = mutability;
|
||||
let alloc = self.tcx.intern_const_alloc(alloc);
|
||||
self.tcx.alloc_map.lock().set_id_memory(alloc_id, alloc);
|
||||
|
|
|
@ -704,7 +704,7 @@ impl<'a, 'tcx> MutVisitor<'tcx> for Integrator<'a, 'tcx> {
|
|||
*unwind = Some(self.update_target(tgt));
|
||||
} else if !self.in_cleanup_block {
|
||||
// Unless this drop is in a cleanup block, add an unwind edge to
|
||||
// the orignal call's cleanup block
|
||||
// the original call's cleanup block
|
||||
*unwind = self.cleanup_block;
|
||||
}
|
||||
}
|
||||
|
@ -716,7 +716,7 @@ impl<'a, 'tcx> MutVisitor<'tcx> for Integrator<'a, 'tcx> {
|
|||
*cleanup = Some(self.update_target(tgt));
|
||||
} else if !self.in_cleanup_block {
|
||||
// Unless this call is in a cleanup block, add an unwind edge to
|
||||
// the orignal call's cleanup block
|
||||
// the original call's cleanup block
|
||||
*cleanup = self.cleanup_block;
|
||||
}
|
||||
}
|
||||
|
@ -726,7 +726,7 @@ impl<'a, 'tcx> MutVisitor<'tcx> for Integrator<'a, 'tcx> {
|
|||
*cleanup = Some(self.update_target(tgt));
|
||||
} else if !self.in_cleanup_block {
|
||||
// Unless this assert is in a cleanup block, add an unwind edge to
|
||||
// the orignal call's cleanup block
|
||||
// the original call's cleanup block
|
||||
*cleanup = self.cleanup_block;
|
||||
}
|
||||
}
|
||||
|
|
|
@ -302,7 +302,7 @@ impl<'a, 'tcx> Promoter<'a, 'tcx> {
|
|||
let ref mut statement = blocks[loc.block].statements[loc.statement_index];
|
||||
match statement.kind {
|
||||
StatementKind::Assign(_, Rvalue::Ref(_, _, ref mut place)) => {
|
||||
// Find the underlying local for this (necessarilly interior) borrow.
|
||||
// Find the underlying local for this (necessarily interior) borrow.
|
||||
// HACK(eddyb) using a recursive function because of mutable borrows.
|
||||
fn interior_base<'a, 'tcx>(place: &'a mut Place<'tcx>)
|
||||
-> &'a mut Place<'tcx> {
|
||||
|
|
|
@ -190,7 +190,7 @@ impl MirPass for RestoreSubsliceArrayMoveOut {
|
|||
let local_use = &visitor.locals_use[*local];
|
||||
let opt_index_and_place = Self::try_get_item_source(local_use, mir);
|
||||
// each local should be used twice:
|
||||
// in assign and in aggregate statments
|
||||
// in assign and in aggregate statements
|
||||
if local_use.use_count == 2 && opt_index_and_place.is_some() {
|
||||
let (index, src_place) = opt_index_and_place.unwrap();
|
||||
return Some((local_use, index, src_place));
|
||||
|
@ -231,15 +231,15 @@ impl RestoreSubsliceArrayMoveOut {
|
|||
if opt_size.is_some() && items.iter().all(
|
||||
|l| l.is_some() && l.unwrap().2 == opt_src_place.unwrap()) {
|
||||
|
||||
let indicies: Vec<_> = items.iter().map(|x| x.unwrap().1).collect();
|
||||
for i in 1..indicies.len() {
|
||||
if indicies[i - 1] + 1 != indicies[i] {
|
||||
let indices: Vec<_> = items.iter().map(|x| x.unwrap().1).collect();
|
||||
for i in 1..indices.len() {
|
||||
if indices[i - 1] + 1 != indices[i] {
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
let min = *indicies.first().unwrap();
|
||||
let max = *indicies.last().unwrap();
|
||||
let min = *indices.first().unwrap();
|
||||
let max = *indices.last().unwrap();
|
||||
|
||||
for item in items {
|
||||
let locals_use = item.unwrap().0;
|
||||
|
|
|
@ -459,7 +459,7 @@ fn write_scope_tree(
|
|||
let indent = depth * INDENT.len();
|
||||
|
||||
let children = match scope_tree.get(&parent) {
|
||||
Some(childs) => childs,
|
||||
Some(children) => children,
|
||||
None => return Ok(()),
|
||||
};
|
||||
|
||||
|
|
|
@ -201,7 +201,7 @@ fn resolve_struct_error<'sess, 'a>(resolver: &'sess Resolver,
|
|||
if let Some(impl_span) = maybe_impl_defid.map_or(None,
|
||||
|def_id| resolver.definitions.opt_span(def_id)) {
|
||||
err.span_label(reduce_impl_span_to_impl_keyword(cm, impl_span),
|
||||
"`Self` type implicitely declared here, on the `impl`");
|
||||
"`Self` type implicitly declared here, on the `impl`");
|
||||
}
|
||||
},
|
||||
Def::TyParam(typaram_defid) => {
|
||||
|
|
|
@ -81,7 +81,7 @@ fn dropck_outlives<'tcx>(
|
|||
// into the types of its fields `(B, Vec<A>)`. These will get
|
||||
// pushed onto the stack. Eventually, expanding `Vec<A>` will
|
||||
// lead to us trying to push `A` a second time -- to prevent
|
||||
// infinite recusion, we notice that `A` was already pushed
|
||||
// infinite recursion, we notice that `A` was already pushed
|
||||
// once and stop.
|
||||
let mut ty_stack = vec![(for_ty, 0)];
|
||||
|
||||
|
|
|
@ -121,7 +121,7 @@ pub fn resolve_interior<'a, 'gcx, 'tcx>(fcx: &'a FnCtxt<'a, 'gcx, 'tcx>,
|
|||
// Replace all regions inside the generator interior with late bound regions
|
||||
// Note that each region slot in the types gets a new fresh late bound region,
|
||||
// which means that none of the regions inside relate to any other, even if
|
||||
// typeck had previously found contraints that would cause them to be related.
|
||||
// typeck had previously found constraints that would cause them to be related.
|
||||
let mut counter = 0;
|
||||
let type_list = fcx.tcx.fold_regions(&type_list, &mut false, |_, current_depth| {
|
||||
counter += 1;
|
||||
|
|
|
@ -876,7 +876,7 @@ fn typeck_tables_of<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
|||
// backwards compatibility. This makes fallback a stronger type hint than a cast coercion.
|
||||
fcx.check_casts();
|
||||
|
||||
// Closure and generater analysis may run after fallback
|
||||
// Closure and generator analysis may run after fallback
|
||||
// because they don't constrain other type variables.
|
||||
fcx.closure_analyze(body);
|
||||
assert!(fcx.deferred_call_resolutions.borrow().is_empty());
|
||||
|
@ -2329,7 +2329,7 @@ impl<'a, 'gcx, 'tcx> FnCtxt<'a, 'gcx, 'tcx> {
|
|||
// unconstrained floats with f64.
|
||||
// Fallback becomes very dubious if we have encountered type-checking errors.
|
||||
// In that case, fallback to TyError.
|
||||
// The return value indicates whether fallback has occured.
|
||||
// The return value indicates whether fallback has occurred.
|
||||
fn fallback_if_possible(&self, ty: Ty<'tcx>) -> bool {
|
||||
use rustc::ty::error::UnconstrainedNumeric::Neither;
|
||||
use rustc::ty::error::UnconstrainedNumeric::{UnconstrainedInt, UnconstrainedFloat};
|
||||
|
|
|
@ -1284,7 +1284,7 @@ impl<'a, 'gcx, 'tcx> RegionCtxt<'a, 'gcx, 'tcx> {
|
|||
// how all the types get adjusted.)
|
||||
match ref_kind {
|
||||
ty::ImmBorrow => {
|
||||
// The reference being reborrowed is a sharable ref of
|
||||
// The reference being reborrowed is a shareable ref of
|
||||
// type `&'a T`. In this case, it doesn't matter where we
|
||||
// *found* the `&T` pointer, the memory it references will
|
||||
// be valid and immutable for `'a`. So we can stop here.
|
||||
|
|
|
@ -516,7 +516,7 @@ impl<'cx, 'gcx, 'tcx> WritebackCx<'cx, 'gcx, 'tcx> {
|
|||
}
|
||||
|
||||
fn visit_node_id(&mut self, span: Span, hir_id: hir::HirId) {
|
||||
// Export associated path extensions and method resultions.
|
||||
// Export associated path extensions and method resolutions.
|
||||
if let Some(def) = self.fcx
|
||||
.tables
|
||||
.borrow_mut()
|
||||
|
|
|
@ -152,7 +152,7 @@ fn enforce_impl_params_are_constrained<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
|||
// }
|
||||
// ```
|
||||
//
|
||||
// In a concession to backwards compatbility, we continue to
|
||||
// In a concession to backwards compatibility, we continue to
|
||||
// permit those, so long as the lifetimes aren't used in
|
||||
// associated types. I believe this is sound, because lifetimes
|
||||
// used elsewhere are not projected back out.
|
||||
|
|
|
@ -824,7 +824,7 @@ impl<'a, 'tcx, 'rcx, 'cstore> AutoTraitFinder<'a, 'tcx, 'rcx, 'cstore> {
|
|||
// In fact, the iteration of an FxHashMap can even vary between platforms,
|
||||
// since FxHasher has different behavior for 32-bit and 64-bit platforms.
|
||||
//
|
||||
// Obviously, it's extremely undesireable for documentation rendering
|
||||
// Obviously, it's extremely undesirable for documentation rendering
|
||||
// to be depndent on the platform it's run on. Apart from being confusing
|
||||
// to end users, it makes writing tests much more difficult, as predicates
|
||||
// can appear in any order in the final result.
|
||||
|
@ -836,7 +836,7 @@ impl<'a, 'tcx, 'rcx, 'cstore> AutoTraitFinder<'a, 'tcx, 'rcx, 'cstore> {
|
|||
// predicates and bounds, however, we ensure that for a given codebase, all
|
||||
// auto-trait impls always render in exactly the same way.
|
||||
//
|
||||
// Using the Debug impementation for sorting prevents us from needing to
|
||||
// Using the Debug implementation for sorting prevents us from needing to
|
||||
// write quite a bit of almost entirely useless code (e.g. how should two
|
||||
// Types be sorted relative to each other). It also allows us to solve the
|
||||
// problem for both WherePredicates and GenericBounds at the same time. This
|
||||
|
|
|
@ -31,7 +31,7 @@ pub enum Cfg {
|
|||
True,
|
||||
/// Denies all configurations.
|
||||
False,
|
||||
/// A generic configration option, e.g. `test` or `target_os = "linux"`.
|
||||
/// A generic configuration option, e.g. `test` or `target_os = "linux"`.
|
||||
Cfg(Symbol, Option<Symbol>),
|
||||
/// Negate a configuration requirement, i.e. `not(x)`.
|
||||
Not(Box<Cfg>),
|
||||
|
|
|
@ -315,7 +315,7 @@ pub struct Cache {
|
|||
// the access levels from crateanalysis.
|
||||
pub access_levels: Arc<AccessLevels<DefId>>,
|
||||
|
||||
/// The version of the crate being documented, if given fron the `--crate-version` flag.
|
||||
/// The version of the crate being documented, if given from the `--crate-version` flag.
|
||||
pub crate_version: Option<String>,
|
||||
|
||||
// Private fields only used when initially crawling a crate to build a cache
|
||||
|
|
|
@ -88,7 +88,7 @@ where
|
|||
/// This function acquires exclusive access to the task context.
|
||||
///
|
||||
/// Panics if no task has been set or if the task context has already been
|
||||
/// retrived by a surrounding call to get_task_cx.
|
||||
/// retrieved by a surrounding call to get_task_cx.
|
||||
pub fn get_task_cx<F, R>(f: F) -> R
|
||||
where
|
||||
F: FnOnce(&mut task::Context) -> R
|
||||
|
|
|
@ -889,7 +889,7 @@ impl<W: Write> Write for LineWriter<W> {
|
|||
|
||||
// Find the last newline character in the buffer provided. If found then
|
||||
// we're going to write all the data up to that point and then flush,
|
||||
// otherewise we just write the whole block to the underlying writer.
|
||||
// otherwise we just write the whole block to the underlying writer.
|
||||
let i = match memchr::memrchr(b'\n', buf) {
|
||||
Some(i) => i,
|
||||
None => return self.inner.write(buf),
|
||||
|
|
|
@ -57,7 +57,7 @@ pub fn memrchr(needle: u8, haystack: &[u8]) -> Option<usize> {
|
|||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
// test the implementations for the current plattform
|
||||
// test the implementations for the current platform
|
||||
use super::{memchr, memrchr};
|
||||
|
||||
#[test]
|
||||
|
|
|
@ -176,7 +176,7 @@ impl Once {
|
|||
/// happens-before relation between the closure and code executing after the
|
||||
/// return).
|
||||
///
|
||||
/// If the given closure recusively invokes `call_once` on the same `Once`
|
||||
/// If the given closure recursively invokes `call_once` on the same `Once`
|
||||
/// instance the exact behavior is not specified, allowed outcomes are
|
||||
/// a panic or a deadlock.
|
||||
///
|
||||
|
|
|
@ -857,7 +857,7 @@ pub fn copy(from: &Path, to: &Path) -> io::Result<u64> {
|
|||
use sync::atomic::{AtomicBool, Ordering};
|
||||
|
||||
// Kernel prior to 4.5 don't have copy_file_range
|
||||
// We store the availability in a global to avoid unneccessary syscalls
|
||||
// We store the availability in a global to avoid unnecessary syscalls
|
||||
static HAS_COPY_FILE_RANGE: AtomicBool = AtomicBool::new(true);
|
||||
|
||||
unsafe fn copy_file_range(
|
||||
|
|
|
@ -35,7 +35,7 @@ use libc::SOCK_CLOEXEC;
|
|||
#[cfg(not(target_os = "linux"))]
|
||||
const SOCK_CLOEXEC: c_int = 0;
|
||||
|
||||
// Another conditional contant for name resolution: Macos et iOS use
|
||||
// Another conditional constant for name resolution: Macos et iOS use
|
||||
// SO_NOSIGPIPE as a setsockopt flag to disable SIGPIPE emission on socket.
|
||||
// Other platforms do otherwise.
|
||||
#[cfg(target_vendor = "apple")]
|
||||
|
|
|
@ -321,7 +321,7 @@ impl<'a, 'b> MacroExpander<'a, 'b> {
|
|||
// we'll be able to immediately resolve most of imported macros.
|
||||
self.resolve_imports();
|
||||
|
||||
// Resolve paths in all invocations and produce ouput expanded fragments for them, but
|
||||
// Resolve paths in all invocations and produce output expanded fragments for them, but
|
||||
// do not insert them into our input AST fragment yet, only store in `expanded_fragments`.
|
||||
// The output fragments also go through expansion recursively until no invocations are left.
|
||||
// Unresolved macros produce dummy outputs as a recovery measure.
|
||||
|
|
|
@ -637,7 +637,7 @@ pub fn parse(
|
|||
|
||||
// A queue of possible matcher positions. We initialize it with the matcher position in which
|
||||
// the "dot" is before the first token of the first token tree in `ms`. `inner_parse_loop` then
|
||||
// processes all of these possible matcher positions and produces posible next positions into
|
||||
// processes all of these possible matcher positions and produces possible next positions into
|
||||
// `next_items`. After some post-processing, the contents of `next_items` replenish `cur_items`
|
||||
// and we start over again.
|
||||
//
|
||||
|
@ -726,7 +726,7 @@ pub fn parse(
|
|||
),
|
||||
);
|
||||
}
|
||||
// If there are no posible next positions AND we aren't waiting for the black-box parser,
|
||||
// If there are no possible next positions AND we aren't waiting for the black-box parser,
|
||||
// then their is a syntax error.
|
||||
else if bb_items.is_empty() && next_items.is_empty() {
|
||||
return Failure(parser.span, parser.token);
|
||||
|
|
|
@ -323,7 +323,7 @@ where
|
|||
}
|
||||
|
||||
// `tree` is followed by an `ident`. This could be `$meta_var` or the `$crate` special
|
||||
// metavariable that names the crate of the invokation.
|
||||
// metavariable that names the crate of the invocation.
|
||||
Some(tokenstream::TokenTree::Token(ident_span, ref token)) if token.is_ident() => {
|
||||
let (ident, is_raw) = token.ident().unwrap();
|
||||
let span = ident_span.with_lo(span.lo());
|
||||
|
|
|
@ -3535,8 +3535,8 @@ impl<'a> Parser<'a> {
|
|||
if arm_start_lines.lines[0].end_col == expr_lines.lines[0].end_col
|
||||
&& expr_lines.lines.len() == 2
|
||||
&& self.token == token::FatArrow => {
|
||||
// We check wether there's any trailing code in the parse span, if there
|
||||
// isn't, we very likely have the following:
|
||||
// We check whether there's any trailing code in the parse span,
|
||||
// if there isn't, we very likely have the following:
|
||||
//
|
||||
// X | &Y => "y"
|
||||
// | -- - missing comma
|
||||
|
@ -3934,7 +3934,7 @@ impl<'a> Parser<'a> {
|
|||
}
|
||||
|
||||
/// A wrapper around `parse_pat` with some special error handling for the
|
||||
/// "top-level" patterns in a match arm, `for` loop, `let`, &c. (in contast
|
||||
/// "top-level" patterns in a match arm, `for` loop, `let`, &c. (in contrast
|
||||
/// to subpatterns within such).
|
||||
fn parse_top_level_pat(&mut self) -> PResult<'a, P<Pat>> {
|
||||
let pat = self.parse_pat()?;
|
||||
|
@ -4322,7 +4322,7 @@ impl<'a> Parser<'a> {
|
|||
// If `break_on_semi` is `Break`, then we will stop consuming tokens after
|
||||
// finding (and consuming) a `;` outside of `{}` or `[]` (note that this is
|
||||
// approximate - it can mean we break too early due to macros, but that
|
||||
// shoud only lead to sub-optimal recovery, not inaccurate parsing).
|
||||
// should only lead to sub-optimal recovery, not inaccurate parsing).
|
||||
//
|
||||
// If `break_on_block` is `Break`, then we will stop consuming tokens
|
||||
// after finding (and consuming) a brace-delimited block.
|
||||
|
@ -4887,7 +4887,7 @@ impl<'a> Parser<'a> {
|
|||
fn parse_generic_bounds_common(&mut self, allow_plus: bool) -> PResult<'a, GenericBounds> {
|
||||
let mut bounds = Vec::new();
|
||||
loop {
|
||||
// This needs to be syncronized with `Token::can_begin_bound`.
|
||||
// This needs to be synchronized with `Token::can_begin_bound`.
|
||||
let is_bound_start = self.check_path() || self.check_lifetime() ||
|
||||
self.check(&token::Question) ||
|
||||
self.check_keyword(keywords::For) ||
|
||||
|
|
|
@ -117,7 +117,7 @@ struct Context<'a, 'b: 'a> {
|
|||
invalid_refs: Vec<(usize, usize)>,
|
||||
/// Spans of all the formatting arguments, in order.
|
||||
arg_spans: Vec<Span>,
|
||||
/// Wether this formatting string is a literal or it comes from a macro.
|
||||
/// Whether this formatting string is a literal or it comes from a macro.
|
||||
is_literal: bool,
|
||||
}
|
||||
|
||||
|
|
|
@ -586,7 +586,7 @@ impl InternedString {
|
|||
});
|
||||
// This is safe because the interner keeps string alive until it is dropped.
|
||||
// We can access it because we know the interner is still alive since we use a
|
||||
// scoped thread local to access it, and it was alive at the begining of this scope
|
||||
// scoped thread local to access it, and it was alive at the beginning of this scope
|
||||
unsafe { f(&*str) }
|
||||
}
|
||||
|
||||
|
|
|
@ -28,7 +28,7 @@ fn start(_: isize, _: *const *const u8) -> isize {
|
|||
let _: (char, u32) = Trait::without_default_impl(0);
|
||||
|
||||
// Currently, no object code is generated for trait methods with default
|
||||
// implemenations, unless they are actually called from somewhere. Therefore
|
||||
// implementations, unless they are actually called from somewhere. Therefore
|
||||
// we cannot import the implementations and have to create our own inline.
|
||||
//~ MONO_ITEM fn cgu_export_trait_method::Trait[0]::with_default_impl[0]<u32>
|
||||
let _ = Trait::with_default_impl(0u32);
|
||||
|
|
|
@ -9,7 +9,7 @@
|
|||
// except according to those terms.
|
||||
|
||||
// This crate attempts to enumerate the various scenarios for how a
|
||||
// type can define fields and methods with various visiblities and
|
||||
// type can define fields and methods with various visibilities and
|
||||
// stabilities.
|
||||
//
|
||||
// The basic stability pattern in this file has four cases:
|
||||
|
@ -23,7 +23,7 @@
|
|||
//
|
||||
// However, since stability attributes can only be observed in
|
||||
// cross-crate linkage scenarios, there is little reason to take the
|
||||
// cross-product (4 stability cases * 4 visiblity cases), because the
|
||||
// cross-product (4 stability cases * 4 visibility cases), because the
|
||||
// first three visibility cases cannot be accessed outside this crate,
|
||||
// and therefore stability is only relevant when the visibility is pub
|
||||
// to the whole universe.
|
||||
|
|
|
@ -8,7 +8,7 @@
|
|||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
// error-pattern:runned an unexported test
|
||||
// error-pattern:ran an unexported test
|
||||
// compile-flags:--test
|
||||
// check-stdout
|
||||
|
||||
|
@ -17,6 +17,6 @@ mod m {
|
|||
|
||||
#[test]
|
||||
fn unexported() {
|
||||
panic!("runned an unexported test");
|
||||
panic!("ran an unexported test");
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
-include ../tools.mk
|
||||
|
||||
# Test that hir-tree output doens't crash and includes
|
||||
# Test that hir-tree output doesn't crash and includes
|
||||
# the string constant we would expect to see.
|
||||
|
||||
all:
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
-include ../tools.mk
|
||||
|
||||
# Test that hir-tree output doens't crash and includes
|
||||
# Test that hir-tree output doesn't crash and includes
|
||||
# the string constant we would expect to see.
|
||||
|
||||
all:
|
||||
|
|
|
@ -3,7 +3,7 @@
|
|||
LOG := $(TMPDIR)/log.txt
|
||||
|
||||
# This test builds a shared object, then an executable that links it as a native
|
||||
# rust library (constrast to an rlib). The shared library and executable both
|
||||
# rust library (contrast to an rlib). The shared library and executable both
|
||||
# are compiled with address sanitizer, and we assert that a fault in the cdylib
|
||||
# is correctly detected.
|
||||
|
||||
|
|
|
@ -3,7 +3,7 @@
|
|||
LOG := $(TMPDIR)/log.txt
|
||||
|
||||
# This test builds a shared object, then an executable that links it as a native
|
||||
# rust library (constrast to an rlib). The shared library and executable both
|
||||
# rust library (contrast to an rlib). The shared library and executable both
|
||||
# are compiled with address sanitizer, and we assert that a fault in the dylib
|
||||
# is correctly detected.
|
||||
|
||||
|
|
|
@ -385,7 +385,7 @@ pub fn main() {
|
|||
// RwLock (since we can grab the child pointers in read-only
|
||||
// mode), but we cannot lock a std::sync::Mutex to guard reading
|
||||
// from each node via the same pattern, since once you hit the
|
||||
// cycle, you'll be trying to acquring the same lock twice.
|
||||
// cycle, you'll be trying to acquiring the same lock twice.
|
||||
// (We deal with this by exiting the traversal early if try_lock fails.)
|
||||
|
||||
// Cycle 12: { arc0 -> (arc1, arc2), arc1 -> (), arc2 -> arc0 }, refcells
|
||||
|
|
|
@ -310,7 +310,7 @@ fn test_order() {
|
|||
}
|
||||
|
||||
fn test_once() {
|
||||
// Make sure each argument are evaluted only once even though it may be
|
||||
// Make sure each argument are evaluated only once even though it may be
|
||||
// formatted multiple times
|
||||
fn foo() -> isize {
|
||||
static mut FOO: isize = 0;
|
||||
|
|
|
@ -27,7 +27,7 @@ pub fn main() {
|
|||
}
|
||||
match 'c' {
|
||||
'a'...'z' => {}
|
||||
_ => panic!("should suppport char ranges")
|
||||
_ => panic!("should support char ranges")
|
||||
}
|
||||
match -3_isize {
|
||||
-7...5 => {}
|
||||
|
|
|
@ -26,7 +26,7 @@ type TypeD = TypeA<'static>;
|
|||
// trailing comma on lifetime bounds
|
||||
type TypeE = TypeA<'static,>;
|
||||
|
||||
// normal type arugment
|
||||
// normal type argument
|
||||
type TypeF<T> = Box<T>;
|
||||
|
||||
// type argument with trailing comma
|
||||
|
|
|
@ -11,7 +11,7 @@
|
|||
// Issue 33903:
|
||||
// Built-in indexing should be used even when the index is not
|
||||
// trivially an integer
|
||||
// Only built-in indexing can be used in constant expresssions
|
||||
// Only built-in indexing can be used in constant expressions
|
||||
|
||||
const FOO: i32 = [12, 34][0 + 1];
|
||||
|
||||
|
|
|
@ -8,7 +8,7 @@
|
|||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
// Test that we are able to reinitilize box with moved referent
|
||||
// Test that we are able to reinitialize box with moved referent
|
||||
#![feature(nll)]
|
||||
static mut ORDER: [usize; 3] = [0, 0, 0];
|
||||
static mut INDEX: usize = 0;
|
||||
|
|
|
@ -34,7 +34,7 @@ pub fn main() {
|
|||
}
|
||||
match 'c' {
|
||||
'a'..='z' => {}
|
||||
_ => panic!("should suppport char ranges")
|
||||
_ => panic!("should support char ranges")
|
||||
}
|
||||
match -3 {
|
||||
-7..=5 => {}
|
||||
|
|
|
@ -9,7 +9,7 @@
|
|||
// except according to those terms.
|
||||
|
||||
// Regression test for #23698: The reassignment checker only cared
|
||||
// about the last assigment in a match arm body
|
||||
// about the last assignment in a match arm body
|
||||
|
||||
// Use an extra function to make sure no extra assignments
|
||||
// are introduced by macros in the match statement
|
||||
|
|
|
@ -26,7 +26,7 @@ fn sanity_check_size<T: Copy>(one: T) {
|
|||
|
||||
fn main() {
|
||||
// This can fail if rustc and LLVM disagree on the size of a type.
|
||||
// In this case, `Option<Packed<(&(), u32)>>` was erronously not
|
||||
// In this case, `Option<Packed<(&(), u32)>>` was erroneously not
|
||||
// marked as packed despite needing alignment `1` and containing
|
||||
// its `&()` discriminant, which has alignment larger than `1`.
|
||||
sanity_check_size((Some(Packed((&(), 0))), true));
|
||||
|
|
|
@ -33,7 +33,7 @@ extern fn send_signal() {
|
|||
|
||||
fn main() {
|
||||
unsafe {
|
||||
// Install signal hander that runs on alternate signal stack.
|
||||
// Install signal handler that runs on alternate signal stack.
|
||||
let mut action: sigaction = std::mem::zeroed();
|
||||
action.sa_flags = (SA_ONSTACK | SA_SIGINFO) as _;
|
||||
action.sa_sigaction = signal_handler as sighandler_t;
|
||||
|
|
|
@ -10,7 +10,7 @@
|
|||
|
||||
#![crate_name = "qwop"]
|
||||
|
||||
/// (writen on a spider's web) Some Macro
|
||||
/// (written on a spider's web) Some Macro
|
||||
#[macro_export]
|
||||
macro_rules! some_macro {
|
||||
() => {
|
||||
|
|
|
@ -8,7 +8,7 @@
|
|||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
// Check that the user gets an errror if they omit a binding from an
|
||||
// Check that the user gets an error if they omit a binding from an
|
||||
// object type.
|
||||
|
||||
pub trait Foo {
|
||||
|
|
|
@ -19,7 +19,7 @@
|
|||
// revisions: ast migrate nll
|
||||
|
||||
// Since we are testing nll (and migration) explicitly as a separate
|
||||
// revisions, dont worry about the --compare-mode=nll on this test.
|
||||
// revisions, don't worry about the --compare-mode=nll on this test.
|
||||
|
||||
// ignore-compare-mode-nll
|
||||
|
||||
|
|
|
@ -8,7 +8,7 @@
|
|||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
// We need to opt inot the `!` feature in order to trigger the
|
||||
// We need to opt into the `!` feature in order to trigger the
|
||||
// requirement that this is testing.
|
||||
#![feature(never_type)]
|
||||
|
||||
|
|
|
@ -24,7 +24,7 @@ error[E0401]: can't use type parameters from outer function
|
|||
--> $DIR/E0401.rs:32:25
|
||||
|
|
||||
LL | impl<T> Iterator for A<T> {
|
||||
| ---- `Self` type implicitely declared here, on the `impl`
|
||||
| ---- `Self` type implicitly declared here, on the `impl`
|
||||
...
|
||||
LL | fn helper(sel: &Self) -> u8 { //~ ERROR E0401
|
||||
| ------ ^^^^ use of type variable from outer function
|
||||
|
|
|
@ -8,7 +8,7 @@
|
|||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
// `#[macro_export] macro_rules` that doen't originate from macro expansions can be placed
|
||||
// `#[macro_export] macro_rules` that doesn't originate from macro expansions can be placed
|
||||
// into the root module soon enough to act as usual items and shadow globs and preludes.
|
||||
|
||||
#![feature(decl_macro)]
|
||||
|
|
|
@ -8,7 +8,7 @@
|
|||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
// Checks lexical scopes cannot see through normal module boundries
|
||||
// Checks lexical scopes cannot see through normal module boundaries
|
||||
|
||||
fn f() {
|
||||
fn g() {}
|
||||
|
|
|
@ -24,7 +24,7 @@
|
|||
|
||||
// run-pass
|
||||
|
||||
// This test has structs and functions that are by definiton unusable
|
||||
// This test has structs and functions that are by definition unusable
|
||||
// all over the place, so just go ahead and allow dead_code
|
||||
#![allow(dead_code)]
|
||||
|
||||
|
|
|
@ -8,7 +8,7 @@
|
|||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
// Confirm that we don't accidently divide or mod by zero in llvm_type
|
||||
// Confirm that we don't accidentally divide or mod by zero in llvm_type
|
||||
|
||||
// compile-pass
|
||||
|
||||
|
|
|
@ -10,7 +10,7 @@
|
|||
|
||||
#![feature(label_break_value)]
|
||||
|
||||
// These are forbidden occurences of label-break-value
|
||||
// These are forbidden occurrences of label-break-value
|
||||
|
||||
fn labeled_unsafe() {
|
||||
unsafe 'b: {} //~ ERROR expected one of `extern`, `fn`, or `{`
|
||||
|
|
|
@ -9,8 +9,8 @@
|
|||
// except according to those terms.
|
||||
|
||||
// FIXME: Change to UI Test
|
||||
// Check notes are placed on an assignment that can actually precede the current assigmnent
|
||||
// Don't emmit a first assignment for assignment in a loop.
|
||||
// Check notes are placed on an assignment that can actually precede the current assignment
|
||||
// Don't emit a first assignment for assignment in a loop.
|
||||
|
||||
// compile-flags: -Zborrowck=compare
|
||||
|
||||
|
|
|
@ -35,7 +35,7 @@ use lint_unused_extern_crate2::foo as bar;
|
|||
use other::*;
|
||||
|
||||
mod foo {
|
||||
// Test that this is unused even though an earler `extern crate` is used.
|
||||
// Test that this is unused even though an earlier `extern crate` is used.
|
||||
extern crate lint_unused_extern_crate2; //~ ERROR unused extern crate
|
||||
}
|
||||
|
||||
|
|
|
@ -11,7 +11,7 @@
|
|||
#![feature(generic_associated_types)]
|
||||
|
||||
//FIXME(#44265): The lifetime shadowing and type parameter shadowing
|
||||
// should cause an error. Now it compiles (errorneously) and this will be addressed
|
||||
// should cause an error. Now it compiles (erroneously) and this will be addressed
|
||||
// by a future PR. Then remove the following:
|
||||
// compile-pass
|
||||
|
||||
|
|
|
@ -12,7 +12,7 @@
|
|||
|
||||
#![deny(rust_2018_compatibility)]
|
||||
|
||||
// Don't make a suggestion for a raw identifer replacement unless raw
|
||||
// Don't make a suggestion for a raw identifier replacement unless raw
|
||||
// identifiers are enabled.
|
||||
|
||||
fn main() {
|
||||
|
|
|
@ -9,7 +9,7 @@
|
|||
// except according to those terms.
|
||||
|
||||
// Test that we DO NOT warn when lifetime name is used multiple
|
||||
// argments, or more than once in a single argument.
|
||||
// arguments, or more than once in a single argument.
|
||||
//
|
||||
// compile-pass
|
||||
|
||||
|
|
|
@ -8,7 +8,7 @@
|
|||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
// ignore-arm stdcall isn't suppported
|
||||
// ignore-arm stdcall isn't supported
|
||||
|
||||
fn baz(f: extern "stdcall" fn(usize, ...)) {
|
||||
//~^ ERROR: variadic function must have C or cdecl calling convention
|
||||
|
|
|
@ -8,8 +8,8 @@
|
|||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
// ignore-arm stdcall isn't suppported
|
||||
// ignore-aarch64 stdcall isn't suppported
|
||||
// ignore-arm stdcall isn't supported
|
||||
// ignore-aarch64 stdcall isn't supported
|
||||
|
||||
extern "stdcall" {
|
||||
fn printf(_: *const u8, ...); //~ ERROR: variadic function must have C or cdecl calling
|
||||
|
|
|
@ -10,7 +10,7 @@
|
|||
|
||||
// Test that we can quantify lifetimes outside a constraint (i.e., including
|
||||
// the self type) in a where clause. Specifically, test that implementing for a
|
||||
// specific lifetime is not enough to satisify the `for<'a> ...` constraint, which
|
||||
// specific lifetime is not enough to satisfy the `for<'a> ...` constraint, which
|
||||
// should require *all* lifetimes.
|
||||
|
||||
static X: &'static u32 = &42;
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue