2020-06-14 14:53:36 +02:00
|
|
|
//! Note: most of the tests relevant to this file can be found (at the time of writing) in
|
|
|
|
//! src/tests/ui/pattern/usefulness.
|
|
|
|
//!
|
|
|
|
//! This file includes the logic for exhaustiveness and usefulness checking for
|
|
|
|
//! pattern-matching. Specifically, given a list of patterns for a type, we can
|
|
|
|
//! tell whether:
|
2020-06-30 10:56:10 +02:00
|
|
|
//! (a) the patterns cover every possible constructor for the type (exhaustiveness)
|
|
|
|
//! (b) each pattern is necessary (usefulness)
|
2020-06-14 14:53:36 +02:00
|
|
|
//!
|
|
|
|
//! The algorithm implemented here is a modified version of the one described in:
|
|
|
|
//! http://moscova.inria.fr/~maranget/papers/warn/index.html
|
|
|
|
//! However, to save future implementors from reading the original paper, we
|
|
|
|
//! summarise the algorithm here to hopefully save time and be a little clearer
|
|
|
|
//! (without being so rigorous).
|
|
|
|
//!
|
|
|
|
//! # Premise
|
|
|
|
//!
|
|
|
|
//! The core of the algorithm revolves about a "usefulness" check. In particular, we
|
|
|
|
//! are trying to compute a predicate `U(P, p)` where `P` is a list of patterns (we refer to this as
|
|
|
|
//! a matrix). `U(P, p)` represents whether, given an existing list of patterns
|
|
|
|
//! `P_1 ..= P_m`, adding a new pattern `p` will be "useful" (that is, cover previously-
|
|
|
|
//! uncovered values of the type).
|
|
|
|
//!
|
|
|
|
//! If we have this predicate, then we can easily compute both exhaustiveness of an
|
|
|
|
//! entire set of patterns and the individual usefulness of each one.
|
|
|
|
//! (a) the set of patterns is exhaustive iff `U(P, _)` is false (i.e., adding a wildcard
|
|
|
|
//! match doesn't increase the number of values we're matching)
|
|
|
|
//! (b) a pattern `P_i` is not useful if `U(P[0..=(i-1), P_i)` is false (i.e., adding a
|
|
|
|
//! pattern to those that have come before it doesn't increase the number of values
|
|
|
|
//! we're matching).
|
|
|
|
//!
|
|
|
|
//! # Core concept
|
|
|
|
//!
|
|
|
|
//! The idea that powers everything that is done in this file is the following: a value is made
|
|
|
|
//! from a constructor applied to some fields. Examples of constructors are `Some`, `None`, `(,)`
|
|
|
|
//! (the 2-tuple constructor), `Foo {..}` (the constructor for a struct `Foo`), and `2` (the
|
|
|
|
//! constructor for the number `2`). Fields are just a (possibly empty) list of values.
|
|
|
|
//!
|
|
|
|
//! Some of the constructors listed above might feel weird: `None` and `2` don't take any
|
|
|
|
//! arguments. This is part of what makes constructors so general: we will consider plain values
|
|
|
|
//! like numbers and string literals to be constructors that take no arguments, also called "0-ary
|
|
|
|
//! constructors"; they are the simplest case of constructors. This allows us to see any value as
|
|
|
|
//! made up from a tree of constructors, each having a given number of children. For example:
|
|
|
|
//! `(None, Ok(0))` is made from 4 different constructors.
|
|
|
|
//!
|
|
|
|
//! This idea can be extended to patterns: a pattern captures a set of possible values, and we can
|
|
|
|
//! describe this set using constructors. For example, `Err(_)` captures all values of the type
|
|
|
|
//! `Result<T, E>` that start with the `Err` constructor (for some choice of `T` and `E`). The
|
|
|
|
//! wildcard `_` captures all values of the given type starting with any of the constructors for
|
|
|
|
//! that type.
|
|
|
|
//!
|
|
|
|
//! We use this to compute whether different patterns might capture a same value. Do the patterns
|
|
|
|
//! `Ok("foo")` and `Err(_)` capture a common value? The answer is no, because the first pattern
|
|
|
|
//! captures only values starting with the `Ok` constructor and the second only values starting
|
|
|
|
//! with the `Err` constructor. Do the patterns `Some(42)` and `Some(1..10)` intersect? They might,
|
|
|
|
//! since they both capture values starting with `Some`. To be certain, we need to dig under the
|
|
|
|
//! `Some` constructor and continue asking the question. This is the main idea behind the
|
|
|
|
//! exhaustiveness algorithm: by looking at patterns constructor-by-constructor, we can efficiently
|
|
|
|
//! figure out if some new pattern might capture a value that hadn't been captured by previous
|
|
|
|
//! patterns.
|
|
|
|
//!
|
|
|
|
//! Constructors are represented by the `Constructor` enum, and its fields by the `Fields` enum.
|
|
|
|
//! Most of the complexity of this file resides in transforming between patterns and
|
|
|
|
//! (`Constructor`, `Fields`) pairs, handling all the special cases correctly.
|
|
|
|
//!
|
|
|
|
//! Caveat: this constructors/fields distinction doesn't quite cover every Rust value. For example
|
|
|
|
//! a value of type `Rc<u64>` doesn't fit this idea very well, nor do various other things.
|
|
|
|
//! However, this idea covers most of the cases that are relevant to exhaustiveness checking.
|
|
|
|
//!
|
|
|
|
//!
|
|
|
|
//! # Algorithm
|
|
|
|
//!
|
|
|
|
//! Recall that `U(P, p)` represents whether, given an existing list of patterns (aka matrix) `P`,
|
|
|
|
//! adding a new pattern `p` will cover previously-uncovered values of the type.
|
|
|
|
//! During the course of the algorithm, the rows of the matrix won't just be individual patterns,
|
|
|
|
//! but rather partially-deconstructed patterns in the form of a list of fields. The paper
|
|
|
|
//! calls those pattern-vectors, and we will call them pattern-stacks. The same holds for the
|
|
|
|
//! new pattern `p`.
|
|
|
|
//!
|
|
|
|
//! For example, say we have the following:
|
2020-10-18 21:54:10 -07:00
|
|
|
//!
|
2020-06-14 14:53:36 +02:00
|
|
|
//! ```
|
2020-10-18 21:54:10 -07:00
|
|
|
//! // x: (Option<bool>, Result<()>)
|
|
|
|
//! match x {
|
|
|
|
//! (Some(true), _) => {}
|
|
|
|
//! (None, Err(())) => {}
|
|
|
|
//! (None, Err(_)) => {}
|
|
|
|
//! }
|
2020-06-14 14:53:36 +02:00
|
|
|
//! ```
|
2020-10-18 21:54:10 -07:00
|
|
|
//!
|
2020-06-14 14:53:36 +02:00
|
|
|
//! Here, the matrix `P` starts as:
|
2020-10-18 21:54:10 -07:00
|
|
|
//!
|
|
|
|
//! ```
|
2020-06-14 14:53:36 +02:00
|
|
|
//! [
|
|
|
|
//! [(Some(true), _)],
|
|
|
|
//! [(None, Err(()))],
|
|
|
|
//! [(None, Err(_))],
|
|
|
|
//! ]
|
2020-10-18 21:54:10 -07:00
|
|
|
//! ```
|
|
|
|
//!
|
2020-06-14 14:53:36 +02:00
|
|
|
//! We can tell it's not exhaustive, because `U(P, _)` is true (we're not covering
|
|
|
|
//! `[(Some(false), _)]`, for instance). In addition, row 3 is not useful, because
|
|
|
|
//! all the values it covers are already covered by row 2.
|
|
|
|
//!
|
|
|
|
//! A list of patterns can be thought of as a stack, because we are mainly interested in the top of
|
|
|
|
//! the stack at any given point, and we can pop or apply constructors to get new pattern-stacks.
|
|
|
|
//! To match the paper, the top of the stack is at the beginning / on the left.
|
|
|
|
//!
|
|
|
|
//! There are two important operations on pattern-stacks necessary to understand the algorithm:
|
2020-06-30 10:56:10 +02:00
|
|
|
//!
|
|
|
|
//! 1. We can pop a given constructor off the top of a stack. This operation is called
|
|
|
|
//! `specialize`, and is denoted `S(c, p)` where `c` is a constructor (like `Some` or
|
|
|
|
//! `None`) and `p` a pattern-stack.
|
|
|
|
//! If the pattern on top of the stack can cover `c`, this removes the constructor and
|
|
|
|
//! pushes its arguments onto the stack. It also expands OR-patterns into distinct patterns.
|
|
|
|
//! Otherwise the pattern-stack is discarded.
|
|
|
|
//! This essentially filters those pattern-stacks whose top covers the constructor `c` and
|
|
|
|
//! discards the others.
|
|
|
|
//!
|
|
|
|
//! For example, the first pattern above initially gives a stack `[(Some(true), _)]`. If we
|
|
|
|
//! pop the tuple constructor, we are left with `[Some(true), _]`, and if we then pop the
|
|
|
|
//! `Some` constructor we get `[true, _]`. If we had popped `None` instead, we would get
|
|
|
|
//! nothing back.
|
|
|
|
//!
|
|
|
|
//! This returns zero or more new pattern-stacks, as follows. We look at the pattern `p_1`
|
|
|
|
//! on top of the stack, and we have four cases:
|
|
|
|
//! 1.1. `p_1 = c(r_1, .., r_a)`, i.e. the top of the stack has constructor `c`. We
|
|
|
|
//! push onto the stack the arguments of this constructor, and return the result:
|
|
|
|
//! r_1, .., r_a, p_2, .., p_n
|
|
|
|
//! 1.2. `p_1 = c'(r_1, .., r_a')` where `c ≠ c'`. We discard the current stack and
|
|
|
|
//! return nothing.
|
|
|
|
//! 1.3. `p_1 = _`. We push onto the stack as many wildcards as the constructor `c` has
|
|
|
|
//! arguments (its arity), and return the resulting stack:
|
|
|
|
//! _, .., _, p_2, .., p_n
|
|
|
|
//! 1.4. `p_1 = r_1 | r_2`. We expand the OR-pattern and then recurse on each resulting
|
|
|
|
//! stack:
|
|
|
|
//! S(c, (r_1, p_2, .., p_n))
|
|
|
|
//! S(c, (r_2, p_2, .., p_n))
|
|
|
|
//!
|
|
|
|
//! 2. We can pop a wildcard off the top of the stack. This is called `D(p)`, where `p` is
|
|
|
|
//! a pattern-stack.
|
|
|
|
//! This is used when we know there are missing constructor cases, but there might be
|
|
|
|
//! existing wildcard patterns, so to check the usefulness of the matrix, we have to check
|
|
|
|
//! all its *other* components.
|
|
|
|
//!
|
|
|
|
//! It is computed as follows. We look at the pattern `p_1` on top of the stack,
|
|
|
|
//! and we have three cases:
|
2020-09-21 20:29:12 +09:00
|
|
|
//! 2.1. `p_1 = c(r_1, .., r_a)`. We discard the current stack and return nothing.
|
|
|
|
//! 2.2. `p_1 = _`. We return the rest of the stack:
|
2020-06-30 10:56:10 +02:00
|
|
|
//! p_2, .., p_n
|
2020-09-21 20:29:12 +09:00
|
|
|
//! 2.3. `p_1 = r_1 | r_2`. We expand the OR-pattern and then recurse on each resulting
|
2020-06-30 10:56:10 +02:00
|
|
|
//! stack.
|
|
|
|
//! D((r_1, p_2, .., p_n))
|
|
|
|
//! D((r_2, p_2, .., p_n))
|
|
|
|
//!
|
|
|
|
//! Note that the OR-patterns are not always used directly in Rust, but are used to derive the
|
|
|
|
//! exhaustive integer matching rules, so they're written here for posterity.
|
2020-06-14 14:53:36 +02:00
|
|
|
//!
|
|
|
|
//! Both those operations extend straightforwardly to a list or pattern-stacks, i.e. a matrix, by
|
|
|
|
//! working row-by-row. Popping a constructor ends up keeping only the matrix rows that start with
|
|
|
|
//! the given constructor, and popping a wildcard keeps those rows that start with a wildcard.
|
|
|
|
//!
|
|
|
|
//!
|
|
|
|
//! The algorithm for computing `U`
|
|
|
|
//! -------------------------------
|
|
|
|
//! The algorithm is inductive (on the number of columns: i.e., components of tuple patterns).
|
|
|
|
//! That means we're going to check the components from left-to-right, so the algorithm
|
|
|
|
//! operates principally on the first component of the matrix and new pattern-stack `p`.
|
|
|
|
//! This algorithm is realised in the `is_useful` function.
|
|
|
|
//!
|
|
|
|
//! Base case. (`n = 0`, i.e., an empty tuple pattern)
|
|
|
|
//! - If `P` already contains an empty pattern (i.e., if the number of patterns `m > 0`),
|
|
|
|
//! then `U(P, p)` is false.
|
|
|
|
//! - Otherwise, `P` must be empty, so `U(P, p)` is true.
|
|
|
|
//!
|
|
|
|
//! Inductive step. (`n > 0`, i.e., whether there's at least one column
|
|
|
|
//! [which may then be expanded into further columns later])
|
2020-06-30 10:56:10 +02:00
|
|
|
//! We're going to match on the top of the new pattern-stack, `p_1`.
|
|
|
|
//! - If `p_1 == c(r_1, .., r_a)`, i.e. we have a constructor pattern.
|
|
|
|
//! Then, the usefulness of `p_1` can be reduced to whether it is useful when
|
|
|
|
//! we ignore all the patterns in the first column of `P` that involve other constructors.
|
|
|
|
//! This is where `S(c, P)` comes in:
|
|
|
|
//! `U(P, p) := U(S(c, P), S(c, p))`
|
|
|
|
//! This special case is handled in `is_useful_specialized`.
|
|
|
|
//!
|
|
|
|
//! For example, if `P` is:
|
2020-10-18 21:54:10 -07:00
|
|
|
//!
|
|
|
|
//! ```
|
2020-06-30 10:56:10 +02:00
|
|
|
//! [
|
2020-10-18 21:54:10 -07:00
|
|
|
//! [Some(true), _],
|
|
|
|
//! [None, 0],
|
2020-06-30 10:56:10 +02:00
|
|
|
//! ]
|
2020-10-18 21:54:10 -07:00
|
|
|
//! ```
|
|
|
|
//!
|
2020-06-30 10:56:10 +02:00
|
|
|
//! and `p` is [Some(false), 0], then we don't care about row 2 since we know `p` only
|
|
|
|
//! matches values that row 2 doesn't. For row 1 however, we need to dig into the
|
|
|
|
//! arguments of `Some` to know whether some new value is covered. So we compute
|
|
|
|
//! `U([[true, _]], [false, 0])`.
|
|
|
|
//!
|
|
|
|
//! - If `p_1 == _`, then we look at the list of constructors that appear in the first
|
|
|
|
//! component of the rows of `P`:
|
|
|
|
//! + If there are some constructors that aren't present, then we might think that the
|
|
|
|
//! wildcard `_` is useful, since it covers those constructors that weren't covered
|
|
|
|
//! before.
|
|
|
|
//! That's almost correct, but only works if there were no wildcards in those first
|
|
|
|
//! components. So we need to check that `p` is useful with respect to the rows that
|
|
|
|
//! start with a wildcard, if there are any. This is where `D` comes in:
|
|
|
|
//! `U(P, p) := U(D(P), D(p))`
|
|
|
|
//!
|
|
|
|
//! For example, if `P` is:
|
2020-10-18 21:54:10 -07:00
|
|
|
//!
|
|
|
|
//! ```
|
2020-06-30 10:56:10 +02:00
|
|
|
//! [
|
|
|
|
//! [_, true, _],
|
|
|
|
//! [None, false, 1],
|
|
|
|
//! ]
|
2020-10-18 21:54:10 -07:00
|
|
|
//! ```
|
|
|
|
//!
|
2020-06-30 10:56:10 +02:00
|
|
|
//! and `p` is [_, false, _], the `Some` constructor doesn't appear in `P`. So if we
|
|
|
|
//! only had row 2, we'd know that `p` is useful. However row 1 starts with a
|
|
|
|
//! wildcard, so we need to check whether `U([[true, _]], [false, 1])`.
|
|
|
|
//!
|
|
|
|
//! + Otherwise, all possible constructors (for the relevant type) are present. In this
|
|
|
|
//! case we must check whether the wildcard pattern covers any unmatched value. For
|
|
|
|
//! that, we can think of the `_` pattern as a big OR-pattern that covers all
|
|
|
|
//! possible constructors. For `Option`, that would mean `_ = None | Some(_)` for
|
|
|
|
//! example. The wildcard pattern is useful in this case if it is useful when
|
|
|
|
//! specialized to one of the possible constructors. So we compute:
|
|
|
|
//! `U(P, p) := ∃(k ϵ constructors) U(S(k, P), S(k, p))`
|
|
|
|
//!
|
|
|
|
//! For example, if `P` is:
|
2020-10-18 21:54:10 -07:00
|
|
|
//!
|
|
|
|
//! ```
|
2020-06-30 10:56:10 +02:00
|
|
|
//! [
|
|
|
|
//! [Some(true), _],
|
|
|
|
//! [None, false],
|
|
|
|
//! ]
|
2020-10-18 21:54:10 -07:00
|
|
|
//! ```
|
|
|
|
//!
|
2020-06-30 10:56:10 +02:00
|
|
|
//! and `p` is [_, false], both `None` and `Some` constructors appear in the first
|
|
|
|
//! components of `P`. We will therefore try popping both constructors in turn: we
|
|
|
|
//! compute `U([[true, _]], [_, false])` for the `Some` constructor, and `U([[false]],
|
|
|
|
//! [false])` for the `None` constructor. The first case returns true, so we know that
|
|
|
|
//! `p` is useful for `P`. Indeed, it matches `[Some(false), _]` that wasn't matched
|
|
|
|
//! before.
|
|
|
|
//!
|
|
|
|
//! - If `p_1 == r_1 | r_2`, then the usefulness depends on each `r_i` separately:
|
|
|
|
//! `U(P, p) := U(P, (r_1, p_2, .., p_n))
|
|
|
|
//! || U(P, (r_2, p_2, .., p_n))`
|
2020-06-14 14:53:36 +02:00
|
|
|
//!
|
|
|
|
//! Modifications to the algorithm
|
|
|
|
//! ------------------------------
|
|
|
|
//! The algorithm in the paper doesn't cover some of the special cases that arise in Rust, for
|
|
|
|
//! example uninhabited types and variable-length slice patterns. These are drawn attention to
|
|
|
|
//! throughout the code below. I'll make a quick note here about how exhaustive integer matching is
|
|
|
|
//! accounted for, though.
|
|
|
|
//!
|
|
|
|
//! Exhaustive integer matching
|
|
|
|
//! ---------------------------
|
|
|
|
//! An integer type can be thought of as a (huge) sum type: 1 | 2 | 3 | ...
|
|
|
|
//! So to support exhaustive integer matching, we can make use of the logic in the paper for
|
|
|
|
//! OR-patterns. However, we obviously can't just treat ranges x..=y as individual sums, because
|
|
|
|
//! they are likely gigantic. So we instead treat ranges as constructors of the integers. This means
|
|
|
|
//! that we have a constructor *of* constructors (the integers themselves). We then need to work
|
|
|
|
//! through all the inductive step rules above, deriving how the ranges would be treated as
|
|
|
|
//! OR-patterns, and making sure that they're treated in the same way even when they're ranges.
|
|
|
|
//! There are really only four special cases here:
|
|
|
|
//! - When we match on a constructor that's actually a range, we have to treat it as if we would
|
|
|
|
//! an OR-pattern.
|
|
|
|
//! + It turns out that we can simply extend the case for single-value patterns in
|
|
|
|
//! `specialize` to either be *equal* to a value constructor, or *contained within* a range
|
|
|
|
//! constructor.
|
|
|
|
//! + When the pattern itself is a range, you just want to tell whether any of the values in
|
|
|
|
//! the pattern range coincide with values in the constructor range, which is precisely
|
|
|
|
//! intersection.
|
|
|
|
//! Since when encountering a range pattern for a value constructor, we also use inclusion, it
|
|
|
|
//! means that whenever the constructor is a value/range and the pattern is also a value/range,
|
|
|
|
//! we can simply use intersection to test usefulness.
|
|
|
|
//! - When we're testing for usefulness of a pattern and the pattern's first component is a
|
|
|
|
//! wildcard.
|
|
|
|
//! + If all the constructors appear in the matrix, we have a slight complication. By default,
|
|
|
|
//! the behaviour (i.e., a disjunction over specialised matrices for each constructor) is
|
|
|
|
//! invalid, because we want a disjunction over every *integer* in each range, not just a
|
|
|
|
//! disjunction over every range. This is a bit more tricky to deal with: essentially we need
|
|
|
|
//! to form equivalence classes of subranges of the constructor range for which the behaviour
|
|
|
|
//! of the matrix `P` and new pattern `p` are the same. This is described in more
|
2020-10-25 21:59:59 +00:00
|
|
|
//! detail in `Constructor::split`.
|
2020-06-14 14:53:36 +02:00
|
|
|
//! + If some constructors are missing from the matrix, it turns out we don't need to do
|
|
|
|
//! anything special (because we know none of the integers are actually wildcards: i.e., we
|
|
|
|
//! can't span wildcards using ranges).
|
2016-09-24 18:24:34 +03:00
|
|
|
use self::Constructor::*;
|
2019-11-17 15:54:44 +00:00
|
|
|
use self::SliceKind::*;
|
2016-09-24 18:24:34 +03:00
|
|
|
use self::Usefulness::*;
|
|
|
|
use self::WitnessPreference::*;
|
|
|
|
|
2020-01-06 07:03:46 +01:00
|
|
|
use rustc_data_structures::captures::Captures;
|
2020-09-19 22:00:10 +09:00
|
|
|
use rustc_data_structures::fx::{FxHashMap, FxHashSet};
|
2019-09-26 05:38:33 +00:00
|
|
|
use rustc_index::vec::Idx;
|
2016-09-26 02:53:26 +03:00
|
|
|
|
2019-09-21 13:49:14 +02:00
|
|
|
use super::{compare_const_vals, PatternFoldable, PatternFolder};
|
2019-09-26 18:51:59 +01:00
|
|
|
use super::{FieldPat, Pat, PatKind, PatRange};
|
2016-09-24 18:24:34 +03:00
|
|
|
|
2020-06-02 20:19:49 +03:00
|
|
|
use rustc_arena::TypedArena;
|
2020-03-29 17:19:48 +02:00
|
|
|
use rustc_attr::{SignedInt, UnsignedInt};
|
|
|
|
use rustc_hir::def_id::DefId;
|
|
|
|
use rustc_hir::{HirId, RangeEnd};
|
2020-10-01 09:24:44 +02:00
|
|
|
use rustc_middle::mir::interpret::{truncate, ConstValue};
|
2020-03-29 16:41:09 +02:00
|
|
|
use rustc_middle::mir::Field;
|
2020-03-31 18:16:47 +02:00
|
|
|
use rustc_middle::ty::layout::IntegerExt;
|
2020-05-09 13:46:05 +01:00
|
|
|
use rustc_middle::ty::{self, Const, Ty, TyCtxt};
|
2020-03-11 12:49:08 +01:00
|
|
|
use rustc_session::lint;
|
2019-12-31 20:15:40 +03:00
|
|
|
use rustc_span::{Span, DUMMY_SP};
|
2020-03-31 18:16:47 +02:00
|
|
|
use rustc_target::abi::{Integer, Size, VariantIdx};
|
2016-09-24 18:24:34 +03:00
|
|
|
|
2019-09-21 13:49:14 +02:00
|
|
|
use smallvec::{smallvec, SmallVec};
|
|
|
|
use std::cmp::{self, max, min, Ordering};
|
2016-09-26 02:53:26 +03:00
|
|
|
use std::fmt;
|
2018-04-01 13:48:15 +09:00
|
|
|
use std::iter::{FromIterator, IntoIterator};
|
2018-06-22 23:52:56 +01:00
|
|
|
use std::ops::RangeInclusive;
|
2016-09-24 18:24:34 +03:00
|
|
|
|
2020-10-01 09:24:44 +02:00
|
|
|
crate fn expand_pattern<'tcx>(pat: Pat<'tcx>) -> Pat<'tcx> {
|
|
|
|
LiteralExpander.fold_pattern(&pat)
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
|
|
|
|
2020-10-01 09:24:44 +02:00
|
|
|
struct LiteralExpander;
|
2018-12-05 18:31:49 +01:00
|
|
|
|
2020-10-01 09:24:44 +02:00
|
|
|
impl<'tcx> PatternFolder<'tcx> for LiteralExpander {
|
2019-09-26 18:51:59 +01:00
|
|
|
fn fold_pattern(&mut self, pat: &Pat<'tcx>) -> Pat<'tcx> {
|
2020-08-03 00:49:11 +02:00
|
|
|
debug!("fold_pattern {:?} {:?} {:?}", pat, pat.ty.kind(), pat.kind);
|
|
|
|
match (pat.ty.kind(), &*pat.kind) {
|
2019-09-21 13:49:14 +02:00
|
|
|
(_, &PatKind::Binding { subpattern: Some(ref s), .. }) => s.fold_with(self),
|
2019-11-17 23:05:50 +00:00
|
|
|
(_, &PatKind::AscribeUserType { subpattern: ref s, .. }) => s.fold_with(self),
|
2019-09-21 13:49:14 +02:00
|
|
|
_ => pat.super_fold_with(self),
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2019-09-26 18:51:59 +01:00
|
|
|
impl<'tcx> Pat<'tcx> {
|
2019-12-04 16:26:30 +00:00
|
|
|
pub(super) fn is_wildcard(&self) -> bool {
|
2016-09-26 02:53:26 +03:00
|
|
|
match *self.kind {
|
2019-09-21 13:49:14 +02:00
|
|
|
PatKind::Binding { subpattern: None, .. } | PatKind::Wild => true,
|
|
|
|
_ => false,
|
2016-09-26 02:53:26 +03:00
|
|
|
}
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2019-11-01 15:44:58 +00:00
|
|
|
/// A row of a matrix. Rows of len 1 are very common, which is why `SmallVec[_; 2]`
|
|
|
|
/// works well.
|
2020-09-22 14:24:55 +09:00
|
|
|
#[derive(Debug, Clone, PartialEq)]
|
2020-01-05 15:46:44 +00:00
|
|
|
crate struct PatStack<'p, 'tcx>(SmallVec<[&'p Pat<'tcx>; 2]>);
|
2019-11-01 15:44:58 +00:00
|
|
|
|
|
|
|
impl<'p, 'tcx> PatStack<'p, 'tcx> {
|
2020-01-05 15:46:44 +00:00
|
|
|
crate fn from_pattern(pat: &'p Pat<'tcx>) -> Self {
|
2019-11-01 15:44:58 +00:00
|
|
|
PatStack(smallvec![pat])
|
|
|
|
}
|
|
|
|
|
|
|
|
fn from_vec(vec: SmallVec<[&'p Pat<'tcx>; 2]>) -> Self {
|
|
|
|
PatStack(vec)
|
|
|
|
}
|
|
|
|
|
|
|
|
fn from_slice(s: &[&'p Pat<'tcx>]) -> Self {
|
|
|
|
PatStack(SmallVec::from_slice(s))
|
|
|
|
}
|
|
|
|
|
|
|
|
fn is_empty(&self) -> bool {
|
|
|
|
self.0.is_empty()
|
|
|
|
}
|
|
|
|
|
|
|
|
fn len(&self) -> usize {
|
|
|
|
self.0.len()
|
|
|
|
}
|
|
|
|
|
|
|
|
fn head(&self) -> &'p Pat<'tcx> {
|
|
|
|
self.0[0]
|
|
|
|
}
|
|
|
|
|
|
|
|
fn to_tail(&self) -> Self {
|
|
|
|
PatStack::from_slice(&self.0[1..])
|
|
|
|
}
|
|
|
|
|
|
|
|
fn iter(&self) -> impl Iterator<Item = &Pat<'tcx>> {
|
2020-02-29 13:14:52 +01:00
|
|
|
self.0.iter().copied()
|
2019-11-01 15:44:58 +00:00
|
|
|
}
|
2019-11-01 16:33:34 +00:00
|
|
|
|
2019-11-21 18:45:28 +00:00
|
|
|
// If the first pattern is an or-pattern, expand this pattern. Otherwise, return `None`.
|
2019-11-25 18:23:09 +00:00
|
|
|
fn expand_or_pat(&self) -> Option<Vec<Self>> {
|
2019-11-21 18:45:28 +00:00
|
|
|
if self.is_empty() {
|
|
|
|
None
|
|
|
|
} else if let PatKind::Or { pats } = &*self.head().kind {
|
|
|
|
Some(
|
|
|
|
pats.iter()
|
|
|
|
.map(|pat| {
|
|
|
|
let mut new_patstack = PatStack::from_pattern(pat);
|
|
|
|
new_patstack.0.extend_from_slice(&self.0[1..]);
|
|
|
|
new_patstack
|
|
|
|
})
|
|
|
|
.collect(),
|
|
|
|
)
|
|
|
|
} else {
|
|
|
|
None
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2019-11-01 16:33:34 +00:00
|
|
|
/// This computes `D(self)`. See top of the file for explanations.
|
|
|
|
fn specialize_wildcard(&self) -> Option<Self> {
|
|
|
|
if self.head().is_wildcard() { Some(self.to_tail()) } else { None }
|
|
|
|
}
|
|
|
|
|
|
|
|
/// This computes `S(constructor, self)`. See top of the file for explanations.
|
2020-10-25 23:03:15 +00:00
|
|
|
///
|
|
|
|
/// This is the main specialization step. It expands the pattern
|
|
|
|
/// into `arity` patterns based on the constructor. For most patterns, the step is trivial,
|
|
|
|
/// for instance tuple patterns are flattened and box patterns expand into their inner pattern.
|
|
|
|
/// Returns `None` if the pattern does not have the given constructor.
|
|
|
|
///
|
|
|
|
/// OTOH, slice patterns with a subslice pattern (tail @ ..) can be expanded into multiple
|
|
|
|
/// different patterns.
|
|
|
|
/// Structure patterns with a partial wild pattern (Foo { a: 42, .. }) have their missing
|
|
|
|
/// fields filled with wild patterns.
|
|
|
|
///
|
|
|
|
/// This is roughly the inverse of `Constructor::apply`.
|
2019-11-28 13:03:02 +00:00
|
|
|
fn specialize_constructor(
|
2019-11-01 16:33:34 +00:00
|
|
|
&self,
|
2020-10-25 21:59:59 +00:00
|
|
|
cx: &MatchCheckCtxt<'p, 'tcx>,
|
2020-10-25 23:03:15 +00:00
|
|
|
ctor: &Constructor<'tcx>,
|
2020-05-09 11:32:54 +01:00
|
|
|
ctor_wild_subpatterns: &Fields<'p, 'tcx>,
|
2020-10-18 13:48:54 +01:00
|
|
|
is_my_head_ctor: bool,
|
2019-11-28 13:03:02 +00:00
|
|
|
) -> Option<PatStack<'p, 'tcx>> {
|
2020-10-25 23:03:15 +00:00
|
|
|
// We return `None` if `ctor` is not covered by `self.head()`. If `ctor` is known to be
|
|
|
|
// derived from `self.head()`, or if `self.head()` is a wildcard, then we don't need to
|
|
|
|
// check; otherwise, we compute the constructor of `self.head()` and check for constructor
|
|
|
|
// inclusion.
|
|
|
|
// Note that this shortcut is also necessary for correctness: a pattern should always be
|
|
|
|
// specializable with its own constructor, even in cases where we refuse to inspect values like
|
|
|
|
// opaque constants.
|
|
|
|
if !self.head().is_wildcard() && !is_my_head_ctor {
|
|
|
|
// `unwrap` is safe because `pat` is not a wildcard.
|
|
|
|
let head_ctor = pat_constructor(cx.tcx, cx.param_env, self.head()).unwrap();
|
|
|
|
if !ctor.is_covered_by(cx, &head_ctor, self.head().ty) {
|
|
|
|
return None;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
let new_fields = ctor_wild_subpatterns.replace_with_pattern_arguments(self.head());
|
|
|
|
|
|
|
|
debug!(
|
|
|
|
"specialize_constructor({:#?}, {:#?}, {:#?}) = {:#?}",
|
2020-10-18 13:48:54 +01:00
|
|
|
self.head(),
|
2020-10-25 23:03:15 +00:00
|
|
|
ctor,
|
2020-10-18 13:48:54 +01:00
|
|
|
ctor_wild_subpatterns,
|
2020-10-25 23:03:15 +00:00
|
|
|
new_fields
|
|
|
|
);
|
|
|
|
|
|
|
|
// We pop the head pattern and push the new fields extracted from the arguments of
|
|
|
|
// `self.head()`.
|
2020-05-09 12:00:34 +01:00
|
|
|
Some(new_fields.push_on_patstack(&self.0[1..]))
|
2019-11-01 16:33:34 +00:00
|
|
|
}
|
2019-11-01 15:44:58 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
impl<'p, 'tcx> Default for PatStack<'p, 'tcx> {
|
|
|
|
fn default() -> Self {
|
|
|
|
PatStack(smallvec![])
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
impl<'p, 'tcx> FromIterator<&'p Pat<'tcx>> for PatStack<'p, 'tcx> {
|
|
|
|
fn from_iter<T>(iter: T) -> Self
|
|
|
|
where
|
|
|
|
T: IntoIterator<Item = &'p Pat<'tcx>>,
|
|
|
|
{
|
|
|
|
PatStack(iter.into_iter().collect())
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2020-09-19 22:00:10 +09:00
|
|
|
/// Depending on the match patterns, the specialization process might be able to use a fast path.
|
|
|
|
/// Tracks whether we can use the fast path and the lookup table needed in those cases.
|
2020-09-22 14:24:55 +09:00
|
|
|
#[derive(Clone, Debug, PartialEq)]
|
2020-09-19 22:00:10 +09:00
|
|
|
enum SpecializationCache {
|
|
|
|
/// Patterns consist of only enum variants.
|
2020-09-21 20:29:12 +09:00
|
|
|
/// Variant patterns does not intersect with each other (in contrast to range patterns),
|
|
|
|
/// so it is possible to precompute the result of `Matrix::specialize_constructor` at a
|
|
|
|
/// lower computational complexity.
|
|
|
|
/// `lookup` is responsible for holding the precomputed result of
|
|
|
|
/// `Matrix::specialize_constructor`, while `wilds` is used for two purposes: the first one is
|
|
|
|
/// the precomputed result of `Matrix::specialize_wildcard`, and the second is to be used as a
|
|
|
|
/// fallback for `Matrix::specialize_constructor` when it tries to apply a constructor that
|
|
|
|
/// has not been seen in the `Matrix`. See `update_cache` for further explanations.
|
2020-09-19 22:00:10 +09:00
|
|
|
Variants { lookup: FxHashMap<DefId, SmallVec<[usize; 1]>>, wilds: SmallVec<[usize; 1]> },
|
|
|
|
/// Does not belong to the cases above, use the slow path.
|
|
|
|
Incompatible,
|
|
|
|
}
|
|
|
|
|
2019-11-01 15:44:58 +00:00
|
|
|
/// A 2D matrix.
|
2020-09-22 14:24:55 +09:00
|
|
|
#[derive(Clone, PartialEq)]
|
2020-09-19 22:00:10 +09:00
|
|
|
crate struct Matrix<'p, 'tcx> {
|
|
|
|
patterns: Vec<PatStack<'p, 'tcx>>,
|
|
|
|
cache: SpecializationCache,
|
|
|
|
}
|
2016-09-24 18:24:34 +03:00
|
|
|
|
2018-11-28 13:38:46 +11:00
|
|
|
impl<'p, 'tcx> Matrix<'p, 'tcx> {
|
2020-01-05 15:46:44 +00:00
|
|
|
crate fn empty() -> Self {
|
2020-09-21 20:29:12 +09:00
|
|
|
// Use `SpecializationCache::Incompatible` as a placeholder; we will initialize it on the
|
|
|
|
// first call to `push`. See the first half of `update_cache`.
|
2020-09-19 22:00:10 +09:00
|
|
|
Matrix { patterns: vec![], cache: SpecializationCache::Incompatible }
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
|
|
|
|
2019-11-21 18:45:28 +00:00
|
|
|
/// Pushes a new row to the matrix. If the row starts with an or-pattern, this expands it.
|
2020-01-05 15:46:44 +00:00
|
|
|
crate fn push(&mut self, row: PatStack<'p, 'tcx>) {
|
2019-11-21 18:45:28 +00:00
|
|
|
if let Some(rows) = row.expand_or_pat() {
|
2020-03-10 16:20:47 +01:00
|
|
|
for row in rows {
|
|
|
|
// We recursively expand the or-patterns of the new rows.
|
|
|
|
// This is necessary as we might have `0 | (1 | 2)` or e.g., `x @ 0 | x @ (1 | 2)`.
|
|
|
|
self.push(row)
|
|
|
|
}
|
2019-11-21 18:45:28 +00:00
|
|
|
} else {
|
2020-09-21 20:29:12 +09:00
|
|
|
self.patterns.push(row);
|
|
|
|
self.update_cache(self.patterns.len() - 1);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
fn update_cache(&mut self, idx: usize) {
|
|
|
|
let row = &self.patterns[idx];
|
|
|
|
// We don't know which kind of cache could be used until we see the first row; therefore an
|
|
|
|
// empty `Matrix` is initialized with `SpecializationCache::Empty`, then the cache is
|
|
|
|
// assigned the appropriate variant below on the first call to `push`.
|
|
|
|
if self.patterns.is_empty() {
|
|
|
|
self.cache = if row.is_empty() {
|
|
|
|
SpecializationCache::Incompatible
|
|
|
|
} else {
|
|
|
|
match *row.head().kind {
|
|
|
|
PatKind::Variant { .. } => SpecializationCache::Variants {
|
|
|
|
lookup: FxHashMap::default(),
|
|
|
|
wilds: SmallVec::new(),
|
|
|
|
},
|
|
|
|
// Note: If the first pattern is a wildcard, then all patterns after that is not
|
|
|
|
// useful. The check is simple enough so we treat it as the same as unsupported
|
|
|
|
// patterns.
|
|
|
|
_ => SpecializationCache::Incompatible,
|
|
|
|
}
|
|
|
|
};
|
|
|
|
}
|
|
|
|
// Update the cache.
|
|
|
|
match &mut self.cache {
|
|
|
|
SpecializationCache::Variants { ref mut lookup, ref mut wilds } => {
|
|
|
|
let head = row.head();
|
|
|
|
match *head.kind {
|
|
|
|
_ if head.is_wildcard() => {
|
|
|
|
// Per rule 1.3 in the top-level comments, a wildcard pattern is included in
|
|
|
|
// the result of `specialize_constructor` for *any* `Constructor`.
|
|
|
|
// We push the wildcard pattern to the precomputed result for constructors
|
|
|
|
// that we have seen before; results for constructors we have not yet seen
|
|
|
|
// defaults to `wilds`, which is updated right below.
|
|
|
|
for (_, v) in lookup.iter_mut() {
|
|
|
|
v.push(idx);
|
2020-09-19 22:00:10 +09:00
|
|
|
}
|
2020-09-21 20:29:12 +09:00
|
|
|
// Per rule 2.1 and 2.2 in the top-level comments, only wildcard patterns
|
|
|
|
// are included in the result of `specialize_wildcard`.
|
|
|
|
// What we do here is to track the wildcards we have seen; so in addition to
|
|
|
|
// acting as the precomputed result of `specialize_wildcard`, `wilds` also
|
|
|
|
// serves as the default value of `specialize_constructor` for constructors
|
|
|
|
// that are not in `lookup`.
|
|
|
|
wilds.push(idx);
|
|
|
|
}
|
|
|
|
PatKind::Variant { adt_def, variant_index, .. } => {
|
|
|
|
// Handle the cases of rule 1.1 and 1.2 in the top-level comments.
|
|
|
|
// A variant pattern can only be included in the results of
|
|
|
|
// `specialize_constructor` for a particular constructor, therefore we are
|
|
|
|
// using a HashMap to track that.
|
|
|
|
lookup
|
|
|
|
.entry(adt_def.variants[variant_index].def_id)
|
|
|
|
// Default to `wilds` for absent keys. See above for an explanation.
|
|
|
|
.or_insert_with(|| wilds.clone())
|
|
|
|
.push(idx);
|
|
|
|
}
|
|
|
|
_ => {
|
|
|
|
self.cache = SpecializationCache::Incompatible;
|
2020-09-19 22:00:10 +09:00
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
2020-09-21 20:29:12 +09:00
|
|
|
SpecializationCache::Incompatible => {}
|
2019-11-21 18:45:28 +00:00
|
|
|
}
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
2019-11-01 16:33:34 +00:00
|
|
|
|
2019-09-23 16:07:23 +02:00
|
|
|
/// Iterate over the first component of each row
|
|
|
|
fn heads<'a>(&'a self) -> impl Iterator<Item = &'a Pat<'tcx>> + Captures<'p> {
|
2020-09-19 22:00:10 +09:00
|
|
|
self.patterns.iter().map(|r| r.head())
|
2019-09-23 16:07:23 +02:00
|
|
|
}
|
|
|
|
|
2019-11-01 16:33:34 +00:00
|
|
|
/// This computes `D(self)`. See top of the file for explanations.
|
|
|
|
fn specialize_wildcard(&self) -> Self {
|
2020-09-19 22:00:10 +09:00
|
|
|
match &self.cache {
|
|
|
|
SpecializationCache::Variants { wilds, .. } => {
|
2020-09-22 14:24:55 +09:00
|
|
|
let result =
|
|
|
|
wilds.iter().filter_map(|&i| self.patterns[i].specialize_wildcard()).collect();
|
|
|
|
// When debug assertions are enabled, check the results against the "slow path"
|
|
|
|
// result.
|
|
|
|
debug_assert_eq!(
|
|
|
|
result,
|
|
|
|
Self {
|
|
|
|
patterns: self.patterns.clone(),
|
|
|
|
cache: SpecializationCache::Incompatible
|
|
|
|
}
|
|
|
|
.specialize_wildcard()
|
|
|
|
);
|
|
|
|
result
|
2020-09-19 22:00:10 +09:00
|
|
|
}
|
|
|
|
SpecializationCache::Incompatible => {
|
|
|
|
self.patterns.iter().filter_map(|r| r.specialize_wildcard()).collect()
|
|
|
|
}
|
|
|
|
}
|
2019-11-01 16:33:34 +00:00
|
|
|
}
|
|
|
|
|
|
|
|
/// This computes `S(constructor, self)`. See top of the file for explanations.
|
2019-11-28 13:03:02 +00:00
|
|
|
fn specialize_constructor(
|
2019-11-01 16:33:34 +00:00
|
|
|
&self,
|
2020-10-25 21:59:59 +00:00
|
|
|
cx: &MatchCheckCtxt<'p, 'tcx>,
|
2019-11-01 16:33:34 +00:00
|
|
|
constructor: &Constructor<'tcx>,
|
2020-05-09 11:32:54 +01:00
|
|
|
ctor_wild_subpatterns: &Fields<'p, 'tcx>,
|
2019-11-28 13:03:02 +00:00
|
|
|
) -> Matrix<'p, 'tcx> {
|
2020-09-19 22:00:10 +09:00
|
|
|
match &self.cache {
|
|
|
|
SpecializationCache::Variants { lookup, wilds } => {
|
2020-09-22 14:24:55 +09:00
|
|
|
let result: Self = if let Constructor::Variant(id) = constructor {
|
2020-09-19 22:00:10 +09:00
|
|
|
lookup
|
|
|
|
.get(id)
|
2020-09-21 20:29:12 +09:00
|
|
|
// Default to `wilds` for absent keys. See `update_cache` for an explanation.
|
2020-09-19 22:00:10 +09:00
|
|
|
.unwrap_or(&wilds)
|
|
|
|
.iter()
|
|
|
|
.filter_map(|&i| {
|
|
|
|
self.patterns[i].specialize_constructor(
|
|
|
|
cx,
|
|
|
|
constructor,
|
|
|
|
ctor_wild_subpatterns,
|
2020-10-18 13:48:54 +01:00
|
|
|
false,
|
2020-09-19 22:00:10 +09:00
|
|
|
)
|
|
|
|
})
|
|
|
|
.collect()
|
|
|
|
} else {
|
|
|
|
unreachable!()
|
2020-09-22 14:24:55 +09:00
|
|
|
};
|
|
|
|
// When debug assertions are enabled, check the results against the "slow path"
|
|
|
|
// result.
|
|
|
|
debug_assert_eq!(
|
|
|
|
result,
|
|
|
|
Matrix {
|
|
|
|
patterns: self.patterns.clone(),
|
|
|
|
cache: SpecializationCache::Incompatible
|
|
|
|
}
|
|
|
|
.specialize_constructor(
|
|
|
|
cx,
|
|
|
|
constructor,
|
|
|
|
ctor_wild_subpatterns
|
|
|
|
)
|
|
|
|
);
|
|
|
|
result
|
2020-09-19 22:00:10 +09:00
|
|
|
}
|
|
|
|
SpecializationCache::Incompatible => self
|
|
|
|
.patterns
|
|
|
|
.iter()
|
2020-10-18 13:48:54 +01:00
|
|
|
.filter_map(|r| {
|
|
|
|
r.specialize_constructor(cx, constructor, ctor_wild_subpatterns, false)
|
|
|
|
})
|
2020-09-19 22:00:10 +09:00
|
|
|
.collect(),
|
|
|
|
}
|
2019-11-01 16:33:34 +00:00
|
|
|
}
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
/// Pretty-printer for matrices of patterns, example:
|
2020-05-01 22:28:15 +02:00
|
|
|
///
|
|
|
|
/// ```text
|
2019-09-26 20:47:05 +02:00
|
|
|
/// +++++++++++++++++++++++++++++
|
|
|
|
/// + _ + [] +
|
|
|
|
/// +++++++++++++++++++++++++++++
|
|
|
|
/// + true + [First] +
|
|
|
|
/// +++++++++++++++++++++++++++++
|
|
|
|
/// + true + [Second(true)] +
|
|
|
|
/// +++++++++++++++++++++++++++++
|
|
|
|
/// + false + [_] +
|
|
|
|
/// +++++++++++++++++++++++++++++
|
|
|
|
/// + _ + [_, _, tail @ ..] +
|
|
|
|
/// +++++++++++++++++++++++++++++
|
2020-10-17 20:11:30 +01:00
|
|
|
/// ```
|
2018-11-28 13:38:46 +11:00
|
|
|
impl<'p, 'tcx> fmt::Debug for Matrix<'p, 'tcx> {
|
2019-02-08 06:28:15 +09:00
|
|
|
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
2016-09-24 18:24:34 +03:00
|
|
|
write!(f, "\n")?;
|
|
|
|
|
2020-09-19 22:00:10 +09:00
|
|
|
let Matrix { patterns: m, .. } = self;
|
2019-09-21 13:49:14 +02:00
|
|
|
let pretty_printed_matrix: Vec<Vec<String>> =
|
|
|
|
m.iter().map(|row| row.iter().map(|pat| format!("{:?}", pat)).collect()).collect();
|
2016-09-24 18:24:34 +03:00
|
|
|
|
|
|
|
let column_count = m.iter().map(|row| row.len()).max().unwrap_or(0);
|
|
|
|
assert!(m.iter().all(|row| row.len() == column_count));
|
2019-09-21 13:49:14 +02:00
|
|
|
let column_widths: Vec<usize> = (0..column_count)
|
|
|
|
.map(|col| pretty_printed_matrix.iter().map(|row| row[col].len()).max().unwrap_or(0))
|
|
|
|
.collect();
|
2016-09-24 18:24:34 +03:00
|
|
|
|
|
|
|
let total_width = column_widths.iter().cloned().sum::<usize>() + column_count * 3 + 1;
|
2018-04-01 13:48:15 +09:00
|
|
|
let br = "+".repeat(total_width);
|
2016-09-24 18:24:34 +03:00
|
|
|
write!(f, "{}\n", br)?;
|
|
|
|
for row in pretty_printed_matrix {
|
|
|
|
write!(f, "+")?;
|
|
|
|
for (column, pat_str) in row.into_iter().enumerate() {
|
|
|
|
write!(f, " ")?;
|
|
|
|
write!(f, "{:1$}", pat_str, column_widths[column])?;
|
|
|
|
write!(f, " +")?;
|
|
|
|
}
|
|
|
|
write!(f, "\n")?;
|
|
|
|
write!(f, "{}\n", br)?;
|
|
|
|
}
|
|
|
|
Ok(())
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2019-11-01 15:44:58 +00:00
|
|
|
impl<'p, 'tcx> FromIterator<PatStack<'p, 'tcx>> for Matrix<'p, 'tcx> {
|
2018-11-28 13:38:46 +11:00
|
|
|
fn from_iter<T>(iter: T) -> Self
|
2019-09-21 13:49:14 +02:00
|
|
|
where
|
2019-11-01 15:44:58 +00:00
|
|
|
T: IntoIterator<Item = PatStack<'p, 'tcx>>,
|
2016-09-24 18:24:34 +03:00
|
|
|
{
|
2019-11-21 18:45:28 +00:00
|
|
|
let mut matrix = Matrix::empty();
|
|
|
|
for x in iter {
|
|
|
|
// Using `push` ensures we correctly expand or-patterns.
|
|
|
|
matrix.push(x);
|
|
|
|
}
|
|
|
|
matrix
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2020-01-05 15:46:44 +00:00
|
|
|
crate struct MatchCheckCtxt<'a, 'tcx> {
|
|
|
|
crate tcx: TyCtxt<'tcx>,
|
2016-12-29 17:08:33 +08:00
|
|
|
/// The module in which the match occurs. This is necessary for
|
2016-11-29 15:10:26 +08:00
|
|
|
/// checking inhabited-ness of types because whether a type is (visibly)
|
|
|
|
/// inhabited can depend on whether it was defined in the current module or
|
2019-02-08 14:53:55 +01:00
|
|
|
/// not. E.g., `struct Foo { _private: ! }` cannot be seen to be empty
|
2016-12-29 17:08:33 +08:00
|
|
|
/// outside it's module and should not be matchable with an empty match
|
|
|
|
/// statement.
|
2020-01-05 15:46:44 +00:00
|
|
|
crate module: DefId,
|
2020-05-08 07:42:49 +10:00
|
|
|
crate param_env: ty::ParamEnv<'tcx>,
|
2020-01-05 15:46:44 +00:00
|
|
|
crate pattern_arena: &'a TypedArena<Pat<'tcx>>,
|
2016-09-26 02:53:26 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
impl<'a, 'tcx> MatchCheckCtxt<'a, 'tcx> {
|
2017-02-20 19:18:31 +02:00
|
|
|
fn is_uninhabited(&self, ty: Ty<'tcx>) -> bool {
|
2018-01-21 16:44:41 +08:00
|
|
|
if self.tcx.features().exhaustive_patterns {
|
2020-03-16 18:51:55 +01:00
|
|
|
self.tcx.is_ty_uninhabited_from(self.module, ty, self.param_env)
|
2017-02-20 19:18:31 +02:00
|
|
|
} else {
|
|
|
|
false
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2020-05-09 13:36:48 +01:00
|
|
|
/// Returns whether the given type is an enum from another crate declared `#[non_exhaustive]`.
|
2020-01-05 15:46:44 +00:00
|
|
|
crate fn is_foreign_non_exhaustive_enum(&self, ty: Ty<'tcx>) -> bool {
|
2020-08-03 00:49:11 +02:00
|
|
|
match ty.kind() {
|
2019-11-30 16:00:44 +00:00
|
|
|
ty::Adt(def, ..) => {
|
|
|
|
def.is_enum() && def.is_variant_list_non_exhaustive() && !def.did.is_local()
|
|
|
|
}
|
|
|
|
_ => false,
|
|
|
|
}
|
|
|
|
}
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
|
|
|
|
2019-11-17 17:33:39 +00:00
|
|
|
#[derive(Copy, Clone, Debug, PartialEq, Eq)]
|
2019-11-17 15:54:44 +00:00
|
|
|
enum SliceKind {
|
2019-11-17 17:33:39 +00:00
|
|
|
/// Patterns of length `n` (`[x, y]`).
|
2019-11-17 15:54:44 +00:00
|
|
|
FixedLen(u64),
|
2019-12-20 11:54:27 +01:00
|
|
|
/// Patterns using the `..` notation (`[x, .., y]`).
|
|
|
|
/// Captures any array constructor of `length >= i + j`.
|
|
|
|
/// In the case where `array_len` is `Some(_)`,
|
|
|
|
/// this indicates that we only care about the first `i` and the last `j` values of the array,
|
|
|
|
/// and everything in between is a wildcard `_`.
|
2019-11-17 15:54:44 +00:00
|
|
|
VarLen(u64, u64),
|
|
|
|
}
|
|
|
|
|
2019-11-17 19:08:01 +00:00
|
|
|
impl SliceKind {
|
2019-11-17 17:33:39 +00:00
|
|
|
fn arity(self) -> u64 {
|
2019-11-17 19:08:01 +00:00
|
|
|
match self {
|
2019-11-17 15:54:44 +00:00
|
|
|
FixedLen(length) => length,
|
|
|
|
VarLen(prefix, suffix) => prefix + suffix,
|
|
|
|
}
|
|
|
|
}
|
2019-11-17 17:33:39 +00:00
|
|
|
|
|
|
|
/// Whether this pattern includes patterns of length `other_len`.
|
|
|
|
fn covers_length(self, other_len: u64) -> bool {
|
2019-11-17 19:08:01 +00:00
|
|
|
match self {
|
2019-11-17 17:33:39 +00:00
|
|
|
FixedLen(len) => len == other_len,
|
|
|
|
VarLen(prefix, suffix) => prefix + suffix <= other_len,
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
/// Returns a collection of slices that spans the values covered by `self`, subtracted by the
|
|
|
|
/// values covered by `other`: i.e., `self \ other` (in set notation).
|
|
|
|
fn subtract(self, other: Self) -> SmallVec<[Self; 1]> {
|
|
|
|
// Remember, `VarLen(i, j)` covers the union of `FixedLen` from `i + j` to infinity.
|
|
|
|
// Naming: we remove the "neg" constructors from the "pos" ones.
|
2019-11-17 19:08:01 +00:00
|
|
|
match self {
|
2019-11-17 17:33:39 +00:00
|
|
|
FixedLen(pos_len) => {
|
|
|
|
if other.covers_length(pos_len) {
|
|
|
|
smallvec![]
|
|
|
|
} else {
|
|
|
|
smallvec![self]
|
|
|
|
}
|
|
|
|
}
|
|
|
|
VarLen(pos_prefix, pos_suffix) => {
|
|
|
|
let pos_len = pos_prefix + pos_suffix;
|
2019-11-17 19:08:01 +00:00
|
|
|
match other {
|
2019-11-17 17:33:39 +00:00
|
|
|
FixedLen(neg_len) => {
|
|
|
|
if neg_len < pos_len {
|
|
|
|
smallvec![self]
|
|
|
|
} else {
|
|
|
|
(pos_len..neg_len)
|
|
|
|
.map(FixedLen)
|
|
|
|
// We know that `neg_len + 1 >= pos_len >= pos_suffix`.
|
|
|
|
.chain(Some(VarLen(neg_len + 1 - pos_suffix, pos_suffix)))
|
|
|
|
.collect()
|
|
|
|
}
|
|
|
|
}
|
|
|
|
VarLen(neg_prefix, neg_suffix) => {
|
|
|
|
let neg_len = neg_prefix + neg_suffix;
|
|
|
|
if neg_len <= pos_len {
|
|
|
|
smallvec![]
|
|
|
|
} else {
|
2019-11-17 19:08:01 +00:00
|
|
|
(pos_len..neg_len).map(FixedLen).collect()
|
2019-11-17 17:33:39 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
2019-11-17 15:54:44 +00:00
|
|
|
}
|
|
|
|
|
2019-11-17 19:08:01 +00:00
|
|
|
/// A constructor for array and slice patterns.
|
|
|
|
#[derive(Copy, Clone, Debug, PartialEq, Eq)]
|
|
|
|
struct Slice {
|
|
|
|
/// `None` if the matched value is a slice, `Some(n)` if it is an array of size `n`.
|
|
|
|
array_len: Option<u64>,
|
|
|
|
/// The kind of pattern it is: fixed-length `[x, y]` or variable length `[x, .., y]`.
|
|
|
|
kind: SliceKind,
|
|
|
|
}
|
|
|
|
|
|
|
|
impl Slice {
|
|
|
|
/// Returns what patterns this constructor covers: either fixed-length patterns or
|
|
|
|
/// variable-length patterns.
|
|
|
|
fn pattern_kind(self) -> SliceKind {
|
|
|
|
match self {
|
|
|
|
Slice { array_len: Some(len), kind: VarLen(prefix, suffix) }
|
|
|
|
if prefix + suffix == len =>
|
|
|
|
{
|
|
|
|
FixedLen(len)
|
|
|
|
}
|
|
|
|
_ => self.kind,
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
/// Returns what values this constructor covers: either values of only one given length, or
|
|
|
|
/// values of length above a given length.
|
|
|
|
/// This is different from `pattern_kind()` because in some cases the pattern only takes into
|
|
|
|
/// account a subset of the entries of the array, but still only captures values of a given
|
|
|
|
/// length.
|
|
|
|
fn value_kind(self) -> SliceKind {
|
|
|
|
match self {
|
|
|
|
Slice { array_len: Some(len), kind: VarLen(_, _) } => FixedLen(len),
|
|
|
|
_ => self.kind,
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
fn arity(self) -> u64 {
|
|
|
|
self.pattern_kind().arity()
|
|
|
|
}
|
2020-10-25 21:59:59 +00:00
|
|
|
|
|
|
|
/// The exhaustiveness-checking paper does not include any details on
|
|
|
|
/// checking variable-length slice patterns. However, they are matched
|
|
|
|
/// by an infinite collection of fixed-length array patterns.
|
|
|
|
///
|
|
|
|
/// Checking the infinite set directly would take an infinite amount
|
|
|
|
/// of time. However, it turns out that for each finite set of
|
|
|
|
/// patterns `P`, all sufficiently large array lengths are equivalent:
|
|
|
|
///
|
|
|
|
/// Each slice `s` with a "sufficiently-large" length `l ≥ L` that applies
|
|
|
|
/// to exactly the subset `Pₜ` of `P` can be transformed to a slice
|
|
|
|
/// `sₘ` for each sufficiently-large length `m` that applies to exactly
|
|
|
|
/// the same subset of `P`.
|
|
|
|
///
|
|
|
|
/// Because of that, each witness for reachability-checking from one
|
|
|
|
/// of the sufficiently-large lengths can be transformed to an
|
|
|
|
/// equally-valid witness from any other length, so we only have
|
|
|
|
/// to check slice lengths from the "minimal sufficiently-large length"
|
|
|
|
/// and below.
|
|
|
|
///
|
|
|
|
/// Note that the fact that there is a *single* `sₘ` for each `m`
|
|
|
|
/// not depending on the specific pattern in `P` is important: if
|
|
|
|
/// you look at the pair of patterns
|
|
|
|
/// `[true, ..]`
|
|
|
|
/// `[.., false]`
|
|
|
|
/// Then any slice of length ≥1 that matches one of these two
|
|
|
|
/// patterns can be trivially turned to a slice of any
|
|
|
|
/// other length ≥1 that matches them and vice-versa - for
|
|
|
|
/// but the slice from length 2 `[false, true]` that matches neither
|
|
|
|
/// of these patterns can't be turned to a slice from length 1 that
|
|
|
|
/// matches neither of these patterns, so we have to consider
|
|
|
|
/// slices from length 2 there.
|
|
|
|
///
|
|
|
|
/// Now, to see that that length exists and find it, observe that slice
|
|
|
|
/// patterns are either "fixed-length" patterns (`[_, _, _]`) or
|
|
|
|
/// "variable-length" patterns (`[_, .., _]`).
|
|
|
|
///
|
|
|
|
/// For fixed-length patterns, all slices with lengths *longer* than
|
|
|
|
/// the pattern's length have the same outcome (of not matching), so
|
|
|
|
/// as long as `L` is greater than the pattern's length we can pick
|
|
|
|
/// any `sₘ` from that length and get the same result.
|
|
|
|
///
|
|
|
|
/// For variable-length patterns, the situation is more complicated,
|
|
|
|
/// because as seen above the precise value of `sₘ` matters.
|
|
|
|
///
|
|
|
|
/// However, for each variable-length pattern `p` with a prefix of length
|
|
|
|
/// `plₚ` and suffix of length `slₚ`, only the first `plₚ` and the last
|
|
|
|
/// `slₚ` elements are examined.
|
|
|
|
///
|
|
|
|
/// Therefore, as long as `L` is positive (to avoid concerns about empty
|
|
|
|
/// types), all elements after the maximum prefix length and before
|
|
|
|
/// the maximum suffix length are not examined by any variable-length
|
|
|
|
/// pattern, and therefore can be added/removed without affecting
|
|
|
|
/// them - creating equivalent patterns from any sufficiently-large
|
|
|
|
/// length.
|
|
|
|
///
|
|
|
|
/// Of course, if fixed-length patterns exist, we must be sure
|
|
|
|
/// that our length is large enough to miss them all, so
|
|
|
|
/// we can pick `L = max(max(FIXED_LEN)+1, max(PREFIX_LEN) + max(SUFFIX_LEN))`
|
|
|
|
///
|
|
|
|
/// for example, with the above pair of patterns, all elements
|
|
|
|
/// but the first and last can be added/removed, so any
|
|
|
|
/// witness of length ≥2 (say, `[false, false, true]`) can be
|
|
|
|
/// turned to a witness from any other length ≥2.
|
|
|
|
fn split<'p, 'tcx>(
|
|
|
|
self,
|
|
|
|
cx: &MatchCheckCtxt<'p, 'tcx>,
|
|
|
|
matrix: &Matrix<'p, 'tcx>,
|
|
|
|
) -> SmallVec<[Constructor<'tcx>; 1]> {
|
|
|
|
let (array_len, self_prefix, self_suffix) = match self {
|
|
|
|
Slice { array_len, kind: VarLen(self_prefix, self_suffix) } => {
|
|
|
|
(array_len, self_prefix, self_suffix)
|
|
|
|
}
|
|
|
|
_ => return smallvec![Slice(self)],
|
|
|
|
};
|
|
|
|
|
|
|
|
let head_ctors =
|
|
|
|
matrix.heads().filter_map(|pat| pat_constructor(cx.tcx, cx.param_env, pat));
|
|
|
|
|
|
|
|
let mut max_prefix_len = self_prefix;
|
|
|
|
let mut max_suffix_len = self_suffix;
|
|
|
|
let mut max_fixed_len = 0;
|
|
|
|
|
|
|
|
for ctor in head_ctors {
|
|
|
|
if let Slice(slice) = ctor {
|
|
|
|
match slice.pattern_kind() {
|
|
|
|
FixedLen(len) => {
|
|
|
|
max_fixed_len = cmp::max(max_fixed_len, len);
|
|
|
|
}
|
|
|
|
VarLen(prefix, suffix) => {
|
|
|
|
max_prefix_len = cmp::max(max_prefix_len, prefix);
|
|
|
|
max_suffix_len = cmp::max(max_suffix_len, suffix);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
// For diagnostics, we keep the prefix and suffix lengths separate, so in the case
|
|
|
|
// where `max_fixed_len + 1` is the largest, we adapt `max_prefix_len` accordingly,
|
|
|
|
// so that `L = max_prefix_len + max_suffix_len`.
|
|
|
|
if max_fixed_len + 1 >= max_prefix_len + max_suffix_len {
|
|
|
|
// The subtraction can't overflow thanks to the above check.
|
|
|
|
// The new `max_prefix_len` is also guaranteed to be larger than its previous
|
|
|
|
// value.
|
|
|
|
max_prefix_len = max_fixed_len + 1 - max_suffix_len;
|
|
|
|
}
|
|
|
|
|
|
|
|
match array_len {
|
|
|
|
Some(len) => {
|
|
|
|
let kind = if max_prefix_len + max_suffix_len < len {
|
|
|
|
VarLen(max_prefix_len, max_suffix_len)
|
|
|
|
} else {
|
|
|
|
FixedLen(len)
|
|
|
|
};
|
|
|
|
smallvec![Slice(Slice { array_len, kind })]
|
|
|
|
}
|
|
|
|
None => {
|
|
|
|
// `ctor` originally covered the range `(self_prefix +
|
|
|
|
// self_suffix..infinity)`. We now split it into two: lengths smaller than
|
|
|
|
// `max_prefix_len + max_suffix_len` are treated independently as
|
|
|
|
// fixed-lengths slices, and lengths above are captured by a final VarLen
|
|
|
|
// constructor.
|
|
|
|
let smaller_lengths =
|
|
|
|
(self_prefix + self_suffix..max_prefix_len + max_suffix_len).map(FixedLen);
|
|
|
|
let final_slice = VarLen(max_prefix_len, max_suffix_len);
|
|
|
|
smaller_lengths
|
|
|
|
.chain(Some(final_slice))
|
|
|
|
.map(|kind| Slice { array_len, kind })
|
|
|
|
.map(Slice)
|
|
|
|
.collect()
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
2019-11-17 19:08:01 +00:00
|
|
|
}
|
|
|
|
|
2020-05-09 11:32:54 +01:00
|
|
|
/// A value can be decomposed into a constructor applied to some fields. This struct represents
|
|
|
|
/// the constructor. See also `Fields`.
|
2020-05-09 12:46:42 +01:00
|
|
|
///
|
|
|
|
/// `pat_constructor` retrieves the constructor corresponding to a pattern.
|
2020-10-25 23:03:15 +00:00
|
|
|
/// `specialize_constructor` returns the list of fields corresponding to a pattern, given a
|
2020-05-09 12:46:42 +01:00
|
|
|
/// constructor. `Constructor::apply` reconstructs the pattern from a pair of `Constructor` and
|
|
|
|
/// `Fields`.
|
2019-11-09 21:11:59 +00:00
|
|
|
#[derive(Clone, Debug, PartialEq)]
|
2019-03-24 12:09:44 +03:00
|
|
|
enum Constructor<'tcx> {
|
2020-05-09 11:32:54 +01:00
|
|
|
/// The constructor for patterns that have a single constructor, like tuples, struct patterns
|
|
|
|
/// and fixed-length arrays.
|
2016-09-24 18:24:34 +03:00
|
|
|
Single,
|
|
|
|
/// Enum variants.
|
|
|
|
Variant(DefId),
|
2019-11-15 16:39:31 +00:00
|
|
|
/// Ranges of integer literal values (`2`, `2..=5` or `2..5`).
|
|
|
|
IntRange(IntRange<'tcx>),
|
2019-11-15 17:00:38 +00:00
|
|
|
/// Ranges of floating-point literal values (`2.0..=5.2`).
|
|
|
|
FloatRange(&'tcx ty::Const<'tcx>, &'tcx ty::Const<'tcx>, RangeEnd),
|
2020-10-17 23:18:05 +01:00
|
|
|
/// String literals. Strings are not quite the same as `&[u8]` so we treat them separately.
|
|
|
|
Str(&'tcx ty::Const<'tcx>),
|
2019-11-17 15:54:44 +00:00
|
|
|
/// Array and slice patterns.
|
2019-11-17 17:33:39 +00:00
|
|
|
Slice(Slice),
|
2020-10-18 13:48:54 +01:00
|
|
|
/// Constants that must not be matched structurally. They are treated as black
|
|
|
|
/// boxes for the purposes of exhaustiveness: we must not inspect them, and they
|
|
|
|
/// don't count towards making a match exhaustive.
|
|
|
|
Opaque,
|
2019-11-12 12:44:00 +00:00
|
|
|
/// Fake extra constructor for enums that aren't allowed to be matched exhaustively.
|
|
|
|
NonExhaustive,
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
|
|
|
|
2017-02-15 15:00:20 +02:00
|
|
|
impl<'tcx> Constructor<'tcx> {
|
2020-10-18 13:48:54 +01:00
|
|
|
fn variant_index_for_adt(&self, adt: &'tcx ty::AdtDef) -> VariantIdx {
|
2020-01-05 15:46:44 +00:00
|
|
|
match *self {
|
|
|
|
Variant(id) => adt.variant_index_with_id(id),
|
2019-08-29 16:06:44 -07:00
|
|
|
Single => {
|
2017-11-18 20:24:54 +02:00
|
|
|
assert!(!adt.is_enum());
|
2018-11-01 19:03:38 +01:00
|
|
|
VariantIdx::new(0)
|
2016-09-26 02:53:26 +03:00
|
|
|
}
|
2019-09-21 13:49:14 +02:00
|
|
|
_ => bug!("bad constructor {:?} for adt {:?}", self, adt),
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
|
|
|
}
|
2019-08-29 16:06:44 -07:00
|
|
|
|
2019-10-27 16:58:04 +00:00
|
|
|
// Returns the set of constructors covered by `self` but not by
|
|
|
|
// anything in `other_ctors`.
|
2019-11-09 21:01:26 +00:00
|
|
|
fn subtract_ctors(&self, other_ctors: &Vec<Constructor<'tcx>>) -> Vec<Constructor<'tcx>> {
|
2019-12-03 16:15:25 +00:00
|
|
|
if other_ctors.is_empty() {
|
|
|
|
return vec![self.clone()];
|
|
|
|
}
|
|
|
|
|
2019-11-09 13:35:04 +00:00
|
|
|
match self {
|
2019-11-03 18:23:01 +00:00
|
|
|
// Those constructors can only match themselves.
|
2020-10-18 13:48:54 +01:00
|
|
|
Single | Variant(_) | Str(..) | FloatRange(..) => {
|
2019-11-09 21:46:46 +00:00
|
|
|
if other_ctors.iter().any(|c| c == self) { vec![] } else { vec![self.clone()] }
|
2019-10-27 16:58:04 +00:00
|
|
|
}
|
2019-11-17 19:24:48 +00:00
|
|
|
&Slice(slice) => {
|
|
|
|
let mut other_slices = other_ctors
|
|
|
|
.iter()
|
|
|
|
.filter_map(|c: &Constructor<'_>| match c {
|
|
|
|
Slice(slice) => Some(*slice),
|
|
|
|
_ => bug!("bad slice pattern constructor {:?}", c),
|
|
|
|
})
|
|
|
|
.map(Slice::value_kind);
|
|
|
|
|
|
|
|
match slice.value_kind() {
|
|
|
|
FixedLen(self_len) => {
|
|
|
|
if other_slices.any(|other_slice| other_slice.covers_length(self_len)) {
|
|
|
|
vec![]
|
|
|
|
} else {
|
|
|
|
vec![Slice(slice)]
|
2019-11-17 17:33:39 +00:00
|
|
|
}
|
2019-11-17 15:54:44 +00:00
|
|
|
}
|
2019-11-17 19:24:48 +00:00
|
|
|
kind @ VarLen(..) => {
|
|
|
|
let mut remaining_slices = vec![kind];
|
|
|
|
|
|
|
|
// For each used slice, subtract from the current set of slices.
|
|
|
|
for other_slice in other_slices {
|
|
|
|
remaining_slices = remaining_slices
|
|
|
|
.into_iter()
|
|
|
|
.flat_map(|remaining_slice| remaining_slice.subtract(other_slice))
|
|
|
|
.collect();
|
|
|
|
|
|
|
|
// If the constructors that have been considered so far already cover
|
|
|
|
// the entire range of `self`, no need to look at more constructors.
|
|
|
|
if remaining_slices.is_empty() {
|
|
|
|
break;
|
|
|
|
}
|
|
|
|
}
|
2019-11-05 17:57:55 +00:00
|
|
|
|
2019-11-17 19:24:48 +00:00
|
|
|
remaining_slices
|
|
|
|
.into_iter()
|
|
|
|
.map(|kind| Slice { array_len: slice.array_len, kind })
|
|
|
|
.map(Slice)
|
|
|
|
.collect()
|
|
|
|
}
|
2019-11-05 17:57:55 +00:00
|
|
|
}
|
2019-11-17 19:24:48 +00:00
|
|
|
}
|
2019-11-09 13:35:04 +00:00
|
|
|
IntRange(self_range) => {
|
|
|
|
let mut remaining_ranges = vec![self_range.clone()];
|
2019-11-09 21:01:26 +00:00
|
|
|
for other_ctor in other_ctors {
|
|
|
|
if let IntRange(other_range) = other_ctor {
|
|
|
|
if other_range == self_range {
|
|
|
|
// If the `self` range appears directly in a `match` arm, we can
|
|
|
|
// eliminate it straight away.
|
|
|
|
remaining_ranges = vec![];
|
|
|
|
} else {
|
2020-03-06 12:13:55 +01:00
|
|
|
// Otherwise explicitly compute the remaining ranges.
|
2019-11-09 21:01:26 +00:00
|
|
|
remaining_ranges = other_range.subtract_from(remaining_ranges);
|
|
|
|
}
|
2019-11-09 09:43:15 +00:00
|
|
|
|
2019-11-09 21:01:26 +00:00
|
|
|
// If the ranges that have been considered so far already cover the entire
|
|
|
|
// range of values, we can return early.
|
|
|
|
if remaining_ranges.is_empty() {
|
|
|
|
break;
|
|
|
|
}
|
2019-11-03 18:23:01 +00:00
|
|
|
}
|
2019-11-09 13:35:04 +00:00
|
|
|
}
|
2019-11-03 18:23:01 +00:00
|
|
|
|
2019-11-15 17:00:38 +00:00
|
|
|
// Convert the ranges back into constructors.
|
2019-11-09 13:35:04 +00:00
|
|
|
remaining_ranges.into_iter().map(IntRange).collect()
|
|
|
|
}
|
2019-11-12 12:44:00 +00:00
|
|
|
// This constructor is never covered by anything else
|
|
|
|
NonExhaustive => vec![NonExhaustive],
|
2020-10-18 13:48:54 +01:00
|
|
|
Opaque => bug!("unexpected opaque ctor {:?} found in all_ctors", self),
|
2019-10-27 16:58:04 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2020-10-25 21:59:59 +00:00
|
|
|
/// Some constructors (namely IntRange and Slice) actually stand for a set of actual
|
|
|
|
/// constructors (integers and fixed-sized slices). When specializing for these
|
|
|
|
/// constructors, we want to be specialising for the actual underlying constructors.
|
|
|
|
/// Naively, we would simply return the list of constructors they correspond to. We instead are
|
|
|
|
/// more clever: if there are constructors that we know will behave the same wrt the current
|
|
|
|
/// matrix, we keep them grouped. For example, all slices of a sufficiently large length
|
|
|
|
/// will either be all useful or all non-useful with a given matrix.
|
|
|
|
///
|
|
|
|
/// See the branches for details on how the splitting is done.
|
|
|
|
///
|
|
|
|
/// This function may discard some irrelevant constructors if this preserves behavior and
|
|
|
|
/// diagnostics. Eg. for the `_` case, we ignore the constructors already present in the
|
|
|
|
/// matrix, unless all of them are.
|
|
|
|
///
|
|
|
|
/// `hir_id` is `None` when we're evaluating the wildcard pattern. In that case we do not want
|
|
|
|
/// to lint for overlapping ranges.
|
|
|
|
fn split<'p>(
|
|
|
|
self,
|
|
|
|
cx: &MatchCheckCtxt<'p, 'tcx>,
|
|
|
|
pcx: PatCtxt<'tcx>,
|
|
|
|
matrix: &Matrix<'p, 'tcx>,
|
|
|
|
hir_id: Option<HirId>,
|
|
|
|
) -> SmallVec<[Self; 1]> {
|
|
|
|
debug!("Constructor::split({:#?}, {:#?})", self, matrix);
|
|
|
|
|
|
|
|
match self {
|
|
|
|
// Fast-track if the range is trivial. In particular, we don't do the overlapping
|
|
|
|
// ranges check.
|
|
|
|
IntRange(ctor_range)
|
|
|
|
if ctor_range.treat_exhaustively(cx.tcx) && !ctor_range.is_singleton() =>
|
|
|
|
{
|
|
|
|
ctor_range.split(cx, pcx, matrix, hir_id)
|
|
|
|
}
|
|
|
|
Slice(slice @ Slice { kind: VarLen(..), .. }) => slice.split(cx, matrix),
|
|
|
|
// Any other constructor can be used unchanged.
|
|
|
|
_ => smallvec![self],
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2020-10-25 22:51:50 +00:00
|
|
|
/// Returns whether `self` is covered by `other`, ie whether `self` is a subset of `other`. For
|
|
|
|
/// the simple cases, this is simply checking for equality. For the "grouped" constructors,
|
|
|
|
/// this checks for inclusion.
|
|
|
|
fn is_covered_by<'p>(
|
|
|
|
&self,
|
|
|
|
cx: &MatchCheckCtxt<'p, 'tcx>,
|
|
|
|
other: &Constructor<'tcx>,
|
|
|
|
ty: Ty<'tcx>,
|
|
|
|
) -> bool {
|
|
|
|
match (self, other) {
|
|
|
|
(Single, Single) => true,
|
|
|
|
(Variant(self_id), Variant(other_id)) => self_id == other_id,
|
|
|
|
|
|
|
|
(IntRange(self_range), IntRange(other_range)) => {
|
|
|
|
if self_range.intersection(cx.tcx, other_range).is_some() {
|
|
|
|
// Constructor splitting should ensure that all intersections we encounter
|
|
|
|
// are actually inclusions.
|
|
|
|
assert!(self_range.is_subrange(other_range));
|
|
|
|
true
|
|
|
|
} else {
|
|
|
|
false
|
|
|
|
}
|
|
|
|
}
|
|
|
|
(
|
|
|
|
FloatRange(self_from, self_to, self_end),
|
|
|
|
FloatRange(other_from, other_to, other_end),
|
|
|
|
) => {
|
|
|
|
match (
|
|
|
|
compare_const_vals(cx.tcx, self_to, other_to, cx.param_env, ty),
|
|
|
|
compare_const_vals(cx.tcx, self_from, other_from, cx.param_env, ty),
|
|
|
|
) {
|
|
|
|
(Some(to), Some(from)) => {
|
|
|
|
(from == Ordering::Greater || from == Ordering::Equal)
|
|
|
|
&& (to == Ordering::Less
|
|
|
|
|| (other_end == self_end && to == Ordering::Equal))
|
|
|
|
}
|
|
|
|
_ => false,
|
|
|
|
}
|
|
|
|
}
|
|
|
|
(Str(self_val), Str(other_val)) => {
|
|
|
|
// FIXME: there's probably a more direct way of comparing for equality
|
|
|
|
match compare_const_vals(cx.tcx, self_val, other_val, cx.param_env, ty) {
|
|
|
|
Some(comparison) => comparison == Ordering::Equal,
|
|
|
|
None => false,
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
(Slice(self_slice), Slice(other_slice)) => {
|
|
|
|
other_slice.pattern_kind().covers_length(self_slice.arity())
|
|
|
|
}
|
|
|
|
|
|
|
|
// We are trying to inspect an opaque constant. Thus we skip the row.
|
|
|
|
(Opaque, _) | (_, Opaque) => false,
|
|
|
|
// Only a wildcard pattern can match the special extra constructor.
|
|
|
|
(NonExhaustive, _) => false,
|
|
|
|
|
|
|
|
_ => bug!("trying to compare incompatible constructors {:?} and {:?}", self, other),
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2019-09-23 17:36:42 +02:00
|
|
|
/// Apply a constructor to a list of patterns, yielding a new pattern. `pats`
|
|
|
|
/// must have as many elements as this constructor's arity.
|
|
|
|
///
|
2020-10-25 23:03:15 +00:00
|
|
|
/// This is roughly the inverse of `specialize_constructor`.
|
2019-11-07 12:09:05 +00:00
|
|
|
///
|
2019-09-23 17:36:42 +02:00
|
|
|
/// Examples:
|
2019-10-27 17:14:43 +00:00
|
|
|
/// `self`: `Constructor::Single`
|
|
|
|
/// `ty`: `(u32, u32, u32)`
|
|
|
|
/// `pats`: `[10, 20, _]`
|
|
|
|
/// returns `(10, 20, _)`
|
2019-09-23 17:36:42 +02:00
|
|
|
///
|
2019-10-27 17:14:43 +00:00
|
|
|
/// `self`: `Constructor::Variant(Option::Some)`
|
|
|
|
/// `ty`: `Option<bool>`
|
|
|
|
/// `pats`: `[false]`
|
|
|
|
/// returns `Some(false)`
|
2020-05-09 12:46:42 +01:00
|
|
|
fn apply<'p>(
|
2019-09-23 17:36:42 +02:00
|
|
|
&self,
|
2020-05-09 12:46:42 +01:00
|
|
|
cx: &MatchCheckCtxt<'p, 'tcx>,
|
2019-09-23 17:36:42 +02:00
|
|
|
ty: Ty<'tcx>,
|
2020-05-09 12:46:42 +01:00
|
|
|
fields: Fields<'p, 'tcx>,
|
2019-09-23 17:36:42 +02:00
|
|
|
) -> Pat<'tcx> {
|
2020-05-09 13:36:48 +01:00
|
|
|
let mut subpatterns = fields.all_patterns();
|
2019-11-12 10:36:56 +00:00
|
|
|
|
|
|
|
let pat = match self {
|
2020-08-03 00:49:11 +02:00
|
|
|
Single | Variant(_) => match ty.kind() {
|
2019-11-12 10:36:56 +00:00
|
|
|
ty::Adt(..) | ty::Tuple(..) => {
|
|
|
|
let subpatterns = subpatterns
|
|
|
|
.enumerate()
|
|
|
|
.map(|(i, p)| FieldPat { field: Field::new(i), pattern: p })
|
|
|
|
.collect();
|
|
|
|
|
2020-08-03 00:49:11 +02:00
|
|
|
if let ty::Adt(adt, substs) = ty.kind() {
|
2019-11-12 10:36:56 +00:00
|
|
|
if adt.is_enum() {
|
|
|
|
PatKind::Variant {
|
|
|
|
adt_def: adt,
|
|
|
|
substs,
|
2020-10-18 13:48:54 +01:00
|
|
|
variant_index: self.variant_index_for_adt(adt),
|
2019-11-12 10:36:56 +00:00
|
|
|
subpatterns,
|
|
|
|
}
|
|
|
|
} else {
|
|
|
|
PatKind::Leaf { subpatterns }
|
2019-09-23 17:36:42 +02:00
|
|
|
}
|
|
|
|
} else {
|
2019-10-27 17:14:43 +00:00
|
|
|
PatKind::Leaf { subpatterns }
|
2019-09-23 17:36:42 +02:00
|
|
|
}
|
2019-11-03 23:11:04 +00:00
|
|
|
}
|
2020-03-03 01:19:00 +01:00
|
|
|
ty::Ref(..) => PatKind::Deref { subpattern: subpatterns.next().unwrap() },
|
2019-11-12 10:36:56 +00:00
|
|
|
ty::Slice(_) | ty::Array(..) => bug!("bad slice pattern {:?} {:?}", self, ty),
|
2019-09-23 17:36:42 +02:00
|
|
|
_ => PatKind::Wild,
|
|
|
|
},
|
2019-11-17 17:33:39 +00:00
|
|
|
Slice(slice) => match slice.pattern_kind() {
|
|
|
|
FixedLen(_) => {
|
|
|
|
PatKind::Slice { prefix: subpatterns.collect(), slice: None, suffix: vec![] }
|
|
|
|
}
|
|
|
|
VarLen(prefix, _) => {
|
2019-11-17 17:48:45 +00:00
|
|
|
let mut prefix: Vec<_> = subpatterns.by_ref().take(prefix as usize).collect();
|
|
|
|
if slice.array_len.is_some() {
|
|
|
|
// Improves diagnostics a bit: if the type is a known-size array, instead
|
|
|
|
// of reporting `[x, _, .., _, y]`, we prefer to report `[x, .., y]`.
|
|
|
|
// This is incorrect if the size is not known, since `[_, ..]` captures
|
|
|
|
// arrays of lengths `>= 1` whereas `[..]` captures any length.
|
|
|
|
while !prefix.is_empty() && prefix.last().unwrap().is_wildcard() {
|
|
|
|
prefix.pop();
|
|
|
|
}
|
|
|
|
}
|
2019-11-18 15:47:51 +00:00
|
|
|
let suffix: Vec<_> = if slice.array_len.is_some() {
|
|
|
|
// Same as above.
|
|
|
|
subpatterns.skip_while(Pat::is_wildcard).collect()
|
|
|
|
} else {
|
|
|
|
subpatterns.collect()
|
|
|
|
};
|
2019-11-17 17:33:39 +00:00
|
|
|
let wild = Pat::wildcard_from_ty(ty);
|
|
|
|
PatKind::Slice { prefix, slice: Some(wild), suffix }
|
|
|
|
}
|
|
|
|
},
|
2020-10-17 23:18:05 +01:00
|
|
|
&Str(value) => PatKind::Constant { value },
|
2019-11-15 17:00:38 +00:00
|
|
|
&FloatRange(lo, hi, end) => PatKind::Range(PatRange { lo, hi, end }),
|
|
|
|
IntRange(range) => return range.to_pat(cx.tcx),
|
2019-11-12 12:44:00 +00:00
|
|
|
NonExhaustive => PatKind::Wild,
|
2020-10-18 13:48:54 +01:00
|
|
|
Opaque => bug!("we should not try to apply an opaque constructor {:?}", self),
|
2019-09-23 17:36:42 +02:00
|
|
|
};
|
|
|
|
|
|
|
|
Pat { ty, span: DUMMY_SP, kind: Box::new(pat) }
|
|
|
|
}
|
2019-09-23 17:37:42 +02:00
|
|
|
|
|
|
|
/// Like `apply`, but where all the subpatterns are wildcards `_`.
|
|
|
|
fn apply_wildcards<'a>(&self, cx: &MatchCheckCtxt<'a, 'tcx>, ty: Ty<'tcx>) -> Pat<'tcx> {
|
2020-05-09 12:46:42 +01:00
|
|
|
self.apply(cx, ty, Fields::wildcards(cx, self, ty))
|
2019-09-23 17:37:42 +02:00
|
|
|
}
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
|
|
|
|
2020-05-18 16:12:01 +01:00
|
|
|
/// Some fields need to be explicitly hidden away in certain cases; see the comment above the
|
2020-05-17 14:01:28 +01:00
|
|
|
/// `Fields` struct. This struct represents such a potentially-hidden field. When a field is hidden
|
|
|
|
/// we still keep its type around.
|
2020-05-09 13:36:48 +01:00
|
|
|
#[derive(Debug, Copy, Clone)]
|
|
|
|
enum FilteredField<'p, 'tcx> {
|
|
|
|
Kept(&'p Pat<'tcx>),
|
|
|
|
Hidden(Ty<'tcx>),
|
|
|
|
}
|
|
|
|
|
|
|
|
impl<'p, 'tcx> FilteredField<'p, 'tcx> {
|
|
|
|
fn kept(self) -> Option<&'p Pat<'tcx>> {
|
|
|
|
match self {
|
|
|
|
FilteredField::Kept(p) => Some(p),
|
|
|
|
FilteredField::Hidden(_) => None,
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
fn to_pattern(self) -> Pat<'tcx> {
|
|
|
|
match self {
|
|
|
|
FilteredField::Kept(p) => p.clone(),
|
|
|
|
FilteredField::Hidden(ty) => Pat::wildcard_from_ty(ty),
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2020-05-09 11:32:54 +01:00
|
|
|
/// A value can be decomposed into a constructor applied to some fields. This struct represents
|
|
|
|
/// those fields, generalized to allow patterns in each field. See also `Constructor`.
|
2020-05-09 13:36:48 +01:00
|
|
|
///
|
|
|
|
/// If a private or `non_exhaustive` field is uninhabited, the code mustn't observe that it is
|
|
|
|
/// uninhabited. For that, we filter these fields out of the matrix. This is subtle because we
|
2020-05-09 19:00:37 +01:00
|
|
|
/// still need to have those fields back when going to/from a `Pat`. Most of this is handled
|
|
|
|
/// automatically in `Fields`, but when constructing or deconstructing `Fields` you need to be
|
|
|
|
/// careful. As a rule, when going to/from the matrix, use the filtered field list; when going
|
|
|
|
/// to/from `Pat`, use the full field list.
|
|
|
|
/// This filtering is uncommon in practice, because uninhabited fields are rarely used, so we avoid
|
|
|
|
/// it when possible to preserve performance.
|
2020-05-09 11:32:54 +01:00
|
|
|
#[derive(Debug, Clone)]
|
|
|
|
enum Fields<'p, 'tcx> {
|
2020-05-09 13:36:48 +01:00
|
|
|
/// Lists of patterns that don't contain any filtered fields.
|
2020-05-17 14:01:28 +01:00
|
|
|
/// `Slice` and `Vec` behave the same; the difference is only to avoid allocating and
|
|
|
|
/// triple-dereferences when possible. Frankly this is premature optimization, I (Nadrieril)
|
|
|
|
/// have not measured if it really made a difference.
|
2020-05-09 11:32:54 +01:00
|
|
|
Slice(&'p [Pat<'tcx>]),
|
2020-05-09 12:00:34 +01:00
|
|
|
Vec(SmallVec<[&'p Pat<'tcx>; 2]>),
|
2020-05-23 18:49:38 +01:00
|
|
|
/// Patterns where some of the fields need to be hidden. `kept_count` caches the number of
|
|
|
|
/// non-hidden fields.
|
2020-05-17 14:01:28 +01:00
|
|
|
Filtered {
|
|
|
|
fields: SmallVec<[FilteredField<'p, 'tcx>; 2]>,
|
2020-05-23 18:49:38 +01:00
|
|
|
kept_count: usize,
|
2020-05-17 14:01:28 +01:00
|
|
|
},
|
2020-05-09 11:32:54 +01:00
|
|
|
}
|
|
|
|
|
|
|
|
impl<'p, 'tcx> Fields<'p, 'tcx> {
|
2020-05-09 12:00:34 +01:00
|
|
|
fn empty() -> Self {
|
|
|
|
Fields::Slice(&[])
|
|
|
|
}
|
|
|
|
|
|
|
|
/// Construct a new `Fields` from the given pattern. Must not be used if the pattern is a field
|
|
|
|
/// of a struct/tuple/variant.
|
|
|
|
fn from_single_pattern(pat: &'p Pat<'tcx>) -> Self {
|
|
|
|
Fields::Slice(std::slice::from_ref(pat))
|
|
|
|
}
|
|
|
|
|
2020-05-09 12:46:42 +01:00
|
|
|
/// Convenience; internal use.
|
|
|
|
fn wildcards_from_tys(
|
|
|
|
cx: &MatchCheckCtxt<'p, 'tcx>,
|
|
|
|
tys: impl IntoIterator<Item = Ty<'tcx>>,
|
|
|
|
) -> Self {
|
|
|
|
let wilds = tys.into_iter().map(Pat::wildcard_from_ty);
|
|
|
|
let pats = cx.pattern_arena.alloc_from_iter(wilds);
|
|
|
|
Fields::Slice(pats)
|
|
|
|
}
|
|
|
|
|
2020-05-09 11:32:54 +01:00
|
|
|
/// Creates a new list of wildcard fields for a given constructor.
|
|
|
|
fn wildcards(
|
|
|
|
cx: &MatchCheckCtxt<'p, 'tcx>,
|
|
|
|
constructor: &Constructor<'tcx>,
|
|
|
|
ty: Ty<'tcx>,
|
|
|
|
) -> Self {
|
2020-05-09 12:46:42 +01:00
|
|
|
let wildcard_from_ty = |ty| &*cx.pattern_arena.alloc(Pat::wildcard_from_ty(ty));
|
|
|
|
|
2020-05-23 13:11:28 +01:00
|
|
|
let ret = match constructor {
|
2020-08-03 00:49:11 +02:00
|
|
|
Single | Variant(_) => match ty.kind() {
|
2020-05-09 12:46:42 +01:00
|
|
|
ty::Tuple(ref fs) => {
|
|
|
|
Fields::wildcards_from_tys(cx, fs.into_iter().map(|ty| ty.expect_ty()))
|
|
|
|
}
|
|
|
|
ty::Ref(_, rty, _) => Fields::from_single_pattern(wildcard_from_ty(rty)),
|
|
|
|
ty::Adt(adt, substs) => {
|
|
|
|
if adt.is_box() {
|
|
|
|
// Use T as the sub pattern type of Box<T>.
|
|
|
|
Fields::from_single_pattern(wildcard_from_ty(substs.type_at(0)))
|
|
|
|
} else {
|
2020-10-18 13:48:54 +01:00
|
|
|
let variant = &adt.variants[constructor.variant_index_for_adt(adt)];
|
2020-05-09 13:36:48 +01:00
|
|
|
// Whether we must not match the fields of this variant exhaustively.
|
|
|
|
let is_non_exhaustive =
|
|
|
|
variant.is_field_list_non_exhaustive() && !adt.did.is_local();
|
|
|
|
let field_tys = variant.fields.iter().map(|field| field.ty(cx.tcx, substs));
|
|
|
|
// In the following cases, we don't need to filter out any fields. This is
|
|
|
|
// the vast majority of real cases, since uninhabited fields are uncommon.
|
|
|
|
let has_no_hidden_fields = (adt.is_enum() && !is_non_exhaustive)
|
|
|
|
|| !field_tys.clone().any(|ty| cx.is_uninhabited(ty));
|
|
|
|
|
|
|
|
if has_no_hidden_fields {
|
|
|
|
Fields::wildcards_from_tys(cx, field_tys)
|
|
|
|
} else {
|
2020-05-23 18:49:38 +01:00
|
|
|
let mut kept_count = 0;
|
2020-05-09 13:36:48 +01:00
|
|
|
let fields = variant
|
|
|
|
.fields
|
|
|
|
.iter()
|
|
|
|
.map(|field| {
|
|
|
|
let ty = field.ty(cx.tcx, substs);
|
|
|
|
let is_visible = adt.is_enum()
|
|
|
|
|| field.vis.is_accessible_from(cx.module, cx.tcx);
|
|
|
|
let is_uninhabited = cx.is_uninhabited(ty);
|
|
|
|
|
|
|
|
// In the cases of either a `#[non_exhaustive]` field list
|
|
|
|
// or a non-public field, we hide uninhabited fields in
|
|
|
|
// order not to reveal the uninhabitedness of the whole
|
|
|
|
// variant.
|
|
|
|
if is_uninhabited && (!is_visible || is_non_exhaustive) {
|
|
|
|
FilteredField::Hidden(ty)
|
|
|
|
} else {
|
2020-05-23 18:49:38 +01:00
|
|
|
kept_count += 1;
|
2020-05-09 13:36:48 +01:00
|
|
|
FilteredField::Kept(wildcard_from_ty(ty))
|
|
|
|
}
|
|
|
|
})
|
|
|
|
.collect();
|
2020-05-23 18:49:38 +01:00
|
|
|
Fields::Filtered { fields, kept_count }
|
2020-05-09 13:36:48 +01:00
|
|
|
}
|
2020-05-09 12:46:42 +01:00
|
|
|
}
|
|
|
|
}
|
|
|
|
_ => Fields::empty(),
|
|
|
|
},
|
2020-08-03 00:49:11 +02:00
|
|
|
Slice(slice) => match *ty.kind() {
|
2020-05-09 12:46:42 +01:00
|
|
|
ty::Slice(ty) | ty::Array(ty, _) => {
|
|
|
|
let arity = slice.arity();
|
|
|
|
Fields::wildcards_from_tys(cx, (0..arity).map(|_| ty))
|
|
|
|
}
|
|
|
|
_ => bug!("bad slice pattern {:?} {:?}", constructor, ty),
|
|
|
|
},
|
2020-10-18 13:48:54 +01:00
|
|
|
Str(..) | FloatRange(..) | IntRange(..) | NonExhaustive | Opaque => Fields::empty(),
|
2020-05-23 13:11:28 +01:00
|
|
|
};
|
|
|
|
debug!("Fields::wildcards({:?}, {:?}) = {:#?}", constructor, ty, ret);
|
|
|
|
ret
|
2020-05-09 11:32:54 +01:00
|
|
|
}
|
|
|
|
|
2020-05-23 18:49:38 +01:00
|
|
|
/// Returns the number of patterns from the viewpoint of match-checking, i.e. excluding hidden
|
|
|
|
/// fields. This is what we want in most cases in this file, the only exception being
|
|
|
|
/// conversion to/from `Pat`.
|
2020-05-09 11:32:54 +01:00
|
|
|
fn len(&self) -> usize {
|
|
|
|
match self {
|
|
|
|
Fields::Slice(pats) => pats.len(),
|
2020-05-09 12:00:34 +01:00
|
|
|
Fields::Vec(pats) => pats.len(),
|
2020-05-23 18:49:38 +01:00
|
|
|
Fields::Filtered { kept_count, .. } => *kept_count,
|
2020-05-09 11:32:54 +01:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2020-05-09 13:36:48 +01:00
|
|
|
/// Returns the complete list of patterns, including hidden fields.
|
|
|
|
fn all_patterns(self) -> impl Iterator<Item = Pat<'tcx>> {
|
|
|
|
let pats: SmallVec<[_; 2]> = match self {
|
|
|
|
Fields::Slice(pats) => pats.iter().cloned().collect(),
|
|
|
|
Fields::Vec(pats) => pats.into_iter().cloned().collect(),
|
2020-05-17 14:58:26 +01:00
|
|
|
Fields::Filtered { fields, .. } => {
|
2020-05-09 13:36:48 +01:00
|
|
|
// We don't skip any fields here.
|
|
|
|
fields.into_iter().map(|p| p.to_pattern()).collect()
|
|
|
|
}
|
2020-05-09 12:00:34 +01:00
|
|
|
};
|
|
|
|
pats.into_iter()
|
|
|
|
}
|
|
|
|
|
2020-05-17 14:01:28 +01:00
|
|
|
/// Overrides some of the fields with the provided patterns. Exactly like
|
|
|
|
/// `replace_fields_indexed`, except that it takes `FieldPat`s as input.
|
2020-05-09 12:46:42 +01:00
|
|
|
fn replace_with_fieldpats(
|
|
|
|
&self,
|
|
|
|
new_pats: impl IntoIterator<Item = &'p FieldPat<'tcx>>,
|
|
|
|
) -> Self {
|
|
|
|
self.replace_fields_indexed(
|
2020-05-09 13:36:48 +01:00
|
|
|
new_pats.into_iter().map(|pat| (pat.field.index(), &pat.pattern)),
|
2020-05-09 12:46:42 +01:00
|
|
|
)
|
|
|
|
}
|
|
|
|
|
2020-05-17 14:01:28 +01:00
|
|
|
/// Overrides some of the fields with the provided patterns. This is used when a pattern
|
|
|
|
/// defines some fields but not all, for example `Foo { field1: Some(_), .. }`: here we start with a
|
|
|
|
/// `Fields` that is just one wildcard per field of the `Foo` struct, and override the entry
|
|
|
|
/// corresponding to `field1` with the pattern `Some(_)`. This is also used for slice patterns
|
|
|
|
/// for the same reason.
|
2020-05-09 12:46:42 +01:00
|
|
|
fn replace_fields_indexed(
|
|
|
|
&self,
|
|
|
|
new_pats: impl IntoIterator<Item = (usize, &'p Pat<'tcx>)>,
|
|
|
|
) -> Self {
|
|
|
|
let mut fields = self.clone();
|
|
|
|
if let Fields::Slice(pats) = fields {
|
|
|
|
fields = Fields::Vec(pats.iter().collect());
|
|
|
|
}
|
|
|
|
|
|
|
|
match &mut fields {
|
|
|
|
Fields::Vec(pats) => {
|
|
|
|
for (i, pat) in new_pats {
|
|
|
|
pats[i] = pat
|
|
|
|
}
|
|
|
|
}
|
2020-05-17 14:58:26 +01:00
|
|
|
Fields::Filtered { fields, .. } => {
|
2020-05-09 13:36:48 +01:00
|
|
|
for (i, pat) in new_pats {
|
|
|
|
if let FilteredField::Kept(p) = &mut fields[i] {
|
|
|
|
*p = pat
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
2020-05-09 12:46:42 +01:00
|
|
|
Fields::Slice(_) => unreachable!(),
|
|
|
|
}
|
|
|
|
fields
|
|
|
|
}
|
|
|
|
|
|
|
|
/// Replaces contained fields with the given filtered list of patterns, e.g. taken from the
|
|
|
|
/// matrix. There must be `len()` patterns in `pats`.
|
|
|
|
fn replace_fields(
|
|
|
|
&self,
|
|
|
|
cx: &MatchCheckCtxt<'p, 'tcx>,
|
|
|
|
pats: impl IntoIterator<Item = Pat<'tcx>>,
|
|
|
|
) -> Self {
|
|
|
|
let pats: &[_] = cx.pattern_arena.alloc_from_iter(pats);
|
2020-05-09 13:36:48 +01:00
|
|
|
|
|
|
|
match self {
|
2020-05-23 18:49:38 +01:00
|
|
|
Fields::Filtered { fields, kept_count } => {
|
2020-05-09 13:36:48 +01:00
|
|
|
let mut pats = pats.iter();
|
|
|
|
let mut fields = fields.clone();
|
|
|
|
for f in &mut fields {
|
|
|
|
if let FilteredField::Kept(p) = f {
|
|
|
|
// We take one input pattern for each `Kept` field, in order.
|
|
|
|
*p = pats.next().unwrap();
|
|
|
|
}
|
|
|
|
}
|
2020-05-23 18:49:38 +01:00
|
|
|
Fields::Filtered { fields, kept_count: *kept_count }
|
2020-05-09 13:36:48 +01:00
|
|
|
}
|
|
|
|
_ => Fields::Slice(pats),
|
|
|
|
}
|
2020-05-09 12:46:42 +01:00
|
|
|
}
|
|
|
|
|
2020-10-25 22:51:50 +00:00
|
|
|
/// Replaces contained fields with the arguments of the given pattern. Only use on a pattern
|
|
|
|
/// that is compatible with the constructor used to build `self`.
|
|
|
|
/// This is meant to be used on the result of `Fields::wildcards()`. The idea is that
|
|
|
|
/// `wildcards` constructs a list of fields where all entries are wildcards, and the pattern
|
|
|
|
/// provided to this function fills some of the fields with non-wildcards.
|
|
|
|
/// In the following example `Fields::wildcards` would return `[_, _, _, _]`. If we call
|
|
|
|
/// `replace_with_pattern_arguments` on it with the pattern, the result will be `[Some(0), _,
|
|
|
|
/// _, _]`.
|
|
|
|
/// ```rust
|
|
|
|
/// let x: [Option<u8>; 4] = foo();
|
|
|
|
/// match x {
|
|
|
|
/// [Some(0), ..] => {}
|
|
|
|
/// }
|
|
|
|
/// ```
|
|
|
|
fn replace_with_pattern_arguments(&self, pat: &'p Pat<'tcx>) -> Self {
|
|
|
|
match pat.kind.as_ref() {
|
|
|
|
PatKind::Deref { subpattern } => Self::from_single_pattern(subpattern),
|
|
|
|
PatKind::Leaf { subpatterns } | PatKind::Variant { subpatterns, .. } => {
|
|
|
|
self.replace_with_fieldpats(subpatterns)
|
|
|
|
}
|
|
|
|
PatKind::Array { prefix, suffix, .. } | PatKind::Slice { prefix, suffix, .. } => {
|
|
|
|
// Number of subpatterns for the constructor
|
|
|
|
let ctor_arity = self.len();
|
|
|
|
|
|
|
|
// Replace the prefix and the suffix with the given patterns, leaving wildcards in
|
|
|
|
// the middle if there was a subslice pattern `..`.
|
|
|
|
let prefix = prefix.iter().enumerate();
|
|
|
|
let suffix =
|
|
|
|
suffix.iter().enumerate().map(|(i, p)| (ctor_arity - suffix.len() + i, p));
|
|
|
|
self.replace_fields_indexed(prefix.chain(suffix))
|
|
|
|
}
|
|
|
|
_ => self.clone(),
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2020-05-09 12:00:34 +01:00
|
|
|
fn push_on_patstack(self, stack: &[&'p Pat<'tcx>]) -> PatStack<'p, 'tcx> {
|
|
|
|
let pats: SmallVec<_> = match self {
|
|
|
|
Fields::Slice(pats) => pats.iter().chain(stack.iter().copied()).collect(),
|
|
|
|
Fields::Vec(mut pats) => {
|
|
|
|
pats.extend_from_slice(stack);
|
|
|
|
pats
|
|
|
|
}
|
2020-05-17 14:58:26 +01:00
|
|
|
Fields::Filtered { fields, .. } => {
|
2020-05-09 13:36:48 +01:00
|
|
|
// We skip hidden fields here
|
|
|
|
fields.into_iter().filter_map(|p| p.kept()).chain(stack.iter().copied()).collect()
|
|
|
|
}
|
2020-05-09 12:00:34 +01:00
|
|
|
};
|
|
|
|
PatStack::from_vec(pats)
|
2020-05-09 11:32:54 +01:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2018-06-22 23:52:56 +01:00
|
|
|
#[derive(Clone, Debug)]
|
2020-07-02 21:03:59 +01:00
|
|
|
crate enum Usefulness<'tcx> {
|
2019-11-28 16:56:45 +00:00
|
|
|
/// Carries a list of unreachable subpatterns. Used only in the presence of or-patterns.
|
2020-07-02 21:03:59 +01:00
|
|
|
Useful(Vec<Span>),
|
2019-11-28 16:56:45 +00:00
|
|
|
/// Carries a list of witnesses of non-exhaustiveness.
|
2017-01-01 20:57:21 +02:00
|
|
|
UsefulWithWitness(Vec<Witness<'tcx>>),
|
2019-09-21 13:49:14 +02:00
|
|
|
NotUseful,
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
|
|
|
|
2020-07-02 21:03:59 +01:00
|
|
|
impl<'tcx> Usefulness<'tcx> {
|
2019-10-27 17:07:05 +00:00
|
|
|
fn new_useful(preference: WitnessPreference) -> Self {
|
|
|
|
match preference {
|
|
|
|
ConstructWitness => UsefulWithWitness(vec![Witness(vec![])]),
|
2019-11-28 16:56:45 +00:00
|
|
|
LeaveOutWitness => Useful(vec![]),
|
2019-10-27 17:07:05 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2017-01-01 20:57:21 +02:00
|
|
|
fn is_useful(&self) -> bool {
|
|
|
|
match *self {
|
|
|
|
NotUseful => false,
|
2019-09-21 13:49:14 +02:00
|
|
|
_ => true,
|
2017-01-01 20:57:21 +02:00
|
|
|
}
|
|
|
|
}
|
2019-10-27 17:07:05 +00:00
|
|
|
|
2020-07-02 21:03:59 +01:00
|
|
|
fn apply_constructor<'p>(
|
2019-10-27 17:07:05 +00:00
|
|
|
self,
|
2020-05-09 12:46:42 +01:00
|
|
|
cx: &MatchCheckCtxt<'p, 'tcx>,
|
2019-10-27 17:07:05 +00:00
|
|
|
ctor: &Constructor<'tcx>,
|
|
|
|
ty: Ty<'tcx>,
|
2020-05-09 12:46:42 +01:00
|
|
|
ctor_wild_subpatterns: &Fields<'p, 'tcx>,
|
2019-10-27 17:07:05 +00:00
|
|
|
) -> Self {
|
|
|
|
match self {
|
|
|
|
UsefulWithWitness(witnesses) => UsefulWithWitness(
|
|
|
|
witnesses
|
|
|
|
.into_iter()
|
2020-05-09 12:46:42 +01:00
|
|
|
.map(|witness| witness.apply_constructor(cx, &ctor, ty, ctor_wild_subpatterns))
|
2019-10-27 17:07:05 +00:00
|
|
|
.collect(),
|
|
|
|
),
|
|
|
|
x => x,
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
fn apply_wildcard(self, ty: Ty<'tcx>) -> Self {
|
|
|
|
match self {
|
|
|
|
UsefulWithWitness(witnesses) => {
|
2019-11-06 15:06:57 +00:00
|
|
|
let wild = Pat::wildcard_from_ty(ty);
|
2019-10-27 17:07:05 +00:00
|
|
|
UsefulWithWitness(
|
|
|
|
witnesses
|
|
|
|
.into_iter()
|
|
|
|
.map(|mut witness| {
|
|
|
|
witness.0.push(wild.clone());
|
|
|
|
witness
|
|
|
|
})
|
|
|
|
.collect(),
|
|
|
|
)
|
|
|
|
}
|
|
|
|
x => x,
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
fn apply_missing_ctors(
|
|
|
|
self,
|
|
|
|
cx: &MatchCheckCtxt<'_, 'tcx>,
|
|
|
|
ty: Ty<'tcx>,
|
|
|
|
missing_ctors: &MissingConstructors<'tcx>,
|
|
|
|
) -> Self {
|
|
|
|
match self {
|
|
|
|
UsefulWithWitness(witnesses) => {
|
|
|
|
let new_patterns: Vec<_> =
|
|
|
|
missing_ctors.iter().map(|ctor| ctor.apply_wildcards(cx, ty)).collect();
|
|
|
|
// Add the new patterns to each witness
|
|
|
|
UsefulWithWitness(
|
|
|
|
witnesses
|
|
|
|
.into_iter()
|
|
|
|
.flat_map(|witness| {
|
|
|
|
new_patterns.iter().map(move |pat| {
|
|
|
|
let mut witness = witness.clone();
|
|
|
|
witness.0.push(pat.clone());
|
|
|
|
witness
|
|
|
|
})
|
|
|
|
})
|
|
|
|
.collect(),
|
|
|
|
)
|
|
|
|
}
|
|
|
|
x => x,
|
2017-01-01 20:57:21 +02:00
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2018-06-22 23:52:56 +01:00
|
|
|
#[derive(Copy, Clone, Debug)]
|
2020-01-05 15:46:44 +00:00
|
|
|
crate enum WitnessPreference {
|
2016-09-24 18:24:34 +03:00
|
|
|
ConstructWitness,
|
2019-09-21 13:49:14 +02:00
|
|
|
LeaveOutWitness,
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
|
|
|
|
2016-09-24 20:45:59 +03:00
|
|
|
#[derive(Copy, Clone, Debug)]
|
2019-09-26 18:45:10 +01:00
|
|
|
struct PatCtxt<'tcx> {
|
2016-09-24 20:45:59 +03:00
|
|
|
ty: Ty<'tcx>,
|
2019-08-29 16:06:44 -07:00
|
|
|
span: Span,
|
2016-09-24 20:45:59 +03:00
|
|
|
}
|
|
|
|
|
2018-08-12 11:43:42 +01:00
|
|
|
/// A witness of non-exhaustiveness for error reporting, represented
|
|
|
|
/// as a list of patterns (in reverse order of construction) with
|
|
|
|
/// wildcards inside to represent elements that can take any inhabitant
|
|
|
|
/// of the type as a value.
|
|
|
|
///
|
|
|
|
/// A witness against a list of patterns should have the same types
|
|
|
|
/// and length as the pattern matched against. Because Rust `match`
|
|
|
|
/// is always against a single pattern, at the end the witness will
|
|
|
|
/// have length 1, but in the middle of the algorithm, it can contain
|
|
|
|
/// multiple patterns.
|
|
|
|
///
|
|
|
|
/// For example, if we are constructing a witness for the match against
|
2020-10-18 21:54:10 -07:00
|
|
|
///
|
2018-08-12 11:43:42 +01:00
|
|
|
/// ```
|
|
|
|
/// struct Pair(Option<(u32, u32)>, bool);
|
|
|
|
///
|
|
|
|
/// match (p: Pair) {
|
|
|
|
/// Pair(None, _) => {}
|
|
|
|
/// Pair(_, false) => {}
|
|
|
|
/// }
|
|
|
|
/// ```
|
|
|
|
///
|
|
|
|
/// We'll perform the following steps:
|
|
|
|
/// 1. Start with an empty witness
|
|
|
|
/// `Witness(vec![])`
|
|
|
|
/// 2. Push a witness `Some(_)` against the `None`
|
|
|
|
/// `Witness(vec![Some(_)])`
|
|
|
|
/// 3. Push a witness `true` against the `false`
|
|
|
|
/// `Witness(vec![Some(_), true])`
|
|
|
|
/// 4. Apply the `Pair` constructor to the witnesses
|
|
|
|
/// `Witness(vec![Pair(Some(_), true)])`
|
|
|
|
///
|
|
|
|
/// The final `Pair(Some(_), true)` is then the resulting witness.
|
2018-06-22 23:52:56 +01:00
|
|
|
#[derive(Clone, Debug)]
|
2020-01-05 15:46:44 +00:00
|
|
|
crate struct Witness<'tcx>(Vec<Pat<'tcx>>);
|
2016-09-24 18:24:34 +03:00
|
|
|
|
2017-01-01 20:57:21 +02:00
|
|
|
impl<'tcx> Witness<'tcx> {
|
2020-01-05 15:46:44 +00:00
|
|
|
crate fn single_pattern(self) -> Pat<'tcx> {
|
2016-09-24 20:45:59 +03:00
|
|
|
assert_eq!(self.0.len(), 1);
|
2019-09-09 16:44:06 +02:00
|
|
|
self.0.into_iter().next().unwrap()
|
2016-09-24 20:45:59 +03:00
|
|
|
}
|
2016-09-24 18:24:34 +03:00
|
|
|
|
2016-09-24 20:45:59 +03:00
|
|
|
/// Constructs a partial witness for a pattern given a list of
|
|
|
|
/// patterns expanded by the specialization step.
|
|
|
|
///
|
|
|
|
/// When a pattern P is discovered to be useful, this function is used bottom-up
|
2018-11-27 02:59:49 +00:00
|
|
|
/// to reconstruct a complete witness, e.g., a pattern P' that covers a subset
|
2016-09-24 20:45:59 +03:00
|
|
|
/// of values, V, where each value in that set is not covered by any previously
|
|
|
|
/// used patterns and is covered by the pattern P'. Examples:
|
|
|
|
///
|
|
|
|
/// left_ty: tuple of 3 elements
|
|
|
|
/// pats: [10, 20, _] => (10, 20, _)
|
|
|
|
///
|
|
|
|
/// left_ty: struct X { a: (bool, &'static str), b: usize}
|
|
|
|
/// pats: [(false, "foo"), 42] => X { a: (false, "foo"), b: 42 }
|
2020-05-09 12:46:42 +01:00
|
|
|
fn apply_constructor<'p>(
|
2016-09-24 20:45:59 +03:00
|
|
|
mut self,
|
2020-05-09 12:46:42 +01:00
|
|
|
cx: &MatchCheckCtxt<'p, 'tcx>,
|
2017-02-15 15:00:20 +02:00
|
|
|
ctor: &Constructor<'tcx>,
|
2019-09-21 13:49:14 +02:00
|
|
|
ty: Ty<'tcx>,
|
2020-05-09 12:46:42 +01:00
|
|
|
ctor_wild_subpatterns: &Fields<'p, 'tcx>,
|
2019-09-21 13:49:14 +02:00
|
|
|
) -> Self {
|
2016-09-24 20:45:59 +03:00
|
|
|
let pat = {
|
2020-05-09 12:46:42 +01:00
|
|
|
let len = self.0.len();
|
|
|
|
let arity = ctor_wild_subpatterns.len();
|
|
|
|
let pats = self.0.drain((len - arity)..).rev();
|
|
|
|
let fields = ctor_wild_subpatterns.replace_fields(cx, pats);
|
|
|
|
ctor.apply(cx, ty, fields)
|
2016-09-24 20:45:59 +03:00
|
|
|
};
|
2016-09-24 18:24:34 +03:00
|
|
|
|
2019-09-23 17:36:42 +02:00
|
|
|
self.0.push(pat);
|
2016-09-24 20:45:59 +03:00
|
|
|
|
|
|
|
self
|
|
|
|
}
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
|
|
|
|
|
|
|
/// This determines the set of all possible constructors of a pattern matching
|
|
|
|
/// values of type `left_ty`. For vectors, this would normally be an infinite set
|
2016-12-30 10:51:32 -05:00
|
|
|
/// but is instead bounded by the maximum fixed length of slice patterns in
|
|
|
|
/// the column of patterns being analyzed.
|
2016-10-26 22:38:22 +03:00
|
|
|
///
|
2019-02-18 18:34:42 +00:00
|
|
|
/// We make sure to omit constructors that are statically impossible. E.g., for
|
|
|
|
/// `Option<!>`, we do not include `Some(_)` in the returned list of constructors.
|
2019-12-04 16:06:31 +00:00
|
|
|
/// Invariant: this returns an empty `Vec` if and only if the type is uninhabited (as determined by
|
|
|
|
/// `cx.is_uninhabited()`).
|
2019-06-16 12:41:24 +03:00
|
|
|
fn all_constructors<'a, 'tcx>(
|
2020-10-25 21:59:59 +00:00
|
|
|
cx: &MatchCheckCtxt<'a, 'tcx>,
|
2019-09-26 18:45:10 +01:00
|
|
|
pcx: PatCtxt<'tcx>,
|
2019-06-16 12:41:24 +03:00
|
|
|
) -> Vec<Constructor<'tcx>> {
|
2016-12-01 01:12:03 +08:00
|
|
|
debug!("all_constructors({:?})", pcx.ty);
|
2019-11-15 16:39:31 +00:00
|
|
|
let make_range = |start, end| {
|
|
|
|
IntRange(
|
2019-11-09 21:42:02 +00:00
|
|
|
// `unwrap()` is ok because we know the type is an integer.
|
2019-11-15 16:39:31 +00:00
|
|
|
IntRange::from_range(cx.tcx, start, end, pcx.ty, &RangeEnd::Included, pcx.span)
|
|
|
|
.unwrap(),
|
|
|
|
)
|
|
|
|
};
|
2020-08-03 00:49:11 +02:00
|
|
|
match *pcx.ty.kind() {
|
2020-10-02 14:19:52 +02:00
|
|
|
ty::Bool => vec![make_range(0, 1)],
|
2019-03-26 00:13:09 +01:00
|
|
|
ty::Array(ref sub_ty, len) if len.try_eval_usize(cx.tcx, cx.param_env).is_some() => {
|
|
|
|
let len = len.eval_usize(cx.tcx, cx.param_env);
|
2019-11-16 16:05:32 +00:00
|
|
|
if len != 0 && cx.is_uninhabited(sub_ty) {
|
|
|
|
vec![]
|
|
|
|
} else {
|
2019-11-17 17:33:39 +00:00
|
|
|
vec![Slice(Slice { array_len: Some(len), kind: VarLen(0, 0) })]
|
2019-11-16 16:05:32 +00:00
|
|
|
}
|
2016-11-29 15:10:26 +08:00
|
|
|
}
|
2017-08-07 08:08:53 +03:00
|
|
|
// Treat arrays of a constant but unknown length like slices.
|
2019-09-21 13:49:14 +02:00
|
|
|
ty::Array(ref sub_ty, _) | ty::Slice(ref sub_ty) => {
|
2019-11-17 17:33:39 +00:00
|
|
|
let kind = if cx.is_uninhabited(sub_ty) { FixedLen(0) } else { VarLen(0, 0) };
|
|
|
|
vec![Slice(Slice { array_len: None, kind })]
|
2017-08-07 08:08:53 +03:00
|
|
|
}
|
2019-11-12 14:51:59 +00:00
|
|
|
ty::Adt(def, substs) if def.is_enum() => {
|
2019-11-30 14:36:41 +00:00
|
|
|
let ctors: Vec<_> = if cx.tcx.features().exhaustive_patterns {
|
|
|
|
// If `exhaustive_patterns` is enabled, we exclude variants known to be
|
|
|
|
// uninhabited.
|
|
|
|
def.variants
|
|
|
|
.iter()
|
|
|
|
.filter(|v| {
|
2020-03-16 18:51:55 +01:00
|
|
|
!v.uninhabited_from(cx.tcx, substs, def.adt_kind(), cx.param_env)
|
2019-11-12 14:51:59 +00:00
|
|
|
.contains(cx.tcx, cx.module)
|
2019-11-30 14:36:41 +00:00
|
|
|
})
|
|
|
|
.map(|v| Variant(v.def_id))
|
|
|
|
.collect()
|
|
|
|
} else {
|
|
|
|
def.variants.iter().map(|v| Variant(v.def_id)).collect()
|
|
|
|
};
|
|
|
|
|
2019-11-12 14:51:59 +00:00
|
|
|
// If the enum is declared as `#[non_exhaustive]`, we treat it as if it had an
|
2019-11-30 14:36:41 +00:00
|
|
|
// additional "unknown" constructor.
|
|
|
|
// There is no point in enumerating all possible variants, because the user can't
|
|
|
|
// actually match against them all themselves. So we always return only the fictitious
|
|
|
|
// constructor.
|
|
|
|
// E.g., in an example like:
|
2020-10-18 21:54:10 -07:00
|
|
|
//
|
2019-11-30 14:36:41 +00:00
|
|
|
// ```
|
|
|
|
// let err: io::ErrorKind = ...;
|
|
|
|
// match err {
|
|
|
|
// io::ErrorKind::NotFound => {},
|
|
|
|
// }
|
|
|
|
// ```
|
2020-10-18 21:54:10 -07:00
|
|
|
//
|
2019-11-30 14:36:41 +00:00
|
|
|
// we don't want to show every possible IO error, but instead have only `_` as the
|
|
|
|
// witness.
|
2019-11-30 16:00:44 +00:00
|
|
|
let is_declared_nonexhaustive = cx.is_foreign_non_exhaustive_enum(pcx.ty);
|
2019-11-12 14:51:59 +00:00
|
|
|
|
2019-11-30 15:41:27 +00:00
|
|
|
// If `exhaustive_patterns` is disabled and our scrutinee is an empty enum, we treat it
|
|
|
|
// as though it had an "unknown" constructor to avoid exposing its emptyness. Note that
|
|
|
|
// an empty match will still be considered exhaustive because that case is handled
|
|
|
|
// separately in `check_match`.
|
|
|
|
let is_secretly_empty =
|
|
|
|
def.variants.is_empty() && !cx.tcx.features().exhaustive_patterns;
|
|
|
|
|
|
|
|
if is_secretly_empty || is_declared_nonexhaustive { vec![NonExhaustive] } else { ctors }
|
2019-11-12 14:51:59 +00:00
|
|
|
}
|
2018-11-30 10:40:59 +00:00
|
|
|
ty::Char => {
|
2018-05-21 20:40:38 +01:00
|
|
|
vec![
|
|
|
|
// The valid Unicode Scalar Value ranges.
|
2019-11-07 18:06:48 +00:00
|
|
|
make_range('\u{0000}' as u128, '\u{D7FF}' as u128),
|
|
|
|
make_range('\u{E000}' as u128, '\u{10FFFF}' as u128),
|
2018-05-21 20:40:38 +01:00
|
|
|
]
|
2018-05-19 15:26:07 +01:00
|
|
|
}
|
2019-11-12 14:51:59 +00:00
|
|
|
ty::Int(_) | ty::Uint(_)
|
|
|
|
if pcx.ty.is_ptr_sized_integral()
|
|
|
|
&& !cx.tcx.features().precise_pointer_size_matching =>
|
|
|
|
{
|
|
|
|
// `usize`/`isize` are not allowed to be matched exhaustively unless the
|
|
|
|
// `precise_pointer_size_matching` feature is enabled. So we treat those types like
|
|
|
|
// `#[non_exhaustive]` enums by returning a special unmatcheable constructor.
|
|
|
|
vec![NonExhaustive]
|
|
|
|
}
|
2018-11-30 10:40:59 +00:00
|
|
|
ty::Int(ity) => {
|
2018-11-03 22:57:53 +02:00
|
|
|
let bits = Integer::from_attr(&cx.tcx, SignedInt(ity)).size().bits() as u128;
|
2018-05-23 21:16:07 +01:00
|
|
|
let min = 1u128 << (bits - 1);
|
2019-03-16 01:58:48 +09:00
|
|
|
let max = min - 1;
|
2019-11-07 18:06:48 +00:00
|
|
|
vec![make_range(min, max)]
|
2018-05-19 21:57:25 +01:00
|
|
|
}
|
2018-11-30 10:40:59 +00:00
|
|
|
ty::Uint(uty) => {
|
2019-03-15 21:48:05 +09:00
|
|
|
let size = Integer::from_attr(&cx.tcx, UnsignedInt(uty)).size();
|
2020-06-02 07:59:11 +00:00
|
|
|
let max = truncate(u128::MAX, size);
|
2019-11-07 18:06:48 +00:00
|
|
|
vec![make_range(0, max)]
|
2018-05-19 21:57:25 +01:00
|
|
|
}
|
2016-11-29 15:10:26 +08:00
|
|
|
_ => {
|
2017-02-20 19:18:31 +02:00
|
|
|
if cx.is_uninhabited(pcx.ty) {
|
2016-11-29 15:10:26 +08:00
|
|
|
vec![]
|
|
|
|
} else {
|
|
|
|
vec![Single]
|
|
|
|
}
|
|
|
|
}
|
2019-11-12 12:44:00 +00:00
|
|
|
}
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
|
|
|
|
2018-05-19 23:19:29 +01:00
|
|
|
/// An inclusive interval, used for precise integer exhaustiveness checking.
|
2018-05-24 10:48:23 +01:00
|
|
|
/// `IntRange`s always store a contiguous range. This means that values are
|
|
|
|
/// encoded such that `0` encodes the minimum value for the integer,
|
|
|
|
/// regardless of the signedness.
|
2019-06-14 12:24:38 -05:00
|
|
|
/// For example, the pattern `-128..=127i8` is encoded as `0..=255`.
|
2018-05-23 21:16:07 +01:00
|
|
|
/// This makes comparisons and arithmetic on interval endpoints much more
|
2018-05-28 20:53:20 +01:00
|
|
|
/// straightforward. See `signed_bias` for details.
|
2018-08-20 23:16:15 +01:00
|
|
|
///
|
|
|
|
/// `IntRange` is never used to encode an empty range or a "range" that wraps
|
2018-11-27 02:59:49 +00:00
|
|
|
/// around the (offset) space: i.e., `range.lo <= range.hi`.
|
2019-08-29 16:06:44 -07:00
|
|
|
#[derive(Clone, Debug)]
|
2018-05-24 10:48:23 +01:00
|
|
|
struct IntRange<'tcx> {
|
2020-01-05 15:46:44 +00:00
|
|
|
range: RangeInclusive<u128>,
|
|
|
|
ty: Ty<'tcx>,
|
|
|
|
span: Span,
|
2018-05-19 23:19:29 +01:00
|
|
|
}
|
|
|
|
|
2018-05-24 10:48:23 +01:00
|
|
|
impl<'tcx> IntRange<'tcx> {
|
2019-10-04 13:16:37 +10:00
|
|
|
#[inline]
|
|
|
|
fn is_integral(ty: Ty<'_>) -> bool {
|
2020-08-03 00:49:11 +02:00
|
|
|
match ty.kind() {
|
2020-10-02 14:19:52 +02:00
|
|
|
ty::Char | ty::Int(_) | ty::Uint(_) | ty::Bool => true,
|
2019-10-04 13:16:37 +10:00
|
|
|
_ => false,
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2019-11-09 13:35:04 +00:00
|
|
|
fn is_singleton(&self) -> bool {
|
|
|
|
self.range.start() == self.range.end()
|
|
|
|
}
|
|
|
|
|
2019-11-09 20:59:10 +00:00
|
|
|
fn boundaries(&self) -> (u128, u128) {
|
|
|
|
(*self.range.start(), *self.range.end())
|
|
|
|
}
|
|
|
|
|
2019-11-12 13:23:27 +00:00
|
|
|
/// Don't treat `usize`/`isize` exhaustively unless the `precise_pointer_size_matching` feature
|
|
|
|
/// is enabled.
|
2019-11-09 21:06:41 +00:00
|
|
|
fn treat_exhaustively(&self, tcx: TyCtxt<'tcx>) -> bool {
|
|
|
|
!self.ty.is_ptr_sized_integral() || tcx.features().precise_pointer_size_matching
|
2019-11-07 17:29:56 +00:00
|
|
|
}
|
|
|
|
|
2019-10-04 14:29:20 +10:00
|
|
|
#[inline]
|
|
|
|
fn integral_size_and_signed_bias(tcx: TyCtxt<'tcx>, ty: Ty<'_>) -> Option<(Size, u128)> {
|
2020-08-03 00:49:11 +02:00
|
|
|
match *ty.kind() {
|
2020-10-02 14:19:52 +02:00
|
|
|
ty::Bool => Some((Size::from_bytes(1), 0)),
|
2019-10-04 14:29:20 +10:00
|
|
|
ty::Char => Some((Size::from_bytes(4), 0)),
|
|
|
|
ty::Int(ity) => {
|
|
|
|
let size = Integer::from_attr(&tcx, SignedInt(ity)).size();
|
|
|
|
Some((size, 1u128 << (size.bits() as u128 - 1)))
|
|
|
|
}
|
|
|
|
ty::Uint(uty) => Some((Integer::from_attr(&tcx, UnsignedInt(uty)).size(), 0)),
|
|
|
|
_ => None,
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2019-10-04 13:16:37 +10:00
|
|
|
#[inline]
|
|
|
|
fn from_const(
|
|
|
|
tcx: TyCtxt<'tcx>,
|
|
|
|
param_env: ty::ParamEnv<'tcx>,
|
|
|
|
value: &Const<'tcx>,
|
2019-08-29 16:06:44 -07:00
|
|
|
span: Span,
|
2019-10-04 13:16:37 +10:00
|
|
|
) -> Option<IntRange<'tcx>> {
|
2019-10-04 14:29:20 +10:00
|
|
|
if let Some((target_size, bias)) = Self::integral_size_and_signed_bias(tcx, value.ty) {
|
2019-10-04 13:16:37 +10:00
|
|
|
let ty = value.ty;
|
2020-02-28 09:15:04 +01:00
|
|
|
let val = (|| {
|
|
|
|
if let ty::ConstKind::Value(ConstValue::Scalar(scalar)) = value.val {
|
|
|
|
// For this specific pattern we can skip a lot of effort and go
|
|
|
|
// straight to the result, after doing a bit of checking. (We
|
|
|
|
// could remove this branch and just fall through, which
|
|
|
|
// is more general but much slower.)
|
|
|
|
if let Ok(bits) = scalar.to_bits_or_ptr(target_size, &tcx) {
|
|
|
|
return Some(bits);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
// This is a more general form of the previous case.
|
|
|
|
value.try_eval_bits(tcx, param_env, ty)
|
|
|
|
})()?;
|
2019-10-04 14:29:20 +10:00
|
|
|
let val = val ^ bias;
|
2019-08-29 16:06:44 -07:00
|
|
|
Some(IntRange { range: val..=val, ty, span })
|
2019-10-04 13:16:37 +10:00
|
|
|
} else {
|
|
|
|
None
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
#[inline]
|
|
|
|
fn from_range(
|
|
|
|
tcx: TyCtxt<'tcx>,
|
|
|
|
lo: u128,
|
|
|
|
hi: u128,
|
|
|
|
ty: Ty<'tcx>,
|
|
|
|
end: &RangeEnd,
|
2019-08-29 16:06:44 -07:00
|
|
|
span: Span,
|
2019-10-04 13:16:37 +10:00
|
|
|
) -> Option<IntRange<'tcx>> {
|
|
|
|
if Self::is_integral(ty) {
|
|
|
|
// Perform a shift if the underlying types are signed,
|
|
|
|
// which makes the interval arithmetic simpler.
|
|
|
|
let bias = IntRange::signed_bias(tcx, ty);
|
|
|
|
let (lo, hi) = (lo ^ bias, hi ^ bias);
|
2019-11-09 21:42:02 +00:00
|
|
|
let offset = (*end == RangeEnd::Excluded) as u128;
|
|
|
|
if lo > hi || (lo == hi && *end == RangeEnd::Excluded) {
|
2019-11-15 17:00:38 +00:00
|
|
|
// This should have been caught earlier by E0030.
|
2019-11-09 21:42:02 +00:00
|
|
|
bug!("malformed range pattern: {}..={}", lo, (hi - offset));
|
2019-10-04 13:16:37 +10:00
|
|
|
}
|
2019-11-09 21:42:02 +00:00
|
|
|
Some(IntRange { range: lo..=(hi - offset), ty, span })
|
2019-10-04 13:16:37 +10:00
|
|
|
} else {
|
|
|
|
None
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2019-03-26 00:13:09 +01:00
|
|
|
fn from_pat(
|
|
|
|
tcx: TyCtxt<'tcx>,
|
|
|
|
param_env: ty::ParamEnv<'tcx>,
|
2019-11-07 18:37:10 +00:00
|
|
|
pat: &Pat<'tcx>,
|
2019-03-26 00:13:09 +01:00
|
|
|
) -> Option<IntRange<'tcx>> {
|
2020-09-22 12:41:30 -07:00
|
|
|
// This MUST be kept in sync with `pat_constructor`.
|
|
|
|
match *pat.kind {
|
|
|
|
PatKind::AscribeUserType { .. } => bug!(), // Handled by `expand_pattern`
|
|
|
|
PatKind::Or { .. } => bug!("Or-pattern should have been expanded earlier on."),
|
|
|
|
|
|
|
|
PatKind::Binding { .. }
|
|
|
|
| PatKind::Wild
|
|
|
|
| PatKind::Leaf { .. }
|
|
|
|
| PatKind::Deref { .. }
|
|
|
|
| PatKind::Variant { .. }
|
|
|
|
| PatKind::Array { .. }
|
|
|
|
| PatKind::Slice { .. } => None,
|
|
|
|
|
|
|
|
PatKind::Constant { value } => Self::from_const(tcx, param_env, value, pat.span),
|
|
|
|
|
|
|
|
PatKind::Range(PatRange { lo, hi, end }) => {
|
|
|
|
let ty = lo.ty;
|
|
|
|
Self::from_range(
|
|
|
|
tcx,
|
|
|
|
lo.eval_bits(tcx, param_env, lo.ty),
|
|
|
|
hi.eval_bits(tcx, param_env, hi.ty),
|
|
|
|
ty,
|
|
|
|
&end,
|
|
|
|
pat.span,
|
|
|
|
)
|
|
|
|
}
|
2019-11-09 21:01:26 +00:00
|
|
|
}
|
2018-08-14 12:45:26 +01:00
|
|
|
}
|
|
|
|
|
|
|
|
// The return value of `signed_bias` should be XORed with an endpoint to encode/decode it.
|
2019-06-14 00:48:52 +03:00
|
|
|
fn signed_bias(tcx: TyCtxt<'tcx>, ty: Ty<'tcx>) -> u128 {
|
2020-08-03 00:49:11 +02:00
|
|
|
match *ty.kind() {
|
2018-08-22 11:54:46 +01:00
|
|
|
ty::Int(ity) => {
|
2018-11-03 22:57:53 +02:00
|
|
|
let bits = Integer::from_attr(&tcx, SignedInt(ity)).size().bits() as u128;
|
2018-05-24 13:30:21 +01:00
|
|
|
1u128 << (bits - 1)
|
2018-05-20 01:54:22 +01:00
|
|
|
}
|
2019-09-21 13:49:14 +02:00
|
|
|
_ => 0,
|
2018-05-20 01:54:22 +01:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2019-02-08 14:53:55 +01:00
|
|
|
/// Returns a collection of ranges that spans the values covered by `ranges`, subtracted
|
2018-11-27 02:59:49 +00:00
|
|
|
/// by the values covered by `self`: i.e., `ranges \ self` (in set notation).
|
2019-11-09 20:59:10 +00:00
|
|
|
fn subtract_from(&self, ranges: Vec<IntRange<'tcx>>) -> Vec<IntRange<'tcx>> {
|
2018-05-19 23:19:29 +01:00
|
|
|
let mut remaining_ranges = vec![];
|
2018-08-14 12:45:26 +01:00
|
|
|
let ty = self.ty;
|
2019-11-07 20:03:50 +00:00
|
|
|
let span = self.span;
|
2019-11-09 20:59:10 +00:00
|
|
|
let (lo, hi) = self.boundaries();
|
2018-05-28 20:53:20 +01:00
|
|
|
for subrange in ranges {
|
2019-11-09 10:03:18 +00:00
|
|
|
let (subrange_lo, subrange_hi) = subrange.range.into_inner();
|
2019-09-21 13:49:14 +02:00
|
|
|
if lo > subrange_hi || subrange_lo > hi {
|
2018-05-19 23:19:29 +01:00
|
|
|
// The pattern doesn't intersect with the subrange at all,
|
|
|
|
// so the subrange remains untouched.
|
2019-11-09 10:03:18 +00:00
|
|
|
remaining_ranges.push(IntRange { range: subrange_lo..=subrange_hi, ty, span });
|
2018-05-19 23:19:29 +01:00
|
|
|
} else {
|
2018-05-28 20:53:20 +01:00
|
|
|
if lo > subrange_lo {
|
2018-05-21 21:46:12 +01:00
|
|
|
// The pattern intersects an upper section of the
|
|
|
|
// subrange, so a lower section will remain.
|
2019-11-09 10:03:18 +00:00
|
|
|
remaining_ranges.push(IntRange { range: subrange_lo..=(lo - 1), ty, span });
|
2018-05-21 21:46:12 +01:00
|
|
|
}
|
2018-05-28 20:53:20 +01:00
|
|
|
if hi < subrange_hi {
|
2018-05-21 21:46:12 +01:00
|
|
|
// The pattern intersects a lower section of the
|
|
|
|
// subrange, so an upper section will remain.
|
2019-11-09 10:03:18 +00:00
|
|
|
remaining_ranges.push(IntRange { range: (hi + 1)..=subrange_hi, ty, span });
|
2018-05-21 21:46:12 +01:00
|
|
|
}
|
2018-05-19 23:19:29 +01:00
|
|
|
}
|
|
|
|
}
|
2018-05-28 20:53:20 +01:00
|
|
|
remaining_ranges
|
2018-05-19 23:19:29 +01:00
|
|
|
}
|
2018-08-14 12:45:26 +01:00
|
|
|
|
2019-11-12 13:03:05 +00:00
|
|
|
fn is_subrange(&self, other: &Self) -> bool {
|
|
|
|
other.range.start() <= self.range.start() && self.range.end() <= other.range.end()
|
|
|
|
}
|
|
|
|
|
2019-11-07 19:46:14 +00:00
|
|
|
fn intersection(&self, tcx: TyCtxt<'tcx>, other: &Self) -> Option<Self> {
|
2018-08-14 12:45:26 +01:00
|
|
|
let ty = self.ty;
|
2019-11-09 20:59:10 +00:00
|
|
|
let (lo, hi) = self.boundaries();
|
|
|
|
let (other_lo, other_hi) = other.boundaries();
|
2019-11-09 21:06:41 +00:00
|
|
|
if self.treat_exhaustively(tcx) {
|
2019-11-07 19:46:14 +00:00
|
|
|
if lo <= other_hi && other_lo <= hi {
|
|
|
|
let span = other.span;
|
|
|
|
Some(IntRange { range: max(lo, other_lo)..=min(hi, other_hi), ty, span })
|
|
|
|
} else {
|
|
|
|
None
|
|
|
|
}
|
2018-08-14 12:45:26 +01:00
|
|
|
} else {
|
2019-11-12 12:47:34 +01:00
|
|
|
// If the range should not be treated exhaustively, fallback to checking for inclusion.
|
2019-11-12 13:03:05 +00:00
|
|
|
if self.is_subrange(other) { Some(self.clone()) } else { None }
|
2018-08-14 12:45:26 +01:00
|
|
|
}
|
|
|
|
}
|
2019-10-06 21:47:01 -07:00
|
|
|
|
|
|
|
fn suspicious_intersection(&self, other: &Self) -> bool {
|
|
|
|
// `false` in the following cases:
|
2019-10-16 12:22:23 -07:00
|
|
|
// 1 ---- // 1 ---------- // 1 ---- // 1 ----
|
|
|
|
// 2 ---------- // 2 ---- // 2 ---- // 2 ----
|
2019-10-06 21:47:01 -07:00
|
|
|
//
|
2019-10-16 12:22:23 -07:00
|
|
|
// The following are currently `false`, but could be `true` in the future (#64007):
|
|
|
|
// 1 --------- // 1 ---------
|
|
|
|
// 2 ---------- // 2 ----------
|
2019-10-06 21:47:01 -07:00
|
|
|
//
|
|
|
|
// `true` in the following cases:
|
2019-10-16 12:22:23 -07:00
|
|
|
// 1 ------- // 1 -------
|
|
|
|
// 2 -------- // 2 -------
|
2019-11-09 20:59:10 +00:00
|
|
|
let (lo, hi) = self.boundaries();
|
|
|
|
let (other_lo, other_hi) = other.boundaries();
|
2020-01-23 00:42:35 -05:00
|
|
|
lo == other_hi || hi == other_lo
|
2019-10-06 21:47:01 -07:00
|
|
|
}
|
2019-11-09 10:31:53 +00:00
|
|
|
|
2019-11-09 12:01:47 +00:00
|
|
|
fn to_pat(&self, tcx: TyCtxt<'tcx>) -> Pat<'tcx> {
|
2019-11-09 20:59:10 +00:00
|
|
|
let (lo, hi) = self.boundaries();
|
2019-11-09 10:31:53 +00:00
|
|
|
|
|
|
|
let bias = IntRange::signed_bias(tcx, self.ty);
|
|
|
|
let (lo, hi) = (lo ^ bias, hi ^ bias);
|
|
|
|
|
|
|
|
let ty = ty::ParamEnv::empty().and(self.ty);
|
|
|
|
let lo_const = ty::Const::from_bits(tcx, lo, ty);
|
|
|
|
let hi_const = ty::Const::from_bits(tcx, hi, ty);
|
|
|
|
|
2019-11-09 12:01:47 +00:00
|
|
|
let kind = if lo == hi {
|
|
|
|
PatKind::Constant { value: lo_const }
|
2019-11-09 10:31:53 +00:00
|
|
|
} else {
|
2019-11-09 12:01:47 +00:00
|
|
|
PatKind::Range(PatRange { lo: lo_const, hi: hi_const, end: RangeEnd::Included })
|
|
|
|
};
|
|
|
|
|
|
|
|
// This is a brand new pattern, so we don't reuse `self.span`.
|
|
|
|
Pat { ty: self.ty, span: DUMMY_SP, kind: Box::new(kind) }
|
2019-11-09 10:31:53 +00:00
|
|
|
}
|
2020-10-25 21:59:59 +00:00
|
|
|
|
|
|
|
/// For exhaustive integer matching, some constructors are grouped within other constructors
|
|
|
|
/// (namely integer typed values are grouped within ranges). However, when specialising these
|
|
|
|
/// constructors, we want to be specialising for the underlying constructors (the integers), not
|
|
|
|
/// the groups (the ranges). Thus we need to split the groups up. Splitting them up naïvely would
|
|
|
|
/// mean creating a separate constructor for every single value in the range, which is clearly
|
|
|
|
/// impractical. However, observe that for some ranges of integers, the specialisation will be
|
|
|
|
/// identical across all values in that range (i.e., there are equivalence classes of ranges of
|
|
|
|
/// constructors based on their `U(S(c, P), S(c, p))` outcome). These classes are grouped by
|
|
|
|
/// the patterns that apply to them (in the matrix `P`). We can split the range whenever the
|
|
|
|
/// patterns that apply to that range (specifically: the patterns that *intersect* with that range)
|
|
|
|
/// change.
|
|
|
|
/// Our solution, therefore, is to split the range constructor into subranges at every single point
|
|
|
|
/// the group of intersecting patterns changes (using the method described below).
|
|
|
|
/// And voilà! We're testing precisely those ranges that we need to, without any exhaustive matching
|
|
|
|
/// on actual integers. The nice thing about this is that the number of subranges is linear in the
|
|
|
|
/// number of rows in the matrix (i.e., the number of cases in the `match` statement), so we don't
|
|
|
|
/// need to be worried about matching over gargantuan ranges.
|
|
|
|
///
|
|
|
|
/// Essentially, given the first column of a matrix representing ranges, looking like the following:
|
|
|
|
///
|
|
|
|
/// |------| |----------| |-------| ||
|
|
|
|
/// |-------| |-------| |----| ||
|
|
|
|
/// |---------|
|
|
|
|
///
|
|
|
|
/// We split the ranges up into equivalence classes so the ranges are no longer overlapping:
|
|
|
|
///
|
|
|
|
/// |--|--|||-||||--||---|||-------| |-|||| ||
|
|
|
|
///
|
|
|
|
/// The logic for determining how to split the ranges is fairly straightforward: we calculate
|
|
|
|
/// boundaries for each interval range, sort them, then create constructors for each new interval
|
|
|
|
/// between every pair of boundary points. (This essentially sums up to performing the intuitive
|
|
|
|
/// merging operation depicted above.)
|
|
|
|
fn split<'p>(
|
|
|
|
self,
|
|
|
|
cx: &MatchCheckCtxt<'p, 'tcx>,
|
|
|
|
pcx: PatCtxt<'tcx>,
|
|
|
|
matrix: &Matrix<'p, 'tcx>,
|
|
|
|
hir_id: Option<HirId>,
|
|
|
|
) -> SmallVec<[Constructor<'tcx>; 1]> {
|
|
|
|
let ty = pcx.ty;
|
|
|
|
|
|
|
|
/// Represents a border between 2 integers. Because the intervals spanning borders
|
|
|
|
/// must be able to cover every integer, we need to be able to represent
|
|
|
|
/// 2^128 + 1 such borders.
|
|
|
|
#[derive(Clone, Copy, PartialEq, Eq, PartialOrd, Ord, Debug)]
|
|
|
|
enum Border {
|
|
|
|
JustBefore(u128),
|
|
|
|
AfterMax,
|
|
|
|
}
|
|
|
|
|
|
|
|
// A function for extracting the borders of an integer interval.
|
|
|
|
fn range_borders(r: IntRange<'_>) -> impl Iterator<Item = Border> {
|
|
|
|
let (lo, hi) = r.range.into_inner();
|
|
|
|
let from = Border::JustBefore(lo);
|
|
|
|
let to = match hi.checked_add(1) {
|
|
|
|
Some(m) => Border::JustBefore(m),
|
|
|
|
None => Border::AfterMax,
|
|
|
|
};
|
|
|
|
vec![from, to].into_iter()
|
|
|
|
}
|
|
|
|
|
|
|
|
// Collect the span and range of all the intersecting ranges to lint on likely
|
|
|
|
// incorrect range patterns. (#63987)
|
|
|
|
let mut overlaps = vec![];
|
|
|
|
// `borders` is the set of borders between equivalence classes: each equivalence
|
|
|
|
// class lies between 2 borders.
|
|
|
|
let row_borders = matrix
|
|
|
|
.patterns
|
|
|
|
.iter()
|
|
|
|
.flat_map(|row| {
|
|
|
|
IntRange::from_pat(cx.tcx, cx.param_env, row.head()).map(|r| (r, row.len()))
|
|
|
|
})
|
|
|
|
.flat_map(|(range, row_len)| {
|
|
|
|
let intersection = self.intersection(cx.tcx, &range);
|
|
|
|
let should_lint = self.suspicious_intersection(&range);
|
|
|
|
if let (Some(range), 1, true) = (&intersection, row_len, should_lint) {
|
|
|
|
// FIXME: for now, only check for overlapping ranges on simple range
|
|
|
|
// patterns. Otherwise with the current logic the following is detected
|
|
|
|
// as overlapping:
|
|
|
|
// match (10u8, true) {
|
|
|
|
// (0 ..= 125, false) => {}
|
|
|
|
// (126 ..= 255, false) => {}
|
|
|
|
// (0 ..= 255, true) => {}
|
|
|
|
// }
|
|
|
|
overlaps.push(range.clone());
|
|
|
|
}
|
|
|
|
intersection
|
|
|
|
})
|
|
|
|
.flat_map(range_borders);
|
|
|
|
let self_borders = range_borders(self.clone());
|
|
|
|
let mut borders: Vec<_> = row_borders.chain(self_borders).collect();
|
|
|
|
borders.sort_unstable();
|
|
|
|
|
|
|
|
self.lint_overlapping_patterns(cx.tcx, hir_id, ty, overlaps);
|
|
|
|
|
|
|
|
// We're going to iterate through every adjacent pair of borders, making sure that
|
|
|
|
// each represents an interval of nonnegative length, and convert each such
|
|
|
|
// interval into a constructor.
|
|
|
|
borders
|
|
|
|
.array_windows()
|
|
|
|
.filter_map(|&pair| match pair {
|
|
|
|
[Border::JustBefore(n), Border::JustBefore(m)] => {
|
|
|
|
if n < m {
|
|
|
|
Some(n..=(m - 1))
|
|
|
|
} else {
|
|
|
|
None
|
|
|
|
}
|
|
|
|
}
|
|
|
|
[Border::JustBefore(n), Border::AfterMax] => Some(n..=u128::MAX),
|
|
|
|
[Border::AfterMax, _] => None,
|
|
|
|
})
|
|
|
|
.map(|range| IntRange { range, ty, span: pcx.span })
|
|
|
|
.map(IntRange)
|
|
|
|
.collect()
|
|
|
|
}
|
|
|
|
|
|
|
|
fn lint_overlapping_patterns(
|
|
|
|
self,
|
|
|
|
tcx: TyCtxt<'tcx>,
|
|
|
|
hir_id: Option<HirId>,
|
|
|
|
ty: Ty<'tcx>,
|
|
|
|
overlaps: Vec<IntRange<'tcx>>,
|
|
|
|
) {
|
|
|
|
if let (true, Some(hir_id)) = (!overlaps.is_empty(), hir_id) {
|
|
|
|
tcx.struct_span_lint_hir(
|
|
|
|
lint::builtin::OVERLAPPING_PATTERNS,
|
|
|
|
hir_id,
|
|
|
|
self.span,
|
|
|
|
|lint| {
|
|
|
|
let mut err = lint.build("multiple patterns covering the same range");
|
|
|
|
err.span_label(self.span, "overlapping patterns");
|
|
|
|
for int_range in overlaps {
|
|
|
|
// Use the real type for user display of the ranges:
|
|
|
|
err.span_label(
|
|
|
|
int_range.span,
|
|
|
|
&format!(
|
|
|
|
"this range overlaps on `{}`",
|
|
|
|
IntRange { range: int_range.range, ty, span: DUMMY_SP }.to_pat(tcx),
|
|
|
|
),
|
|
|
|
);
|
|
|
|
}
|
|
|
|
err.emit();
|
|
|
|
},
|
|
|
|
);
|
|
|
|
}
|
|
|
|
}
|
2018-05-19 23:19:29 +01:00
|
|
|
}
|
|
|
|
|
2019-11-12 13:23:27 +00:00
|
|
|
/// Ignore spans when comparing, they don't carry semantic information as they are only for lints.
|
2019-11-09 10:03:18 +00:00
|
|
|
impl<'tcx> std::cmp::PartialEq for IntRange<'tcx> {
|
|
|
|
fn eq(&self, other: &Self) -> bool {
|
|
|
|
self.range == other.range && self.ty == other.ty
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2019-10-27 16:36:50 +00:00
|
|
|
// A struct to compute a set of constructors equivalent to `all_ctors \ used_ctors`.
|
|
|
|
struct MissingConstructors<'tcx> {
|
|
|
|
all_ctors: Vec<Constructor<'tcx>>,
|
|
|
|
used_ctors: Vec<Constructor<'tcx>>,
|
2018-10-18 09:11:11 +11:00
|
|
|
}
|
|
|
|
|
2019-10-27 16:36:50 +00:00
|
|
|
impl<'tcx> MissingConstructors<'tcx> {
|
2019-11-09 21:01:26 +00:00
|
|
|
fn new(all_ctors: Vec<Constructor<'tcx>>, used_ctors: Vec<Constructor<'tcx>>) -> Self {
|
|
|
|
MissingConstructors { all_ctors, used_ctors }
|
2019-10-27 16:36:50 +00:00
|
|
|
}
|
2018-10-18 09:11:11 +11:00
|
|
|
|
2019-10-27 16:36:50 +00:00
|
|
|
fn into_inner(self) -> (Vec<Constructor<'tcx>>, Vec<Constructor<'tcx>>) {
|
|
|
|
(self.all_ctors, self.used_ctors)
|
|
|
|
}
|
2018-10-18 09:11:11 +11:00
|
|
|
|
2019-10-27 16:36:50 +00:00
|
|
|
fn is_empty(&self) -> bool {
|
|
|
|
self.iter().next().is_none()
|
|
|
|
}
|
|
|
|
/// Whether this contains all the constructors for the given type or only a
|
|
|
|
/// subset.
|
|
|
|
fn all_ctors_are_missing(&self) -> bool {
|
|
|
|
self.used_ctors.is_empty()
|
|
|
|
}
|
2018-08-19 23:10:18 +01:00
|
|
|
|
2019-10-27 16:36:50 +00:00
|
|
|
/// Iterate over all_ctors \ used_ctors
|
|
|
|
fn iter<'a>(&'a self) -> impl Iterator<Item = Constructor<'tcx>> + Captures<'a> {
|
2019-11-09 21:01:26 +00:00
|
|
|
self.all_ctors.iter().flat_map(move |req_ctor| req_ctor.subtract_ctors(&self.used_ctors))
|
2018-08-19 23:10:18 +01:00
|
|
|
}
|
2019-10-27 16:36:50 +00:00
|
|
|
}
|
2018-08-19 23:10:18 +01:00
|
|
|
|
2019-10-27 16:36:50 +00:00
|
|
|
impl<'tcx> fmt::Debug for MissingConstructors<'tcx> {
|
|
|
|
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
|
|
|
let ctors: Vec<_> = self.iter().collect();
|
|
|
|
write!(f, "{:?}", ctors)
|
2018-10-18 09:11:11 +11:00
|
|
|
}
|
2018-08-19 23:10:18 +01:00
|
|
|
}
|
|
|
|
|
2019-02-08 14:53:55 +01:00
|
|
|
/// Algorithm from http://moscova.inria.fr/~maranget/papers/warn/index.html.
|
2016-12-01 01:12:03 +08:00
|
|
|
/// The algorithm from the paper has been modified to correctly handle empty
|
|
|
|
/// types. The changes are:
|
|
|
|
/// (0) We don't exit early if the pattern matrix has zero rows. We just
|
|
|
|
/// continue to recurse over columns.
|
|
|
|
/// (1) all_constructors will only return constructors that are statically
|
2019-02-08 14:53:55 +01:00
|
|
|
/// possible. E.g., it will only return `Ok` for `Result<T, !>`.
|
2016-09-26 02:53:26 +03:00
|
|
|
///
|
2017-12-25 18:14:50 +02:00
|
|
|
/// This finds whether a (row) vector `v` of patterns is 'useful' in relation
|
2018-01-13 23:41:11 +02:00
|
|
|
/// to a set of such vectors `m` - this is defined as there being a set of
|
|
|
|
/// inputs that will match `v` but not any of the sets in `m`.
|
2017-12-25 18:14:50 +02:00
|
|
|
///
|
2020-05-09 13:46:05 +01:00
|
|
|
/// All the patterns at each column of the `matrix ++ v` matrix must have the same type.
|
2016-09-26 02:53:26 +03:00
|
|
|
///
|
|
|
|
/// This is used both for reachability checking (if a pattern isn't useful in
|
|
|
|
/// relation to preceding patterns, it is not reachable) and exhaustiveness
|
|
|
|
/// checking (if a wildcard pattern is useful in relation to a matrix, the
|
|
|
|
/// matrix isn't exhaustive).
|
2020-03-25 20:07:01 -03:00
|
|
|
///
|
|
|
|
/// `is_under_guard` is used to inform if the pattern has a guard. If it
|
|
|
|
/// has one it must not be inserted into the matrix. This shouldn't be
|
|
|
|
/// relied on for soundness.
|
2020-01-05 15:46:44 +00:00
|
|
|
crate fn is_useful<'p, 'tcx>(
|
2020-10-25 21:59:59 +00:00
|
|
|
cx: &MatchCheckCtxt<'p, 'tcx>,
|
2019-06-16 12:41:24 +03:00
|
|
|
matrix: &Matrix<'p, 'tcx>,
|
2019-11-28 13:03:02 +00:00
|
|
|
v: &PatStack<'p, 'tcx>,
|
2019-09-23 17:44:24 +02:00
|
|
|
witness_preference: WitnessPreference,
|
2019-08-29 16:06:44 -07:00
|
|
|
hir_id: HirId,
|
2020-03-25 20:07:01 -03:00
|
|
|
is_under_guard: bool,
|
2019-12-03 16:15:25 +00:00
|
|
|
is_top_level: bool,
|
2020-07-02 21:03:59 +01:00
|
|
|
) -> Usefulness<'tcx> {
|
2020-09-19 22:00:10 +09:00
|
|
|
let Matrix { patterns: rows, .. } = matrix;
|
2018-01-16 09:24:38 +01:00
|
|
|
debug!("is_useful({:#?}, {:#?})", matrix, v);
|
2016-09-24 20:45:59 +03:00
|
|
|
|
2016-11-28 18:38:27 +08:00
|
|
|
// The base case. We are pattern-matching on () and the return value is
|
|
|
|
// based on whether our matrix has a row or not.
|
|
|
|
// NOTE: This could potentially be optimized by checking rows.is_empty()
|
|
|
|
// first and then, if v is non-empty, the return value is based on whether
|
|
|
|
// the type of the tuple we're checking is inhabited or not.
|
|
|
|
if v.is_empty() {
|
|
|
|
return if rows.is_empty() {
|
2019-10-27 17:07:05 +00:00
|
|
|
Usefulness::new_useful(witness_preference)
|
2016-12-01 11:56:55 +08:00
|
|
|
} else {
|
2016-11-28 18:38:27 +08:00
|
|
|
NotUseful
|
2019-09-21 13:49:14 +02:00
|
|
|
};
|
2016-11-28 18:38:27 +08:00
|
|
|
};
|
|
|
|
|
2016-10-26 22:38:22 +03:00
|
|
|
assert!(rows.iter().all(|r| r.len() == v.len()));
|
2016-11-05 13:32:35 +02:00
|
|
|
|
2019-11-21 18:45:28 +00:00
|
|
|
// If the first pattern is an or-pattern, expand it.
|
|
|
|
if let Some(vs) = v.expand_or_pat() {
|
2019-11-28 16:56:45 +00:00
|
|
|
// We need to push the already-seen patterns into the matrix in order to detect redundant
|
|
|
|
// branches like `Some(_) | Some(0)`. We also keep track of the unreachable subpatterns.
|
|
|
|
let mut matrix = matrix.clone();
|
2020-07-02 21:49:58 +01:00
|
|
|
// `Vec` of all the unreachable branches of the current or-pattern.
|
|
|
|
let mut unreachable_branches = Vec::new();
|
|
|
|
// Subpatterns that are unreachable from all branches. E.g. in the following case, the last
|
|
|
|
// `true` is unreachable only from one branch, so it is overall reachable.
|
2020-10-18 21:54:10 -07:00
|
|
|
//
|
2020-07-02 21:49:58 +01:00
|
|
|
// ```
|
|
|
|
// match (true, true) {
|
|
|
|
// (true, true) => {}
|
|
|
|
// (false | true, false | true) => {}
|
|
|
|
// }
|
|
|
|
// ```
|
|
|
|
let mut unreachable_subpats = FxHashSet::default();
|
|
|
|
// Whether any branch at all is useful.
|
2019-11-28 16:56:45 +00:00
|
|
|
let mut any_is_useful = false;
|
2020-07-02 21:49:58 +01:00
|
|
|
|
2019-11-28 16:56:45 +00:00
|
|
|
for v in vs {
|
2020-03-25 20:07:01 -03:00
|
|
|
let res = is_useful(cx, &matrix, &v, witness_preference, hir_id, is_under_guard, false);
|
2019-11-28 16:56:45 +00:00
|
|
|
match res {
|
|
|
|
Useful(pats) => {
|
2020-07-02 21:49:58 +01:00
|
|
|
if !any_is_useful {
|
|
|
|
any_is_useful = true;
|
|
|
|
// Initialize with the first set of unreachable subpatterns encountered.
|
|
|
|
unreachable_subpats = pats.into_iter().collect();
|
|
|
|
} else {
|
|
|
|
// Keep the patterns unreachable from both this and previous branches.
|
|
|
|
unreachable_subpats =
|
|
|
|
pats.into_iter().filter(|p| unreachable_subpats.contains(p)).collect();
|
|
|
|
}
|
2019-11-28 16:56:45 +00:00
|
|
|
}
|
2020-07-02 21:49:58 +01:00
|
|
|
NotUseful => unreachable_branches.push(v.head().span),
|
2019-11-28 16:56:45 +00:00
|
|
|
UsefulWithWitness(_) => {
|
|
|
|
bug!("Encountered or-pat in `v` during exhaustiveness checking")
|
|
|
|
}
|
|
|
|
}
|
2020-03-25 20:07:01 -03:00
|
|
|
// If pattern has a guard don't add it to the matrix
|
|
|
|
if !is_under_guard {
|
|
|
|
matrix.push(v);
|
|
|
|
}
|
2019-11-28 16:56:45 +00:00
|
|
|
}
|
2020-07-02 21:49:58 +01:00
|
|
|
if any_is_useful {
|
|
|
|
// Collect all the unreachable patterns.
|
|
|
|
unreachable_branches.extend(unreachable_subpats);
|
|
|
|
return Useful(unreachable_branches);
|
|
|
|
} else {
|
|
|
|
return NotUseful;
|
|
|
|
}
|
2019-11-21 18:45:28 +00:00
|
|
|
}
|
|
|
|
|
2020-05-23 18:59:27 +01:00
|
|
|
// FIXME(Nadrieril): Hack to work around type normalization issues (see #72476).
|
|
|
|
let ty = matrix.heads().next().map(|r| r.ty).unwrap_or(v.head().ty);
|
|
|
|
let pcx = PatCtxt { ty, span: v.head().span };
|
2016-09-24 18:24:34 +03:00
|
|
|
|
2019-11-01 15:44:58 +00:00
|
|
|
debug!("is_useful_expand_first_col: pcx={:#?}, expanding {:#?}", pcx, v.head());
|
2016-09-24 18:24:34 +03:00
|
|
|
|
2020-05-23 13:11:28 +01:00
|
|
|
let ret = if let Some(constructor) = pat_constructor(cx.tcx, cx.param_env, v.head()) {
|
2019-11-05 16:16:47 +00:00
|
|
|
debug!("is_useful - expanding constructor: {:#?}", constructor);
|
2020-10-25 21:59:59 +00:00
|
|
|
constructor
|
|
|
|
.split(cx, pcx, matrix, Some(hir_id))
|
|
|
|
.into_iter()
|
|
|
|
.map(|c| {
|
|
|
|
is_useful_specialized(
|
|
|
|
cx,
|
|
|
|
matrix,
|
|
|
|
v,
|
|
|
|
c,
|
|
|
|
pcx.ty,
|
|
|
|
witness_preference,
|
|
|
|
hir_id,
|
|
|
|
is_under_guard,
|
|
|
|
)
|
|
|
|
})
|
|
|
|
.find(|result| result.is_useful())
|
|
|
|
.unwrap_or(NotUseful)
|
2016-09-24 20:45:59 +03:00
|
|
|
} else {
|
|
|
|
debug!("is_useful - expanding wildcard");
|
2016-12-01 11:37:03 +08:00
|
|
|
|
2019-09-23 16:07:23 +02:00
|
|
|
let used_ctors: Vec<Constructor<'_>> =
|
2019-11-07 18:37:10 +00:00
|
|
|
matrix.heads().filter_map(|p| pat_constructor(cx.tcx, cx.param_env, p)).collect();
|
2020-05-23 13:11:28 +01:00
|
|
|
debug!("is_useful_used_ctors = {:#?}", used_ctors);
|
2018-05-19 23:19:29 +01:00
|
|
|
// `all_ctors` are all the constructors for the given type, which
|
|
|
|
// should all be represented (or caught with the wild pattern `_`).
|
2018-05-29 12:47:39 +01:00
|
|
|
let all_ctors = all_constructors(cx, pcx);
|
2020-05-23 13:11:28 +01:00
|
|
|
debug!("is_useful_all_ctors = {:#?}", all_ctors);
|
2018-05-19 15:26:07 +01:00
|
|
|
|
2016-12-01 11:37:03 +08:00
|
|
|
// `missing_ctors` is the set of constructors from the same type as the
|
|
|
|
// first column of `matrix` that are matched only by wildcard patterns
|
|
|
|
// from the first column.
|
|
|
|
//
|
|
|
|
// Therefore, if there is some pattern that is unmatched by `matrix`,
|
|
|
|
// it will still be unmatched if the first constructor is replaced by
|
|
|
|
// any of the constructors in `missing_ctors`
|
2017-02-20 19:18:31 +02:00
|
|
|
|
2019-11-12 12:44:00 +00:00
|
|
|
// Missing constructors are those that are not matched by any non-wildcard patterns in the
|
|
|
|
// current column. We only fully construct them on-demand, because they're rarely used and
|
|
|
|
// can be big.
|
2019-11-09 21:01:26 +00:00
|
|
|
let missing_ctors = MissingConstructors::new(all_ctors, used_ctors);
|
2017-02-20 19:18:31 +02:00
|
|
|
|
2020-05-23 13:11:28 +01:00
|
|
|
debug!("is_useful_missing_ctors.empty()={:#?}", missing_ctors.is_empty(),);
|
2017-11-03 19:17:54 +00:00
|
|
|
|
2019-11-12 12:44:00 +00:00
|
|
|
if missing_ctors.is_empty() {
|
2019-10-27 16:36:50 +00:00
|
|
|
let (all_ctors, _) = missing_ctors.into_inner();
|
2020-10-25 21:59:59 +00:00
|
|
|
all_ctors
|
2019-11-05 17:57:55 +00:00
|
|
|
.into_iter()
|
2020-10-25 21:59:59 +00:00
|
|
|
.flat_map(|ctor| ctor.split(cx, pcx, matrix, None))
|
2019-11-05 17:57:55 +00:00
|
|
|
.map(|c| {
|
2020-03-25 20:07:01 -03:00
|
|
|
is_useful_specialized(
|
|
|
|
cx,
|
|
|
|
matrix,
|
|
|
|
v,
|
|
|
|
c,
|
|
|
|
pcx.ty,
|
|
|
|
witness_preference,
|
|
|
|
hir_id,
|
|
|
|
is_under_guard,
|
|
|
|
)
|
2019-11-05 17:57:55 +00:00
|
|
|
})
|
|
|
|
.find(|result| result.is_useful())
|
|
|
|
.unwrap_or(NotUseful)
|
2016-09-24 18:24:34 +03:00
|
|
|
} else {
|
2019-11-01 16:33:34 +00:00
|
|
|
let matrix = matrix.specialize_wildcard();
|
|
|
|
let v = v.to_tail();
|
2020-03-25 20:07:01 -03:00
|
|
|
let usefulness =
|
|
|
|
is_useful(cx, &matrix, &v, witness_preference, hir_id, is_under_guard, false);
|
2019-10-27 17:07:05 +00:00
|
|
|
|
|
|
|
// In this case, there's at least one "free"
|
|
|
|
// constructor that is only matched against by
|
|
|
|
// wildcard patterns.
|
|
|
|
//
|
|
|
|
// There are 2 ways we can report a witness here.
|
|
|
|
// Commonly, we can report all the "free"
|
|
|
|
// constructors as witnesses, e.g., if we have:
|
|
|
|
//
|
|
|
|
// ```
|
|
|
|
// enum Direction { N, S, E, W }
|
|
|
|
// let Direction::N = ...;
|
|
|
|
// ```
|
|
|
|
//
|
|
|
|
// we can report 3 witnesses: `S`, `E`, and `W`.
|
|
|
|
//
|
2019-11-12 12:44:00 +00:00
|
|
|
// However, there is a case where we don't want
|
2019-10-27 17:07:05 +00:00
|
|
|
// to do this and instead report a single `_` witness:
|
2019-11-12 12:44:00 +00:00
|
|
|
// if the user didn't actually specify a constructor
|
2019-10-27 17:07:05 +00:00
|
|
|
// in this arm, e.g., in
|
2020-10-18 21:54:10 -07:00
|
|
|
//
|
2019-10-27 17:07:05 +00:00
|
|
|
// ```
|
|
|
|
// let x: (Direction, Direction, bool) = ...;
|
|
|
|
// let (_, _, false) = x;
|
|
|
|
// ```
|
2020-10-18 21:54:10 -07:00
|
|
|
//
|
2019-10-27 17:07:05 +00:00
|
|
|
// we don't want to show all 16 possible witnesses
|
|
|
|
// `(<direction-1>, <direction-2>, true)` - we are
|
|
|
|
// satisfied with `(_, _, true)`. In this case,
|
|
|
|
// `used_ctors` is empty.
|
2019-12-03 16:15:25 +00:00
|
|
|
// The exception is: if we are at the top-level, for example in an empty match, we
|
2019-12-11 18:17:58 +00:00
|
|
|
// sometimes prefer reporting the list of constructors instead of just `_`.
|
|
|
|
let report_ctors_rather_than_wildcard = is_top_level && !IntRange::is_integral(pcx.ty);
|
|
|
|
if missing_ctors.all_ctors_are_missing() && !report_ctors_rather_than_wildcard {
|
2019-10-27 17:07:05 +00:00
|
|
|
// All constructors are unused. Add a wild pattern
|
|
|
|
// rather than each individual constructor.
|
|
|
|
usefulness.apply_wildcard(pcx.ty)
|
|
|
|
} else {
|
|
|
|
// Construct for each missing constructor a "wild" version of this
|
|
|
|
// constructor, that matches everything that can be built with
|
|
|
|
// it. For example, if `ctor` is a `Constructor::Variant` for
|
|
|
|
// `Option::Some`, we get the pattern `Some(_)`.
|
|
|
|
usefulness.apply_missing_ctors(cx, pcx.ty, &missing_ctors)
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
|
|
|
}
|
2020-05-23 13:11:28 +01:00
|
|
|
};
|
|
|
|
debug!("is_useful::returns({:#?}, {:#?}) = {:?}", matrix, v, ret);
|
|
|
|
ret
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
|
|
|
|
2018-11-27 02:59:49 +00:00
|
|
|
/// A shorthand for the `U(S(c, P), S(c, q))` operation from the paper. I.e., `is_useful` applied
|
2018-08-14 12:45:26 +01:00
|
|
|
/// to the specialised version of both the pattern matrix `P` and the new pattern `q`.
|
2019-11-28 13:03:02 +00:00
|
|
|
fn is_useful_specialized<'p, 'tcx>(
|
2020-10-25 21:59:59 +00:00
|
|
|
cx: &MatchCheckCtxt<'p, 'tcx>,
|
2019-11-01 16:33:34 +00:00
|
|
|
matrix: &Matrix<'p, 'tcx>,
|
2019-11-28 13:03:02 +00:00
|
|
|
v: &PatStack<'p, 'tcx>,
|
2017-02-15 15:00:20 +02:00
|
|
|
ctor: Constructor<'tcx>,
|
2020-05-09 12:46:42 +01:00
|
|
|
ty: Ty<'tcx>,
|
2019-09-23 17:44:24 +02:00
|
|
|
witness_preference: WitnessPreference,
|
2019-08-29 16:06:44 -07:00
|
|
|
hir_id: HirId,
|
2020-03-25 20:07:01 -03:00
|
|
|
is_under_guard: bool,
|
2020-07-02 21:03:59 +01:00
|
|
|
) -> Usefulness<'tcx> {
|
2020-05-09 12:46:42 +01:00
|
|
|
debug!("is_useful_specialized({:#?}, {:#?}, {:?})", v, ctor, ty);
|
2019-09-23 16:07:23 +02:00
|
|
|
|
2020-05-09 12:46:42 +01:00
|
|
|
// We cache the result of `Fields::wildcards` because it is used a lot.
|
|
|
|
let ctor_wild_subpatterns = Fields::wildcards(cx, &ctor, ty);
|
2020-05-09 11:32:54 +01:00
|
|
|
let matrix = matrix.specialize_constructor(cx, &ctor, &ctor_wild_subpatterns);
|
2020-10-18 13:48:54 +01:00
|
|
|
v.specialize_constructor(cx, &ctor, &ctor_wild_subpatterns, true)
|
2020-03-25 20:07:01 -03:00
|
|
|
.map(|v| is_useful(cx, &matrix, &v, witness_preference, hir_id, is_under_guard, false))
|
2020-05-09 12:46:42 +01:00
|
|
|
.map(|u| u.apply_constructor(cx, &ctor, ty, &ctor_wild_subpatterns))
|
2019-10-27 17:07:05 +00:00
|
|
|
.unwrap_or(NotUseful)
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
|
|
|
|
2019-11-05 16:16:47 +00:00
|
|
|
/// Determines the constructor that the given pattern can be specialized to.
|
2019-02-08 14:53:55 +01:00
|
|
|
/// Returns `None` in case of a catch-all, which can't be specialized.
|
2019-11-05 16:16:47 +00:00
|
|
|
fn pat_constructor<'tcx>(
|
2019-11-07 18:37:10 +00:00
|
|
|
tcx: TyCtxt<'tcx>,
|
|
|
|
param_env: ty::ParamEnv<'tcx>,
|
2019-08-29 16:06:44 -07:00
|
|
|
pat: &Pat<'tcx>,
|
2019-11-05 16:16:47 +00:00
|
|
|
) -> Option<Constructor<'tcx>> {
|
2020-09-22 12:41:30 -07:00
|
|
|
// This MUST be kept in sync with `IntRange::from_pat`.
|
2016-09-26 02:53:26 +03:00
|
|
|
match *pat.kind {
|
2019-11-17 23:05:50 +00:00
|
|
|
PatKind::AscribeUserType { .. } => bug!(), // Handled by `expand_pattern`
|
2019-09-26 18:42:24 +01:00
|
|
|
PatKind::Binding { .. } | PatKind::Wild => None,
|
2019-11-05 16:16:47 +00:00
|
|
|
PatKind::Leaf { .. } | PatKind::Deref { .. } => Some(Single),
|
2019-09-26 18:42:24 +01:00
|
|
|
PatKind::Variant { adt_def, variant_index, .. } => {
|
2019-11-05 16:16:47 +00:00
|
|
|
Some(Variant(adt_def.variants[variant_index].def_id))
|
2018-08-13 16:23:14 +01:00
|
|
|
}
|
2019-11-09 13:35:04 +00:00
|
|
|
PatKind::Constant { value } => {
|
|
|
|
if let Some(int_range) = IntRange::from_const(tcx, param_env, value, pat.span) {
|
|
|
|
Some(IntRange(int_range))
|
|
|
|
} else {
|
2020-10-17 23:12:28 +01:00
|
|
|
match value.ty.kind() {
|
|
|
|
ty::Float(_) => Some(FloatRange(value, value, RangeEnd::Included)),
|
2020-10-17 23:18:05 +01:00
|
|
|
ty::Ref(_, t, _) if t.is_str() => Some(Str(value)),
|
2020-10-18 17:05:19 +01:00
|
|
|
// All constants that can be structurally matched have already been expanded
|
|
|
|
// into the corresponding `Pat`s by `const_to_pat`. Constants that remain are
|
|
|
|
// opaque.
|
2020-10-18 13:48:54 +01:00
|
|
|
_ => Some(Opaque),
|
2019-11-17 22:43:06 +00:00
|
|
|
}
|
2019-11-09 13:35:04 +00:00
|
|
|
}
|
|
|
|
}
|
|
|
|
PatKind::Range(PatRange { lo, hi, end }) => {
|
|
|
|
let ty = lo.ty;
|
2019-11-09 21:09:12 +00:00
|
|
|
if let Some(int_range) = IntRange::from_range(
|
|
|
|
tcx,
|
|
|
|
lo.eval_bits(tcx, param_env, lo.ty),
|
|
|
|
hi.eval_bits(tcx, param_env, hi.ty),
|
|
|
|
ty,
|
|
|
|
&end,
|
|
|
|
pat.span,
|
|
|
|
) {
|
2019-11-09 13:35:04 +00:00
|
|
|
Some(IntRange(int_range))
|
|
|
|
} else {
|
2019-11-15 17:00:38 +00:00
|
|
|
Some(FloatRange(lo, hi, end))
|
2019-11-09 13:35:04 +00:00
|
|
|
}
|
|
|
|
}
|
2019-11-17 15:54:44 +00:00
|
|
|
PatKind::Array { ref prefix, ref slice, ref suffix }
|
|
|
|
| PatKind::Slice { ref prefix, ref slice, ref suffix } => {
|
2020-08-03 00:49:11 +02:00
|
|
|
let array_len = match pat.ty.kind() {
|
2019-11-17 15:54:44 +00:00
|
|
|
ty::Array(_, length) => Some(length.eval_usize(tcx, param_env)),
|
|
|
|
ty::Slice(_) => None,
|
|
|
|
_ => span_bug!(pat.span, "bad ty {:?} for slice pattern", pat.ty),
|
2019-11-16 16:05:32 +00:00
|
|
|
};
|
|
|
|
let prefix = prefix.len() as u64;
|
|
|
|
let suffix = suffix.len() as u64;
|
2019-11-17 15:54:44 +00:00
|
|
|
let kind =
|
|
|
|
if slice.is_some() { VarLen(prefix, suffix) } else { FixedLen(prefix + suffix) };
|
2019-11-17 17:33:39 +00:00
|
|
|
Some(Slice(Slice { array_len, kind }))
|
2016-09-26 02:53:26 +03:00
|
|
|
}
|
2019-11-30 13:35:46 +00:00
|
|
|
PatKind::Or { .. } => bug!("Or-pattern should have been expanded earlier on."),
|
2016-09-24 18:24:34 +03:00
|
|
|
}
|
|
|
|
}
|