1
Fork 0

[librustdoc] Reform lang string token splitting

Only split doctest lang strings on `,`, ` `, and `\t`. Additionally, to
preserve backwards compatibility with pandoc-style langstrings, strip a
surrounding `{}`, and remove leading `.`s from each token.

Prior to this change, doctest lang strings were split on all
non-alphanumeric characters except `-` or `_`, which limited future
extensions to doctest lang string tokens, for example using `=` for
key-value tokens.

This is a breaking change, although it is not expected to be disruptive,
because lang strings using separators other than `,` and ` ` are not
very common
This commit is contained in:
Casey Rodarmor 2020-10-26 19:08:42 -07:00
parent e9920ef774
commit 66f4883308
No known key found for this signature in database
GPG key ID: 556186B153EC6FE0
6 changed files with 84 additions and 20 deletions

View file

@ -10,7 +10,7 @@
//! fixpoint solution to your dataflow problem, or implement the `ResultsVisitor` interface and use
//! `visit_results`. The following example uses the `ResultsCursor` approach.
//!
//! ```ignore(cross-crate-imports)
//! ```ignore (cross-crate-imports)
//! use rustc_mir::dataflow::Analysis; // Makes `into_engine` available.
//!
//! fn do_my_analysis(tcx: TyCtxt<'tcx>, body: &mir::Body<'tcx>) {
@ -211,7 +211,7 @@ pub trait Analysis<'tcx>: AnalysisDomain<'tcx> {
/// default impl and the one for all `A: GenKillAnalysis` will do the right thing.
/// Its purpose is to enable method chaining like so:
///
/// ```ignore(cross-crate-imports)
/// ```ignore (cross-crate-imports)
/// let results = MyAnalysis::new(tcx, body)
/// .into_engine(tcx, body, def_id)
/// .iterate_to_fixpoint()