Make 'alloc' and 'std' default features; Make serialisations variably sized again; Refactor derive implementations; Completely rework streams; Fix tuple deserialisation; Encode 'FixedString' in UTF-8; Remove methods 'from_chars' and 'set_len' from 'FixedString'; Rename 'as_slice' and 'as_mut_slice' methods in 'FixedString' to 'as_st' and 'as_mut_str'; Add methods 'as_bytes', 'push_str', 'chars', 'capacity', and 'char_indices' to 'FixedString'; Rework 'FixedString' traits; Remove 'FixedIter'; Update lints; Add methods 'set_len' and 'set_len_unchecked' to 'Buffer'; Elaborate docs; Update readme; Do not require 'Serialise' for 'Deserialise'; Rename 'SERIALISED_SIZE' in 'Serialise' to 'MAX_SERIALISED_SIZE'; Use streams in 'Serialise' and 'Deserialise'; Drop 'Serialise' requirement for 'Buffer'; Add methods 'with_capacity' and 'capacity' to 'Buffer';

This commit is contained in:
Gabriel Bjørnager Jensen 2024-08-31 12:55:15 +02:00
parent 3b29e72624
commit a872a5c245
24 changed files with 1215 additions and 1040 deletions

View file

@ -3,6 +3,29 @@
This is the changelog of bzipper. This is the changelog of bzipper.
See `"README.md"` for more information. See `"README.md"` for more information.
## 0.7.0
* Make `alloc` and `std` default features
* Make serialisations variably sized again
* Refactor derive implementations
* Completely rework streams
* Fix tuple deserialisation
* Encode `FixedString` in UTF-8
* Remove methods `from_chars` and `set_len` from `FixedString`
* Rename `as_slice` and `as_mut_slice` methods in `FixedString` to `as_st` and `as_mut_str`
* Add methods `as_bytes`, `push_str`, `chars`, `capacity`, and `char_indices` to `FixedString`
* Rework `FixedString` traits
* Remove `FixedIter`
* Update lints
* Add methods `set_len` and `set_len_unchecked` to `Buffer`
* Elaborate docs
* Update readme
* Do not require `Serialise` for `Deserialise`
* Rename `SERIALISED_SIZE` in `Serialise` to `MAX_SERIALISED_SIZE`
* Use streams in `Serialise` and `Deserialise`
* Drop `Serialise` requirement for `Buffer`
* Add methods `with_capacity` and `capacity` to `Buffer`
## 0.6.2 ## 0.6.2
* Fix `Deserialise` derive for unit variants * Fix `Deserialise` derive for unit variants

View file

@ -101,7 +101,6 @@ option_as_ref_cloned = "warn"
option_if_let_else = "warn" option_if_let_else = "warn"
option_option = "deny" option_option = "deny"
or_fun_call = "deny" or_fun_call = "deny"
panic_in_result_fn = "deny"
path_buf_push_overwrite = "deny" path_buf_push_overwrite = "deny"
pattern_type_mismatch = "deny" pattern_type_mismatch = "deny"
ptr_as_ptr = "forbid" ptr_as_ptr = "forbid"
@ -122,7 +121,6 @@ return_self_not_must_use = "deny"
same_functions_in_if_condition = "deny" same_functions_in_if_condition = "deny"
same_name_method = "deny" same_name_method = "deny"
self_named_module_files = "deny" self_named_module_files = "deny"
semicolon_outside_block = "warn"
single_char_pattern = "warn" single_char_pattern = "warn"
str_split_at_newline = "warn" str_split_at_newline = "warn"
string_lit_as_bytes = "deny" string_lit_as_bytes = "deny"

View file

@ -2,7 +2,7 @@
[bzipper](https://crates.io/crates/bzipper/) is a binary (de)serialiser for the Rust language. [bzipper](https://crates.io/crates/bzipper/) is a binary (de)serialiser for the Rust language.
Contrary to [Serde](https://crates.io/crates/serde/)/[Bincode](https://crates.io/crates/bincode/), the goal of bzipper is to serialise with a known size constraint. In contrast to [Serde](https://crates.io/crates/serde/)/[Bincode](https://crates.io/crates/bincode/), the primary goal of bzipper is to serialise with a known size constraint.
Therefore, this crate may be more suited for networking or other cases where a fixed-sized buffer is needed. Therefore, this crate may be more suited for networking or other cases where a fixed-sized buffer is needed.
Keep in mind that this project is still work-in-progress. Keep in mind that this project is still work-in-progress.
@ -20,15 +20,15 @@ For strings, the `FixedString` type is also provided.
## Usage ## Usage
This crate revolves around the `Serialise` and `Deserialise` traits, both of which are commonly used in conjunction with streams (more specifically, s-streams and d-streams). This crate revolves around the `Serialise` and `Deserialise` traits, both of which use *streams* – or more specifically – s-streams and d-streams.
Many core types come implemented with bzipper, including primitives as well as some standard library types such as `Option` and `Result`. Many core types come implemented with bzipper, including primitives as well as some standard library types such as `Option` and `Result`.
It is recommended in most cases to just derive these traits for custom types (enumerations and structures only). It is recommended in most cases to just derive these two traits for custom types (although this is only supported with enumerations and structures).
Here, each field is chained in declaration order: Here, each field is *chained* according to declaration order:
```rs ```rust
use bzipper::{Deserialise, Serialise}; use bzipper::{Buffer, Deserialise, Serialise};
#[derive(Debug, Deserialise, PartialEq, Serialise)] #[derive(Debug, Deserialise, PartialEq, Serialise)]
struct IoRegister { struct IoRegister {
@ -36,45 +36,55 @@ struct IoRegister {
value: u16, value: u16,
} }
let mut buf: [u8; IoRegister::SERIALISED_SIZE] = Default::default(); let mut buf = Buffer::new();
IoRegister { addr: 0x04000000, value: 0x0402 }.serialise(&mut buf).unwrap();
buf.write(IoRegister { addr: 0x04000000, value: 0x0402 }).unwrap();
assert_eq!(buf.len(), 0x6);
assert_eq!(buf, [0x04, 0x00, 0x00, 0x00, 0x04, 0x02]); assert_eq!(buf, [0x04, 0x00, 0x00, 0x00, 0x04, 0x02]);
assert_eq!(IoRegister::deserialise(&buf).unwrap(), IoRegister { addr: 0x04000000, value: 0x0402 }); assert_eq!(buf.read().unwrap(), IoRegister { addr: 0x04000000, value: 0x0402 });
``` ```
### Serialisation ### Serialisation
To serialise an object implementing `Serialise`, simply allocate a buffer for the serialisation. To serialise an object implementing `Serialise`, simply allocate a buffer for the serialisation and wrap it in an s-stream (*serialisation stream*) with the `Sstream` type.
The required size of any given serialisation is specified by the `SERIALISED_SIZE` constant:
```rs ```rust
use bzipper::Serialise; use bzipper::{Serialise, Sstream};
let mut buf: [u8; char::SERIALISED_SIZE] = Default::default(); let mut buf = [Default::default(); char::MAX_SERIALISED_SIZE];
'Ж'.serialise(&mut buf).unwrap(); let mut stream = Sstream::new(&mut buf);
assert_eq!(buf, [0x00, 0x00, 0x04, 0x16]); 'Ж'.serialise(&mut stream).unwrap();
assert_eq!(stream, [0x00, 0x00, 0x04, 0x16]);
``` ```
The only special requirement of the `serialise` method is that the provided byte slice has an element count of exactly `SERIALISED_SIZE`. The maximum size of any given serialisation is specified by the `MAX_SERIALISED_SIZE` constant.
We can also use streams to *chain* multiple elements together: We can also use streams to chain multiple elements together:
```rs ```rust
use bzipper::Serialise; use bzipper::{Serialise, Sstream};
let mut buf: [u8; char::SERIALISED_SIZE * 5] = Default::default(); let mut buf = [Default::default(); char::MAX_SERIALISED_SIZE * 0x5];
let mut stream = bzipper::Sstream::new(&mut buf); let mut stream = Sstream::new(&mut buf);
stream.append(&'ل'); // Note: For serialising multiple characters, the
stream.append(&'ا'); // `FixedString` type is usually preferred.
stream.append(&'م');
stream.append(&'د');
stream.append(&'ا');
assert_eq!(buf, [0x00, 0x00, 0x06, 0x44, 0x00, 0x00, 0x06, 0x27, 0x00, 0x00, 0x06, 0x45, 0x00, 0x00, 0x06, 0x2F, 0x00, 0x00, 0x06, 0x27]); 'ل'.serialise(&mut stream).unwrap();
'ا'.serialise(&mut stream).unwrap();
'م'.serialise(&mut stream).unwrap();
'د'.serialise(&mut stream).unwrap();
'ا'.serialise(&mut stream).unwrap();
assert_eq!(buf, [
0x00, 0x00, 0x06, 0x44, 0x00, 0x00, 0x06, 0x27,
0x00, 0x00, 0x06, 0x45, 0x00, 0x00, 0x06, 0x2F,
0x00, 0x00, 0x06, 0x27
]);
``` ```
When serialising primitives, the resulting byte stream is in big endian (a.k.a. network endian). When serialising primitives, the resulting byte stream is in big endian (a.k.a. network endian).
@ -82,25 +92,33 @@ It is recommended for implementors to adhere to this convention as well.
### Deserialisation ### Deserialisation
Deserialisation works with an almost identical syntax to serialisation. Deserialisation works with a similar syntax to serialisation.
To deserialise a buffer, simply call the `deserialise` method: D-streams (*deserialisation streams*) use the `Dstream` type and are constructed in a manner similar to s-streams.
To deserialise a buffer, simply call the `deserialise` method with the strema:
```rs ```rust
use bzipper::Deserialise; use bzipper::{Deserialise, Dstream};
let data = [0x45, 0x54]; let data = [0x45, 0x54];
assert_eq!(<u16>::deserialise(&data).unwrap(), 0x4554); let stream = Dstream::new(&data);
assert_eq!(u16::deserialise(&stream).unwrap(), 0x4554);
``` ```
Just like with serialisations, the `Dstream` can be used to deserialise chained elements: And just like s-streams, d-streams can also be used to handle chaining:
```rs ```rust
use bzipper::Deserialise; use bzipper::{Deserialise, Dstream};
let data = [0x45, 0x54]; let data = [0x45, 0x54];
let stream = bzipper::Dstream::new(&data); let stream = Dstream::new(&data);
assert_eq!(stream.take::<u8>().unwrap(), 0x45); assert_eq!(u8::deserialise(&stream).unwrap(), 0x45);
assert_eq!(stream.take::<u8>().unwrap(), 0x54); assert_eq!(u8::deserialise(&stream).unwrap(), 0x54);
// The data can also be deserialised as a tuple (up
// to twelve elements).
let stream = Dstream::new(&data);
assert_eq!(<(u8, u8)>::deserialise(&stream).unwrap(), (0x45, 0x54));
``` ```

View file

@ -1,6 +1,6 @@
[package] [package]
name = "bzipper" name = "bzipper"
version = "0.6.2" version = "0.7.0"
edition = "2021" edition = "2021"
rust-version = "1.81" rust-version = "1.81"
documentation = "https://docs.rs/bzipper/" documentation = "https://docs.rs/bzipper/"
@ -16,11 +16,13 @@ license.workspace = true
all-features = true all-features = true
[features] [features]
default = ["alloc", "std"]
alloc = [] alloc = []
std = [] std = []
[dependencies] [dependencies]
bzipper_macros = { path = "../bzipper_macros", version = "0.6.2"} bzipper_macros = { path = "../bzipper_macros", version = "0.7.0"}
[lints] [lints]
workspace = true workspace = true

View file

@ -22,112 +22,201 @@
#[cfg(test)] #[cfg(test)]
mod test; mod test;
use crate::{Deserialise, Result, Serialise}; use crate::{Deserialise, Dstream, Result, Serialise, Sstream};
use alloc::vec; use alloc::vec;
use alloc::boxed::Box; use alloc::boxed::Box;
use core::borrow::Borrow;
use core::fmt::{Debug, Formatter}; use core::fmt::{Debug, Formatter};
use core::marker::PhantomData; use core::marker::PhantomData;
use core::ops::{Deref, DerefMut}; use core::ops::{Deref, DerefMut};
// We cannot use arrays for the `Buffer` type as
// that would require `generic_const_exprs`.
/// Typed (de)serialisation buffer. /// Typed (de)serialisation buffer.
/// ///
/// This structure is intended as a lightweight wrapper around byte buffers for specific (de)serialisations of specific types. /// This structure is intended as a lightweight wrapper around byte buffers for specific (de)serialisations of specific types.
/// ///
/// The methods [`write`](Self::write) and [`read`](Self::read) can be used to <interpreting> the internal buffer. /// The methods [`write`](Self::write) and [`read`](Self::read) can be used to handle the internal buffer.
/// Other methods exist for accessing the internal buffer directly. /// Other methods exist for accessing the internal buffer directly.
/// ///
/// # Examples /// # Examples
/// ///
/// Create a buffer for holding a `Request` enumeration: /// Create a buffer for holding a `Request` enumeration:
/// ///
/// ``` /// ```rust
/// use bzipper::{Buffer, FixedString, Serialise}; /// use bzipper::{Buffer, FixedString, Serialise};
/// ///
/// #[derive(Serialise)] /// #[derive(Serialise)]
/// enum Request { /// enum Request {
/// Join { username: FixedString<0x10> }, /// Join { username: FixedString<0x40> },
/// ///
/// Quit { username: FixedString<0x10> }, /// Quit { username: FixedString<0x40> },
/// ///
/// SendMessage { message: FixedString<0x20> }, /// SendMessage { message: FixedString<0x80> },
/// } /// }
/// ///
/// use Request::*; /// use Request::*;
/// ///
/// let join_request = Join { username: FixedString::try_from("epsiloneridani").unwrap() }; /// let join_request = Join { username: FixedString::try_from("epsiloneridani").unwrap() };
/// ///
/// let mut buf = Buffer::<Request>::new(); /// let mut buf = Buffer::new();
/// buf.write(&join_request); /// buf.write(join_request);
/// ///
/// // Do something with the buffer... /// // Do something with the buffer...
/// ``` /// ```
#[cfg_attr(doc, doc(cfg(feature = "alloc")))] #[cfg_attr(doc, doc(cfg(feature = "alloc")))]
#[derive(Clone, Eq, PartialEq)] #[derive(Clone, Eq, PartialEq)]
pub struct Buffer<T: Serialise> { pub struct Buffer<T> {
buf: Box<[u8]>, buf: Box<[u8]>,
len: usize,
_phanton: PhantomData<T> _phanton: PhantomData<T>
} }
impl<T: Serialise> Buffer<T> { impl<T> Buffer<T> {
/// Allocates a new buffer suitable for (de)serialisation. /// Allocates a new buffer suitable for serialisation.
///
/// The given capacity should be large enough to hold any expected serialisation of `T`.
/// Therefore, if `T` implements [`Serialise`], it is recommended to use [`new`](Self::new) instead, which is equivalent to passing [`MAX_SERIALISED_SIZE`](Serialise::MAX_SERIALISED_SIZE) to this function:
#[inline]
#[must_use] #[must_use]
pub fn new() -> Self { Self { buf: vec![0x00; T::SERIALISED_SIZE].into(), _phanton: PhantomData } } pub fn with_capacity(len: usize) -> Self {
Self {
buf: vec![0x00; len].into(),
len: 0x0,
/// Serialises into the contained buffer. _phanton: PhantomData,
}
}
/// Sets the length of the used buffer.
///
/// The provided size is checked before being written.
/// For the same operation *without* checks, see [`set_len_unchecked`](Self::set_len_unchecked).
///
/// # Panics
///
/// The provided size must not be greater than the buffer's capacity.
/// If this is the case, however, this method will panic.
#[inline(always)] #[inline(always)]
pub fn write(&mut self, value: &T) -> Result<()> { value.serialise(&mut self.buf) } pub fn set_len(&mut self, len: usize) {
assert!(len <= self.capacity(), "cannot extend buffer beyond capacity");
/// Retrieves a pointer to the first byte. self.len = len;
}
/// Sets the length of the used buffer without checks.
///
/// The validity of the provided size is **not** checked before being written.
/// For the same operation *with* checks, see [`set_len`](Self::set_len).
///
/// # Safety
///
/// If the value of `len` is greater than the buffer's capacity, behaviour is undefined.
#[inline(always)]
pub unsafe fn set_len_unchecked(&mut self, len: usize) { self.len = len }
/// Retrieves a pointer to the first byte of the internal buffer.
#[inline(always)] #[inline(always)]
#[must_use] #[must_use]
pub const fn as_ptr(&self) -> *const u8 { self.buf.as_ptr() } pub const fn as_ptr(&self) -> *const u8 { self.buf.as_ptr() }
/// Retrieves a mutable pointer to the first byte. /// Retrieves a mutable pointer to the first byte of the internal buffer.
#[inline(always)] #[inline(always)]
#[must_use] #[must_use]
pub fn as_mut_ptr(&mut self) -> *mut u8 { self.buf.as_mut_ptr() } pub fn as_mut_ptr(&mut self) -> *mut u8 { self.buf.as_mut_ptr() }
/// Gets a slice of the internal buffer. /// Gets a slice of the internal buffer.
///
/// The returned slice will only include the used part of the buffer (as specified by [`len`](Self::len)).
#[inline(always)] #[inline(always)]
#[must_use] #[must_use]
pub const fn as_slice(&self) -> &[u8] { unsafe { core::slice::from_raw_parts(self.as_ptr(), self.len()) } } pub const fn as_slice(&self) -> &[u8] { unsafe { core::slice::from_raw_parts(self.as_ptr(), self.len()) } }
/// Gets a mutable slice of the internal buffer. /// Gets a mutable slice of the internal buffer.
///
/// In contrast to [`as_slice`](Self::as_slice), this method returns a slice of the **entire** internal buffer.
///
/// If the returned reference is written through, the new buffer length -- if different -- should be set using [`set_len`](Self::set_len).
#[inline(always)] #[inline(always)]
#[must_use] #[must_use]
pub fn as_mut_slice(&mut self) -> &mut [u8] { &mut self.buf } pub fn as_mut_slice(&mut self) -> &mut [u8] { &mut self.buf }
/// Gets the length of the buffer. /// Gets the length of the buffer.
///
/// This is defined as (and therefore always equal to) the value of [SERIALISED_SIZE](Serialise::SERIALISED_SIZE) as specified by `T`.
#[allow(clippy::len_without_is_empty)] #[allow(clippy::len_without_is_empty)]
#[inline(always)] #[inline(always)]
#[must_use] #[must_use]
pub const fn len(&self) -> usize { T::SERIALISED_SIZE } pub const fn len(&self) -> usize { self.len }
/// Gets the capacity of the buffer.
///
/// If the buffer was constructed using [`new`](Self::new), this value is exactly the same as [`MAX_SERIALISED_SIZE`](Serialise::MAX_SERIALISED_SIZE).
#[inline(always)]
#[must_use]
pub const fn capacity(&self) -> usize { self.buf.len() }
}
impl<T: Serialise> Buffer<T> {
/// Allocates a new buffer suitable for serialisation.
///
/// The capacity of the internal buffer is set so that any serialisation of `T` may be stored.
///
/// This is equivalent to calling [`with_capacity`](Self::with_capacity) with [`MAX_SERIALISED_SIZE`](Serialise::MAX_SERIALISED_SIZE).
#[inline(always)]
#[must_use]
pub fn new() -> Self { Self::with_capacity(T::MAX_SERIALISED_SIZE) }
/// Serialises into the contained buffer.
///
/// # Errors
///
/// Any error that occurs during serialisation is passed on and returned from this method.
///
/// # Panics
///
/// If the amount of bytes read by [`serialise`](Serialise::serialise) is greater than that specified by [`MAX_SERIALISED_SIZE`](Serialise::MAX_SERIALISED_SIZE), this method panics.
///
/// In reality, however, this error can only be detected if the buffer's capacity is set to a value greater than `MAX_SERIALISED_SIZE` to begin with (e.g. using [`with_capacity`](Self::with_capacity)).
#[inline(always)]
pub fn write<U: Borrow<T>>(&mut self, value: U) -> Result<()> {
let mut stream = Sstream::new(&mut self.buf);
value.borrow().serialise(&mut stream)?;
assert!(stream.len() <= T::MAX_SERIALISED_SIZE);
self.len = stream.len();
Ok(())
}
} }
impl<T: Deserialise> Buffer<T> { impl<T: Deserialise> Buffer<T> {
/// Deserialises from the contained buffer. /// Deserialises from the contained buffer.
///
/// # Errors
///
/// Any error that occurs during deserialisation is passed on and returned from this method.
#[inline(always)] #[inline(always)]
pub fn read(&self) -> Result<T> { T::deserialise(&self.buf) } pub fn read(&self) -> Result<T> {
// We should only pass the used part of the buffer
// to `deserialise`.
let stream = Dstream::new(&self.buf[0x0..self.len()]);
let value = Deserialise::deserialise(&stream)?;
Ok(value)
}
} }
impl<T: Serialise> AsMut<[u8]> for Buffer<T> { impl<T> AsMut<[u8]> for Buffer<T> {
#[inline(always)] #[inline(always)]
fn as_mut(&mut self) -> &mut [u8] { self.as_mut_slice() } fn as_mut(&mut self) -> &mut [u8] { self.as_mut_slice() }
} }
impl<T: Serialise> AsRef<[u8]> for Buffer<T> { impl<T> AsRef<[u8]> for Buffer<T> {
#[inline(always)] #[inline(always)]
fn as_ref(&self) -> &[u8] { self.as_slice() } fn as_ref(&self) -> &[u8] { self.as_slice() }
} }
impl<T: Serialise> Debug for Buffer<T> { impl<T> Debug for Buffer<T> {
#[inline(always)] #[inline(always)]
fn fmt(&self, f: &mut Formatter) -> core::fmt::Result { write!(f, "{:?}", self.as_slice()) } fn fmt(&self, f: &mut Formatter) -> core::fmt::Result { write!(f, "{:?}", self.as_slice()) }
} }
@ -137,19 +226,24 @@ impl<T: Serialise> Default for Buffer<T> {
fn default() -> Self { Self::new() } fn default() -> Self { Self::new() }
} }
impl<T: Serialise> Deref for Buffer<T> { impl<T> Deref for Buffer<T> {
type Target = [u8]; type Target = [u8];
#[inline(always)] #[inline(always)]
fn deref(&self) -> &Self::Target { self.as_slice() } fn deref(&self) -> &Self::Target { self.as_slice() }
} }
impl<T: Serialise> DerefMut for Buffer<T> { impl<T> DerefMut for Buffer<T> {
#[inline(always)] #[inline(always)]
fn deref_mut(&mut self) -> &mut Self::Target { self.as_mut_slice() } fn deref_mut(&mut self) -> &mut Self::Target { self.as_mut_slice() }
} }
impl<T: Serialise> PartialEq<&[u8]> for Buffer<T> { impl<T> PartialEq<&[u8]> for Buffer<T> {
#[inline(always)] #[inline(always)]
fn eq(&self, other: &&[u8]) -> bool { self.as_slice() == *other } fn eq(&self, other: &&[u8]) -> bool { self.as_slice() == *other }
} }
impl<T, const N: usize> PartialEq<[u8; N]> for Buffer<T> {
#[inline(always)]
fn eq(&self, other: &[u8; N]) -> bool { self.as_slice() == other.as_slice() }
}

View file

@ -25,11 +25,11 @@ use crate::{Buffer, Error};
fn test_buffer() { fn test_buffer() {
let mut buf = Buffer::<char>::new(); let mut buf = Buffer::<char>::new();
buf.write(&'\u{1F44D}').unwrap(); buf.write('\u{1F44D}').unwrap();
assert_eq!(buf, [0x00, 0x01, 0xF4, 0x4D].as_slice()); assert_eq!(buf, [0x00, 0x01, 0xF4, 0x4D].as_slice());
buf.as_mut_slice().copy_from_slice(&[0x00, 0x00, 0xD8, 0x00]); buf.as_mut_slice().copy_from_slice(&[0x00, 0x00, 0xD8, 0x00]);
assert!(matches!(buf.read(), Err(Error::InvalidCodePoint { value: 0xD800 }))); assert!(matches!(buf.read(), Err(Error::InvalidCodePoint(0xD800))));
buf.as_mut_slice().copy_from_slice(&[0x00, 0x00, 0xFF, 0x3A]); buf.as_mut_slice().copy_from_slice(&[0x00, 0x00, 0xFF, 0x3A]);
assert_eq!(buf.read().unwrap(), '\u{FF3A}'); assert_eq!(buf.read().unwrap(), '\u{FF3A}');

View file

@ -31,15 +31,12 @@ use core::num::NonZero;
mod tuple; mod tuple;
/// Types capable of being deserialised. /// Denotes a type capable of deserialisation.
/// pub trait Deserialise: Sized {
/// This trait requires [`Serialise`] also being implemented as it relies on the [`SERIALISED_SIZE`](crate::Serialise::SERIALISED_SIZE) constant. /// Deserialises an object from the given d-stream.
pub trait Deserialise: Serialise + Sized {
/// Deserialises a slice into an object.
/// ///
/// This function must **never** take more bytes than specified by [`SERIALISED_SIZE`](crate::Serialise::SERIALISED_SIZE). /// This method must **never** read more bytes than specified by [`MAX_SERIALISED_SIZE`](crate::Serialise::MAX_SERIALISED_SIZE) (if [`Serialise`] is defined, that is).
/// Doing so is considered a logic error. /// Doing so is considered a logic error.
/// Likewise, providing more than this amount is also disfavoured.
/// ///
/// # Errors /// # Errors
/// ///
@ -47,22 +44,20 @@ pub trait Deserialise: Serialise + Sized {
/// ///
/// # Panics /// # Panics
/// ///
/// This method will usually panic if the provided slice has a length *less* than the value of `SERIALISED_SIZE`. /// This method will usually panic if the provided slice has a length *less* than the value of `MAX_SERIALISED_SIZE`.
/// Official implementations of this trait (including those that are derived) always panic in debug mode if the provided slice has a length that is different at all. /// Official implementations of this trait (including those that are derived) always panic in debug mode if the provided slice has a length that is different at all.
fn deserialise(data: &[u8]) -> Result<Self>; fn deserialise(stream: &Dstream) -> Result<Self>;
} }
macro_rules! impl_numeric { macro_rules! impl_numeric {
($ty:ty) => { ($ty:ty) => {
impl ::bzipper::Deserialise for $ty { impl ::bzipper::Deserialise for $ty {
fn deserialise(data: &[u8]) -> ::bzipper::Result<Self> { #[inline]
::core::debug_assert_eq!(data.len(), <Self as ::bzipper::Serialise>::SERIALISED_SIZE); fn deserialise(stream: &Dstream) -> ::bzipper::Result<Self> {
let data = stream
const SIZE: usize = ::core::mem::size_of::<$ty>(); .read(Self::MAX_SERIALISED_SIZE)
.unwrap()
let data = data //.ok_or(::bzipper::Error::EndOfStream { req: Self::MAX_SERIALISED_SIZE, rem: data.len() })?
.get(0x0..SIZE)
.ok_or(::bzipper::Error::EndOfStream { req: SIZE, rem: data.len() })?
.try_into() .try_into()
.unwrap(); .unwrap();
@ -75,34 +70,29 @@ macro_rules! impl_numeric {
macro_rules! impl_non_zero { macro_rules! impl_non_zero {
($ty:ty) => { ($ty:ty) => {
impl ::bzipper::Deserialise for NonZero<$ty> { impl ::bzipper::Deserialise for NonZero<$ty> {
fn deserialise(data: &[u8]) -> ::bzipper::Result<Self> { #[inline]
::core::debug_assert_eq!(data.len(), <Self as ::bzipper::Serialise>::SERIALISED_SIZE); fn deserialise(stream: &Dstream) -> ::bzipper::Result<Self> {
let value = <$ty as ::bzipper::Deserialise>::deserialise(stream)?;
let value = <$ty as ::bzipper::Deserialise>::deserialise(data)?; let value = NonZero::new(value)
.ok_or(Error::NullInteger)?;
NonZero::new(value) Ok(value)
.ok_or(Error::NullInteger)
} }
} }
}; };
} }
impl<T, const N: usize> Deserialise for [T; N] impl<T: Deserialise, const N: usize> Deserialise for [T; N] {
where #[inline]
T: Deserialise { fn deserialise(stream: &Dstream) -> Result<Self> {
fn deserialise(data: &[u8]) -> Result<Self> {
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE);
// Initialise the array incrementally. // Initialise the array incrementally.
let mut buf: [MaybeUninit<T>; N] = unsafe { MaybeUninit::uninit().assume_init() }; let mut buf: [MaybeUninit<T>; N] = unsafe { MaybeUninit::uninit().assume_init() };
let mut pos = 0x0;
for item in &mut buf { for item in &mut buf {
let range = pos..pos + T::SERIALISED_SIZE; let value = T::deserialise(stream)?;
item.write(value);
pos = range.end;
item.write(Deserialise::deserialise(&data[range])?);
} }
// This should be safe as `MaybeUninit<T>` is // This should be safe as `MaybeUninit<T>` is
@ -118,83 +108,80 @@ where
} }
impl Deserialise for bool { impl Deserialise for bool {
fn deserialise(data: &[u8]) -> Result<Self> { #[inline]
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); fn deserialise(stream: &Dstream) -> Result<Self> {
let value = u8::deserialise(stream)?;
let value = u8::deserialise(data)?;
match value { match value {
0x00 => Ok(false), 0x00 => Ok(false),
0x01 => Ok(true), 0x01 => Ok(true),
_ => Err(Error::InvalidBoolean { value }) _ => Err(Error::InvalidBoolean(value))
} }
} }
} }
impl Deserialise for char { impl Deserialise for char {
fn deserialise(data: &[u8]) -> Result<Self> { #[inline]
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); fn deserialise(stream: &Dstream) -> Result<Self> {
let value = u32::deserialise(stream)?;
let value = u32::deserialise(data)?; let value = value
.try_into()
.map_err(|_| Error::InvalidCodePoint(value))?;
Self::from_u32(value) Ok(value)
.ok_or(Error::InvalidCodePoint { value })
} }
} }
impl Deserialise for Infallible { impl Deserialise for Infallible {
#[allow(clippy::panic_in_result_fn)] #[allow(clippy::panic_in_result_fn)]
#[inline(always)] #[inline(always)]
fn deserialise(_data: &[u8]) -> Result<Self> { panic!("cannot deserialise `Infallible` as it cannot be serialised to begin with") } fn deserialise(_stream: &Dstream) -> Result<Self> { panic!("cannot deserialise `Infallible` as it cannot be serialised to begin with") }
} }
impl Deserialise for isize { impl Deserialise for isize {
fn deserialise(data: &[u8]) -> Result<Self> { #[inline]
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); fn deserialise(stream: &Dstream) -> Result<Self> {
let value = i32::deserialise(stream)?;
let value = i32::deserialise(data)? let value = value
.try_into().expect("unable to convert from `i32` to `isize`"); .try_into()
.expect("unable to convert from `i32` to `isize`");
Ok(value) Ok(value)
} }
} }
impl<T: Deserialise> Deserialise for Option<T> { impl<T: Deserialise> Deserialise for Option<T> {
fn deserialise(data: &[u8]) -> Result<Self> { #[allow(clippy::if_then_some_else_none)]
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); #[inline]
fn deserialise(stream: &Dstream) -> Result<Self> {
let sign = bool::deserialise(stream)?;
let stream = Dstream::new(data); let value = if sign {
Some(T::deserialise(stream)?)
let sign = stream.take::<bool>()?;
if sign {
Ok(Some(stream.take::<T>()?))
} else { } else {
Ok(None) None
} };
Ok(value)
} }
} }
impl<T> Deserialise for PhantomData<T> { impl<T> Deserialise for PhantomData<T> {
fn deserialise(data: &[u8]) -> Result<Self> { #[inline(always)]
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); fn deserialise(_stream: &Dstream) -> Result<Self> { Ok(Self) }
Ok(Self)
}
} }
impl<T: Deserialise, E: Deserialise> Deserialise for core::result::Result<T, E> { impl<T: Deserialise, E: Deserialise> Deserialise for core::result::Result<T, E> {
fn deserialise(data: &[u8]) -> Result<Self> { #[inline]
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); fn deserialise(stream: &Dstream) -> Result<Self> {
let sign = bool::deserialise(stream)?;
let stream = Dstream::new(data);
let sign = stream.take::<bool>()?;
let value = if sign { let value = if sign {
Err(stream.take::<E>()?) Err(E::deserialise(stream)?)
} else { } else {
Ok(stream.take::<T>()?) Ok(T::deserialise(stream)?)
}; };
Ok(value) Ok(value)
@ -202,15 +189,18 @@ impl<T: Deserialise, E: Deserialise> Deserialise for core::result::Result<T, E>
} }
impl Deserialise for () { impl Deserialise for () {
fn deserialise(_data: &[u8]) -> Result<Self> { Ok(()) } #[inline(always)]
fn deserialise(_stream: &Dstream) -> Result<Self> { Ok(()) }
} }
impl Deserialise for usize { impl Deserialise for usize {
fn deserialise(data: &[u8]) -> Result<Self> { #[inline]
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); fn deserialise(stream: &Dstream) -> Result<Self> {
let value = u32::deserialise(stream)?;
let value = u32::deserialise(data)? let value = value
.try_into().expect("unable to convert from `u32` to `usize`"); .try_into()
.expect("must be able to convert from `u32` to `usize`");
Ok(value) Ok(value)
} }

View file

@ -19,7 +19,9 @@
// er General Public License along with bzipper. If // er General Public License along with bzipper. If
// not, see <https://www.gnu.org/licenses/>. // not, see <https://www.gnu.org/licenses/>.
use crate::{Deserialise, Serialise}; use core::char;
use crate::{Deserialise, Dstream, Serialise};
#[test] #[test]
fn test() { fn test() {
@ -46,9 +48,10 @@ fn test() {
($ty:ty: $data:expr => $value:expr) => {{ ($ty:ty: $data:expr => $value:expr) => {{
use ::bzipper::{Deserialise, Serialise}; use ::bzipper::{Deserialise, Serialise};
let buf: [u8; <$ty as Serialise>::SERIALISED_SIZE] = $data; let mut buf: [u8; <$ty as Serialise>::MAX_SERIALISED_SIZE] = $data;
let stream = Dstream::new(&mut buf);
let left = <$ty as Deserialise>::deserialise(&buf).unwrap(); let left = <$ty as Deserialise>::deserialise(&stream).unwrap();
let right = $value; let right = $value;
assert_eq!(left, right); assert_eq!(left, right);
@ -80,6 +83,8 @@ fn test() {
0xBF, 0x4F, 0xAF, 0x5F, 0x9F, 0x6F, 0x8F, 0x7F, 0xBF, 0x4F, 0xAF, 0x5F, 0x9F, 0x6F, 0x8F, 0x7F,
] => 0xFF_0F_EF_1F_DF_2F_CF_3F_BF_4F_AF_5F_9F_6F_8F_7F); ] => 0xFF_0F_EF_1F_DF_2F_CF_3F_BF_4F_AF_5F_9F_6F_8F_7F);
test!(char: [0x00, 0x00, 0xFF, 0xFD] => char::REPLACEMENT_CHARACTER);
test!([char; 0x5]: [ test!([char; 0x5]: [
0x00, 0x00, 0x03, 0xBB, 0x00, 0x00, 0x03, 0x91, 0x00, 0x00, 0x03, 0xBB, 0x00, 0x00, 0x03, 0x91,
0x00, 0x00, 0x03, 0xBC, 0x00, 0x00, 0x03, 0x94, 0x00, 0x00, 0x03, 0xBC, 0x00, 0x00, 0x03, 0x94,

View file

@ -19,17 +19,17 @@
// er General Public License along with bzipper. If // er General Public License along with bzipper. If
// not, see <https://www.gnu.org/licenses/>. // not, see <https://www.gnu.org/licenses/>.
use crate::{Deserialise, Result, Serialise}; use crate::{Deserialise, Dstream, Result};
impl<T0> Deserialise for (T0, ) impl<T0> Deserialise for (T0, )
where where
T0: Deserialise, { T0: Deserialise, {
fn deserialise(data: &[u8]) -> Result<Self> { fn deserialise(stream: &Dstream) -> Result<Self> {
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); let value = (
Deserialise::deserialise(stream)?,
);
Ok(( Ok(value)
Deserialise::deserialise(data)?,
))
} }
} }
@ -37,13 +37,13 @@ impl<T0, T1> Deserialise for (T0, T1)
where where
T0: Deserialise, T0: Deserialise,
T1: Deserialise, { T1: Deserialise, {
fn deserialise(data: &[u8]) -> Result<Self> { fn deserialise(stream: &Dstream) -> Result<Self> {
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); let value = (
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
);
Ok(( Ok(value)
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
))
} }
} }
@ -52,14 +52,14 @@ where
T0: Deserialise, T0: Deserialise,
T1: Deserialise, T1: Deserialise,
T2: Deserialise, { T2: Deserialise, {
fn deserialise(data: &[u8]) -> Result<Self> { fn deserialise(stream: &Dstream) -> Result<Self> {
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); let value = (
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
);
Ok(( Ok(value)
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
))
} }
} }
@ -69,15 +69,15 @@ where
T1: Deserialise, T1: Deserialise,
T2: Deserialise, T2: Deserialise,
T3: Deserialise, { T3: Deserialise, {
fn deserialise(data: &[u8]) -> Result<Self> { fn deserialise(stream: &Dstream) -> Result<Self> {
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); let value = (
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
);
Ok(( Ok(value)
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
))
} }
} }
@ -88,16 +88,16 @@ where
T2: Deserialise, T2: Deserialise,
T3: Deserialise, T3: Deserialise,
T4: Deserialise, { T4: Deserialise, {
fn deserialise(data: &[u8]) -> Result<Self> { fn deserialise(stream: &Dstream) -> Result<Self> {
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); let value = (
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
);
Ok(( Ok(value)
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
))
} }
} }
@ -109,17 +109,17 @@ where
T3: Deserialise, T3: Deserialise,
T4: Deserialise, T4: Deserialise,
T5: Deserialise, { T5: Deserialise, {
fn deserialise(data: &[u8]) -> Result<Self> { fn deserialise(stream: &Dstream) -> Result<Self> {
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); let value = (
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
);
Ok(( Ok(value)
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
))
} }
} }
@ -132,18 +132,18 @@ where
T4: Deserialise, T4: Deserialise,
T5: Deserialise, T5: Deserialise,
T6: Deserialise, { T6: Deserialise, {
fn deserialise(data: &[u8]) -> Result<Self> { fn deserialise(stream: &Dstream) -> Result<Self> {
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); let value = (
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
);
Ok(( Ok(value)
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
))
} }
} }
@ -157,19 +157,19 @@ where
T5: Deserialise, T5: Deserialise,
T6: Deserialise, T6: Deserialise,
T7: Deserialise, { T7: Deserialise, {
fn deserialise(data: &[u8]) -> Result<Self> { fn deserialise(stream: &Dstream) -> Result<Self> {
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); let value = (
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
);
Ok(( Ok(value)
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
))
} }
} }
@ -184,20 +184,20 @@ where
T6: Deserialise, T6: Deserialise,
T7: Deserialise, T7: Deserialise,
T8: Deserialise, { T8: Deserialise, {
fn deserialise(data: &[u8]) -> Result<Self> { fn deserialise(stream: &Dstream) -> Result<Self> {
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); let value = (
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
);
Ok(( Ok(value)
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
))
} }
} }
@ -213,21 +213,21 @@ where
T7: Deserialise, T7: Deserialise,
T8: Deserialise, T8: Deserialise,
T9: Deserialise, { T9: Deserialise, {
fn deserialise(data: &[u8]) -> Result<Self> { fn deserialise(stream: &Dstream) -> Result<Self> {
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); let value = (
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
);
Ok(( Ok(value)
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
))
} }
} }
@ -244,22 +244,22 @@ where
T8: Deserialise, T8: Deserialise,
T9: Deserialise, T9: Deserialise,
T10: Deserialise, { T10: Deserialise, {
fn deserialise(data: &[u8]) -> Result<Self> { fn deserialise(stream: &Dstream) -> Result<Self> {
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); let value = (
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
);
Ok(( Ok(value)
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
))
} }
} }
@ -277,22 +277,22 @@ where
T9: Deserialise, T9: Deserialise,
T10: Deserialise, T10: Deserialise,
T11: Deserialise, { T11: Deserialise, {
fn deserialise(data: &[u8]) -> Result<Self> { fn deserialise(stream: &Dstream) -> Result<Self> {
debug_assert_eq!(data.len(), Self::SERIALISED_SIZE); let value = (
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
Deserialise::deserialise(stream)?,
);
Ok(( Ok(value)
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
Deserialise::deserialise(data)?,
))
} }
} }

View file

@ -19,16 +19,17 @@
// er General Public License along with bzipper. If // er General Public License along with bzipper. If
// not, see <https://www.gnu.org/licenses/>. // not, see <https://www.gnu.org/licenses/>.
use crate::{Deserialise, Error, Result}; use crate::{Error, Result};
use core::cell::Cell; use core::cell::Cell;
use core::fmt::{Debug, Formatter};
/// Byte stream for deserialisation. /// Byte stream suitable for deserialisation.
/// ///
/// This type borrows a slice, keeping track internally of the used bytes. /// This type borrows a buffer, keeping track internally of the used bytes.
pub struct Dstream<'a> { pub struct Dstream<'a> {
data: &'a [u8], pub(in crate) data: &'a [u8],
pos: Cell<usize>, pub(in crate) pos: Cell<usize>,
} }
impl<'a> Dstream<'a> { impl<'a> Dstream<'a> {
@ -37,22 +38,84 @@ impl<'a> Dstream<'a> {
#[must_use] #[must_use]
pub const fn new(data: &'a [u8]) -> Self { Self { data, pos: Cell::new(0x0) } } pub const fn new(data: &'a [u8]) -> Self { Self { data, pos: Cell::new(0x0) } }
/// Deserialises an object from the stream. /// Takes (borrows) raw bytes from the stream.
///
/// # Errors
///
/// If the stream doesn't hold at least the amount of bytes specified by [`SERIALISED_SIZE`](crate::Serialise::SERIALISED_SIZE), an [`EndOfStream`](Error::EndOfStream) error is returned.
#[inline] #[inline]
pub fn take<T: Deserialise>(&self) -> Result<T> { pub fn read(&self, count: usize) -> Result<&[u8]> {
let rem = self.data.len() - self.pos.get(); let rem = self.data.len() - self.pos.get();
let req = T::SERIALISED_SIZE; let req = count;
if rem < req { return Err(Error::EndOfStream { req, rem }) }; if rem < req { return Err(Error::EndOfStream { req, rem }) }
let start = self.pos.get(); let start = self.pos.get();
let stop = start + req; let stop = start + req;
self.pos.set(stop); self.pos.set(stop);
T::deserialise(&self.data[start..stop])
let data = &self.data[start..stop];
Ok(data)
} }
/// Gets a pointer to the first byte in the stream.
#[inline(always)]
#[must_use]
pub const fn as_ptr(&self) -> *const u8 { self.data.as_ptr() }
/// Gets a slice of the stream.
#[inline(always)]
#[must_use]
pub const fn as_slice(&self) -> &[u8] {
let ptr = self.as_ptr();
let len = self.len();
unsafe { core::slice::from_raw_parts(ptr, len) }
}
/// Gets the length of the stream.
#[inline(always)]
#[must_use]
pub const fn len(&self) -> usize { unsafe { self.pos.as_ptr().read() } }
/// Tests if the stream is empty.
///
/// If no deserialisations have been made at the time of calling, this method returns `false`.
#[inline(always)]
#[must_use]
pub const fn is_empty(&self) -> bool { self.len() == 0x0 }
/// Tests if the stream is full.
///
/// Note that zero-sized types such as [`()`](unit) can still be deserialised from this stream.
#[inline(always)]
#[must_use]
pub const fn is_full(&self) -> bool { self.len() == self.data.len() }
}
impl Debug for Dstream<'_> {
#[inline(always)]
fn fmt(&self, f: &mut Formatter) -> core::fmt::Result { Debug::fmt(self.as_slice(), f) }
}
impl<'a> From<&'a [u8]> for Dstream<'a> {
#[inline(always)]
fn from(value: &'a [u8]) -> Self { Self::new(value) }
}
impl<'a> From<&'a mut [u8]> for Dstream<'a> {
#[inline(always)]
fn from(value: &'a mut [u8]) -> Self { Self::new(value) }
}
impl PartialEq for Dstream<'_> {
#[inline(always)]
fn eq(&self, other: &Self) -> bool { self.as_slice() == other.as_slice() }
}
impl PartialEq<&[u8]> for Dstream<'_> {
#[inline(always)]
fn eq(&self, other: &&[u8]) -> bool { self.as_slice() == *other }
}
impl<const N: usize> PartialEq<[u8; N]> for Dstream<'_> {
#[inline(always)]
fn eq(&self, other: &[u8; N]) -> bool { self.as_slice() == other.as_slice() }
} }

View file

@ -28,47 +28,53 @@ use alloc::boxed::Box;
/// Mapping of [`core::result::Result`]. /// Mapping of [`core::result::Result`].
pub type Result<T> = core::result::Result<T, Error>; pub type Result<T> = core::result::Result<T, Error>;
/// (De)serialisation failures. /// bzipper errors.
/// ///
/// These variants are used when deserialisation fails. /// These variants are used when deserialisation fails.
/// Serialisations are assumed infallible. /// Serialisations are assumed infallible.
#[derive(Debug)] #[derive(Debug)]
pub enum Error { pub enum Error {
/// An array could not hold the requested amount of elements. /// An array could not hold the requested amount of elements.
ArrayTooShort { req: usize, len: usize }, ArrayTooShort {
/// The required amount of bytes.
req: usize,
/// The remaining amount of bytes.
len: usize,
},
/// A string encountered an invalid UTF-8 sequence. /// A string encountered an invalid UTF-8 sequence.
BadString { source: Utf8Error }, BadString { source: Utf8Error },
/// An implementor-defined error. /// An unspecified (de)serialisation error.
/// ///
/// This is mainly useful if none of the predefined errors are appropriate. /// This is mainly useful if none of the predefined errors are appropriate.
#[cfg(feature = "alloc")] #[cfg(feature = "alloc")]
#[cfg_attr(doc, doc(cfg(feature = "alloc")))] #[cfg_attr(doc, doc(cfg(feature = "alloc")))]
CustomError { source: Box<dyn core::error::Error> }, CustomError(Box<dyn core::error::Error>),
/// Bytes were requested on an empty stream. /// Bytes were requested on an empty stream.
EndOfStream { req: usize, rem: usize }, EndOfStream { req: usize, rem: usize },
/// A boolean encountered a value outside `0` and `1`. /// A boolean encountered a value outside `0` and `1`.
InvalidBoolean { value: u8 }, InvalidBoolean(u8),
/// An invalid code point was encountered. /// An invalid code point was encountered.
/// ///
/// This includes surrogate points in the inclusive range `U+D800` to `U+DFFF`, as well as values larger than `U+10FFFF`. /// This includes surrogate points in the inclusive range `U+D800` to `U+DFFF`, as well as values larger than `U+10FFFF`.
InvalidCodePoint { value: u32 }, InvalidCodePoint(u32),
/// An invalid enumeration descriminant was provided. /// An invalid enumeration descriminant was provided.
InvalidDiscriminant { value: u32 }, InvalidDiscriminant(u32),
/// An `isize` value couldn't fit into `16` bits. /// An `isize` value couldn't fit into `32` bits.
IsizeOutOfRange { value: isize }, IsizeOutOfRange(isize),
/// A non-zero integer encountered the value `0`. /// A non-zero integer encountered the value `0`.
NullInteger, NullInteger,
/// A `usize` value couldn't fit into `16` bits. /// A `usize` value couldn't fit into `32` bits.
UsizeOutOfRange { value: usize }, UsizeOutOfRange(usize),
} }
impl Display for Error { impl Display for Error {
@ -83,28 +89,28 @@ impl Display for Error {
=> write!(f, "unable to parse utf8: \"{source}\""), => write!(f, "unable to parse utf8: \"{source}\""),
#[cfg(feature = "alloc")] #[cfg(feature = "alloc")]
CustomError { ref source } CustomError(ref source)
=> write!(f, "{source}"), => write!(f, "{source}"),
EndOfStream { req, rem } EndOfStream { req, rem }
=> write!(f, "({req}) byte(s) were requested but only ({rem}) byte(s) were left"), => write!(f, "({req}) byte(s) were requested but only ({rem}) byte(s) were left"),
InvalidBoolean { value } InvalidBoolean(value)
=> write!(f, "expected boolean but got {value:#02X}"), => write!(f, "expected boolean but got {value:#02X}"),
InvalidCodePoint { value } InvalidCodePoint(value)
=> write!(f, "code point U+{value:04X} is not valid"), => write!(f, "code point U+{value:04X} is not valid"),
InvalidDiscriminant { value } InvalidDiscriminant(value)
=> write!(f, "discriminant ({value}) is not valid for the given enumeration"), => write!(f, "discriminant ({value}) is not valid for the given enumeration"),
IsizeOutOfRange { value } IsizeOutOfRange(value)
=> write!(f, "signed size value ({value}) cannot be serialised: must be in the range ({}) to ({})", i16::MIN, i16::MAX), => write!(f, "signed size value ({value}) cannot be serialised: must be in the range ({}) to ({})", i16::MIN, i16::MAX),
NullInteger NullInteger
=> write!(f, "expected non-zero integer but got (0)"), => write!(f, "expected non-zero integer but got (0)"),
UsizeOutOfRange { value } UsizeOutOfRange(value)
=> write!(f, "unsigned size value ({value}) cannot be serialised: must be at most ({})", u16::MAX), => write!(f, "unsigned size value ({value}) cannot be serialised: must be at most ({})", u16::MAX),
} }
} }
@ -118,7 +124,7 @@ impl core::error::Error for Error {
BadString { ref source } => Some(source), BadString { ref source } => Some(source),
#[cfg(feature = "alloc")] #[cfg(feature = "alloc")]
CustomError { ref source } => Some(source.as_ref()), CustomError(ref source) => Some(source.as_ref()),
_ => None, _ => None,
} }

View file

@ -1,46 +0,0 @@
// Copyright 2024 Gabriel Bjørnager Jensen.
//
// This file is part of bzipper.
//
// bzipper is free software: you can redistribute
// it and/or modify it under the terms of the GNU
// Lesser General Public License as published by
// the Free Software Foundation, either version 3
// of the License, or (at your option) any later
// version.
//
// bzipper is distributed in the hope that it will
// be useful, but WITHOUT ANY WARRANTY; without
// even the implied warranty of MERCHANTABILITY or
// FITNESS FOR A PARTICULAR PURPOSE. See the GNU
// Lesser General Public License for more details.
//
// You should have received a copy of the GNU Less-
// er General Public License along with bzipper. If
// not, see <https://www.gnu.org/licenses/>.
use core::mem::MaybeUninit;
/// Iterator to a fixed vector.
///
/// This type is used by the [`FixedString`](crate::FixedString) type for iterating over an owned string.
#[must_use]
pub struct FixedIter<T, const N: usize> {
pub(in crate) buf: [MaybeUninit<T>; N],
pub(in crate) pos: usize,
pub(in crate) len: usize,
}
impl<T, const N: usize> Iterator for FixedIter<T, N> {
type Item = T;
fn next(&mut self) -> Option<Self::Item> {
if self.pos >= self.len { return None };
let item = unsafe { self.buf[self.pos].assume_init_read() };
self.pos += 0x1;
Some(item)
}
}

View file

@ -22,39 +22,55 @@
#[cfg(test)] #[cfg(test)]
mod test; mod test;
use crate::{Deserialise, Error, FixedIter, Serialise}; use crate::{
Deserialise,
Dstream,
Error,
Serialise,
Sstream,
};
use core::borrow::{Borrow, BorrowMut};
use core::cmp::Ordering; use core::cmp::Ordering;
use core::fmt::{Debug, Display, Formatter}; use core::fmt::{Debug, Display, Formatter};
use core::mem::MaybeUninit; use core::hash::{Hash, Hasher};
use core::ops::{Deref, DerefMut, Index, IndexMut}; use core::ops::{Add, AddAssign, Deref, DerefMut, Index, IndexMut};
use core::slice::SliceIndex; use core::slice::SliceIndex;
use core::str::FromStr; use core::str::{Chars, CharIndices, FromStr};
#[cfg(feature = "alloc")] #[cfg(feature = "alloc")]
use alloc::string::{String, ToString}; use alloc::string::String;
/// Owned string with maximum size. #[cfg(feature = "std")]
use std::ffi::OsStr;
#[cfg(feature = "std")]
use std::net::ToSocketAddrs;
#[cfg(feature = "std")]
use std::path::Path;
/// Heap-allocated string with maximum size.
/// ///
/// This is in contrast to [String] -- which has no size limit in practice -- and [str], which is unsized. /// This is in contrast to [String] -- which has no size limit in practice -- and [str], which is unsized.
/// ///
/// The string itself is encoded in UTF-8 for interoperability wtih Rust's standard string facilities, as well as for memory concerns.
///
/// Keep in mind that the size limit specified by `N` denotes *bytes* and not *characters* -- i.e. a value of `8` may translate to between two and eight characters, depending on their codepoints.
///
/// # Examples /// # Examples
/// ///
/// All instances of this type have the same size if the value of `N` is also the same. /// All instances of this type have the same size if the value of `N` is also the same.
/// This size can be found through
///
/// `size_of::<char>() * N + size_of::<usize>()`.
///
/// Therefore, the following four strings have -- despite their different contents -- the same total size. /// Therefore, the following four strings have -- despite their different contents -- the same total size.
/// ///
/// ``` /// ```rust
/// use bzipper::FixedString; /// use bzipper::FixedString;
/// use std::str::FromStr; /// use std::str::FromStr;
/// ///
/// let str0 = FixedString::<0xF>::new(); // Empty string. /// let str0 = FixedString::<0x40>::new(); // Empty string.
/// let str1 = FixedString::<0xF>::from_str("Hello there!"); /// let str1 = FixedString::<0x40>::from_str("Hello there!").unwrap();
/// let str2 = FixedString::<0xF>::from_str("أنا من أوروپا"); /// let str2 = FixedString::<0x40>::from_str("أنا من أوروپا").unwrap();
/// let str3 = FixedString::<0xF>::from_str("COGITO ERGO SUM"); /// let str3 = FixedString::<0x40>::from_str("COGITO ERGO SUM").unwrap();
/// ///
/// assert_eq!(size_of_val(&str0), size_of_val(&str1)); /// assert_eq!(size_of_val(&str0), size_of_val(&str1));
/// assert_eq!(size_of_val(&str0), size_of_val(&str2)); /// assert_eq!(size_of_val(&str0), size_of_val(&str2));
@ -64,10 +80,10 @@ use alloc::string::{String, ToString};
/// assert_eq!(size_of_val(&str2), size_of_val(&str3)); /// assert_eq!(size_of_val(&str2), size_of_val(&str3));
/// ``` /// ```
/// ///
/// These three strings can---by extend in theory---also interchange their contents between each other. /// These three strings can -- by extend in theory -- also interchange their contents between each other.
#[derive(Clone, Deserialise, Serialise)] #[derive(Clone)]
pub struct FixedString<const N: usize> { pub struct FixedString<const N: usize> {
buf: [char; N], buf: [u8; N],
len: usize, len: usize,
} }
@ -81,62 +97,78 @@ impl<const N: usize> FixedString<N> {
/// The constructed string will have a null length. /// The constructed string will have a null length.
/// All characters inside the internal buffer are instanced as `U+0000 NULL`. /// All characters inside the internal buffer are instanced as `U+0000 NULL`.
/// ///
/// For constructing a string with an already defined buffer, see [`from_chars`](Self::from_chars) and [`from_raw_parts`](Self::from_raw_parts). /// For constructing a string with an already defined buffer, see [`from_raw_parts`](Self::from_raw_parts) and [`from_str`](Self::from_str).
#[inline(always)] #[inline(always)]
#[must_use] #[must_use]
pub const fn new() -> Self { Self { buf: ['\0'; N], len: 0x0 } } pub const fn new() -> Self { Self { buf: [0x00; N], len: 0x0 } }
/// Consumes the buffer into a fixed string. /// Constructs a new, fixed-size string from raw parts.
/// ///
/// The internal length is to `N`. /// The provided parts are not tested in any way.
/// For a similar function but with an explicit size, see [`from_raw_parts`](Self::from_raw_parts).
#[inline(always)]
#[must_use]
pub const fn from_chars(buf: [char; N]) -> Self { Self { buf, len: N } }
/// Constructs a fixed string from raw parts.
#[inline(always)]
#[must_use]
pub const fn from_raw_parts(buf: [char; N], len: usize) -> Self { Self { buf, len } }
/// Deconstructs a fixed string into its raw parts.
#[inline(always)]
#[must_use]
pub const fn into_raw_parts(self) -> ([char; N], usize) { (self.buf, self.len) }
/// Gets a pointer to the first character.
#[inline(always)]
#[must_use]
pub const fn as_ptr(&self) -> *const char { self.buf.as_ptr() }
/// Gets a mutable pointer to the first character.
/// ///
/// This function can only be marked as `const` when `const_mut_refs` is implemented. /// # Safety
/// See tracking issue [`#57349`](https://github.com/rust-lang/rust/issues/57349/) for more information. ///
/// The value of `len` may not exceed that of `N`.
/// Additionally, the octets in `buf` (from index zero up to the value of `len`) must be valid UTF-8 codepoints.
///
/// If any of these requirements are violated, behaviour is undefined.
#[inline(always)] #[inline(always)]
#[must_use] #[must_use]
pub fn as_mut_ptr(&mut self) -> *mut char { self.buf.as_mut_ptr() } pub const unsafe fn from_raw_parts(buf: [u8; N], len: usize) -> Self { Self { buf, len } }
/// Borrows the string as a character slice. /// Destructs the provided string into its raw parts.
///
/// The returned values are valid to pass on to [`from_raw_parts`](Self::from_raw_parts).
///
/// The returned byte array is guaranteed to be fully initialised.
/// However, only octets up to an index of [`len`](Self::len) are also guaranteed to be valid UTF-8 codepoints.
#[inline(always)]
#[must_use]
pub const fn into_raw_parts(self) -> ([u8; N], usize) { (self.buf, self.len) }
/// Gets a pointer to the first octet.
#[inline(always)]
#[must_use]
pub const fn as_ptr(&self) -> *const u8 { self.buf.as_ptr() }
// This function can only be marked as `const` when
// `const_mut_refs` is implemented. See tracking
// issue #57349 for more information.
/// Gets a mutable pointer to the first octet.
///
#[inline(always)]
#[must_use]
pub fn as_mut_ptr(&mut self) -> *mut u8 { self.buf.as_mut_ptr() }
/// Borrows the string as a byte slice.
/// ///
/// The range of the returned slice only includes characters that are "used." /// The range of the returned slice only includes characters that are "used."
/// For borrowing the entire internal buffer, see [`as_mut_slice`](Self::as_mut_slice).
#[inline(always)] #[inline(always)]
#[must_use] #[must_use]
pub const fn as_slice(&self) -> &[char] { pub const fn as_bytes(&self) -> &[u8] {
// We need to use `from_raw_parts` to mark this // We need to use `from_raw_parts` to mark this
// function `const`. // function `const`.
unsafe { core::slice::from_raw_parts(self.as_ptr(), self.len()) } unsafe { core::slice::from_raw_parts(self.as_ptr(), self.len()) }
} }
/// Mutably borrows the string as a character slice. /// Borrows the string as a string slice.
/// ///
/// The range of the returned slice includes the entire internal buffer. /// The range of the returned slice only includes characters that are "used."
/// For borrowing only the "used" characters, see [`as_slice`](Self::as_slice).
#[inline(always)] #[inline(always)]
#[must_use] #[must_use]
pub fn as_mut_slice(&mut self) -> &mut [char] { &mut self.buf[0x0..self.len] } pub const fn as_str(&self) -> &str { unsafe { core::str::from_utf8_unchecked(self.as_bytes()) } }
/// Mutably borrows the string as a string slice.
///
/// The range of the returned slice only includes characters that are "used."
#[inline(always)]
#[must_use]
pub fn as_mut_str(&mut self) -> &mut str {
let range = 0x0..self.len();
unsafe { core::str::from_utf8_unchecked_mut(&mut self.buf[range]) }
}
/// Returns the length of the string. /// Returns the length of the string.
/// ///
@ -145,28 +177,32 @@ impl<const N: usize> FixedString<N> {
#[must_use] #[must_use]
pub const fn len(&self) -> usize { self.len } pub const fn len(&self) -> usize { self.len }
/// Checks if the string is empty, i.e. `self.len() == 0x0`. /// Checks if the string is empty, i.e. no characters are contained.
#[inline(always)] #[inline(always)]
#[must_use] #[must_use]
pub const fn is_empty(&self) -> bool { self.len() == 0x0 } pub const fn is_empty(&self) -> bool { self.len() == 0x0 }
/// Checks if the string is full, i.e. `self.len() == N`. /// Checks if the string is full, i.e. it cannot hold any more characters.
#[inline(always)] #[inline(always)]
#[must_use] #[must_use]
pub const fn is_full(&self) -> bool { self.len() == N } pub const fn is_full(&self) -> bool { self.len() == N }
/// Sets the internal length. /// Returns the total capacity of the string.
/// ///
/// The length is compared with `N` to guarantee that bounds are honoured. /// This is defined as being exactly the value of `N`.
///
/// # Panics
///
/// This method panics if the value of `len` is greater than that of `N`.
#[inline(always)] #[inline(always)]
pub fn set_len(&mut self, len: usize) { #[must_use]
assert!(self.len <= N, "cannot set length longer than the fixed size"); pub const fn capacity(&self) -> usize { N }
self.len = len;
} /// Gets a substring of the string.
#[inline(always)]
#[must_use]
pub fn get<I: SliceIndex<str>>(&self, index: I) -> Option<&I::Output> { self.as_str().get(index) }
/// Gets a mutable substring of the string.
#[inline(always)]
#[must_use]
pub fn get_mut<I: SliceIndex<str>>(&mut self, index: I) -> Option<&mut I::Output> { self.as_mut_str().get_mut(index) }
/// Pushes a character into the string. /// Pushes a character into the string.
/// ///
@ -174,13 +210,34 @@ impl<const N: usize> FixedString<N> {
/// ///
/// # Panics /// # Panics
/// ///
/// If the string cannot hold any more character (i.e. it is full), this method will panic. /// If the string cannot hold the provided character *after* encoding, this method will panic.
#[inline(always)] #[inline(always)]
pub fn push(&mut self, c: char) { pub fn push(&mut self, c: char) {
assert!(!self.is_full(), "cannot push character to full string"); let mut buf = [0x00; 0x4];
let s = c.encode_utf8(&mut buf);
self.buf[self.len] = c; self.push_str(s);
self.len += 0x1; }
/// Pushes a string slice into the string.
///
/// The internal length is updated accordingly.
///
/// # Panics
///
/// If the string cannot hold the provided slice, this method will panic.
#[inline(always)]
pub fn push_str(&mut self, s: &str) {
let rem = self.buf.len() - self.len;
let req = s.len();
assert!(rem >= req, "cannot push string beyond fixed length");
let start = self.len;
let stop = start + req;
let buf = &mut self.buf[start..stop];
buf.copy_from_slice(s.as_bytes());
} }
/// Pops a character from the string. /// Pops a character from the string.
@ -188,38 +245,76 @@ impl<const N: usize> FixedString<N> {
/// The internal length is updated accordingly. /// The internal length is updated accordingly.
/// ///
/// If no characters are left (i.e. the string is empty), an instance of [`None`] is returned. /// If no characters are left (i.e. the string is empty), an instance of [`None`] is returned.
///
/// **Note that this method is currently unimplemented.**
#[deprecated = "temporarily unimplemented"]
#[inline(always)] #[inline(always)]
pub fn pop(&mut self) -> Option<char> { pub fn pop(&mut self) -> Option<char> { todo!() }
self.len
.checked_sub(0x1)
.map(|len| {
let c = self.buf[self.len];
self.len = len;
c /// Returns an iterator of the string's characters.
}) #[inline(always)]
pub fn chars(&self) -> Chars { self.as_str().chars() }
/// Returns an iterator of the string's characters along with their positions.
#[inline(always)]
pub fn char_indices(&self) -> CharIndices { self.as_str().char_indices() }
}
impl<const N: usize> Add<&str> for FixedString<N> {
type Output = Self;
fn add(mut self, rhs: &str) -> Self::Output {
self.push_str(rhs);
self
} }
} }
impl<const N: usize> AsMut<[char]> for FixedString<N> { impl<const N: usize> AddAssign<&str> for FixedString<N> {
#[inline(always)] fn add_assign(&mut self, rhs: &str) { self.push_str(rhs) }
fn as_mut(&mut self) -> &mut [char] { self.as_mut_slice() }
} }
impl<const N: usize> AsRef<[char]> for FixedString<N> { impl<const N: usize> AsMut<str> for FixedString<N> {
#[inline(always)] #[inline(always)]
fn as_ref(&self) -> &[char] { self.as_slice() } fn as_mut(&mut self) -> &mut str { self.as_mut_str() }
}
#[cfg(feature = "std")]
#[cfg_attr(doc, doc(cfg(feature = "std")))]
impl<const N: usize> AsRef<OsStr> for FixedString<N> {
#[inline(always)]
fn as_ref(&self) -> &OsStr { self.as_str().as_ref() }
}
#[cfg(feature = "std")]
#[cfg_attr(doc, doc(cfg(feature = "std")))]
impl<const N: usize> AsRef<Path> for FixedString<N> {
#[inline(always)]
fn as_ref(&self) -> &Path { self.as_str().as_ref() }
}
impl<const N: usize> AsRef<str> for FixedString<N> {
#[inline(always)]
fn as_ref(&self) -> &str { self.as_str() }
}
impl<const N: usize> AsRef<[u8]> for FixedString<N> {
#[inline(always)]
fn as_ref(&self) -> &[u8] { self.as_bytes() }
}
impl<const N: usize> Borrow<str> for FixedString<N> {
#[inline(always)]
fn borrow(&self) -> &str { self.as_str() }
}
impl<const N: usize> BorrowMut<str> for FixedString<N> {
#[inline(always)]
fn borrow_mut(&mut self) -> &mut str { self.as_mut_str() }
} }
impl<const N: usize> Debug for FixedString<N> { impl<const N: usize> Debug for FixedString<N> {
#[inline] #[inline]
fn fmt(&self, f: &mut Formatter) -> core::fmt::Result { fn fmt(&self, f: &mut Formatter) -> core::fmt::Result { Debug::fmt(self.as_str(), f) }
write!(f, "\"")?;
for c in self { write!(f, "{}", c.escape_debug())? }
write!(f, "\"")?;
Ok(())
}
} }
impl<const N: usize> Default for FixedString<N> { impl<const N: usize> Default for FixedString<N> {
@ -227,159 +322,129 @@ impl<const N: usize> Default for FixedString<N> {
fn default() -> Self { Self { buf: [Default::default(); N], len: 0x0 } } fn default() -> Self { Self { buf: [Default::default(); N], len: 0x0 } }
} }
/// See [`as_slice`](Self::as_slice).
impl<const N: usize> Deref for FixedString<N> { impl<const N: usize> Deref for FixedString<N> {
type Target = [char]; type Target = str;
#[inline(always)] #[inline(always)]
fn deref(&self) -> &Self::Target { self.as_slice() } fn deref(&self) -> &Self::Target { self.as_str() }
} }
/// See [`as_mut_slice`](Self::as_mut_slice).
impl<const N: usize> DerefMut for FixedString<N> { impl<const N: usize> DerefMut for FixedString<N> {
#[inline(always)] #[inline(always)]
fn deref_mut(&mut self) -> &mut Self::Target { self.as_mut_slice() } fn deref_mut(&mut self) -> &mut Self::Target { self.as_mut_str() }
}
impl<const N: usize> Deserialise for FixedString<N> {
#[inline]
fn deserialise(stream: &Dstream) -> Result<Self, Error> {
let len = Deserialise::deserialise(stream)?;
if len > N { return Err(Error::ArrayTooShort { req: len, len: N }) };
let bytes = stream.read(len)?;
let s = core::str::from_utf8(bytes)
.map_err(|e| Error::BadString { source: e })?;
Self::from_str(s)
}
} }
impl<const N: usize> Display for FixedString<N> { impl<const N: usize> Display for FixedString<N> {
#[inline] #[inline]
fn fmt(&self, f: &mut Formatter) -> core::fmt::Result { fn fmt(&self, f: &mut Formatter) -> core::fmt::Result { Display::fmt(self.as_str(), f) }
for c in self { write!(f, "{c}")? }
Ok(())
}
} }
impl<const N: usize> Eq for FixedString<N> { } impl<const N: usize> Eq for FixedString<N> { }
impl<const N: usize> From<[char; N]> for FixedString<N> {
#[inline(always)]
fn from(value: [char; N]) -> Self { Self::from_chars(value) }
}
impl<const N: usize> FromStr for FixedString<N> { impl<const N: usize> FromStr for FixedString<N> {
type Err = Error; type Err = Error;
#[inline] #[inline]
fn from_str(s: &str) -> Result<Self, Self::Err> { fn from_str(s: &str) -> Result<Self, Self::Err> {
let mut buf = [Default::default(); N]; let len = s.len();
let len = s.chars().count(); if len > N { return Err(Error::ArrayTooShort { req: len, len: N }) };
for (i, c) in s.chars().enumerate() { let mut buf = [0x00; N];
if i >= N { return Err(Error::ArrayTooShort { req: len, len: N }) } unsafe { core::ptr::copy_nonoverlapping(s.as_ptr(), buf.as_mut_ptr(), len) };
buf[i] = c; // The remaining bytes are already initialised to
} // null.
Ok(Self { buf, len }) Ok(Self { buf, len })
} }
} }
impl<I: SliceIndex<[char]>, const N: usize> Index<I> for FixedString<N> { impl<const N: usize> Hash for FixedString<N> {
type Output = I::Output; #[inline(always)]
fn hash<H: Hasher>(&self, state: &mut H) { self.as_str().hash(state) }
}
impl<I: SliceIndex<str>, const N: usize> Index<I> for FixedString<N> {
type Output = I::Output;
#[inline(always)] #[inline(always)]
fn index(&self, index: I) -> &Self::Output { self.get(index).unwrap() } fn index(&self, index: I) -> &Self::Output { self.get(index).unwrap() }
} }
impl<I: SliceIndex<[char]>, const N: usize> IndexMut<I> for FixedString<N> { impl<I: SliceIndex<str>, const N: usize> IndexMut<I> for FixedString<N> {
#[inline(always)] #[inline(always)]
fn index_mut(&mut self, index: I) -> &mut Self::Output { self.get_mut(index).unwrap() } fn index_mut(&mut self, index: I) -> &mut Self::Output { self.get_mut(index).unwrap() }
} }
impl<const N: usize> IntoIterator for FixedString<N> {
type Item = char;
type IntoIter = FixedIter<char, N>;
#[inline(always)]
fn into_iter(self) -> Self::IntoIter {
FixedIter {
buf: unsafe { self.buf.as_ptr().cast::<[MaybeUninit<char>; N]>().read() },
pos: 0x0,
len: self.len,
}
}
}
impl<'a, const N: usize> IntoIterator for &'a FixedString<N> {
type Item = &'a char;
type IntoIter = core::slice::Iter<'a, char>;
#[inline(always)]
fn into_iter(self) -> Self::IntoIter { self.iter() }
}
impl<'a, const N: usize> IntoIterator for &'a mut FixedString<N> {
type Item = &'a mut char;
type IntoIter = core::slice::IterMut<'a, char>;
#[inline(always)]
fn into_iter(self) -> Self::IntoIter { self.iter_mut() }
}
impl<const N: usize> Ord for FixedString<N> { impl<const N: usize> Ord for FixedString<N> {
#[inline(always)] #[inline(always)]
fn cmp(&self, other: &Self) -> Ordering { self.partial_cmp(other).unwrap() } fn cmp(&self, other: &Self) -> Ordering { self.as_str().cmp(other.as_str()) }
} }
impl<const N: usize, const M: usize> PartialEq<FixedString<M>> for FixedString<N> { impl<const N: usize, const M: usize> PartialEq<FixedString<M>> for FixedString<N> {
#[inline(always)] #[inline(always)]
fn eq(&self, other: &FixedString<M>) -> bool { self.as_slice() == other.as_slice() } fn eq(&self, other: &FixedString<M>) -> bool { self.as_str() == other.as_str() }
}
impl<const N: usize> PartialEq<&[char]> for FixedString<N> {
#[inline(always)]
fn eq(&self, other: &&[char]) -> bool { self.as_slice() == *other }
} }
impl<const N: usize> PartialEq<&str> for FixedString<N> { impl<const N: usize> PartialEq<&str> for FixedString<N> {
#[inline] #[inline(always)]
fn eq(&self, other: &&str) -> bool { fn eq(&self, other: &&str) -> bool { self.as_str() == *other }
for (i, c) in other.chars().enumerate() {
if self.get(i) != Some(&c) { return false };
}
true
}
} }
impl<const N: usize, const M: usize> PartialOrd<FixedString<M>> for FixedString<N> { impl<const N: usize, const M: usize> PartialOrd<FixedString<M>> for FixedString<N> {
#[inline(always)] #[inline(always)]
fn partial_cmp(&self, other: &FixedString<M>) -> Option<Ordering> { self.partial_cmp(&other.as_slice()) } fn partial_cmp(&self, other: &FixedString<M>) -> Option<Ordering> { self.as_str().partial_cmp(other.as_str()) }
}
impl<const N: usize> PartialOrd<&[char]> for FixedString<N> {
#[inline(always)]
fn partial_cmp(&self, other: &&[char]) -> Option<Ordering> { self.as_slice().partial_cmp(other) }
} }
impl<const N: usize> PartialOrd<&str> for FixedString<N> { impl<const N: usize> PartialOrd<&str> for FixedString<N> {
#[inline] #[inline(always)]
fn partial_cmp(&self, other: &&str) -> Option<Ordering> { fn partial_cmp(&self, other: &&str) -> Option<Ordering> { self.as_str().partial_cmp(*other) }
let llen = self.len(); }
let rlen = other.chars().count();
match llen.cmp(&rlen) { impl<const N: usize> Serialise for FixedString<N> {
Ordering::Equal => {}, const MAX_SERIALISED_SIZE: usize = N + usize::MAX_SERIALISED_SIZE;
ordering => return Some(ordering), fn serialise(&self, stream: &mut Sstream) -> Result<(), Error> {
}; self.len().serialise(stream)?;
stream.write(self.as_bytes())?;
for (i, rc) in other.chars().enumerate() { Ok(())
let lc = self[i]; }
}
match lc.cmp(&rc) { #[cfg(feature = "std")]
Ordering::Equal => {}, #[cfg_attr(doc, doc(cfg(feature = "std")))]
impl<const N: usize> ToSocketAddrs for FixedString<N> {
type Iter = <str as ToSocketAddrs>::Iter;
ordering => return Some(ordering), #[inline(always)]
} fn to_socket_addrs(&self) -> std::io::Result<Self::Iter> { self.as_str().to_socket_addrs() }
} }
Some(Ordering::Equal) impl<const N: usize> TryFrom<char> for FixedString<N> {
type Error = <Self as FromStr>::Err;
#[inline(always)]
fn try_from(value: char) -> Result<Self, Self::Error> {
let mut buf = [0x00; 0x4];
let s = value.encode_utf8(&mut buf);
s.parse()
} }
} }
@ -391,6 +456,7 @@ impl<const N: usize> TryFrom<&str> for FixedString<N> {
} }
#[cfg(feature = "alloc")] #[cfg(feature = "alloc")]
#[cfg_attr(doc, doc(cfg(feature = "alloc")))]
impl<const N: usize> TryFrom<String> for FixedString<N> { impl<const N: usize> TryFrom<String> for FixedString<N> {
type Error = <Self as FromStr>::Err; type Error = <Self as FromStr>::Err;
@ -398,8 +464,17 @@ impl<const N: usize> TryFrom<String> for FixedString<N> {
fn try_from(value: String) -> Result<Self, Self::Error> { Self::from_str(&value) } fn try_from(value: String) -> Result<Self, Self::Error> { Self::from_str(&value) }
} }
/// Converts the fixed-size string into a dynamic string.
///
/// The capacity of the resulting [`String`] object is equal to the value of `N`.
#[cfg(feature = "alloc")] #[cfg(feature = "alloc")]
#[cfg_attr(doc, doc(cfg(feature = "alloc")))]
impl<const N: usize> From<FixedString<N>> for String { impl<const N: usize> From<FixedString<N>> for String {
#[inline(always)] #[inline(always)]
fn from(value: FixedString<N>) -> Self { value.to_string() } fn from(value: FixedString<N>) -> Self {
let mut s = Self::with_capacity(N);
s.push_str(value.as_str());
s
}
} }

View file

@ -25,9 +25,9 @@ use core::cmp::Ordering;
#[test] #[test]
fn test_fixed_string() { fn test_fixed_string() {
let str0 = FixedString::<0xC>::try_from("Hello there!").unwrap(); let str0 = FixedString::<0x0C>::try_from("Hello there!").unwrap();
let str1 = FixedString::<0xE>::try_from("MEIN_GRO\u{1E9E}_GOTT").unwrap(); let str1 = FixedString::<0x12>::try_from("MEIN_GRO\u{1E9E}_GOTT").unwrap();
let str2 = FixedString::<0x5>::try_from("Hello").unwrap(); let str2 = FixedString::<0x05>::try_from("Hello").unwrap();
assert_eq!(str0.partial_cmp(&str0), Some(Ordering::Equal)); assert_eq!(str0.partial_cmp(&str0), Some(Ordering::Equal));
assert_eq!(str0.partial_cmp(&str1), Some(Ordering::Less)); assert_eq!(str0.partial_cmp(&str1), Some(Ordering::Less));

View file

@ -23,7 +23,7 @@
//! Binary (de)serialisation. //! Binary (de)serialisation.
//! //!
//! Contrary to [Serde](https://crates.io/crates/serde/)/[Bincode](https://crates.io/crates/bincode/), the goal of bzipper is to serialise with a known size constraint. //! In contrast to [Serde](https://crates.io/crates/serde/)/[Bincode](https://crates.io/crates/bincode/), the primary goal of bzipper is to serialise with a known size constraint.
//! Therefore, this crate may be more suited for networking or other cases where a fixed-sized buffer is needed. //! Therefore, this crate may be more suited for networking or other cases where a fixed-sized buffer is needed.
//! //!
//! Keep in mind that this project is still work-in-progress. //! Keep in mind that this project is still work-in-progress.
@ -41,15 +41,15 @@
//! //!
//! # Usage //! # Usage
//! //!
//! This crate revolves around the [`Serialise`] and [`Deserialise`] traits, both of which are commonly used in conjunction with streams (more specifically, [s-streams](Sstream) and [d-streams](Dstream)). //! This crate revolves around the [`Serialise`] and [`Deserialise`] traits, both of which use *streams* -- or more specifically -- [s-streams](Sstream) and [d-streams](Dstream).
//! //!
//! Many core types come implemented with bzipper, including primitives as well as some standard library types such as [`Option`] and [`Result`](core::result::Result). //! Many core types come implemented with bzipper, including primitives as well as some standard library types such as [`Option`] and [`Result`](core::result::Result).
//! //!
//! It is recommended in most cases to just derive these traits for custom types (enumerations and structures only). //! It is recommended in most cases to just derive these two traits for custom types (although this is only supported with enumerations and structures).
//! Here, each field is chained in declaration order: //! Here, each field is *chained* according to declaration order:
//! //!
//! ``` //! ```
//! use bzipper::{Deserialise, Serialise}; //! use bzipper::{Buffer, Deserialise, Serialise};
//! //!
//! #[derive(Debug, Deserialise, PartialEq, Serialise)] //! #[derive(Debug, Deserialise, PartialEq, Serialise)]
//! struct IoRegister { //! struct IoRegister {
@ -57,45 +57,55 @@
//! value: u16, //! value: u16,
//! } //! }
//! //!
//! let mut buf: [u8; IoRegister::SERIALISED_SIZE] = Default::default(); //! let mut buf = Buffer::new();
//! IoRegister { addr: 0x04000000, value: 0x0402 }.serialise(&mut buf).unwrap();
//! //!
//! buf.write(IoRegister { addr: 0x04000000, value: 0x0402 }).unwrap();
//!
//! assert_eq!(buf.len(), 0x6);
//! assert_eq!(buf, [0x04, 0x00, 0x00, 0x00, 0x04, 0x02]); //! assert_eq!(buf, [0x04, 0x00, 0x00, 0x00, 0x04, 0x02]);
//! //!
//! assert_eq!(IoRegister::deserialise(&buf).unwrap(), IoRegister { addr: 0x04000000, value: 0x0402 }); //! assert_eq!(buf.read().unwrap(), IoRegister { addr: 0x04000000, value: 0x0402 });
//! ``` //! ```
//! //!
//! ## Serialisation //! ## Serialisation
//! //!
//! To serialise an object implementing `Serialise`, simply allocate a buffer for the serialisation. //! To serialise an object implementing `Serialise`, simply allocate a buffer for the serialisation and wrap it in an s-stream (*serialisation stream*) with the [`Sstream`] type.
//! The required size of any given serialisation is specified by the [`SERIALISED_SIZE`](Serialise::SERIALISED_SIZE) constant:
//! //!
//! ``` //! ```
//! use bzipper::Serialise; //! use bzipper::{Serialise, Sstream};
//! //!
//! let mut buf: [u8; char::SERIALISED_SIZE] = Default::default(); //! let mut buf = [Default::default(); char::MAX_SERIALISED_SIZE];
//! 'Ж'.serialise(&mut buf).unwrap(); //! let mut stream = Sstream::new(&mut buf);
//! //!
//! assert_eq!(buf, [0x00, 0x00, 0x04, 0x16]); //! 'Ж'.serialise(&mut stream).unwrap();
//!
//! assert_eq!(stream, [0x00, 0x00, 0x04, 0x16]);
//! ``` //! ```
//! //!
//! The only special requirement of the [`serialise`](Serialise::serialise) method is that the provided byte slice has an element count of exactly `SERIALISED_SIZE`. //! The maximum size of any given serialisation is specified by the [`MAX_SERIALISED_SIZE`](Serialise::MAX_SERIALISED_SIZE) constant.
//! //!
//! We can also use streams to *chain* multiple elements together. //! We can also use streams to chain multiple elements together:
//! //!
//! ``` //! ```
//! use bzipper::Serialise; //! use bzipper::{Serialise, Sstream};
//! //!
//! let mut buf: [u8; char::SERIALISED_SIZE * 5] = Default::default(); //! let mut buf = [Default::default(); char::MAX_SERIALISED_SIZE * 0x5];
//! let mut stream = bzipper::Sstream::new(&mut buf); //! let mut stream = Sstream::new(&mut buf);
//! //!
//! stream.append(&'ل'); //! // Note: For serialising multiple characters, the
//! stream.append(&'ا'); //! // `FixedString` type is usually preferred.
//! stream.append(&'م');
//! stream.append(&'د');
//! stream.append(&'ا');
//! //!
//! assert_eq!(buf, [0x00, 0x00, 0x06, 0x44, 0x00, 0x00, 0x06, 0x27, 0x00, 0x00, 0x06, 0x45, 0x00, 0x00, 0x06, 0x2F, 0x00, 0x00, 0x06, 0x27]); //! 'ل'.serialise(&mut stream).unwrap();
//! 'ا'.serialise(&mut stream).unwrap();
//! 'م'.serialise(&mut stream).unwrap();
//! 'د'.serialise(&mut stream).unwrap();
//! 'ا'.serialise(&mut stream).unwrap();
//!
//! assert_eq!(buf, [
//! 0x00, 0x00, 0x06, 0x44, 0x00, 0x00, 0x06, 0x27,
//! 0x00, 0x00, 0x06, 0x45, 0x00, 0x00, 0x06, 0x2F,
//! 0x00, 0x00, 0x06, 0x27
//! ]);
//! ``` //! ```
//! //!
//! When serialising primitives, the resulting byte stream is in big endian (a.k.a. network endian). //! When serialising primitives, the resulting byte stream is in big endian (a.k.a. network endian).
@ -103,27 +113,35 @@
//! //!
//! ## Deserialisation //! ## Deserialisation
//! //!
//! Deserialisation works with an almost identical syntax to serialisation. //! Deserialisation works with a similar syntax to serialisation.
//! //!
//! To deserialise a buffer, simply call the [`deserialise`](Deserialise::deserialise) method: //! D-streams (*deserialisation streams*) use the [`Dstream`] type and are constructed in a manner similar to s-streams.
//! To deserialise a buffer, simply call the [`deserialise`](Deserialise::deserialise) method with the strema:
//! //!
//! ``` //! ```
//! use bzipper::Deserialise; //! use bzipper::{Deserialise, Dstream};
//! //!
//! let data = [0x45, 0x54]; //! let data = [0x45, 0x54];
//! assert_eq!(<u16>::deserialise(&data).unwrap(), 0x4554); //! let stream = Dstream::new(&data);
//! assert_eq!(u16::deserialise(&stream).unwrap(), 0x4554);
//! ``` //! ```
//! //!
//! Just like with serialisations, the [`Dstream`] can be used to deserialise chained elements: //! And just like s-streams, d-streams can also be used to handle chaining:
//! //!
//! ``` //! ```
//! use bzipper::Deserialise; //! use bzipper::{Deserialise, Dstream};
//! //!
//! let data = [0x45, 0x54]; //! let data = [0x45, 0x54];
//! let stream = bzipper::Dstream::new(&data); //! let stream = Dstream::new(&data);
//! //!
//! assert_eq!(stream.take::<u8>().unwrap(), 0x45); //! assert_eq!(u8::deserialise(&stream).unwrap(), 0x45);
//! assert_eq!(stream.take::<u8>().unwrap(), 0x54); //! assert_eq!(u8::deserialise(&stream).unwrap(), 0x54);
//!
//! // The data can also be deserialised as a tuple (up
//! // to twelve elements).
//!
//! let stream = Dstream::new(&data);
//! assert_eq!(<(u8, u8)>::deserialise(&stream).unwrap(), (0x45, 0x54));
//! ``` //! ```
#![no_std] #![no_std]
@ -139,8 +157,6 @@ extern crate alloc;
extern crate std; extern crate std;
/// Implements [`Deserialise`] for the provided type. /// Implements [`Deserialise`] for the provided type.
///
/// This macro assumes that `Serialise` was also derived, although this is not strictly required as it is unenforceable.
#[doc(inline)] #[doc(inline)]
pub use bzipper_macros::Deserialise; pub use bzipper_macros::Deserialise;
@ -151,7 +167,7 @@ pub use bzipper_macros::Deserialise;
/// For structures, each element is chained in **order of declaration.** /// For structures, each element is chained in **order of declaration.**
/// For example, the following struct will serialise its field `foo` before `bar`: /// For example, the following struct will serialise its field `foo` before `bar`:
/// ///
/// ``` /// ```rust
/// use bzipper::Serialise; /// use bzipper::Serialise;
/// ///
/// #[derive(Serialise)] /// #[derive(Serialise)]
@ -161,9 +177,9 @@ pub use bzipper_macros::Deserialise;
/// } /// }
/// ``` /// ```
/// ///
/// Should the order of declaration change, then most of---if not all---previous dervied serialisations become void. /// Should the structure's declaration change, then all previous derived serialisations be considered void.
/// ///
/// The value of [`SERIALISED_SIZE`](Serialise::SERIALISED_SIZE) is set to the combined value of all fields. /// The value of [`MAX_SERIALISED_SIZE`](Serialise::MAX_SERIALISED_SIZE) is set to the combined value of all fields.
/// ///
/// If the structure is a unit structure (i.e. it has *no* fields), it is serialised equivalently to the [unit] type. /// If the structure is a unit structure (i.e. it has *no* fields), it is serialised equivalently to the [unit] type.
/// ///
@ -176,12 +192,12 @@ pub use bzipper_macros::Deserialise;
/// Variants with fields are serialised exactly like structures. /// Variants with fields are serialised exactly like structures.
/// That is, each field is chained in order of declaration. /// That is, each field is chained in order of declaration.
/// ///
/// Each variant has its own serialised size, and the largest of these values is chosen as the serialised size of the enumeration type. /// Each variant has its own value of `MAX_SERIALISED_SIZE`, and the largest of these values is chosen as the value of the enumeration's own `MAX_SERIALISED_SIZE`.
/// ///
/// # Unions /// # Unions
/// ///
/// Unions cannot derive `Serialise` due to the uncertainty of their contents. /// Unions cannot derive `Serialise` due to the uncertainty of their contents.
/// The trait should therefore be implemented manually. /// The trait should therefore be implemented manually for such types.
#[doc(inline)] #[doc(inline)]
pub use bzipper_macros::Serialise; pub use bzipper_macros::Serialise;
@ -196,7 +212,6 @@ pub(in crate) use use_mod;
use_mod!(pub deserialise); use_mod!(pub deserialise);
use_mod!(pub dstream); use_mod!(pub dstream);
use_mod!(pub error); use_mod!(pub error);
use_mod!(pub fixed_iter);
use_mod!(pub fixed_string); use_mod!(pub fixed_string);
use_mod!(pub serialise); use_mod!(pub serialise);
use_mod!(pub sstream); use_mod!(pub sstream);

View file

@ -24,16 +24,20 @@ mod test;
use crate::{Error, Result, Sstream}; use crate::{Error, Result, Sstream};
use core::{convert::Infallible, marker::PhantomData}; use core::{convert::Infallible, hint::unreachable_unchecked, marker::PhantomData};
mod tuple; mod tuple;
/// Denotes a type capable of being serialised. /// Denotes a type capable of serialisation.
/// ///
/// It is recommended to simply derive this trait for custom types. /// It is recommended to simply derive this trait for custom types.
/// It can, however, be manually implemented: /// It can, however, also be manually implemented:
///
/// ```rust
/// // Manual implementation of custom type. This im-
/// // plementation is equivalent to what would have
/// // been derived.
/// ///
/// ```
/// use bzipper::{Result, Serialise, Sstream}; /// use bzipper::{Result, Serialise, Sstream};
/// ///
/// struct Foo { /// struct Foo {
@ -42,60 +46,46 @@ mod tuple;
/// } /// }
/// ///
/// impl Serialise for Foo { /// impl Serialise for Foo {
/// const SERIALISED_SIZE: usize = u16::SERIALISED_SIZE + f32::SERIALISED_SIZE; /// const MAX_SERIALISED_SIZE: usize = u16::MAX_SERIALISED_SIZE + f32::MAX_SERIALISED_SIZE;
///
/// fn serialise(&self, buf: &mut [u8]) -> Result<()> {
/// debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE);
/// ///
/// fn serialise(&self, stream: &mut Sstream) -> Result<()> {
/// // Serialise fields using chaining. /// // Serialise fields using chaining.
/// ///
/// let mut stream = Sstream::new(buf); /// self.bar.serialise(stream)?;
/// /// self.baz.serialise(stream)?;
/// stream.append(&self.bar)?;
/// stream.append(&self.baz)?;
/// ///
/// Ok(()) /// Ok(())
/// } /// }
/// } /// }
/// ``` /// ```
/// ///
/// Implementors of this trait should make sure that [`SERIALISED_SIZE`](Serialise::SERIALISED_SIZE) is properly defined. /// Implementors of this trait should make sure that [`MAX_SERIALISED_SIZE`](Self::MAX_SERIALISED_SIZE) is properly defined.
/// This value indicates the definitive size of any serialisation of the `Self` type. /// This value indicates the definitively largest size of any serialisation of `Self`.
pub trait Serialise: Sized { pub trait Serialise: Sized {
/// The amount of bytes that result from a serialisation. /// The maximum amount of bytes that can result from a serialisation.
/// ///
/// Implementors of this trait should make sure that no serialisation (or deserialisation) uses more than the amount specified by this constant. /// Implementors of this trait should make sure that no serialisation (or deserialisation) uses more than the amount specified by this constant.
/// When using these traits, always assume that exactly this amount has or will be used. const MAX_SERIALISED_SIZE: usize;
const SERIALISED_SIZE: usize;
/// Serialises `self` into a slice. /// Serialises `self` into the given s-stream.
/// ///
/// In most cases it is wiser to chain serialisations using [`Sstream`] instead of using this method directly. /// This method must **never** write more bytes than specified by [`MAX_SERIALISED_SIZE`](Self::MAX_SERIALISED_SIZE).
/// Doing so is considered a logic error.
/// ///
/// # Errors /// # Errors
/// ///
/// If serialisation failed, e.g. by an unencodable value being provided, an error is returned. /// If serialisation fails, e.g. by an unencodable value being provided, an error is returned.
/// fn serialise(&self, stream: &mut Sstream) -> Result<()>;
/// # Panics
///
/// This method will usually panic if the provided slice has a length *less* than the value of `SERIALISED_SIZE`.
/// Official implementations of this trait (including those that are derived) always panic in debug mode if the provided slice has a length that is different at all.
fn serialise(&self, buf: &mut [u8]) -> Result<()>;
} }
macro_rules! impl_numeric { macro_rules! impl_numeric {
($ty:ty) => { ($ty:ty) => {
impl ::bzipper::Serialise for $ty { impl ::bzipper::Serialise for $ty {
const SERIALISED_SIZE: usize = size_of::<$ty>(); const MAX_SERIALISED_SIZE: usize = size_of::<$ty>();
#[inline] #[inline]
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, stream: &mut Sstream) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE); stream.write(&self.to_be_bytes())?;
::core::debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE);
let data = self.to_be_bytes();
buf.copy_from_slice(&data);
Ok(()) Ok(())
} }
@ -106,101 +96,82 @@ macro_rules! impl_numeric {
macro_rules! impl_non_zero { macro_rules! impl_non_zero {
($ty:ty) => { ($ty:ty) => {
impl ::bzipper::Serialise for ::core::num::NonZero<$ty> { impl ::bzipper::Serialise for ::core::num::NonZero<$ty> {
const SERIALISED_SIZE: usize = ::core::mem::size_of::<$ty>(); const MAX_SERIALISED_SIZE: usize = ::core::mem::size_of::<$ty>();
#[inline(always)] #[inline(always)]
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, stream: &mut Sstream) -> Result<()> { self.get().serialise(stream) }
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE);
self.get().serialise(buf)
}
} }
}; };
} }
impl<T: Serialise, const N: usize> Serialise for [T; N] { impl<T: Serialise, const N: usize> Serialise for [T; N] {
const SERIALISED_SIZE: usize = T::SERIALISED_SIZE * N; const MAX_SERIALISED_SIZE: usize = T::MAX_SERIALISED_SIZE * N;
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, stream: &mut Sstream) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE); for v in self { v.serialise(stream)? }
let mut stream = Sstream::new(buf);
for v in self { stream.append(v)? }
Ok(()) Ok(())
} }
} }
impl Serialise for bool { impl Serialise for bool {
const SERIALISED_SIZE: usize = u8::SERIALISED_SIZE; const MAX_SERIALISED_SIZE: usize = u8::MAX_SERIALISED_SIZE;
#[inline(always)] #[inline(always)]
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, stream: &mut Sstream) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE); u8::from(*self).serialise(stream)
u8::from(*self).serialise(buf)
} }
} }
impl Serialise for char { impl Serialise for char {
const SERIALISED_SIZE: usize = u32::SERIALISED_SIZE; const MAX_SERIALISED_SIZE: usize = u32::MAX_SERIALISED_SIZE;
#[inline(always)] #[inline(always)]
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, stream: &mut Sstream) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE); u32::from(*self).serialise(stream)
u32::from(*self).serialise(buf)
} }
} }
// Especially useful for `Result<T, Infallible>`. // Especially useful for `Result<T, Infallible>`.
// *If* that is needed, of course. // *If* that is even needed, of course.
impl Serialise for Infallible { impl Serialise for Infallible {
const SERIALISED_SIZE: usize = 0x0; const MAX_SERIALISED_SIZE: usize = 0x0;
#[inline(always)] #[inline(always)]
fn serialise(&self, _buf: &mut [u8]) -> Result<()> { unreachable!() } fn serialise(&self, _stream: &mut Sstream) -> Result<()> { unsafe { unreachable_unchecked() } }
} }
impl Serialise for isize { impl Serialise for isize {
const SERIALISED_SIZE: usize = i32::SERIALISED_SIZE; const MAX_SERIALISED_SIZE: usize = i32::MAX_SERIALISED_SIZE;
#[inline] #[inline]
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, stream: &mut Sstream) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE);
let value = i32::try_from(*self) let value = i32::try_from(*self)
.map_err(|_| Error::IsizeOutOfRange { value: *self })?; .map_err(|_| Error::IsizeOutOfRange(*self))?;
value.serialise(buf) value.serialise(stream)
} }
} }
impl<T: Serialise> Serialise for Option<T> { impl<T: Serialise> Serialise for Option<T> {
const SERIALISED_SIZE: usize = bool::SERIALISED_SIZE + T::SERIALISED_SIZE; const MAX_SERIALISED_SIZE: usize = bool::MAX_SERIALISED_SIZE + T::MAX_SERIALISED_SIZE;
fn serialise(&self, buf: &mut [u8]) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE);
fn serialise(&self, stream: &mut Sstream) -> Result<()> {
// The first element is of type `bool` and is // The first element is of type `bool` and is
// called the "sign." It signifies whether there is // called the "sign." It signifies whether there is
// a following element or not. The remaining bytes // a following element or not.
// are preserved if `self` is `None`.
let mut stream = Sstream::new(buf);
match *self { match *self {
None => { None => {
stream.append(&false)?; false.serialise(stream)?;
// No need to zero-fill. // No need to zero-fill.
}, },
Some(ref v) => { Some(ref v) => {
stream.append(&true)?; true.serialise(stream)?;
stream.append(v)?; v.serialise(stream)?;
}, },
}; };
@ -209,37 +180,30 @@ impl<T: Serialise> Serialise for Option<T> {
} }
impl<T> Serialise for PhantomData<T> { impl<T> Serialise for PhantomData<T> {
const SERIALISED_SIZE: usize = size_of::<Self>(); const MAX_SERIALISED_SIZE: usize = size_of::<Self>();
#[inline(always)] #[inline(always)]
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, _stream: &mut Sstream) -> Result<()> { Ok(()) }
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE);
Ok(())
}
} }
impl<T, E> Serialise for core::result::Result<T, E> impl<T, E> Serialise for core::result::Result<T, E>
where where
T: Serialise, T: Serialise,
E: Serialise, { E: Serialise, {
const SERIALISED_SIZE: usize = bool::SERIALISED_SIZE + if size_of::<T>() > size_of::<E>() { size_of::<T>() } else { size_of::<E>() }; const MAX_SERIALISED_SIZE: usize = bool::MAX_SERIALISED_SIZE + if size_of::<T>() > size_of::<E>() { size_of::<T>() } else { size_of::<E>() };
fn serialise(&self, buf: &mut [u8]) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE);
let mut stream = Sstream::new(buf);
fn serialise(&self, stream: &mut Sstream) -> Result<()> {
// Remember the descriminant. // Remember the descriminant.
match *self { match *self {
Ok(ref v) => { Ok(ref v) => {
stream.append(&false)?; false.serialise(stream)?;
stream.append(v)?; v.serialise(stream)?;
}, },
Err(ref e) => { Err(ref e) => {
stream.append(&true)?; true.serialise(stream)?;
stream.append(e)?; e.serialise(stream)?;
}, },
}; };
@ -248,26 +212,20 @@ where
} }
impl Serialise for () { impl Serialise for () {
const SERIALISED_SIZE: usize = size_of::<Self>(); const MAX_SERIALISED_SIZE: usize = 0x0;
#[inline(always)] #[inline(always)]
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, _stream: &mut Sstream) -> Result<()> { Ok(()) }
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE);
Ok(())
}
} }
impl Serialise for usize { impl Serialise for usize {
const SERIALISED_SIZE: Self = u32::SERIALISED_SIZE; const MAX_SERIALISED_SIZE: Self = u32::MAX_SERIALISED_SIZE;
fn serialise(&self, buf: &mut [u8]) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE);
fn serialise(&self, stream: &mut Sstream) -> Result<()> {
let value = u32::try_from(*self) let value = u32::try_from(*self)
.map_err(|_| Error::UsizeOutOfRange { value: *self })?; .map_err(|_| Error::UsizeOutOfRange(*self))?;
value.serialise(buf) value.serialise(stream)
} }
} }

View file

@ -18,7 +18,7 @@
// er General Public License along with bzipper. If // er General Public License along with bzipper. If
// not, see <https://www.gnu.org/licenses/>. // not, see <https://www.gnu.org/licenses/>.
use crate::{FixedString, Serialise}; use crate::{FixedString, Serialise, Sstream};
#[test] #[test]
fn test_serialise() { fn test_serialise() {
@ -32,19 +32,19 @@ fn test_serialise() {
Teacher { initials: [char; 0x3] }, Teacher { initials: [char; 0x3] },
} }
assert_eq!(Foo::SERIALISED_SIZE, 0x4); assert_eq!(Foo::MAX_SERIALISED_SIZE, 0x4);
assert_eq!(Bar::SERIALISED_SIZE, 0x10); assert_eq!(Bar::MAX_SERIALISED_SIZE, 0x10);
macro_rules! test { macro_rules! test {
($ty:ty: $value:expr => $data:expr) => {{ ($ty:ty: $value:expr => $data:expr) => {{
use ::bzipper::Serialise; use ::bzipper::Serialise;
let data: [u8; <$ty as Serialise>::SERIALISED_SIZE] = $data; let mut buf = [0x00; <$ty as Serialise>::MAX_SERIALISED_SIZE];
let mut buf = [0x00; <$ty as Serialise>::SERIALISED_SIZE]; let mut stream = Sstream::new(&mut buf);
<$ty as Serialise>::serialise(&mut $value, &mut buf).unwrap(); <$ty as Serialise>::serialise(&mut $value, &mut stream).unwrap();
assert_eq!(buf, data); assert_eq!(stream, $data);
}}; }};
} }
@ -63,14 +63,11 @@ fn test_serialise() {
0x83, 0x2E, 0x3C, 0x2C, 0x84, 0x10, 0x58, 0x1A, 0x83, 0x2E, 0x3C, 0x2C, 0x84, 0x10, 0x58, 0x1A,
]); ]);
test!(FixedString::<0x1>: FixedString::try_from("A").unwrap() => [0x00, 0x00, 0x00, 0x41, 0x00, 0x00, 0x00, 0x01]); test!(FixedString::<0x1>: FixedString::try_from("A").unwrap() => [0x00, 0x00, 0x00, 0x01, 0x41]);
test!(FixedString::<0x9>: FixedString::try_from("l\u{00F8}gma\u{00F0}ur").unwrap() => [ test!(FixedString::<0x24>: FixedString::try_from("l\u{00F8}gma\u{00F0}ur").unwrap() => [
0x00, 0x00, 0x00, 0x6C, 0x00, 0x00, 0x00, 0xF8, 0x00, 0x00, 0x00, 0x0A, 0x6C, 0xC3, 0xB8, 0x67,
0x00, 0x00, 0x00, 0x67, 0x00, 0x00, 0x00, 0x6D, 0x6D, 0x61, 0xC3, 0xB0, 0x75, 0x72,
0x00, 0x00, 0x00, 0x61, 0x00, 0x00, 0x00, 0xF0,
0x00, 0x00, 0x00, 0x75, 0x00, 0x00, 0x00, 0x72,
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x08,
]); ]);
test!([char; 0x5]: ['\u{03B4}', '\u{0190}', '\u{03BB}', '\u{03A4}', '\u{03B1}'] => [ test!([char; 0x5]: ['\u{03B4}', '\u{0190}', '\u{03BB}', '\u{03A4}', '\u{03B1}'] => [
@ -79,7 +76,7 @@ fn test_serialise() {
0x00, 0x00, 0x03, 0xB1, 0x00, 0x00, 0x03, 0xB1,
]); ]);
test!(Result::<u16, char>: Ok(0x45_45) => [0x00, 0x45, 0x45, 0x00, 0x00]); test!(Result::<u16, char>: Ok(0x45_45) => [0x00, 0x45, 0x45]);
test!(Result::<u16, char>: Err(char::REPLACEMENT_CHARACTER) => [0x01, 0x00, 0x00, 0xFF, 0xFD]); test!(Result::<u16, char>: Err(char::REPLACEMENT_CHARACTER) => [0x01, 0x00, 0x00, 0xFF, 0xFD]);
test!(Option<()>: None => [0x00]); test!(Option<()>: None => [0x00]);
@ -87,15 +84,9 @@ fn test_serialise() {
test!(Foo: Foo('\u{FDF2}') => [0x00, 0x00, 0xFD, 0xF2]); test!(Foo: Foo('\u{FDF2}') => [0x00, 0x00, 0xFD, 0xF2]);
test!(Bar: Bar::Unit => [ test!(Bar: Bar::Unit => [0x00, 0x00, 0x00, 0x00]);
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
]);
test!(Bar: Bar::Pretty(true) => [ test!(Bar: Bar::Pretty(true) => [0x00, 0x00, 0x00, 0x01, 0x01]);
0x00, 0x00, 0x00, 0x01, 0x01, 0x00, 0x00, 0x00,
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
]);
test!(Bar: Bar::Teacher { initials: ['T', 'L', '\0'] } => [ test!(Bar: Bar::Teacher { initials: ['T', 'L', '\0'] } => [
0x00, 0x00, 0x00, 0x02, 0x00, 0x00, 0x00, 0x54, 0x00, 0x00, 0x00, 0x02, 0x00, 0x00, 0x00, 0x54,

View file

@ -24,15 +24,11 @@ use crate::{Result, Serialise, Sstream};
impl<T0> Serialise for (T0, ) impl<T0> Serialise for (T0, )
where where
T0: Serialise, { T0: Serialise, {
const SERIALISED_SIZE: usize = const MAX_SERIALISED_SIZE: usize =
T0::SERIALISED_SIZE; T0::MAX_SERIALISED_SIZE;
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, stream: &mut Sstream) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE); self.0.serialise(stream)?;
let mut stream = Sstream::new(buf);
stream.append(&self.0)?;
Ok(()) Ok(())
} }
@ -42,17 +38,13 @@ impl<T0, T1> Serialise for (T0, T1)
where where
T0: Serialise, T0: Serialise,
T1: Serialise, { T1: Serialise, {
const SERIALISED_SIZE: usize = const MAX_SERIALISED_SIZE: usize =
T0::SERIALISED_SIZE T0::MAX_SERIALISED_SIZE
+ T1::SERIALISED_SIZE; + T1::MAX_SERIALISED_SIZE;
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, stream: &mut Sstream) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE); self.0.serialise(stream)?;
self.1.serialise(stream)?;
let mut stream = Sstream::new(buf);
stream.append(&self.0)?;
stream.append(&self.1)?;
Ok(()) Ok(())
} }
@ -63,19 +55,15 @@ where
T0: Serialise, T0: Serialise,
T1: Serialise, T1: Serialise,
T2: Serialise, { T2: Serialise, {
const SERIALISED_SIZE: usize = const MAX_SERIALISED_SIZE: usize =
T0::SERIALISED_SIZE T0::MAX_SERIALISED_SIZE
+ T1::SERIALISED_SIZE + T1::MAX_SERIALISED_SIZE
+ T2::SERIALISED_SIZE; + T2::MAX_SERIALISED_SIZE;
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, stream: &mut Sstream) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE); self.0.serialise(stream)?;
self.1.serialise(stream)?;
let mut stream = Sstream::new(buf); self.2.serialise(stream)?;
stream.append(&self.0)?;
stream.append(&self.1)?;
stream.append(&self.2)?;
Ok(()) Ok(())
} }
@ -87,21 +75,17 @@ where
T1: Serialise, T1: Serialise,
T2: Serialise, T2: Serialise,
T3: Serialise, { T3: Serialise, {
const SERIALISED_SIZE: usize = const MAX_SERIALISED_SIZE: usize =
T0::SERIALISED_SIZE T0::MAX_SERIALISED_SIZE
+ T1::SERIALISED_SIZE + T1::MAX_SERIALISED_SIZE
+ T2::SERIALISED_SIZE + T2::MAX_SERIALISED_SIZE
+ T3::SERIALISED_SIZE; + T3::MAX_SERIALISED_SIZE;
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, stream: &mut Sstream) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE); self.0.serialise(stream)?;
self.1.serialise(stream)?;
let mut stream = Sstream::new(buf); self.2.serialise(stream)?;
self.3.serialise(stream)?;
stream.append(&self.0)?;
stream.append(&self.1)?;
stream.append(&self.2)?;
stream.append(&self.3)?;
Ok(()) Ok(())
} }
@ -114,23 +98,19 @@ where
T2: Serialise, T2: Serialise,
T3: Serialise, T3: Serialise,
T4: Serialise, { T4: Serialise, {
const SERIALISED_SIZE: usize = const MAX_SERIALISED_SIZE: usize =
T0::SERIALISED_SIZE T0::MAX_SERIALISED_SIZE
+ T1::SERIALISED_SIZE + T1::MAX_SERIALISED_SIZE
+ T2::SERIALISED_SIZE + T2::MAX_SERIALISED_SIZE
+ T3::SERIALISED_SIZE + T3::MAX_SERIALISED_SIZE
+ T4::SERIALISED_SIZE; + T4::MAX_SERIALISED_SIZE;
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, stream: &mut Sstream) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE); self.0.serialise(stream)?;
self.1.serialise(stream)?;
let mut stream = Sstream::new(buf); self.2.serialise(stream)?;
self.3.serialise(stream)?;
stream.append(&self.0)?; self.4.serialise(stream)?;
stream.append(&self.1)?;
stream.append(&self.2)?;
stream.append(&self.3)?;
stream.append(&self.4)?;
Ok(()) Ok(())
} }
@ -144,25 +124,21 @@ where
T3: Serialise, T3: Serialise,
T4: Serialise, T4: Serialise,
T5: Serialise, { T5: Serialise, {
const SERIALISED_SIZE: usize = const MAX_SERIALISED_SIZE: usize =
T0::SERIALISED_SIZE T0::MAX_SERIALISED_SIZE
+ T1::SERIALISED_SIZE + T1::MAX_SERIALISED_SIZE
+ T2::SERIALISED_SIZE + T2::MAX_SERIALISED_SIZE
+ T3::SERIALISED_SIZE + T3::MAX_SERIALISED_SIZE
+ T4::SERIALISED_SIZE + T4::MAX_SERIALISED_SIZE
+ T5::SERIALISED_SIZE; + T5::MAX_SERIALISED_SIZE;
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, stream: &mut Sstream) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE); self.0.serialise(stream)?;
self.1.serialise(stream)?;
let mut stream = Sstream::new(buf); self.2.serialise(stream)?;
self.3.serialise(stream)?;
stream.append(&self.0)?; self.4.serialise(stream)?;
stream.append(&self.1)?; self.5.serialise(stream)?;
stream.append(&self.2)?;
stream.append(&self.3)?;
stream.append(&self.4)?;
stream.append(&self.5)?;
Ok(()) Ok(())
} }
@ -177,27 +153,23 @@ where
T4: Serialise, T4: Serialise,
T5: Serialise, T5: Serialise,
T6: Serialise, { T6: Serialise, {
const SERIALISED_SIZE: usize = const MAX_SERIALISED_SIZE: usize =
T0::SERIALISED_SIZE T0::MAX_SERIALISED_SIZE
+ T1::SERIALISED_SIZE + T1::MAX_SERIALISED_SIZE
+ T2::SERIALISED_SIZE + T2::MAX_SERIALISED_SIZE
+ T3::SERIALISED_SIZE + T3::MAX_SERIALISED_SIZE
+ T4::SERIALISED_SIZE + T4::MAX_SERIALISED_SIZE
+ T5::SERIALISED_SIZE + T5::MAX_SERIALISED_SIZE
+ T6::SERIALISED_SIZE; + T6::MAX_SERIALISED_SIZE;
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, stream: &mut Sstream) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE); self.0.serialise(stream)?;
self.1.serialise(stream)?;
let mut stream = Sstream::new(buf); self.2.serialise(stream)?;
self.3.serialise(stream)?;
stream.append(&self.0)?; self.4.serialise(stream)?;
stream.append(&self.1)?; self.5.serialise(stream)?;
stream.append(&self.2)?; self.6.serialise(stream)?;
stream.append(&self.3)?;
stream.append(&self.4)?;
stream.append(&self.5)?;
stream.append(&self.6)?;
Ok(()) Ok(())
} }
@ -213,29 +185,25 @@ where
T5: Serialise, T5: Serialise,
T6: Serialise, T6: Serialise,
T7: Serialise, { T7: Serialise, {
const SERIALISED_SIZE: usize = const MAX_SERIALISED_SIZE: usize =
T0::SERIALISED_SIZE T0::MAX_SERIALISED_SIZE
+ T1::SERIALISED_SIZE + T1::MAX_SERIALISED_SIZE
+ T2::SERIALISED_SIZE + T2::MAX_SERIALISED_SIZE
+ T3::SERIALISED_SIZE + T3::MAX_SERIALISED_SIZE
+ T4::SERIALISED_SIZE + T4::MAX_SERIALISED_SIZE
+ T5::SERIALISED_SIZE + T5::MAX_SERIALISED_SIZE
+ T6::SERIALISED_SIZE + T6::MAX_SERIALISED_SIZE
+ T7::SERIALISED_SIZE; + T7::MAX_SERIALISED_SIZE;
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, stream: &mut Sstream) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE); self.0.serialise(stream)?;
self.1.serialise(stream)?;
let mut stream = Sstream::new(buf); self.2.serialise(stream)?;
self.3.serialise(stream)?;
stream.append(&self.0)?; self.4.serialise(stream)?;
stream.append(&self.1)?; self.5.serialise(stream)?;
stream.append(&self.2)?; self.6.serialise(stream)?;
stream.append(&self.3)?; self.7.serialise(stream)?;
stream.append(&self.4)?;
stream.append(&self.5)?;
stream.append(&self.6)?;
stream.append(&self.7)?;
Ok(()) Ok(())
} }
@ -252,31 +220,27 @@ where
T6: Serialise, T6: Serialise,
T7: Serialise, T7: Serialise,
T8: Serialise, { T8: Serialise, {
const SERIALISED_SIZE: usize = const MAX_SERIALISED_SIZE: usize =
T0::SERIALISED_SIZE T0::MAX_SERIALISED_SIZE
+ T1::SERIALISED_SIZE + T1::MAX_SERIALISED_SIZE
+ T2::SERIALISED_SIZE + T2::MAX_SERIALISED_SIZE
+ T3::SERIALISED_SIZE + T3::MAX_SERIALISED_SIZE
+ T4::SERIALISED_SIZE + T4::MAX_SERIALISED_SIZE
+ T5::SERIALISED_SIZE + T5::MAX_SERIALISED_SIZE
+ T6::SERIALISED_SIZE + T6::MAX_SERIALISED_SIZE
+ T7::SERIALISED_SIZE + T7::MAX_SERIALISED_SIZE
+ T8::SERIALISED_SIZE; + T8::MAX_SERIALISED_SIZE;
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, stream: &mut Sstream) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE); self.0.serialise(stream)?;
self.1.serialise(stream)?;
let mut stream = Sstream::new(buf); self.2.serialise(stream)?;
self.3.serialise(stream)?;
stream.append(&self.0)?; self.4.serialise(stream)?;
stream.append(&self.1)?; self.5.serialise(stream)?;
stream.append(&self.2)?; self.6.serialise(stream)?;
stream.append(&self.3)?; self.7.serialise(stream)?;
stream.append(&self.4)?; self.8.serialise(stream)?;
stream.append(&self.5)?;
stream.append(&self.6)?;
stream.append(&self.7)?;
stream.append(&self.8)?;
Ok(()) Ok(())
} }
@ -294,33 +258,29 @@ where
T7: Serialise, T7: Serialise,
T8: Serialise, T8: Serialise,
T9: Serialise, { T9: Serialise, {
const SERIALISED_SIZE: usize = const MAX_SERIALISED_SIZE: usize =
T0::SERIALISED_SIZE T0::MAX_SERIALISED_SIZE
+ T1::SERIALISED_SIZE + T1::MAX_SERIALISED_SIZE
+ T2::SERIALISED_SIZE + T2::MAX_SERIALISED_SIZE
+ T3::SERIALISED_SIZE + T3::MAX_SERIALISED_SIZE
+ T4::SERIALISED_SIZE + T4::MAX_SERIALISED_SIZE
+ T5::SERIALISED_SIZE + T5::MAX_SERIALISED_SIZE
+ T6::SERIALISED_SIZE + T6::MAX_SERIALISED_SIZE
+ T7::SERIALISED_SIZE + T7::MAX_SERIALISED_SIZE
+ T8::SERIALISED_SIZE + T8::MAX_SERIALISED_SIZE
+ T9::SERIALISED_SIZE; + T9::MAX_SERIALISED_SIZE;
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, stream: &mut Sstream) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE); self.0.serialise(stream)?;
self.1.serialise(stream)?;
let mut stream = Sstream::new(buf); self.2.serialise(stream)?;
self.3.serialise(stream)?;
stream.append(&self.0)?; self.4.serialise(stream)?;
stream.append(&self.1)?; self.5.serialise(stream)?;
stream.append(&self.2)?; self.6.serialise(stream)?;
stream.append(&self.3)?; self.7.serialise(stream)?;
stream.append(&self.4)?; self.8.serialise(stream)?;
stream.append(&self.5)?; self.9.serialise(stream)?;
stream.append(&self.6)?;
stream.append(&self.7)?;
stream.append(&self.8)?;
stream.append(&self.9)?;
Ok(()) Ok(())
} }
@ -339,35 +299,31 @@ where
T8: Serialise, T8: Serialise,
T9: Serialise, T9: Serialise,
T10: Serialise, { T10: Serialise, {
const SERIALISED_SIZE: usize = const MAX_SERIALISED_SIZE: usize =
T0::SERIALISED_SIZE T0::MAX_SERIALISED_SIZE
+ T1::SERIALISED_SIZE + T1::MAX_SERIALISED_SIZE
+ T2::SERIALISED_SIZE + T2::MAX_SERIALISED_SIZE
+ T3::SERIALISED_SIZE + T3::MAX_SERIALISED_SIZE
+ T4::SERIALISED_SIZE + T4::MAX_SERIALISED_SIZE
+ T5::SERIALISED_SIZE + T5::MAX_SERIALISED_SIZE
+ T6::SERIALISED_SIZE + T6::MAX_SERIALISED_SIZE
+ T7::SERIALISED_SIZE + T7::MAX_SERIALISED_SIZE
+ T8::SERIALISED_SIZE + T8::MAX_SERIALISED_SIZE
+ T9::SERIALISED_SIZE + T9::MAX_SERIALISED_SIZE
+ T10::SERIALISED_SIZE; + T10::MAX_SERIALISED_SIZE;
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, stream: &mut Sstream) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE); self.0.serialise(stream)?;
self.1.serialise(stream)?;
let mut stream = Sstream::new(buf); self.2.serialise(stream)?;
self.3.serialise(stream)?;
stream.append(&self.0)?; self.4.serialise(stream)?;
stream.append(&self.1)?; self.5.serialise(stream)?;
stream.append(&self.2)?; self.6.serialise(stream)?;
stream.append(&self.3)?; self.7.serialise(stream)?;
stream.append(&self.4)?; self.8.serialise(stream)?;
stream.append(&self.5)?; self.9.serialise(stream)?;
stream.append(&self.6)?; self.10.serialise(stream)?;
stream.append(&self.7)?;
stream.append(&self.8)?;
stream.append(&self.9)?;
stream.append(&self.10)?;
Ok(()) Ok(())
} }
@ -387,37 +343,33 @@ where
T9: Serialise, T9: Serialise,
T10: Serialise, T10: Serialise,
T11: Serialise, { T11: Serialise, {
const SERIALISED_SIZE: usize = const MAX_SERIALISED_SIZE: usize =
T0::SERIALISED_SIZE T0::MAX_SERIALISED_SIZE
+ T1::SERIALISED_SIZE + T1::MAX_SERIALISED_SIZE
+ T2::SERIALISED_SIZE + T2::MAX_SERIALISED_SIZE
+ T3::SERIALISED_SIZE + T3::MAX_SERIALISED_SIZE
+ T4::SERIALISED_SIZE + T4::MAX_SERIALISED_SIZE
+ T5::SERIALISED_SIZE + T5::MAX_SERIALISED_SIZE
+ T6::SERIALISED_SIZE + T6::MAX_SERIALISED_SIZE
+ T7::SERIALISED_SIZE + T7::MAX_SERIALISED_SIZE
+ T8::SERIALISED_SIZE + T8::MAX_SERIALISED_SIZE
+ T9::SERIALISED_SIZE + T9::MAX_SERIALISED_SIZE
+ T10::SERIALISED_SIZE + T10::MAX_SERIALISED_SIZE
+ T11::SERIALISED_SIZE; + T11::MAX_SERIALISED_SIZE;
fn serialise(&self, buf: &mut [u8]) -> Result<()> { fn serialise(&self, stream: &mut Sstream) -> Result<()> {
debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE); self.0.serialise(stream)?;
self.1.serialise(stream)?;
let mut stream = Sstream::new(buf); self.2.serialise(stream)?;
self.3.serialise(stream)?;
stream.append(&self.0)?; self.4.serialise(stream)?;
stream.append(&self.1)?; self.5.serialise(stream)?;
stream.append(&self.2)?; self.6.serialise(stream)?;
stream.append(&self.3)?; self.7.serialise(stream)?;
stream.append(&self.4)?; self.8.serialise(stream)?;
stream.append(&self.5)?; self.9.serialise(stream)?;
stream.append(&self.6)?; self.10.serialise(stream)?;
stream.append(&self.7)?; self.11.serialise(stream)?;
stream.append(&self.8)?;
stream.append(&self.9)?;
stream.append(&self.10)?;
stream.append(&self.11)?;
Ok(()) Ok(())
} }

View file

@ -19,16 +19,17 @@
// er General Public License along with bzipper. If // er General Public License along with bzipper. If
// not, see <https://www.gnu.org/licenses/>. // not, see <https://www.gnu.org/licenses/>.
use crate::{Error, Result, Serialise}; use crate::{Dstream, Error, Result};
use core::cell::Cell; use core::cell::Cell;
use core::fmt::{Debug, Formatter};
/// Byte stream for deserialisation. /// Byte stream suitable for serialisation.
/// ///
/// This type borrows a slice, keeping track internally of the used bytes. /// This type mutably borrows a buffer, keeping track internally of the used bytes.
pub struct Sstream<'a> { pub struct Sstream<'a> {
buf: &'a mut [u8], pub(in crate) buf: &'a mut [u8],
pos: Cell<usize>, pub(in crate) pos: Cell<usize>,
} }
impl<'a> Sstream<'a> { impl<'a> Sstream<'a> {
@ -37,22 +38,86 @@ impl<'a> Sstream<'a> {
#[must_use] #[must_use]
pub fn new(buf: &'a mut [u8]) -> Self { Self { buf, pos: Cell::new(0x0) } } pub fn new(buf: &'a mut [u8]) -> Self { Self { buf, pos: Cell::new(0x0) } }
/// Extends the stream by appending a new serialisation. /// Appends raw bytes to the stream.
///
/// # Errors
///
/// If the stream cannot hold any arbitrary serialisation of `T`, an [`EndOfStream`](Error::EndOfStream) instance is returned.
#[inline] #[inline]
pub fn append<T: Serialise>(&mut self, value: &T) -> Result<()> { pub fn write(&mut self, bytes: &[u8]) -> Result<()> {
let rem = self.buf.len() - self.pos.get(); let rem = self.buf.len() - self.pos.get();
let req = T::SERIALISED_SIZE; let req = bytes.len();
if rem < req { return Err(Error::EndOfStream { req, rem }) }; if rem < req { return Err(Error::EndOfStream { req, rem }) }
let start = self.pos.get(); let start = self.pos.get();
let stop = start + req; let stop = start + req;
self.pos.set(stop); self.pos.set(stop);
value.serialise(&mut self.buf[start..stop])
let buf = &mut self.buf[start..stop];
buf.copy_from_slice(bytes);
Ok(())
} }
/// Gets a pointer to the first byte in the stream.
#[inline(always)]
#[must_use]
pub const fn as_ptr(&self) -> *const u8 { self.buf.as_ptr() }
/// Gets an immutable slice of the stream.
#[inline(always)]
#[must_use]
pub const fn as_slice(&self) -> &[u8] {
let ptr = self.as_ptr();
let len = self.len();
unsafe { core::slice::from_raw_parts(ptr, len) }
}
/// Gets the length of the stream.
#[inline(always)]
#[must_use]
pub const fn len(&self) -> usize { unsafe { self.pos.as_ptr().read() } }
/// Tests if the stream is empty.
///
/// If no serialisations have been made so far, this method returns `false`.
#[inline(always)]
#[must_use]
pub const fn is_empty(&self) -> bool { self.len() == 0x0 }
/// Tests if the stream is full.
///
/// Note that zero-sized types such as [`()`](unit) can still be serialised into this stream.
#[inline(always)]
#[must_use]
pub const fn is_full(&self) -> bool { self.len() == self.buf.len() }
}
impl Debug for Sstream<'_> {
#[inline(always)]
fn fmt(&self, f: &mut Formatter) -> core::fmt::Result { Debug::fmt(self.as_slice(), f) }
}
impl<'a> From<&'a mut [u8]> for Sstream<'a> {
#[inline(always)]
fn from(value: &'a mut [u8]) -> Self { Self::new(value) }
}
impl PartialEq for Sstream<'_> {
#[inline(always)]
fn eq(&self, other: &Self) -> bool { self.as_slice() == other.as_slice() }
}
impl PartialEq<&[u8]> for Sstream<'_> {
#[inline(always)]
fn eq(&self, other: &&[u8]) -> bool { self.as_slice() == *other }
}
impl<const N: usize> PartialEq<[u8; N]> for Sstream<'_> {
#[inline(always)]
fn eq(&self, other: &[u8; N]) -> bool { self.as_slice() == other.as_slice() }
}
impl<'a> From<Sstream<'a>> for Dstream<'a> {
#[inline(always)]
fn from(value: Sstream<'a>) -> Self { Self { data: value.buf, pos: value.pos } }
} }

View file

@ -1,6 +1,6 @@
[package] [package]
name = "bzipper_macros" name = "bzipper_macros"
version = "0.6.2" version = "0.7.0"
edition = "2021" edition = "2021"
documentation = "https://docs.rs/bzipper_macros/" documentation = "https://docs.rs/bzipper_macros/"

View file

@ -38,36 +38,30 @@ pub fn deserialise_enum(data: &DataEnum) -> TokenStream {
let mut chain_commands = Punctuated::<TokenStream, Token![,]>::new(); let mut chain_commands = Punctuated::<TokenStream, Token![,]>::new();
for field in &variant.fields { for field in &variant.fields {
let field_ty = &field.ty;
let command = field.ident let command = field.ident
.as_ref() .as_ref()
.map_or_else( .map_or_else(
|| quote! { stream.take::<#field_ty>()? }, || quote! { Deserialise::deserialise(stream)? },
|field_name| quote! { #field_name: stream.take::<#field_ty>()? } |field_name| quote! { #field_name: Deserialise::deserialise(stream)? }
); );
chain_commands.push(command); chain_commands.push(command);
} }
let block = match variant.fields { let value = match variant.fields {
Fields::Named( ..) => quote! { Self::#variant_name { #chain_commands } }, Fields::Named( ..) => quote! { Self::#variant_name { #chain_commands } },
Fields::Unnamed(..) => quote! { Self::#variant_name(#chain_commands) }, Fields::Unnamed(..) => quote! { Self::#variant_name(#chain_commands) },
Fields::Unit => quote! { Self::#variant_name }, Fields::Unit => quote! { Self::#variant_name },
}; };
match_arms.push(quote! { #discriminant => #block }); match_arms.push(quote! { #discriminant => #value });
} }
match_arms.push(quote! { value => return Err(::bzipper::Error::InvalidDiscriminant { value }) }); match_arms.push(quote! { value => return Err(::bzipper::Error::InvalidDiscriminant(value)) });
quote! { quote! {
fn deserialise(data: &[u8]) -> ::bzipper::Result<Self> { fn deserialise(stream: &::bzipper::Dstream) -> ::bzipper::Result<Self> {
::core::debug_assert_eq!(data.len(), <Self as ::bzipper::Serialise>::SERIALISED_SIZE); let value = match (<u32 as ::bzipper::Deserialise>::deserialise(stream)?) { #match_arms };
let mut stream = ::bzipper::Dstream::new(data);
let value = match (stream.take::<u32>()?) { #match_arms };
Ok(value) Ok(value)
} }
} }

View file

@ -26,52 +26,35 @@ use syn::punctuated::Punctuated;
#[must_use] #[must_use]
pub fn deserialise_struct(data: &DataStruct) -> TokenStream { pub fn deserialise_struct(data: &DataStruct) -> TokenStream {
if let Fields::Named(..) = data.fields { if matches!(data.fields, Fields::Unit) {
let mut chain_commands = Punctuated::<TokenStream, Token![,]>::new();
for field in &data.fields {
let name = field.ident.as_ref().unwrap();
let ty = &field.ty;
chain_commands.push(quote! { #name: stream.take::<#ty>()? });
}
quote! {
fn deserialise(data: &[u8]) -> ::bzipper::Result<Self> {
::core::debug_assert_eq!(data.len(), <Self as ::bzipper::Serialise>::SERIALISED_SIZE);
let stream = ::bzipper::Dstream::new(data);
Ok(Self { #chain_commands })
}
}
} else if let Fields::Unnamed(..) = data.fields {
let mut chain_commands = Punctuated::<TokenStream, Token![,]>::new();
for field in &data.fields {
let ty = &field.ty;
chain_commands.push(quote! { stream.take::<#ty>()? });
}
quote! {
fn deserialise(data: &[u8]) -> ::bzipper::Result<Self> {
::core::debug_assert_eq!(data.len(), <Self as ::bzipper::Serialise>::SERIALISED_SIZE);
let stream = ::bzipper::Dstream::new(data);
Ok(Self(#chain_commands))
}
}
} else {
// Fields::Unit
quote! { quote! {
#[inline(always)] #[inline(always)]
fn deserialise(data: &[u8]) -> ::bzipper::Result<Self> { fn deserialise(_stream: &::bzipper::Dstream) -> ::bzipper::Result<Self> { Ok(Self) }
::core::debug_assert_eq!(data.len(), <Self as ::bzipper::Serialise>::SERIALISED_SIZE); }
} else {
let mut chain_commands = Punctuated::<TokenStream, Token![,]>::new();
Ok(Self) for field in &data.fields {
let command = field.ident
.as_ref()
.map_or_else(
|| quote! { Deserialise::deserialise(stream)? },
|field_name| quote! { #field_name: Deserialise::deserialise(stream)? }
);
chain_commands.push(command);
}
let value = if let Fields::Named(..) = data.fields {
quote! { Self { #chain_commands } }
} else {
quote! { Self(#chain_commands) }
};
quote! {
fn deserialise(stream: &::bzipper::Dstream) -> ::bzipper::Result<Self> {
let value = #value;
Ok(value)
} }
} }
} }

View file

@ -38,15 +38,15 @@ pub fn serialise_enum(data: &DataEnum) -> TokenStream {
let variant_name = &variant.ident; let variant_name = &variant.ident;
let discriminant = u32::try_from(index) let discriminant = u32::try_from(index)
.expect("enumeration discriminants must be representable in `u32`"); .expect("enumeration discriminants must be representable as `u32`");
// Discriminant size: // Discriminant size:
serialised_size.push(quote! { <u32 as ::bzipper::Serialise>::SERIALISED_SIZE }); serialised_size.push(quote! { <u32 as ::bzipper::Serialise>::MAX_SERIALISED_SIZE });
let mut captures = Punctuated::<Capture, Token![,]>::new(); let mut captures = Punctuated::<Capture, Token![,]>::new();
let mut chain_commands = Punctuated::<TokenStream, Token![;]>::new(); let mut chain_commands = Punctuated::<TokenStream, Token![;]>::new();
chain_commands.push(quote! { stream.append(&#discriminant)? }); chain_commands.push(quote! { #discriminant.serialise(stream)? });
for (index, field) in variant.fields.iter().enumerate() { for (index, field) in variant.fields.iter().enumerate() {
let field_ty = &field.ty; let field_ty = &field.ty;
@ -55,14 +55,14 @@ pub fn serialise_enum(data: &DataEnum) -> TokenStream {
.as_ref() .as_ref()
.map_or_else(|| Ident::new(&format!("v{index}"), Span::call_site()), Clone::clone); .map_or_else(|| Ident::new(&format!("v{index}"), Span::call_site()), Clone::clone);
serialised_size.push(quote! { <#field_ty as ::bzipper::Serialise>::SERIALISED_SIZE }); serialised_size.push(quote! { <#field_ty as ::bzipper::Serialise>::MAX_SERIALISED_SIZE });
captures.push(Capture { captures.push(Capture {
ref_token: Token![ref](Span::call_site()), ref_token: Token![ref](Span::call_site()),
ident: field_name.clone(), ident: field_name.clone(),
}); });
chain_commands.push(quote! { stream.append(#field_name)? }); chain_commands.push(quote! { #field_name.serialise(stream)? });
} }
chain_commands.push_punct(Token![;](Span::call_site())); chain_commands.push_punct(Token![;](Span::call_site()));
@ -90,14 +90,11 @@ pub fn serialise_enum(data: &DataEnum) -> TokenStream {
size_tests.push(quote! { { core::unreachable!(); } }); size_tests.push(quote! { { core::unreachable!(); } });
quote! { quote! {
const SERIALISED_SIZE: usize = const { #size_tests }; const MAX_SERIALISED_SIZE: usize = const { #size_tests };
fn serialise(&self, buf: &mut [u8]) -> ::bzipper::Result<()> {
::core::debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE);
let mut stream = ::bzipper::Sstream::new(buf);
fn serialise(&self, stream: &mut ::bzipper::Sstream) -> ::bzipper::Result<()> {
match (*self) { #match_arms } match (*self) { #match_arms }
Ok(()) Ok(())
} }
} }

View file

@ -33,14 +33,10 @@ use syn::{
pub fn serialise_struct(data: &DataStruct) -> TokenStream { pub fn serialise_struct(data: &DataStruct) -> TokenStream {
if matches!(data.fields, Fields::Unit) { if matches!(data.fields, Fields::Unit) {
quote! { quote! {
const SERIALISED_SIZE: usize = 0x0; const MAX_SERIALISED_SIZE: usize = 0x0;
#[inline(always)] #[inline(always)]
fn serialise(&self, buf: &mut [u8]) -> ::bzipper::Result<()> { fn serialise(&self, stream: &mut ::bzipper::Sstream) -> ::bzipper::Result<()> { Ok(()) }
::core::debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE);
Ok(())
}
} }
} else { } else {
let mut serialised_size = Punctuated::<TokenStream, Token![+]>::new(); let mut serialised_size = Punctuated::<TokenStream, Token![+]>::new();
@ -53,21 +49,17 @@ pub fn serialise_struct(data: &DataStruct) -> TokenStream {
.as_ref() .as_ref()
.map_or_else(|| Index::from(index).to_token_stream(), ToTokens::to_token_stream); .map_or_else(|| Index::from(index).to_token_stream(), ToTokens::to_token_stream);
serialised_size.push(quote! { <#ty as ::bzipper::Serialise>::SERIALISED_SIZE }); serialised_size.push(quote! { <#ty as ::bzipper::Serialise>::MAX_SERIALISED_SIZE });
chain_commands.push(quote! { stream.append(&self.#name)? }); chain_commands.push(quote! { self.#name.serialise(stream)? });
} }
chain_commands.push_punct(Token![;](Span::call_site())); chain_commands.push_punct(Token![;](Span::call_site()));
quote! { quote! {
const SERIALISED_SIZE: usize = #serialised_size; const MAX_SERIALISED_SIZE: usize = #serialised_size;
fn serialise(&self, buf: &mut [u8]) -> ::bzipper::Result<()> {
::core::debug_assert_eq!(buf.len(), Self::SERIALISED_SIZE);
let mut stream = ::bzipper::Sstream::new(buf);
fn serialise(&self, stream: &mut ::bzipper::Sstream) -> ::bzipper::Result<()> {
#chain_commands #chain_commands
Ok(()) Ok(())