volatile stuff

This commit is contained in:
Lokathor 2018-12-10 17:01:21 -07:00
parent 6a3d022d6d
commit 8b313a3edb
2 changed files with 133 additions and 10 deletions

View file

@ -120,14 +120,22 @@ add more things:
inner field. This would add a lot of line noise, so we'll just always have our
newtypes be `pub`.
* Allowing for generic newtypes, which might sound silly but that we'll actually
see an example of soon enough. To do this you might think that we can change
the `:ident` declarations to `:ty`, but then you can't use that captured type
when you declare the new wrapping type. The way you get around this is with a
proc-macro, which is a lot more powerful but which also requires that the
proc-macro be written in an entirely other crate. We don't need that much
power, so for our examples we'll go with the macro_rules version and just do
it by hand in the few cases where we need a generic newtype.
see an example of soon enough. To do this you might _think_ that we can change
the `:ident` declarations to `:ty`, but since we're declaring a fresh type not
using an existing type we have to accept it as an `:ident`. The way you get
around this is with a proc-macro, which is a lot more powerful but which also
requires that you write the proc-macro in an entirely other crate that gets
compiled first. We don't need that much power, so for our examples we'll go
with the macro_rules version and just do it by hand in the few cases where we
need a generic newtype.
* Allowing for `Deref` and `DerefMut`, which usually defeats the point of doing
the newtype, but maybe sometimes it's the right thing, so if you were going
for the full industrial strength version with a proc-macro and all you might
want to make that part of your optional add-ons as well the same way you might
want optional `From`. You'd probably want `From` to be "on by default" and
`Deref`/`DerefMut` to be "off by default", but whatever.
**As a reminder:** remember that macros have to appear _before_ they're invoked in
your source, so the `newtype` macro will always have to be at the very top of
your file, or in a module that's declared before other modules and code.
**As a reminder:** remember that `macro_rules` macros have to appear _before_
they're invoked in your source, so the `newtype` macro will always have to be at
the very top of your file, or if you put it in a module within your project
you'll need to declare the module before anything that uses it.

View file

@ -1 +1,116 @@
# Volatile Destination
There's a reasonable chance that you've never heard of `volatile` before, so
what's that? Well, it's a slightly overloaded term, but basically it means "get
your grubby mitts off my stuff you over-eager compiler".
## Volatile Memory
The first, and most common, form of volatile thing is volatile memory. Volatile
memory can change without your program changing it, usually because it's not a
location in RAM, but instead some special location that represents an actual
hardware device, or part of a hardware device perhaps. The compiler doesn't know
what's going on in this situation, but when the program is actually run and the
CPU gets an instruction to read or write from that location, instead of just
accessing some place in RAM like with normal memory, it accesses whatever bit of
hardware and does _something_. The details of that something depend on the
hardware, but what's important is that we need to actually, definitely execute
that read or write instruction.
This is like the opposite of how normal memory works. Normally when the compiler
sees us write values into variables and read values from variables, it's free to
optimize those expressions and eliminate some of the reads and writes if it can,
and generally try to save us time. Maybe it even knows some stuff about the data
dependencies in our expressions and so it does some of the reads or writes out
of order from what the source says, because the compiler knows that it won't
actually make a difference to the operation of the program. A good and helpful
friend, that compiler.
Volatile memory works almost the exact opposite way. With volatile memory we
need the compiler to _definitely_ emit an instruction to do a read or write and
they need to happen _exactly_ in the order that we say to do it. Each volatile
read or write might have any sort of unknown side effect that the compiler
doesn't know about and it shouldn't try to be clever about it. Just do what we
say, please.
In Rust, we don't mark volatile things as being a separate type of thing,
instead we use normal raw pointers and then call the
[read_volatile](https://doc.rust-lang.org/core/ptr/fn.read_volatile.html) and
[write_volatile](https://doc.rust-lang.org/core/ptr/fn.write_volatile.html)
functions (also available as methods, if you like), which then delegate to the
LLVM
[volatile_load](https://doc.rust-lang.org/core/intrinsics/fn.volatile_load.html)
and
[volatile_store](https://doc.rust-lang.org/core/intrinsics/fn.volatile_store.html)
intrinsics. In C and C++ you can tag a pointer as being volatile and then any
normal read and write with it becomes the volatile version, but in Rust we have
to remember to use the correct alternate function instead.
I'm told by the experts that this makes for a cleaner and saner design from a
_language design_ perspective, but it really kinda screws us when doing low
level code. References, both mutable and shared, aren't volatile, so they
compile into normal reads and writes. This means we can't do anything we'd
normally do in Rust that utilizes references of any kind. Volatile blocks of
memory can't use normal `.iter()` or `.iter_mut()` based iteration (which give
`&T` or `&mut T`), and they also can't use normal `Index` and `IndexMut` sugar
like `a + x[i]` or `x[i] = 7`.
Unlike with normal raw pointers, this pain point never goes away. There's no way
to abstract over the difference with Rust as it exists now, you'd need to
actually adjust the core language by adding an additional pointer type (`*vol
T`) and possibly a reference type to go with it (`&vol T`) to get the right
semantics. And then you'd need an `IndexVol` trait, and you'd need
`.iter_vol()`, and so on for every other little thing. It would be a lot of
work, and the Rust developers just aren't interested in doing all that for such
a limited portion of their user population. We'll just have to deal with not
having any syntax sugar.
But no syntax sugar doesn't mean we can't at least do a little work for
ourselves. Enter the `VolatilePtr<T>` type, which is a newtype over a `*mut T`:
```rust
#[derive(Debug, Clone, Copy, Hash, PartialEq, Eq, PartialOrd, Ord)]
#[repr(transparent)]
pub struct VolatilePtr<T>(*mut T);
```
Obviously we'll need some methods go with it. The basic operations are reading
and writing of course:
```rust
impl<T> VolatilePtr<T> {
/// Performs a `read_volatile`.
pub unsafe fn read(&self) -> T {
self.0.read_volatile()
}
/// Performs a `write_volatile`.
pub unsafe fn write(&self, data: T) {
self.0.write_volatile(data);
}
```
And we want a way to jump around when we do have volatile memory that's in
blocks. For this there's both
[offset](https://doc.rust-lang.org/std/primitive.pointer.html#method.offset) and
[wrapping_offset](https://doc.rust-lang.org/std/primitive.pointer.html#method.wrapping_offset).
The difference is that `offset` optimizes better, but also it can be Undefined
Behavior if the result is not "in bounds or one byte past the end of the same
allocated object". I asked [ubsan](https://github.com/ubsan) (who is the expert
that you should always listen to on matters like this) what that means for us,
and the answer was that you _can_ use an `offset` in statically memory mapped
situations like this as long as you don't use it to jump to the address of
something that Rust itself allocated at some point. Unfortunately, the downside
to using `offset` instead of `wrapping_offset` is that with `offset`, it's
Undefined Behavior _simply to calculate the out of bounds result_, and with
`wrapping_offset` it's not Undefined Behavior until you _use_ the out of bounds
result.
```rust
/// Performs a `wrapping_offset`.
pub unsafe fn offset(self, count: isize) -> Self {
VolatilePtr(self.0.offset(count))
}
```
## Volatile ASM