Update the DPI module docs (#1349)

* Update the DPI module docs

* Fix HiDpiFactorChanged doc link

* Incorporate lokathor and icefox feedback

* Adjust documented desktop resolution range

* X11 is one of the reasons I use Windows

* Address DPI generics and float->int rounding

* Revise DPI value statement to better reflect best practices

* Address some of freya's feedback

* phrasing

* Rephrase X11 DPI stuff
This commit is contained in:
Osspial 2020-01-04 19:19:17 -05:00
parent 28b82fb9aa
commit 9b122c3804
2 changed files with 96 additions and 76 deletions

View file

@ -1,80 +1,106 @@
//! DPI is important, so read the docs for this module if you don't want to be confused. //! UI scaling is important, so read the docs for this module if you don't want to be confused.
//! //!
//! Originally, `winit` dealt entirely in physical pixels (excluding unintentional inconsistencies), but now all //! ## Why should I care about UI scaling?
//! window-related functions both produce and consume logical pixels. Monitor-related functions still use physical
//! pixels, as do any context-related functions in `glutin`.
//! //!
//! If you've never heard of these terms before, then you're not alone, and this documentation will explain the //! Modern computer screens don't have a consistent relationship between resolution and size.
//! concepts. //! 1920x1080 is a common resolution for both desktop and mobile screens, despite mobile screens
//! normally being less than a quarter the size of their desktop counterparts. What's more, neither
//! desktop nor mobile screens are consistent resolutions within their own size classes - common
//! mobile screens range from below 720p to above 1440p, and desktop screens range from 720p to 5K
//! and beyond.
//! //!
//! Modern screens have a defined physical resolution, most commonly 1920x1080. Indepedent of that is the amount of //! Given that, it's a mistake to assume that 2D content will only be displayed on screens with
//! space the screen occupies, which is to say, the height and width in millimeters. The relationship between these two //! a consistent pixel density. If you were to render a 96-pixel-square image on a 1080p screen,
//! measurements is the *pixel density*. Mobile screens require a high pixel density, as they're held close to the //! then render the same image on a similarly-sized 4K screen, the 4K rendition would only take up
//! eyes. Larger displays also require a higher pixel density, hence the growing presence of 1440p and 4K displays. //! about a quarter of the physical space as it did on the 1080p screen. That issue is especially
//! problematic with text rendering, where quarter-sized text becomes a significant legibility
//! problem.
//! //!
//! So, this presents a problem. Let's say we want to render a square 100px button. It will occupy 100x100 of the //! Failure to account for the scale factor can create a significantly degraded user experience.
//! screen's pixels, which in many cases, seems perfectly fine. However, because this size doesn't account for the //! Most notably, it can make users feel like they have bad eyesight, which will potentially cause
//! screen's dimensions or pixel density, the button's size can vary quite a bit. On a 4K display, it would be unusably //! them to think about growing elderly, resulting in them having an existential crisis. Once users
//! small. //! enter that state, they will no longer be focused on your application.
//! //!
//! That's a description of what happens when the button is 100x100 *physical* pixels. Instead, let's try using 100x100 //! ## How should I handle it?
//! *logical* pixels. To map logical pixels to physical pixels, we simply multiply by the DPI (dots per inch) factor.
//! On a "typical" desktop display, the scale factor will be 1.0, so 100x100 logical pixels equates to 100x100 physical
//! pixels. However, a 1440p display may have a scale factor of 1.25, so the button is rendered as 125x125 physical pixels.
//! Ideally, the button now has approximately the same perceived size across varying displays.
//! //!
//! Failure to account for the scale factor can create a badly degraded user experience. Most notably, it can make users //! The solution to this problem is to account for the device's *scale factor*. The scale factor is
//! feel like they have bad eyesight, which will potentially cause them to think about growing elderly, resulting in //! the factor UI elements should be scaled by to be consistent with the rest of the user's system -
//! them entering an existential panic. Once users enter that state, they will no longer be focused on your application. //! for example, a button that's normally 50 pixels across would be 100 pixels across on a device
//! with a scale factor of `2.0`, or 75 pixels across with a scale factor of `1.5`.
//! //!
//! There are two ways to get the scale factor: //! The scale factor correlates with, but no has direct relationship to, the screen's actual DPI
//! - You can track the [`DpiChanged`](crate::event::WindowEvent::DpiChanged) event of your //! (dots per inch). Operating systems used to define the scale factor in terms of the screen's
//! windows. This event is sent any time the scale factor changes, either because the window moved to another monitor, //! approximate DPI (at the time, 72 pixels per inch), but [Microsoft decided to report that the DPI
//! or because the user changed the configuration of their screen. //! was roughly 1/3 bigger than the screen's actual DPI (so, 96 pixels per inch) in order to make
//! - You can also retrieve the scale factor of a monitor by calling //! text more legible][microsoft_dpi]. As a result, the exact DPI as defined by the OS doesn't carry
//! [`MonitorHandle::scale_factor`](crate::monitor::MonitorHandle::scale_factor), or the //! a whole lot of weight when designing cross-platform UIs. Scaled pixels should generally be used
//! current scale factor applied to a window by calling //! as the base unit for on-screen UI measurement, instead of DPI-dependent units such as
//! [`Window::scale_factor`](crate::window::Window::scale_factor), which is roughly equivalent //! [points][points] or [picas][picas].
//! to `window.current_monitor().scale_factor()`.
//! //!
//! Depending on the platform, the window's actual scale factor may only be known after //! ### Position and Size types
//! the event loop has started and your window has been drawn once. To properly handle these cases,
//! the most robust way is to monitor the [`DpiChanged`](crate::event::WindowEvent::DpiChanged)
//! event and dynamically adapt your drawing logic to follow the scale factor.
//! //!
//! Here's an overview of what sort of scale factors you can expect, and where they come from: //! Winit's `Physical(Position|Size)` types correspond with the actual pixels on the device, and the
//! - **Windows:** On Windows 8 and 10, per-monitor scaling is readily configured by users from the display settings. //! `Logical(Position|Size)` types correspond to the physical pixels divided by the scale factor.
//! While users are free to select any option they want, they're only given a selection of "nice" scale factors, i.e. //! All of Winit's functions return physical types, but can take either logical or physical
//! 1.0, 1.25, 1.5... on Windows 7, the scale factor is global and changing it requires logging out. //! coordinates as input, allowing you to use the most convenient coordinate system for your
//! - **macOS:** The buzzword is "retina displays", which have a scale factor of 2.0. Otherwise, the scale factor is 1.0. //! particular application.
//! Intermediate scale factors are never used, thus 1440p displays/etc. aren't properly supported. It's possible for any
//! display to use that 2.0 scale factor, given the use of the command line.
//! - **X11:** On X11, we calcuate the scale factor based on the millimeter dimensions provided by XRandR. This can
//! result in a wide range of possible values, including some interesting ones like 1.0833333333333333. This can be
//! overridden using the `WINIT_X11_SCALE_FACTOR` environment variable, though that's not recommended.
//! - **Wayland:** On Wayland, scale factors are set per-screen by the server, and are always integers (most often 1 or 2).
//! - **iOS:** scale factors are both constant and device-specific on iOS.
//! - **Android:** This feature isn't yet implemented on Android, so the scale factor will always be returned as 1.0.
//! - **Web:** scale factors are handled by the browser and will always be 1.0 for your application.
//! //!
//! The window's logical size is conserved across DPI changes, resulting in the physical size changing instead. This //! Winit's position and size types types are generic over their exact pixel type, `P`, to allow the
//! may be surprising on X11, but is quite standard elsewhere. Physical size changes always produce a //! API to have integer precision where appropriate (e.g. most window manipulation functions) and
//! [`Resized`](crate::event::WindowEvent::Resized) event, even on platforms where no resize actually occurs, //! floating precision when necessary (e.g. logical sizes for fractional scale factors and touch
//! such as macOS and Wayland. As a result, it's not necessary to separately handle //! input). If `P` is a floating-point type, please do not cast the values with `as {int}`. Doing so
//! [`DpiChanged`](crate::event::WindowEvent::DpiChanged) if you're only listening for size. //! will truncate the fractional part of the float, rather than properly round to the nearest
//! integer. Use the provided `cast` function or `From`/`Into` conversions, which handle the
//! rounding properly. Note that precision loss will still occur when rounding from a float to an
//! int, although rounding lessens the problem.
//! //!
//! Your GPU has no awareness of the concept of logical pixels, and unless you like wasting pixel density, your //! ### Events
//! framebuffer's size should be in physical pixels.
//! //!
//! `winit` will send [`Resized`](crate::event::WindowEvent::Resized) events whenever a window's logical size //! Winit will dispatch a [`DpiChanged`](crate::event::WindowEvent::DpiChanged)
//! changes, and [`DpiChanged`](crate::event::WindowEvent::DpiChanged) events //! event whenever a window's scale factor has changed. This can happen if the user drags their
//! whenever the scale factor changes. Receiving either of these events means that the physical size of your window has //! window from a standard-resolution monitor to a high-DPI monitor, or if the user changes their
//! changed, and you should recompute it using the latest values you received for each. If the logical size and the //! DPI settings. This gives you a chance to rescale your application's UI elements and adjust how
//! scale factor change simultaneously, `winit` will send both events together; thus, it's recommended to buffer //! the platform changes the window's size to reflect the new scale factor. If a window hasn't
//! these events and process them at the end of the queue. //! received a [`DpiChanged`](crate::event::WindowEvent::DpiChanged) event,
//! then its scale factor is `1.0`.
//! //!
//! If you never received any [`DpiChanged`](crate::event::WindowEvent::DpiChanged) events, //! ## How is the scale factor calculated?
//! then your window's scale factor is 1. //!
//! Scale factor is calculated differently on different platforms:
//!
//! - **Windows:** On Windows 8 and 10, per-monitor scaling is readily configured by users from the
//! display settings. While users are free to select any option they want, they're only given a
//! selection of "nice" scale factors, i.e. 1.0, 1.25, 1.5... on Windows 7, the scale factor is
//! global and changing it requires logging out. See [this article][windows_1] for technical
//! details.
//! - **macOS:** "retina displays" have a scale factor of 2.0. Otherwise, the scale factor is 1.0.
//! Intermediate scale factors are never used. It's possible for any display to use that 2.0 scale
//! factor, given the use of the command line.
//! - **X11:** Many man-hours have been spent trying to figure out how to handle DPI in X11. Winit
//! currently uses a three-pronged approach:
//! + Use the value in the `WINIT_X11_SCALE_FACTOR` environment variable, if present.
//! + If not present, use the value set in `Xft.dpi` in Xresources.
//! + Otherwise, calcuate the scale factor based on the millimeter monitor dimensions provided by XRandR.
//!
//! If `WINIT_X11_SCALE_FACTOR` is set to `randr`, it'll ignore the `Xft.dpi` field and use the
//! XRandR scaling method. Generally speaking, you should try to configure the standard system
//! variables to do what you want before resorting to `WINIT_X11_SCALE_FACTOR`.
//! - **Wayland:** On Wayland, scale factors are set per-screen by the server, and are always
//! integers (most often 1 or 2).
//! - **iOS:** Scale factors are set by Apple to the value that best suits the device, and range
//! from `1.0` to `3.0`. See [this article][apple_1] and [this article][apple_2] for more
//! information.
//! - **Android:** Scale factors are set by the manufacturer to the value that best suits the
//! device, and range from `1.0` to `4.0`. See [this article][android_1] for more information.
//! - **Web:** The scale factor is the ratio between CSS pixels and the physical device pixels.
//!
//! [microsoft_dpi]: https://blogs.msdn.microsoft.com/fontblog/2005/11/08/where-does-96-dpi-come-from-in-windows/
//! [points]: https://en.wikipedia.org/wiki/Point_(typography)
//! [picas]: https://en.wikipedia.org/wiki/Pica_(typography)
//! [windows_1]: https://docs.microsoft.com/en-us/windows/win32/hidpi/high-dpi-desktop-application-development-on-windows
//! [apple_1]: https://developer.apple.com/library/archive/documentation/DeviceInformation/Reference/iOSDeviceCompatibility/Displays/Displays.html
//! [apple_2]: https://developer.apple.com/design/human-interface-guidelines/macos/icons-and-images/image-size-and-resolution/
//! [android_1]: https://developer.android.com/training/multiscreen/screendensities
pub trait Pixel: Copy + Into<f64> { pub trait Pixel: Copy + Into<f64> {
fn from_f64(f: f64) -> Self; fn from_f64(f: f64) -> Self;
@ -136,9 +162,9 @@ pub fn validate_scale_factor(dpi_factor: f64) -> bool {
/// A position represented in logical pixels. /// A position represented in logical pixels.
/// ///
/// The position is stored as floats, so please be careful. Casting floats to integers truncates the fractional part, /// The position is stored as floats, so please be careful. Casting floats to integers truncates the
/// which can cause noticable issues. To help with that, an `Into<(i32, i32)>` implementation is provided which /// fractional part, which can cause noticable issues. To help with that, an `Into<(i32, i32)>`
/// does the rounding for you. /// implementation is provided which does the rounding for you.
#[derive(Debug, Copy, Clone, PartialEq)] #[derive(Debug, Copy, Clone, PartialEq)]
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))] #[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
pub struct LogicalPosition<P> { pub struct LogicalPosition<P> {
@ -204,10 +230,6 @@ impl<P: Pixel, X: Pixel> Into<[X; 2]> for LogicalPosition<P> {
} }
/// A position represented in physical pixels. /// A position represented in physical pixels.
///
/// The position is stored as floats, so please be careful. Casting floats to integers truncates the fractional part,
/// which can cause noticable issues. To help with that, an `Into<(i32, i32)>` implementation is provided which
/// does the rounding for you.
#[derive(Debug, Copy, Clone, PartialEq)] #[derive(Debug, Copy, Clone, PartialEq)]
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))] #[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
pub struct PhysicalPosition<P> { pub struct PhysicalPosition<P> {
@ -273,10 +295,6 @@ impl<P: Pixel, X: Pixel> Into<[X; 2]> for PhysicalPosition<P> {
} }
/// A size represented in logical pixels. /// A size represented in logical pixels.
///
/// The size is stored as floats, so please be careful. Casting floats to integers truncates the fractional part,
/// which can cause noticable issues. To help with that, an `Into<(u32, u32)>` implementation is provided which
/// does the rounding for you.
#[derive(Debug, Copy, Clone, PartialEq)] #[derive(Debug, Copy, Clone, PartialEq)]
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))] #[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
pub struct LogicalSize<P> { pub struct LogicalSize<P> {
@ -400,6 +418,7 @@ impl<P: Pixel, X: Pixel> Into<[X; 2]> for PhysicalSize<P> {
} }
} }
/// A size that's either physical or logical.
#[derive(Debug, Copy, Clone, PartialEq)] #[derive(Debug, Copy, Clone, PartialEq)]
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))] #[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
pub enum Size { pub enum Size {
@ -441,6 +460,7 @@ impl<P: Pixel> From<LogicalSize<P>> for Size {
} }
} }
/// A position that's either physical or logical.
#[derive(Debug, Copy, Clone, PartialEq)] #[derive(Debug, Copy, Clone, PartialEq)]
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))] #[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
pub enum Position { pub enum Position {

View file

@ -304,7 +304,7 @@ pub enum WindowEvent<'a> {
/// is pointed to by the `new_inner_size` reference. By default, this will contain the size suggested /// is pointed to by the `new_inner_size` reference. By default, this will contain the size suggested
/// by the OS, but it can be changed to any value. /// by the OS, but it can be changed to any value.
/// ///
/// For more information about DPI in general, see the [`dpi`](dpi/index.html) module. /// For more information about DPI in general, see the [`dpi`](crate::dpi) module.
DpiChanged { DpiChanged {
scale_factor: f64, scale_factor: f64,
new_inner_size: &'a mut PhysicalSize<u32>, new_inner_size: &'a mut PhysicalSize<u32>,