Tidy up multitouch code: remove double spaces after full stop

Weirdest thing I've seen in my life.
This commit is contained in:
Emil Ernerfeldt 2021-05-08 22:49:40 +02:00
parent 268ddca161
commit 04d9ce227b
6 changed files with 31 additions and 31 deletions

View File

@ -18,7 +18,7 @@ NOTE: [`eframe`](eframe/CHANGELOG.md), [`egui_web`](egui_web/CHANGELOG.md) and [
* Add `Response::on_disabled_hover_text` to show tooltip for disabled widgets.
* Zoom input: ctrl-scroll and (on `egui_web`) trackpad-pinch gesture.
* Support for raw [multi touch](https://github.com/emilk/egui/pull/306) events,
enabling zoom, rotate, and more. Works with `egui_web` on mobile devices,
enabling zoom, rotate, and more. Works with `egui_web` on mobile devices,
and should work with `egui_glium` for certain touch devices/screens.
* Add (optional) compatability with [mint](https://docs.rs/mint)

View File

@ -138,7 +138,7 @@ pub enum Event {
/// Hashed device identifier (if available; may be zero).
/// Can be used to separate touches from different devices.
device_id: TouchDeviceId,
/// Unique identifier of a finger/pen. Value is stable from touch down
/// Unique identifier of a finger/pen. Value is stable from touch down
/// to lift-up
id: TouchId,
phase: TouchPhase,
@ -327,7 +327,7 @@ pub struct TouchId(pub u64);
pub enum TouchPhase {
/// User just placed a touch point on the touch surface
Start,
/// User moves a touch point along the surface. This event is also sent when
/// User moves a touch point along the surface. This event is also sent when
/// any attributes (position, force, ...) of the touch point change.
Move,
/// User lifted the finger or pen from the surface, or slid off the edge of

View File

@ -136,7 +136,7 @@ impl InputState {
#[inline(always)]
pub fn zoom_delta(&self) -> f32 {
// If a multi touch gesture is detected, it measures the exact and linear proportions of
// the distances of the finger tips. It is therefore potentially more accurate than
// the distances of the finger tips. It is therefore potentially more accurate than
// `raw.zoom_delta` which is based on the `ctrl-scroll` event which, in turn, may be
// synthesized from an original touch gesture.
self.multi_touch()
@ -209,7 +209,7 @@ impl InputState {
self.physical_pixel_size()
}
/// Returns details about the currently ongoing multi-touch gesture, if any. Note that this
/// Returns details about the currently ongoing multi-touch gesture, if any. Note that this
/// method returns `None` for single-touch gestures (click, drag, …).
///
/// ```
@ -225,8 +225,8 @@ impl InputState {
/// ```
///
/// By far not all touch devices are supported, and the details depend on the `egui`
/// integration backend you are using. `egui_web` supports multi touch for most mobile
/// devices, but not for a `Trackpad` on `MacOS`, for example. The backend has to be able to
/// integration backend you are using. `egui_web` supports multi touch for most mobile
/// devices, but not for a `Trackpad` on `MacOS`, for example. The backend has to be able to
/// capture native touch events, but many browsers seem to pass such events only for touch
/// _screens_, but not touch _pads._
///

View File

@ -13,24 +13,24 @@ pub struct MultiTouchInfo {
pub start_time: f64,
/// Position of the pointer at the time the gesture started.
pub start_pos: Pos2,
/// Number of touches (fingers) on the surface. Value is ≥ 2 since for a single touch no
/// Number of touches (fingers) on the surface. Value is ≥ 2 since for a single touch no
/// `MultiTouchInfo` is created.
pub num_touches: usize,
/// Zoom factor (Pinch or Zoom). Moving fingers closer together or further appart will change
/// this value. This is a relative value, comparing the average distances of the fingers in
/// the current and previous frame. If the fingers did not move since the previous frame,
/// Zoom factor (Pinch or Zoom). Moving fingers closer together or further appart will change
/// this value. This is a relative value, comparing the average distances of the fingers in
/// the current and previous frame. If the fingers did not move since the previous frame,
/// this value is `1.0`.
pub zoom_delta: f32,
/// Rotation in radians. Moving fingers around each other will change this value. This is a
/// Rotation in radians. Moving fingers around each other will change this value. This is a
/// relative value, comparing the orientation of fingers in the current frame with the previous
/// frame. If all fingers are resting, this value is `0.0`.
/// frame. If all fingers are resting, this value is `0.0`.
pub rotation_delta: f32,
/// Relative movement (comparing previous frame and current frame) of the average position of
/// all touch points. Without movement this value is `Vec2::ZERO`.
/// all touch points. Without movement this value is `Vec2::ZERO`.
///
/// Note that this may not necessarily be measured in screen points (although it _will_ be for
/// most mobile devices). In general (depending on the touch device), touch coordinates cannot
/// be directly mapped to the screen. A touch always is considered to start at the position of
/// most mobile devices). In general (depending on the touch device), touch coordinates cannot
/// be directly mapped to the screen. A touch always is considered to start at the position of
/// the pointer, but touch movement is always measured in the units delivered by the device,
/// and may depend on hardware and system settings.
pub translation_delta: Vec2,
@ -48,12 +48,12 @@ pub struct MultiTouchInfo {
/// The current state (for a specific touch device) of touch events and gestures.
#[derive(Clone)]
pub(crate) struct TouchState {
/// Technical identifier of the touch device. This is used to identify relevant touch events
/// Technical identifier of the touch device. This is used to identify relevant touch events
/// for this `TouchState` instance.
device_id: TouchDeviceId,
/// Active touches, if any.
///
/// TouchId is the unique identifier of the touch. It is valid as long as the finger/pen touches the surface. The
/// TouchId is the unique identifier of the touch. It is valid as long as the finger/pen touches the surface. The
/// next touch will receive a new unique ID.
///
/// Refer to [`ActiveTouch`].
@ -80,7 +80,7 @@ struct DynGestureState {
heading: f32,
}
/// Describes an individual touch (finger or digitizer) on the touch surface. Instances exist as
/// Describes an individual touch (finger or digitizer) on the touch surface. Instances exist as
/// long as the finger/pen touches the surface.
#[derive(Clone, Copy, Debug)]
struct ActiveTouch {
@ -136,7 +136,7 @@ impl TouchState {
self.update_gesture(time, pointer_pos);
if added_or_removed_touches {
// Adding or removing fingers makes the average values "jump". We better forget
// Adding or removing fingers makes the average values "jump". We better forget
// about the previous values, and don't create delta information for this frame:
if let Some(ref mut state) = &mut self.gesture_state {
state.previous = None;
@ -219,12 +219,12 @@ impl TouchState {
// Calculate the direction from the first touch to the center position.
// This is not the perfect way of calculating the direction if more than two fingers
// are involved, but as long as all fingers rotate more or less at the same angular
// velocity, the shortcomings of this method will not be noticed. One can see the
// velocity, the shortcomings of this method will not be noticed. One can see the
// issues though, when touching with three or more fingers, and moving only one of them
// (it takes two hands to do this in a controlled manner). A better technique would be
// (it takes two hands to do this in a controlled manner). A better technique would be
// to store the current and previous directions (with reference to the center) for each
// touch individually, and then calculate the average of all individual changes in
// direction. But this approach cannot be implemented locally in this method, making
// direction. But this approach cannot be implemented locally in this method, making
// everything a bit more complicated.
let first_touch = self.active_touches.values().next().unwrap();
state.heading = (state.avg_pos - first_touch.pos).angle();

View File

@ -52,12 +52,12 @@ impl super::View for ZoomRotate {
ui.separator();
ui.label("Try touch gestures Pinch/Stretch, Rotation, and Pressure with 2+ fingers.");
Frame::dark_canvas(ui.style()).show(ui, |ui| {
// Note that we use `Sense::drag()` although we do not use any pointer events. With
// Note that we use `Sense::drag()` although we do not use any pointer events. With
// the current implementation, the fact that a touch event of two or more fingers is
// recognized, does not mean that the pointer events are suppressed, which are always
// generated for the first finger. Therefore, if we do not explicitly consume pointer
// generated for the first finger. Therefore, if we do not explicitly consume pointer
// events, the window will move around, not only when dragged with a single finger, but
// also when a two-finger touch is active. I guess this problem can only be cleanly
// also when a two-finger touch is active. I guess this problem can only be cleanly
// solved when the synthetic pointer events are created by egui, and not by the
// backend.
@ -122,9 +122,9 @@ impl super::View for ZoomRotate {
to_screen.scale() * arrow_direction,
Stroke::new(stroke_width, color),
);
// Paints a circle at the origin of the arrow. The size and opacity of the circle
// Paints a circle at the origin of the arrow. The size and opacity of the circle
// depend on the current velocity, and the circle is translated in the opposite
// direction of the movement, so it follows the origin's movement. Constant factors
// direction of the movement, so it follows the origin's movement. Constant factors
// have been determined by trial and error.
let speed = self.smoothed_velocity.length();
painter.circle_filled(

View File

@ -114,12 +114,12 @@ pub fn button_from_mouse_event(event: &web_sys::MouseEvent) -> Option<egui::Poin
}
}
/// A single touch is translated to a pointer movement. When a second touch is added, the pointer
/// should not jump to a different position. Therefore, we do not calculate the average position
/// A single touch is translated to a pointer movement. When a second touch is added, the pointer
/// should not jump to a different position. Therefore, we do not calculate the average position
/// of all touches, but we keep using the same touch as long as it is available.
///
/// `touch_id_for_pos` is the `TouchId` of the `Touch` we previously used to determine the
/// pointer position.
/// pointer position.
pub fn pos_from_touch_event(
canvas_id: &str,
event: &web_sys::TouchEvent,