Version

Quick search

Gesture Surface

New in version 1.9.0.

Warning

This is experimental and subject to change as long as this warning notice is present.

See kivy/examples/demo/multistroke/main.py for a complete application example.

class kivy.uix.gesturesurface.GestureContainer(touch, **kwargs)[source]

Bases: kivy.event.EventDispatcher

Container object that stores information about a gesture. It has various properties that are updated by GestureSurface as drawing progresses.

Arguments:
touch

Touch object (as received by on_touch_down) used to initialize the gesture container. Required.

Properties:
active

Set to False once the gesture is complete (meets max_stroke setting or GestureSurface.temporal_window)

active is a BooleanProperty

active_strokes

Number of strokes currently active in the gesture, ie concurrent touches associated with this gesture.

active_strokes is a NumericProperty

max_strokes

Max number of strokes allowed in the gesture. This is set by GestureSurface.max_strokes but can be overridden for example from on_gesture_start.

max_strokes is a NumericProperty

was_merged

Indicates that this gesture has been merged with another gesture and should be considered discarded.

was_merged is a BooleanProperty

bbox

Dictionary with keys minx, miny, maxx, maxy. Represents the size of the gesture bounding box.

bbox is a DictProperty

width

Represents the width of the gesture.

width is a NumericProperty

height

Represents the height of the gesture.

height is a NumericProperty

accept_stroke(count=1)[source]

Returns True if this container can accept count new strokes

add_stroke(touch, line)[source]

Associate a list of points with a touch.uid; the line itself is created by the caller, but subsequent move/up events look it up via us. This is done to avoid problems during merge.

complete_stroke()[source]

Called on touch up events to keep track of how many strokes are active in the gesture (we only want to dispatch event when the last stroke in the gesture is released)

get_vectors(**kwargs)[source]

Return strokes in a format that is acceptable for kivy.multistroke.Recognizer as a gesture candidate or template. The result is cached automatically; the cache is invalidated at the start and end of a stroke and if update_bbox is called. If you are going to analyze a gesture mid-stroke, you may need to set the no_cache argument to True.

handles(touch)[source]

Returns True if this container handles the given touch

single_points_test()[source]

Returns True if the gesture consists only of single-point strokes, we must discard it in this case, or an exception will be raised

update_bbox(touch)[source]

Update gesture bbox from a touch coordinate

class kivy.uix.gesturesurface.GestureSurface(**kwargs)[source]

Bases: kivy.uix.floatlayout.FloatLayout

Simple gesture surface to track/draw touch movements. Typically used to gather user input suitable for kivy.multistroke.Recognizer.

Properties:
temporal_window

Time to wait from the last touch_up event before attempting to recognize the gesture. If you set this to 0, the on_gesture_complete event is not fired unless the max_strokes condition is met.

temporal_window is a NumericProperty and defaults to 2.0

max_strokes

Max number of strokes in a single gesture; if this is reached, recognition will start immediately on the final touch_up event. If this is set to 0, the on_gesture_complete event is not fired unless the temporal_window expires.

max_strokes is a NumericProperty and defaults to 2.0

bbox_margin

Bounding box margin for detecting gesture collisions, in pixels.

bbox_margin is a NumericProperty and defaults to 30

draw_timeout

Number of seconds to keep lines/bbox on canvas after the on_gesture_complete event is fired. If this is set to 0, gestures are immediately removed from the surface when complete.

draw_timeout is a NumericProperty and defaults to 3.0

color

Color used to draw the gesture, in RGB. This option does not have an effect if use_random_color is True.

color is a ColorProperty and defaults to [1, 1, 1, 1] (white)

Changed in version 2.0.0: Changed from ListProperty to ColorProperty.

use_random_color

Set to True to pick a random color for each gesture, if you do this then color is ignored. Defaults to False.

use_random_color is a BooleanProperty and defaults to False

line_width

Line width used for tracing touches on the surface. Set to 0 if you only want to detect gestures without drawing anything. If you use 1.0, OpenGL GL_LINE is used for drawing; values > 1 will use an internal drawing method based on triangles (less efficient), see kivy.graphics.

line_width is a NumericProperty and defaults to 2

draw_bbox

Set to True if you want to draw bounding box behind gestures. This only works if line_width >= 1. Default is False.

draw_bbox is a BooleanProperty and defaults to True

bbox_alpha

Opacity for bounding box if draw_bbox is True. Default 0.1

bbox_alpha is a NumericProperty and defaults to 0.1

Events:
on_gesture_start GestureContainer

Fired when a new gesture is initiated on the surface, i.e. the first on_touch_down that does not collide with an existing gesture on the surface.

on_gesture_extend GestureContainer

Fired when a touch_down event occurs within an existing gesture.

on_gesture_merge GestureContainer, GestureContainer

Fired when two gestures collide and get merged to one gesture. The first argument is the gesture that has been merged (no longer valid); the second is the combined (resulting) gesture.

on_gesture_complete GestureContainer

Fired when a set of strokes is considered a complete gesture, this happens when temporal_window expires or max_strokes is reached. Typically you will bind to this event and use the provided GestureContainer get_vectors() method to match against your gesture database.

on_gesture_cleanup GestureContainer

Fired draw_timeout seconds after on_gesture_complete, The gesture will be removed from the canvas (if line_width > 0 or draw_bbox is True) and the internal gesture list before this.

on_gesture_discard GestureContainer

Fired when a gesture does not meet the minimum size requirements for recognition (width/height < 5, or consists only of single- point strokes).

find_colliding_gesture(touch)[source]

Checks if a touch x/y collides with the bounding box of an existing gesture. If so, return it (otherwise returns None)

get_gesture(touch)[source]

Returns GestureContainer associated with given touch

init_gesture(touch)[source]

Create a new gesture from touch, i.e. it’s the first on surface, or was not close enough to any existing gesture (yet)

merge_gestures(g, other)[source]

Merges two gestures together, the oldest one is retained and the newer one gets the GestureContainer.was_merged flag raised.

on_touch_down(touch)[source]

When a new touch is registered, the first thing we do is to test if it collides with the bounding box of another known gesture. If so, it is assumed to be part of that gesture.

on_touch_move(touch)[source]

When a touch moves, we add a point to the line on the canvas so the path is updated. We must also check if the new point collides with the bounding box of another gesture - if so, they should be merged.

on_touch_up(touch)[source]

Receive a touch up event. The touch is in parent coordinates.

See on_touch_down() for more information.