1) Currently developing a standard HTML5 website.
Within this site, there are various clickable elements nested within multiple layers of divs.
HAML represents this structure as follows:
%body
#root
#content
#header
#footer
One particular div #content
contains clickable buttons, draggable elements, and video players that utilize mouse events like click
, mousemove
, mouseenter
, etc.
2) Introducing touch gesture detection.
Everything runs smoothly until the addition of a layer (div) specifically for recognizing touch gestures.
This results in the following structure:
%body
#root
#content
#header
#footer
#touch-area
The #touch-area
spanning the window is designed to interpret user touch gestures using events such as touchstart
, touchmove
, touchstop
.
3) Issue arises with controlling mouse events.
A problem emerges where all mouse events are captured by #touch-area
.
Various attempts have been made to resolve this situation:
3.1) CSS approach:
Applying pointer-events: none
to #touch-area
disables touch events simultaneously.
3.2) Exploring JavaScript solutions:
Referencing methods like Click through div, where a click
listener is added to the top div, temporarily hidden to re-trigger events before being displayed again.
- Relies on jQuery
- Requires specifying allowed events passing through
#touch-area
3.3) Focus on mousemove
:
Researching HTML5 and touchscreen functionality reveals an event order including:
1. touchstart 2. touchmove 3. touchend 4. mouseover 5. mousemove 6. mousedown 7. mouseup 8. click
Considering utilizing mousemove
on #touch-area
, hiding it initially to capture all events, then implementing a timeout routine to reveal the div after a delay for touch events.
However, a swipe on the touchscreen first triggers mousemove
events before touchstart
, disrupting the plan.
4) Seeking assistance from the community.
How can a top layer effectively catch touch gestures without interfering with underlying clickable buttons?