HP Forums
[newRPL] What do we use touch for? - Printable Version

+- HP Forums (https://www.hpmuseum.org/forum)
+-- Forum: Not HP Calculators (/forum-7.html)
+--- Forum: Not quite HP Calculators - but related (/forum-8.html)
+--- Thread: [newRPL] What do we use touch for? (/thread-17702.html)



[newRPL] What do we use touch for? - Claudio L. - 11-12-2021 04:25 PM

We finally have functioning touchscreen drivers on the Prime G1 port.

First question that comes up is: now what?

* The obvious is to press the menu items directly on the screen.

We could also use it to manipulate the stack, but how exactly? We are open to proposals.

I'll throw a bone to start the brainstorming:
* Dragging up and down on the stack would simply scroll to see more elements
* Press and hold would start the interactive stack mode, then "grab" an element, maybe we drag and drop it to a new location in the stack?

What about removing an element in the middle of the stack? That typically takes a few keystrokes of the interactive stack, maybe it can be done with touch quicker? How?


RE: [newRPL] What do we use touch for? - Stevetuc - 11-12-2021 05:16 PM

(11-12-2021 04:25 PM)Claudio L. Wrote:  We finally have functioning touchscreen drivers on the Prime G1 port.

[..]

What about removing an element in the middle of the stack? That typically takes a few keystrokes of the interactive stack, maybe it can be done with touch quicker? How?

Flick left or right, like deleting an email?


RE: [newRPL] What do we use touch for? - toml_12953 - 11-12-2021 07:08 PM

(11-12-2021 04:25 PM)Claudio L. Wrote:  We finally have functioning touchscreen drivers on the Prime G1 port.

First question that comes up is: now what?

* The obvious is to press the menu items directly on the screen.

We could also use it to manipulate the stack, but how exactly? We are open to proposals.

I'll throw a bone to start the brainstorming:
* Dragging up and down on the stack would simply scroll to see more elements
* Press and hold would start the interactive stack mode, then "grab" an element, maybe we drag and drop it to a new location in the stack?

What about removing an element in the middle of the stack? That typically takes a few keystrokes of the interactive stack, maybe it can be done with touch quicker? How?

How about being able to drag and drop stack elements to rearrange the stack?
or select a line?


RE: [newRPL] What do we use touch for? - JoJo1973 - 11-16-2021 06:58 PM

Ok, I'll throw my ideas in the arena.

1) Double tap on level 1: open editor with level 1
2) Upward swipe beginning from bottom of screen: activate interactive stack
3) Horizontal swipe on level 1: SWAP if left-to-right, DROP if right-to-left. BONUS: a "left-handed" system flag reverses the directions of the swipes
4) One finger double tapping in the middle of the screen: swap menu 1 and menu 2
5) Two finger double tapping in the middle of the screen. hide menu 2
6) One finger double tapping on menu 1 or menu 2 area: that menu becomes the active menu
7) Horizontal swipe (or double tap) in status area: autocompletion if available
8) In editor and interactive stack: one finger hold selects a token or a level, hold and swipe selects a range of text/levels, plus some gestures for cut/copy/paste.
9) In interactive stack SWAP and DROP as in 3), ROLL/ROLLD as two finger horizontal swipes, double tap is EDIT (available for all levels), two finger double tap is PICK
10) Downward swipe from top of screen: app switcher


RE: [newRPL] What do we use touch for? - JoJo1973 - 11-16-2021 07:04 PM

11) On screen keyboard?


RE: [newRPL] What do we use touch for? - JoJo1973 - 11-16-2021 11:13 PM

(11-16-2021 06:58 PM)JoJo1973 Wrote:  7) Horizontal swipe (or double tap) in status area: autocompletion if available

7) In status area: swipe up/down browses autocompletions; swipe right accepts, swipe left deletes partially entered token


RE: [newRPL] What do we use touch for? - Claudio L. - 11-18-2021 03:14 PM

Lots of interesting ideas!
I guess what I need to work on first is some kind of framework to detect those gestures. For example, how do you know you are swiping from outside the screen and not pressing right at the edge of the screen, then dragging?
I'm going to try to distinguish them based on the speed of the finger when it enters the touchscreen (I imagine if you are swiping from outside the edge your finger is already at speed, while if you touch and move I should be able to see the acceleration of the finger), but I'm not sure how well it will work in real world scenarios.


RE: [newRPL] What do we use touch for? - Han - 11-20-2021 05:00 PM

(11-18-2021 03:14 PM)Claudio L. Wrote:  Lots of interesting ideas!
I guess what I need to work on first is some kind of framework to detect those gestures. For example, how do you know you are swiping from outside the screen and not pressing right at the edge of the screen, then dragging?
I'm going to try to distinguish them based on the speed of the finger when it enters the touchscreen (I imagine if you are swiping from outside the edge your finger is already at speed, while if you touch and move I should be able to see the acceleration of the finger), but I'm not sure how well it will work in real world scenarios.

The way the MOUSE command works on the Prime is that it tracks the position, the relative change in position, and whether the event is a "mouse-down" or "mouse-up" event. I think the Prime can track up to 2 fingers. So based on position, change in position, and time between a tap and release should provide enough information to determine the difference between swiping from the outside of the screen vs. pressing right at the edge and then dragging. As soon as your finger enters the touchscreen area, it's a mouse-down event whether you're tapping or already in the process of swiping. If no mouse-up event occurs, then you're still in a "drag/swipe" state. Since some people naturally swipe slowly while others swipe more quickly, I suspect acceleration will not be of much use beyond determine scroll speed.