Entraides et échanges autour de la technologie Scol - Informations and exchanges on the Scol technology
Vous pouvez changer la langue de l'interface une fois inscrit - You can change the language once registered
You are not logged in.
i posted this in programming, feel free to delete that thread i think this forum is more appropriate.
I would like to implementing a multitouch 2d interface using the area and multitouch plugits. the area would send cursor coordinates to the control action of an fps or walkthrough plugit, controlling player movement via the ' control' action, while the plugit is disabled so the user's thumbs don't control navigation when they are not over the area. another button would enable full screen navigation and hide the area.
something like this image:
my first question is how do i format the .png so the image contents do not move when the user puts his/her mouse/finger over the area? when i do a mouseover, the contents of the image shift up along the y axis
i've given up on the idea of showing a 2d ui at app startup...what i think i will do is show a simple scene at startup with the widget, so a new user can immediately interact with a 3d scene. something with a plane, phsycis, skybox, and one or two meshes.
i'm uploading a scene that shows the area behavior issue, will post to thread
wait, i see it's in the plugit code, when you mouseover it moves the bitmap, i will figure out to give user option to enable or disable this function (since i want the area to remain in one state)
Last edited by hebdemnobad (17-Nov-2014 17:45:32)
Offline
the area plugit take a picture with 2 state empty and full.
it trigger the event only when the 2nd state is filled.
So you could use the button plugIT instead.
Offline
if you want this behavior you can also set a picture with the 2 same states.
I see...so let's say I have two pictures. One will have a pre or post render callback that takes the xy coordinates of one of the multitouch cursors hovering over it, process those coordinates into the format of
x y z
ax ay az
so that the data can be sent to the fps plugit 'control' action.
i've looked through the picture and area plugits and they do not appear to have callbacks that process the presence and location of cursors.
what i'm thinking of doing is taking the picture plugit and making it into a walk forward/backward controller.
then i would take another and make it into a look up/look down/ turn left/turn right controller.
hmm. first i thought this was a programming question, then a plugit question, i think it's gone back to a programming question.
1. i think i have two steps to take here. the first is retrieving thte location of a cursor on an SO3BitmapWidget in a pre or post render callback
2. the second is doing the same with two or more cursors (and ignoring all but two of them)
so where in the api would i go to achieve step 1....there is the cursor api in the scol core api, but it only gets a cursor location based on an objwin, not SO3BitmapWidget....am I stuck with having to calculate the cursor position by determining where the SO3BitmapWidget is with relation to the mainWindow's objwin component? or is there some way to immediately get cursor position on the widget istelf without having to find out where the widget is in relation to the objwin that it is living in?
or perhaps the simplest thing is to combine the code of the multitouch and picture plugits in a single plugit with a single struct?
and of course there may be a very easy way of doing what i seek to do with plugits that i haven't seen yet.
Last edited by hebdemnobad (17-Nov-2014 19:22:36)
Offline
I'm going to abandon this project (using a widget to control the player, it's too steep a learning curve at the moment). Instead I will add functionality to my app by using a picture widget to overlay the scene with visual instructions on how to use fps or examine navigation, that can be toggled on or off by the user.
Last edited by hebdemnobad (17-Nov-2014 20:37:00)
Offline
hmm I should update button, picture, area, examine view... with multi touch points.
will look at this tomorrow.
you achieve in minutes what i struggle over for hours...it's a bit like interstellar, the movie i saw last night with my son...
Offline
a pinch gesture to zoom int/out on examine view could usefull also
that works by default on my tablet, if i don't use a navigation plugit
Offline
Done.
the Area and Button plugIts manage the multi touch.
the button was harder that I trough ^^ there is a lot of case to manage and more to keep the same behavior with mouse.
So the button now have a pushed and released event when you push it.
if you move while pushing on it the "pushed move" event sent a direction ratio based on button size. (can be used to make some joystick feature)
if you release the move outside the button the left click is not triggered, so you have to push and release on the button to get the click event.
setups updated at the same address as before.
Offline
Done.
the Area and Button plugIts manage the multi touch.
the button was harder that I trough ^^ there is a lot of case to manage and more to keep the same behavior with mouse.
So the button now have a pushed and released event when you push it.if you move while pushing on it the "pushed move" event sent a direction ratio based on button size. (can be used to make some joystick feature)
if you release the move outside the button the left click is not triggered, so you have to push and release on the button to get the click event.setups updated at the same address as before.
amazing, thanks! you are a programming mammoth!
Last edited by hebdemnobad (18-Nov-2014 14:55:34)
Offline
I've taken a look at carea.pkg....how would i be able to extract the xy coordinates for the memebrs of the touchpoint list before every frame, I assume this is something I could do in the cbUpdateCursor function?
Do the x, y, vx, and vy arguments in the function declaration below represent the location and velocity of each touchpoint on the area, or are those figures for the viewport as a whole?
fun cbUpdateCursor(inst, viewstr, id, x, y, vx, vy, areastr)=
thx!
Offline
coordinates are in the window coords.
so 0 0 top left of the window.
vx vy are not really velocity just relative move.
why do you need the finger coords here ?
the code in the multi touch plugit should be easier also.
Offline
coordinates are in the window coords.
so 0 0 top left of the window.
vx vy are not really velocity just relative move.why do you need the finger coords here ?
the code in the multi touch plugit should be easier also.
This is what I am thinking. you have fps navigation, and you have two area plugits, one on the lower left of the screen, the other on the lower right. As a user your are holding the tablet, phone (if windows ever makes one), between your hands and are navigating with your thumbs.
if you can extract y position from the area on the left in relationship to the area as each frame is rendered, where the top of the area is [whatever,0] and the bottom is [whatever, height of area], you can use the coordinate and do operations on it...in my plan the ypos of the touchpoint on the left would determine whether a user is moving forward or backward, and determine their speed by the offset from the horizontal center of the area.
as for the area on the right, the [x,y] coordinates at each frame when rendered would determine to amount of rotation of the camera and the speed of rotation of the camera determined by offset from horizontal and vertical centers.
if all it comes down to is getting the xy coordinates of the touchpoints on an area, I can figure it out from there....is there a way i can do this by using the multitouch and area plugits together without coding?
if i do have to code, what i have to do is determine the relationship of the x position on the window to the x position on the area, and the y position on the window to the y position on the area..if that is what you would be doing anyway, I think I can handle doing that and report back with my results. (it's just arithmetic, I still remember that from school thankfully)
a keyboard for my tablet should arrive tomorrow, which will make testing faster.
once the position and speed numbers are extracted, they can be processed into the two line xyz \n ax ay az format that can be sent to the fps 'control' action.
Offline
this is what I did in the button plugit
forget the area plugit this is not what you need.
if you move while pushing on it the "pushed move" event sent a direction ratio based on button size. (can be used to make some joystick feature)
you can use this values with the control action of the fps
Offline
this is what I did in the button plugit
forget the area plugit this is not what you need.if you move while pushing on it the "pushed move" event sent a direction ratio based on button size. (can be used to make some joystick feature)
you can use this values with the control action of the fps
Thx I will look at that.
Offline
hello arkeon, i just tested the latest beta on my tablet. the button does control the fps camera.
however it's not exactly what i'm trying to implement (although I have around 90% of that figured out, will post in programming.
this is how the button behaves on the tablet:
drag up and release (or drag finger out of button area)>camera moves down along world y axis until you touch the button again
drag down and release(or drag finger out of button area)>camera moves up along world y axis until you touch the button again
drag left and release(or drag finger out of button area)> camera translates left along world x axis until you touch button again
drag right and release(or drag finger out of button area)> camera translates right along world x axis until you touch button again
if that's what you meant to do, it works just right, it controls the x vector and y vector of the camera shell
Offline
button.pushed move -> nav.control
and in link parameter
$1 0 $2
0 0 0
button.released -> nav.control
0 0 0
button rot.pushed move -> nav.control
and in link parameter
0 0 0
$2 $1 0
button rot.released -> nav.control
0 0 0
0 0 0
0 0 0
Offline
button.pushed move -> nav.control
and in link parameter
$1 0 $2
0 0 0button.released -> nav.control
0 0 0button rot.pushed move -> nav.control
and in link parameter
0 0 0
$2 $1 0button rot.released -> nav.control
0 0 0
0 0 0
0 0 0
thx arkeon...what do the $ symbols mean?
and is this plugit posted to redmine (i think you worked on it after the latest beta you bundled), since the button plugit running on my tablet that i downloaded yesterday doesn't have the rot.pushed or rot.released events.
Offline
this is instance name example
"button rot" as the second button instance for rotation
$1 means :
nth parameter from the event
pushed move return X Y
so you set X 0 Y
$1 0 $2
Offline
this is instance name example
"button rot" as the second button instance for rotation$1 means :
nth parameter from the eventpushed move return X Y
so you set X 0 Y
$1 0 $2
i see, so
$1 maps the x value from "Pushed move" (parameter 1, i guess they are 1 based) to itself, but the $ allows the amount to vary, to be positive or negative according to the value set in fx
$2 maps the fy quantity to the z vector, and it also allows the amount of the fy quantity to influence forward/backward movement.
so each button can influence two parameters, i see
the only thing i would like to do is allow movement to take place while the touchpoint is on the button and has not yet been released or exited the button area. is there a callback that exists that keeps track of where the touchpoints are on the screen?
Offline