Entraides et échanges autour de la technologie Scol - Informations and exchanges on the Scol technology
Vous pouvez changer la langue de l'interface une fois inscrit - You can change the language once registered
You are not logged in.
Pages: 1
yes, it's running a customised android 5.1 on an Intel atom cpu. One device connection should suffice at the moment.
they're pretty cool, light and the marker tracking done with OS3D seems to be working fine from what I could figure out.
On a different note, is it possible for OS3D to access other sensors on the Android device separately, such as gyro or accelerometer?
Gyro would be good for head gestures to control the content on the glasses, e. g. scrolling through menus etc.
That looks impressive!!
I'm working on a AR for dental use, I'm using Epson BT300 smartglasses and need to link them to a series of peripherals and sensors using bluetooth (preferrable) or wifi.
Not having a coding background, after a short affair with Unity I have found OS3D easier to use and intuitive for non-programmer.
Looking forward to seeing the update
That would be great Merci beaucoup
I assume there's no workaround at the moment?
Hi Arkeon,
Thanks for this; however I have difficulties isolating YPR in separate variables.
Basically for the app there is a cube that shows the orientation of the device that zero on touch. To zero I need to read the orientation of the device and then orient the cube with the realtime value of the orientation of the device minus the orientation recorded on touch.
I have done something similar in Unity but its read of the orientation is unstable, I have found that OS3D is far more replicable.
Any thoughts or advice on how I can do this?
Cheers,
Paul
Hi
A newbie question here, how can one get the components (yaw, pitch and roll) from the orientation sensor plugit?
I'm trying to get a spirit level application on android that is zeroed on touch.
Many thanks,
Paul
Pages: 1