As Intel RealSense SDK is evolving this is what we can expect in the near future.
My exposure to HTML 5 and JavaScript
I started learning HTML 5 and JavaScript a few months back and was wondering how it is so easy to implement for cross-platform apps. The opportunity is endless.Intel has already handed over a tool for developing cross-platform apps for HTML 5 and JavaScript,the Intel XDK. I target Android applications, but anyone can use it to make cross platforms. I started exploring different templates with the Intel XDK and I found the Crosswalk framework the best as I could bring WebGL apps to the Android platform.Hence the voyage started.
The New Intel RealSense SDK
As the new Intel RealSense SDK Beta version will be soon released, with the next gen SDK as it is evolving we will be able to build Intel RealSense apps for tablets,PC, Ultrabooks, 2 in 1's, all having their own versions of the Intel RealSense camera embedded within it. Just imagine how good it will be to develop apps for Tablets, Ultrabooks,PC, 2 in 1s and the reach for this platforms which will be immensely popular and it will result in more vibrant and innovative human interaction.
The next Intel RealSense SDK beta will come with features that we can use and implement in different parts in our apps.
Supported features in the SDK include:
As JavaScript will be fully supported, chances are in the near future that using Visual Studio we will be able to integrate different JavaScript Libraries.
The SDK will be added with
i)Input Device Manager
ii)Multi-Mode Support
iii)Power Management
iv)Extensible Framework
v)Privacy notification tool
What Lies In Store for HTML 5 Javascript Developers
Here is a brief overview on how we can develop Web-based Unity Web player apps or interactive websites.
More updates in the future could possibly include Windows Metro 8.1 Support (soon to be released), so here is an opportunity where HTML5 JavaScript developers will be able to build apps using Intel RealSense SDK by integrating different WebGL components, JavaScript libraries, and incorporate creatively coded apps with a touch of Intel RealSense. This is definitely exciting!
What can happen in the near future, and how we may create connected HTML 5 JavaScript and Intel RealSense applications? Here's a brief overview.
What will happen in the future?
It's safe to predict that the Intel RealSense SDK will be extended to a JavaScript library which will be renamed as, perhaps, RealSense.js and we will have to include it in the supported JS libraries in the index.html page.
Now we will have to initialize the session in a <script> tag release it before quitting and implement the gesture logic within another function.With this we will be able to bring in and integrate different Java Script libraries such as
i)Three.js
ii)Processing.js
iii)Paper.js
iv)Physics.js
Imagine implementing these JavaScript libraries within your apps; it will give a great feel and will be leveraging the Intel RealSense SDk features to the fullest.
Most of these Libraries are open source and we have lot of examples to explore. Below, I give you an overview on how we can use the RealSense SDK with these current JS libraries.
Our Primary target for the app would be to look for the functions that have Mouse movements, and associate these with Gesture Events.
Here, I give an example of how to do it for a current Processing.pde app, changing the mouse movements to gesture movements
The current example shows an excerpt from mouse movement function script:
void keyPressed(){ if(key == ''){ renderForces = !renderForces; } else if(key == DELETE){ clear = !clear; } else if(key == TAB){ mouseAttract = !mouseAttract; mouseForce.setAttract(mouseAttract); } else { float mass = 0.0; boolean attract = mouseAttract; switch(key){ case '1': mass = 100.0; break; case '2': mass = 200.0; break; case '3': mass = 300.0; break; case '4': mass = 400.0; break; case '5': mass = 500.0; break; case '6': mass = 600.0; break; } if(mass > 0) addForce(new PVector(mouseX, mouseY), mass, attract, true); } } void mousePressed(){ if(keyPressed){ if(key == BACKSPACE){ PVector mousePos = new PVector(mouseX, mouseY); for(int i = forces.size()-1; i>=0; i--){ Force f = (Force) forces.get(i); if(circleIntercept(mousePos, f.pos, 5)){ forces.remove(i); break; } } } else if(key == CODED){ if(keyCode == CONTROL){ PVector mousePos = new PVector(mouseX, mouseY); for(int i = forces.size()-1; i>=0; i--){ Force f = (Force) forces.get(i); if(circleIntercept(mousePos, f.pos, 5)){ f.forceOn = !f.forceOn; break; } } } } } else { mouseForce.pos.set(mouseX, mouseY, 0); mouseForce.forceOn = true; } } void mouseDragged(){ mouseForce.pos.set(mouseX, mouseY, 0); mouseForce.forceOn = true; } void mouseReleased(){ mouseForce.forceOn = false; }
The corresponding code shows how we do it for Gesture Movements; the target part is mouse movement and we convert it to Gesture Detection and movements.
import intel.pcsdk.*; float[] mHandPos = new float[4]; PXCUPipeline session; PXCMGesture.GeoNode hand = new PXCMGesture.GeoNode(); ArrayList particles; ArrayList forces; PImage buffer; PImage loadedImg; float G = 1; boolean clear = true; boolean renderForces = false; Force mouseForce; //special Force for the mouse boolean mouseAttract = false; int gState = -1; void getGesture() { PXCMGesture.Gesture gest = new PXCMGesture.Gesture(); if(session.QueryGesture(PXCMGesture.GeoNode.LABEL_BODY_HAND_PRIMARY, gest)) { if(gest.active) { if(gest.label==PXCMGesture.Gesture.LABEL_NAV_SWIPE_LEFT) gState = 0; if(gest.label==PXCMGesture.Gesture.LABEL_NAV_SWIPE_RIGHT) gState = 1; if(gest.label==PXCMGesture.Gesture.LABEL_NAV_SWIPE_UP) gState = 2; if(gest.label==PXCMGesture.Gesture.LABEL_NAV_SWIPE_DOWN) gState = 3; if(gest.label==PXCMGesture.Gesture.LABEL_HAND_CIRCLE) gState = 4; } } } void drawGesture() { pushMatrix(); translate(320,0); // image(irImage, 0,0); float rad = 10; pushStyle(); noStroke(); ellipseMode(RADIUS); switch(gState) { case 0: { //mass = 100.0; PVector mousePos = new PVector(mHandPos[0], mHandPos[1]); for(int i = forces.size()-1; i>=0; i--){ Force f = (Force) forces.get(i); if(circleIntercept(mousePos, f.pos, 5)){ forces.remove(i); } else{ mouseForce.pos.set(mHandPos[0], mHandPos[1], 0); mouseForce.forceOn = true; } // fill(255,0,0); // ellipse(320-rad,120,rad,rad); //mass = 100.0; break; } } case 1: { //renderForces = !renderForces; break; } case 2: { clear = !clear; break; } case 3: { mouseForce.pos.set(mHandPos[0], mHandPos[1], 0); mouseForce.forceOn = true; //fill(255,255,0); //ellipse(160,240-rad,rad,rad); break; } case 4: { mouseForce.forceOn = false; break; } } fill(255); fill(0); popStyle(); popMatrix(); } void setup(){ size(500, 500); session = new PXCUPipeline(this); if(!session.Init(PXCUPipeline.GESTURE)) { println("Failed to initialize the PXCUPipeline!!!"); exit(); }
The previous example shows how we can change the mouse movements with gesture movements and the point of modification.
What we can expect is cool ways of interaction with cross platform Intel RealSense apps made with HTML 5 and JavaScript made for different form factors. If a full featured JavaScript library for the SDK is released we can easily integrate it with Intel XDK and make great apps for different platforms.The future of Intel RealSense technology is very promising and there is a lot to explore.
Icon Image:
