Showing posts with label AIR. Show all posts
Showing posts with label AIR. Show all posts

Saturday, December 10, 2011

FaceOSC and AIR working example

I got a couple of questions about how to make an AIR app listen for FaceOSC, as shown in these two previous posts, so I've put together a basic 'one class' working example in Flash Builder.



What do you need?

FaceOSC (mac only, already included in my download project in the faceOSC folder)
Osculator (mac only)
This as3 Tuio library to make it easier listening for the OSC messages from FaceOSC (also already included in the src folder)

Actually you don't need Osculator to make the FaceOSC to AIR protocol work, but you need it to open the FaceOSC-Osculator.oscd file - in the osculator folder - that comes with FaceOSC in order to see all the OSC addresses that Kyle used for the facial gestures and expressions. Note that you cannot run the Osculator file and the AIR app simultaneously as they use the same port.

I also exported the project as an AIR app, which you can find in the air folder. A certificate is included, the password is 1234. Basically, you just need to open up FaceOSC and publish the Flash Builder project, or run the AIR app. You can download the example project here.

Tuesday, September 27, 2011

Chanter: brainwaves, EEG, meditation and Flash

I've released a desktop app called Chanter. It's a tool for people who'd like to improve their meditation or mindfulness skills, which gives instant audio feedback about your state of meditation. In short, the app plays a mantra-like audio loop, and changes the pitch of the loop according to the meditation level of the user.

You need a NeuroSky headset, the MindWave or Mindset, to make this work. The headset registers brainwaves - EEG - that are sent to the computer wirelessly, using a dongle that comes with the headset, and a small software program that you need to install.

Flash
Seantheflexguy was the first developer who made me aware of this exciting hardware, and he also wrote an open source ActionScript 3.0 API, making it simple to hook up the headset to a Flash or AIR project. Actually I'm not using that API, as NeuroSky also provides some brief and clear instructions on making Flash listen for messages from the headset.

Brainwaves and level of meditation
The messages returned by the NeuroSky software are not only the delta, theta, alpha, beta and gamma bands of the EEG, but also two values that are based on these bands: concentration and meditation. Obviously, Chanter listens to the meditation value, although concentration and meditation are not opposites; the 'ideal' state of mind would be a value of 100 for both categories.



Why Chanter?
There are quite a few apps in the Neurosky store, most of them simple games, with an emphasis on mind exercise. The one that - in my opinion - stands out is another meditation tool called Dagaz. What struck me though is that all available apps are based on the idea of visual interaction, in other words: you need to look at your computer screen while experiencing the tool or playing the game. My next thought was: wouldn't it be nice to build a meditation tool that is based on audio only, so that you don't have to look at your screen when meditating, a tool that even allows you to close your eyes? This idea resulted in Chanter.

A promotional website can be found here, and the app is available in the NeuroSky webstore. Below are a couple of screenshots of the app.




Pitching the mantra loop

The code for changing the tone of the mantra mp3 that the user selects is derived from a class by (surprise, surprise) Andre Michelle. It's not real pitch shifting, which would mean maintaining the same tempo, but it suffices for the short loops that I'm using.

Monday, September 19, 2011

Installing Adobe AIR 3 SDK in Flash Builder 4.5.1 on Mac OS

To install and use a new AIR SDK, you need to overlay it on an
existing Flex SDK. This is a bit more tricky on a Mac than on
Windows, because - as far as I know - you still cannot do a copy/overwrite
without deleting the contents of the folder you are overwriting.

The following steps illustrate how to install the AIR SDK using the terminal.



UPDATE April 4, 2012
I haven't done it myself yet, but the steps below also seem to work for overlaying the AIR 3.2 SDK on top of the 4.6 SDK.

UPDATE October 5, 2011
I wrote this post during the pre-release period of Flash Player 11 and AIR 3. Now that they are both officially released, I have changed the links to the official release url's, but I haven't updated all the screenshots!

1.
Close Flash Builder or Eclipse.

2.
Look up your current SDK. I was using 4.5.1. which on my macBook is located here:

/Applications/Adobe Flash Builder 4.5/sdks/


3.
Copy the entire, current SDK folder and rename it, I named it AIR3SDK.

4.
Download the AIR 3 SDK here,
and save this file inside your newly created AIR3SDK folder.

5.
Open the terminal (Applications>Utilities>Terminal.app), and cd to the root of the AIR3SDK folder.

You can also just type cd, add a white space and then drag the SDK folder in your terminal
username$ cd /Applications/Adobe\ Flash\ Builder\ 4.5/sdks/AIR3SDK
and hit enter.

6.
Now that you're in the root of the new Flex SDK folder, type or copy/paste

tar jxvf AdobeAIRSDK.tbz2
or use the name of the file you downloaded, when it's a different version,

and hit enter. This will unzip the tar file that you've downloaded, and automatically overwrite any duplicate files. Here's a (NOT UPDATED) screenshot of my terminal for step 5 and 6:



7.
In the root of the AIR3SDK folder, open flex-sdk-description.xml and change the content of the name node from Flex 4.5.1 to AIR3SDK.



8.
Still inside the AIR3SDK folder, go to

/frameworks/libs/player/

and create a folder called 11.0.
Now download PlayerGlobal (.swc) here, save it in the 11.0 folder and rename it to playerglobal.swc.

9.
Open Flash Builder or Eclipse and start a new project. In the opening popup, click on Configure Flex SDK's, click on Add, click Browse to browse to the AIR3SDK folder, and open it. Now you can set this SDK version as default SDK or select it in the dropdown list when you're back in the opening popup.



AIR 3 has some cool new features such as Captive Runtime, which bundles the AIR runtime with your app, so that the user doesn't have to download it separately; something that you already could do with AIR to iOS, but is new for Android and desktop.

Another new feature is extending AIR with Native Extensions, meaning that you can integrate native code from the target device into your app, of which one of the results can be a major performance boost.

Monday, July 25, 2011

FaceWriter, typing text with your face - FaceOSC & Flash (Air)

Here's another video that shows an Adobe AIR app I've been working on. It listens for the data from FaceOSC, a great tool by Kyle McDonald; read more about it in my previous post.

This AIR app enables you to type text by moving your head and using some facial gestures.
It's quite basic for now, allowing you to add text, add spaces, do a backspace and clear the entire textfield. Functionality could be added of course, such as saving the final text to a local file or e-mailng it.



The hardest thing is to keep it all stable and precise, as you tend to change the orientation of your head quite easily. This would require a more advanced way of calibrating, and some functionality that lets you re-calibrate once in a while.

Anyway, I'm not sure if this could be benificial to anyone - I was thinking of people with spinal cord injury, but I'm far from familiar with that area. If you have any ideas, please drop me a line.

UPDATE 11-12-2011: I've put together a basic working example of an AIR project that listens for FaceOSC messages. You can find it here.

Monday, July 11, 2011

Face gestures, FaceOSC and Flash

The coolest thing I saw last week was a video by Kyle McDonald , who released a brilliant tool called FaceOSC, which handles facial gestures. FaceOSC is based on Facetracker by Jason Saragih.

With the help of this open source AS3 Tuio library I made an AIR project listen for the OSC messages sent by FaceOSC. OSC stands for Open Sound Control, although this example has nothing to do with sound.



The code to receive the OSC messages is quite simple. First, create a new OSCManager, add a listener, and start the manager. I found out that I had to use port 8338 by opening the osculator that comes with FaceOSC.
var oscManager: OSCManager = new OSCManager(new UDPConnector("0.0.0.0",8338,true),null, false);
oscManager.addMsgListener(this);
oscManager.start();

Then, in a function called acceptOSCMessage, check what kinds of OSC packets are coming in by checking the message address:
public function acceptOSCMessage(oscmsg:OSCMessage):void
{
//To see the different addresses(i.e. references to the gesture values)
//trace("address: " + oscmsg.address)

if( oscmsg.address == "/pose/orientation" )
{
//roll = oscmsg.arguments[2] * rollFactor;
//speed = oscmsg.arguments[0] * speedFactor;
//your code here
}

}

To make this last function work you need to implement IOSCListener.

The 3D scene is from the book Papervision3D Essentials.

UPDATE 11-12-2011: I've put together a basic working example, which you can find here.

Thursday, February 25, 2010

Remotely controlling a desktop AIR app with an Android phone

AIR 2.0 has support for UDP sockets, which allows you to send and receive messages using the Universal Datagram Protocol. This means that you can communicate with other applications without the need for an extra server.

I used RemoteDroid for this demo, which is an Android app that turns your Android phone into a remote control, using your own wireless network. Originally, the app gives remote access to native applications on your computer and simulates the mouse - moving it and pressing left and right buttons - but you also need to download a server on your computer to make that work.

However, since AIR 2.0 supports UDP, you don't need an extra server to make your Android phone communicate with an AIR app. In this video, I made the AIR app listen for UDP messages sent by the Android app, to control a paperplane in a Papervision3D scene. So basically, you can turn your Android phone into a game controller.

RemoteDroid has just been open sourced, which gave me a chance to take a look at the source code of the Android app. I added the accelerometer data of the phone to it (locally, I didn't contact the owner of the project yet), and in the video you can see how I:
- control the speed of the paperplane by pitching the phone
- steer the paperplane by rolling the phone,
- zoom in and out by pressing the left 'button' on the phone's touch screen
- change the camera mode to a random perspective by pressing the right 'button' on the touch screen
- change the camera perspective to the Third Person, First Person and default perspective by pressing the 1, 2 and 3 keyboard keys respectively.



The Papervision3D scene in the video stems from chapter 6 of the Papervision3D Essentials book.

Of course there are other apps than AIR that support UDP. For instance, I also managed to control a Unity file this way, so another video will follow soon.