As mentioned on my previous post, my next experiment would have been connecting the iPhone to a Flash application. So here it is:
iPhone 2 Flash from nuthinking on Vimeo.
I managed to connect to the iPhone a couple of weeks ago but I wanted to create a decent demo to better show the technical achievement. Multi-touch interactions have still a long way to go and so far it's quite hard to see it useful for contexts different than editing, and of course multi-user experiences. In this experiment the user is able just to modify size and orientation of a carousel and its scrollbar, nothing more a part of course being able to scroll the carousel. I'm definitely interested on seeing how multi-touch interactions will be applied to standard UI components, as I showed also with my previous experiments.
To create a bridge from iPhone to a Flash application I needed to use an iPhone application which would connect to a socket server the Flash application would receive data from. After I could try the efficiency of Multitouch Framework, I used their iPhone application and their framework for the Mac application, then I imported some classes from OpenFrameworks' addons to create the socket server (the same I used on one of my previous experiment), that's it. I would like to share though the most interesting parts of the process.
As you can see, after the finger starts to move a delay starts growing. This is because at any finger movement the iPhone sends a package of information to the application via network, and these packages are sent very frequently so that the network starts accumulating them. To solve a problem like this, you should avoid to send information about small movements. I implemented this on my mac application, but unfortunately the problem seems to be in the root, the iPhone application. The iPhone application seems to send to their framework all the information, without filtering them, the framework, in facts, returns all the events in basically the same way you would get them in an iPhone application, it would be nice to have a tolerance, which you can modify, for small movements so that you can optimize the communication, as I did on my Mac app. The only annoying part of this is that you have to identify the finger, thing that doesn't happen by default with iPhone SDK, but it's very easy to do by the way.
Other thing I wanted to mention is that despite the idea of using Multitouch Framework was to test multi-touch applications quickly, thus just using the iPhone as input device, I had to find an even more portable way to test it. In facts the Mac I usually use to write ActionScript doesn't have Leopard and thus is not the one I use for Mac development (with Objective-C 2.0), so I wanted to be able to test it without anything else than the usual input information a Flash application can give. The solution was pretty easy and combined perfectly with the controllers I was writing. I emulated the finger press just double clicking with the mouse, this would create a touch view which I can then drag or removing with another double click. Precisely the application can easily switch between the socket multi-touch controller, to the mouse emulated one. Here is a capture of this modality:
Multitouch testing in Flash from nuthinking on Vimeo.
When I started to test the iPhone as input device for network connected applications I saw instantly as obstacle the fact that the iphone can only track 5 fingers and that its surface is very small, comparing to common multi-touch surfaces. At the moment my emulated fingers can be dragged only one at the time, but it should be pretty easy to group some of them to move them together, in fact, as shown on my previous experiments, some interactions might required it and so it could be pretty handy to add this feature.
What's next? Multitouch framework is evolving quite rapidly, so I can't exclude I will experiment more with it soon, but at the moment I'm trying to be focused on an iPhone game, tilt based, I started designing and developing.
Stay tuned then!