Inspired by this detailed article by Carl D. Worth I began experimenting with stroke recognition on the iPhone. Unfortunately the sources for xstroke are very difficult to find nowadays and are unsupported. I finally did find them but I did not want to port all that X11 stuff, so I decided to start from scratch and did a small feasibility study which I want to show you here.
I created the project “KrikelKrakel” (German for scribbling) on BitBucket.
The most interesting class you would look at is KrikelKrakelView. It inherits from UIView and does all the tracking and recognition. The gestures are recognized when the touches ended. The area where touches took place is divided in a grid with 9 cells and the path the finger took is then described by the cell ids. You should have a look at the article mentioned earlier about the details.
One can register as a delegate to get called on different occassions:
- (void) willDrawGesture;
This will be called right inside the touchesBegan method.
- (void) didLearnNewGesture:(NSString*)text;
When a gesture has not been recognized the user will be presented with an alert box where he can enter a letter or some more text.
- (void) didRecognizeGesture:(NSString*)text;
When a gesture has been recognized this method will be called and the stored letter/text will be delivered in the variable text.
The learned gestures a stored in the application documents directory with the name “strokes.dict”. If there is no file on first start the bundled strokes.dict will be used a the initial version.
See a demo video here.