We have finished the basic setting part of our app, which allows: start or stop the customized unlock service, choose the type of pattern user want to create, draw a grid pattern, confirm a grid pattern, save a grid pattern and choose the app user want to open.
The UI of the first page is improved.
User need to confirm the pattern twice to save the pattern.
Only the apps other than system images are listed and icons are added to the list.
The white board has been developed to provide the input for the analyzer and learning process.
The interface of analyzer has been developed to accept the input and provide app list to show up.
Add another activity to let users view and modify the created pattern-action pairs.
Improve the UI to make it more attractive.
Design an icon for our app.
Start implementing the machine learning algorithm for pattern recognition.
For the main user interface, design several mode which can be changed by gestures such as shaking detected by GravitiDetector.
In this iteration, we would try to implement the voice unlock function. Actually we could accomplish this function in Matlab but transplanted this to java may take some time. Besides this, we would also try some other features like imaging unlock, like drawing "A" to open a specific app. One of our members is doing this recently.
In this iteration, we would focus on the debugging of this app. Also we need to modify the parameters used like the voice unlock thresholding. If the thresholding is set high, users may often fail to pass the test; if the thresholding is too low, this app would be a failure as it's no longer a "lock".