I’ve always had a soft spot for Flashsticks since buying some of their French language learning Post-it notes and it came with a postcard which had clearly been passed around their office and was personalised and hand signed. A lovely touch.
Flashsticks, if you’re unaware, provides printed Post-it notes for language learners. They have the keyword, a phonetic pronunication guide, the English translation, part of speech and a little picture. They’re designed to be stuck around to aid memory – see them enough times and the words will stick in your memory.
I’ve stuck them around the house next to objects they are linked to – here’s the kids’ school calendar:
One if their key USPs is the blending of digital resources into these physical assets. Using their Flashsticks app you can scan the post-it note, and you are pushed a small video with someone modelling the pronunciation. This video from their team shows it in action albeit for the English version.
This is a solid idea, the app augments the physical objects in a beneficial way by adding something the physical object cannot do – have a voice. And the physical object is arguably superior to a digital version in that you can stick them around the house which means you see the words everyday and they are reinforced.
However, I do find it a great shame that they went to all this trouble and limited the video to a few seconds of someone speaking the word. If you’re going to go to the trouble of making the videos, and the user is going to go to the trouble of scanning them, why not have something slightly more in depth – an example sentence to show the word in content and with natural pronunciation in a sentence would be an obvious extension.
Although I’m a big fan of the stickers (my wife less so – she wants them down off the walls, but then she already knows French) I felt a little disappointed by the augmented video content and have not found myself scanning as intended.
Last week though I was pushed an email saying there was a new version of the app out a with a brand new feature – the ability to take a photo of anything, have the app scan the image and then return what it is in your target language.
This works with brilliant results as can be seen in this series of images…
The final one was particularly impressive. It’s hard to see under the overlay but I deliberately took a photo with a lot of noise; cables, pens, plugs, stapler. And it picked out my headset – not a regular headset but a folding pair.
The technology behind this is quite astonishing. The scanning time is a little slow taking a good thirty seconds to return a result but it has a good wow factor. People in the office were suitably impressed.
Is it a gimick or is it something that could have long term use? As is often the case, the times when this would be most useful is when travelling abroad in situations where you would not have online access – getting the name of a vegetable in the supermarket, needing to describe an object to someone or just wanting to know a word out of curiosity. Without internet access this is useless.
This would be great fun with language students though. A teacher could set loose a group of students to go around the school environment taking photos of things they don’t know in their target language. The best results could be easily shared. I’d like to try this activity out in class.
The major barrier to this though is that after five scans, you need to make an in-app purchase for more. This is OK in principle as they need to cover their costs but any in app purchase is hard in a class environment – whether that’s students using their own devices or with class sets.
To see if this was more than just a gimmick, I would want to know the lifetime usage by doing some cohort analysis. How many users are using the app one week after first opening the app, after two weeks, after four weeks etc.?
The Flasksticks Post it notes can be bought from http://flashsticks.com/ where you can get the app too.