Hands-on with the Narrative Clip wearable lifelogging camera: disappointing

In early 2012 it seemed like everyone jumped on the Kickstarter bandwagon. At TechCrunch Disrupt San Francisco 2012 I was introduced to a project called Memoto, a miniature camera small enough to clip on to your clothing and it’ll automatically take photos of your day-to-day life – lifelogging.

At the time it seemed like a diamond in the rough (remember this was before Google Glass was announced). It exceeded its $50,000 goal with $550,000 of backers. Fast forward to today, after a year of production delays and a company rebrand, the Narrative Clip has finally started shipping.


The design

The Narrative Clip takes the wear in wearables quite literally. Inside a plastic case about half a business card sized is a 5MP camera, a GPS, accelerometer, accelerometer, magnetometer and 8GB of memory. It’s a good example of minimalist Scandinavian design and engineering that weighs less than a pack of gum.

A small pinhole opening on the frontside is the camera lens. A metallic clip at the back allows the device to grip on to most pieces of clothing – shirts, pockets, hats and anything with an edge.

Since there’s no buttons on the device a 4-light LED indicates battery life and a microUSB port (with rubber cover) charges and syncs. The front surface is touch sensitive so you can double tap to force it to take a photo and show battery life. Without a power button, placing the Clip face down or in a totally dark place (like a bag) will put it in sleep mode.

Although you’re suppose to wear a wearable, the Clip can stand vertically on any of its four sides and the user guide actually encourages people to use the camera as a time-lapse tool to capture the clouds outside a window as a neat secondary use for the device.

2014-03-21 02_57_00 +0000

The camera

The Clip automatically takes a photo every 30 seconds (changing the interval is coming in a future firmware update). Needless to say the camera in a lifelogging wearable is pretty important but unfortunately the camera in the Clip leaves a lot to be desired.

The most significant issue is the field of view. At only just 70 degrees, the photos are basically half of what the human eye sees (approx 120 degrees). The camera on Google Glass on the other hand is fantastic with a wide-angle lens (no official FOV but I predict around 100-120).

2014-03-21 04_12_58 +0000

2014-03-21 03_05_05 +0000

One redeeming factor about the camera is the ability to automatically correct the tilt of the photo. Due to the flexible design of the Clip, in a lot of scenarios it will be attached to clothing at an angle and also taking photos at an angle. Of course photos at an angle doesn’t make for stunning pictures.

Thanks to the sensors built-in to the device, each photo has some metadata which the Narrative cloud servers process and make adjustments to the rotation of each photo so it is more consistently levelled with the horizon.

The app

The Clip is actually pretty useless without two companion apps: an uploader on a computer and a viewer on the phone.


The uploader program for Windows and OS X allows the Clip to sync its photos to the computer, the cloud or a combination of the two.

The Narrative Cloud is free for all users in the first year (and $9/month after that). Due to the amount and size of the photos you would take day-to-day, the cloud is actually the only reasonable option if you intend to keep all your captures. The Narrative Cloud is also required for the photo post-processing features like tilt-correction, date grouping and location grouping.

Users with slow or limited upload bandwidth are not going to enjoy the fact that you could be uploading gigabytes of photos every few days.


The mobile app, available only on iPhone and Android at the moment, provides the ability to view the post-processed and grouped photos stored in the cloud.

The cloud service tries its best to automatically select a number of photos in each group that it thinks are interesting and represent key “moments” of the day. Whilst this trims down browsing hundreds of photos down down to just a handful, the algorithm is a bit hit and miss.

Each moment (trimmed or in full) can also be played back like a time-lapse, but unfortunately there’s no way to export a video or animated GIF of this. You can share individual photos to Facebook, Instagram, Twitter or email.

The experience

Wearing a Narrative Clip is somewhere between a watch like Pebble and a headset like Glass. It’s not as obvious as a display and camera on your head but someone directly infront of you will definitely still notice a little black box around your neck.

Even though I believe the arguments against wearables for its ability to discretely take photos and video are invalid by the fact that video recording glasses and pens are readily available in the market with much more affordable prices and discreteness, however I do recognise that people behave differently if they are aware they are being recorded which changes the dynamics of social gatherings.

For that reason I think the Narrative Clip is actually a worse experience than Google Glass since the Clip is entirely passive. As long as it is worn, it is always recording at a set interval without any interaction or control, until it is put in a bag or placed face down. A device like Glass on the other hand is barely “on/active” and only takes photos on command.

The funny thing is that I actually felt uncomfortable wearing the Clip myself.


If you’re willing to upload gigabytes of nondescript photos day after day with the anticipation that you might have captured an interesting Kodak moment while out and about, the Narrative Clip is the gadget to check out.

Wearable technologies have come a long way since 2012 and the Narrative Clip is disappointing at $279. Its simple lifelogging functionality only scrapes the surface of what I have come to expect of a device to be worn day-in day-out.

Windows Phone Store: these aren’t the Facebook apps you’re looking for

I don’t know what the hell is going over at the Windows Phone Store, but I believe the following screenshots can encapsulate everything that is wrong with Microsoft’s app strategy and approval process for the Store.

Blatant artwork theft, copyright/trademark infringements and deceptive conduct by “developers” who spam the Store with website-wrapper apps. (A problem Microsoft only itself has to blame for submitting many unauthorised website-wrapper apps themselves on behalf of popular brands.)

The following screenshots are taken on 15th of March 2014 10pm searching for “Facebook” on the Australian Windows Phone Store. The first result with 4-and-a-half stars and over a thousand ratings is not the official Facebook app. From the first to the 49th, they are not the official Facebook app (except the second Messenger app which is official but not the app we’re looking for).



The circled app is the official app. Very obvious right?

The same ridiculous results are also displayed on the phone.


Thankfully the U.S. Windows Phone Store does not seem to exhibit the same ranking problem for the official app, not to say it’s not full of filth from result three onwards.

I can’t begin to imagine the experience of a customer who just walked out of an Australian mobile phone shop with a new Windows Phone device and goes to try download Facebook from the Store.

I suspect there’s some serious app review manipulations at play here, probably involving bots, that is artificially bumping malicious apps (and possibly demoting the official app).

With all the Microsoft management musical chairs that’s been happening over the past few months, I’m beginning to wonder if anyone is actually still in charge of the Windows Phone Store.

[tl;dr] Titanfall on Xbox One: impressions

I must have missed a memo – I don’t understand the hype behind Titanfall. My theory is a combination of: it’s a new franchise in the popular online shooter genre, it’s an exclusive “near-launch” title for Xbox One, it’s developed by a company made up of former Infinity Ward veterans and everybody loves giant mech robots. Did I miss anything else?

Nevertheless, it’s the game that everyone’s buzzing about and so far it’s scored pretty high by critic reviews (86 on Metacritic at the time of writing).

Here are some of my quick impressions. Caveat: I admit to be a reluctant console gamer (I still prefer PC games).

  • It’s pretty much a cross between Battlefield, Halo and Call of Duty. If you played any of those (who hasn’t), you won’t be far from home.
  • Since the entire game is multiplayer-only (even the “campaign”), you’ll get used to waiting in a lobby for teammates or an opposing team.
  • Rounds are short and usually skewed towards one team, so you’ll either feel like a champion or continuously beaten down. A neat consolation prize for the losing team is the race to an evacuation ship at the end of a round which grants bonus points.
  • The controls, maps and characters are simple and straight-forward. Wall-running and double jumps are interesting but doesn’t change the core mechanics of gameplay (lots of shooting and running).
  • The levelling system and unlocks are consistent with most online shooters. New weapons and titans keeps things fresh without throwing the balance out the door.
  • The titans are fun while it lasts: enemy titans always seem invincible, my titans always die easily. You’ll never get of watching a titan drop from the sky.
  • Although it runs pretty smoothly, the graphics are just acceptable for a next-generation game. After all it’s based on the Source engine but you don’t spend a lot of time standing still to enjoy the scenery.
  • The little story you do get from the “campaign” by mission reports and voiceovers is pretty dry and generic space-colony uprising fluff.
  • The network ping/latency for Australian players will be a bit high, but it’s not unplayable (unless you’re a competitive type). It’s rumoured Australian servers will come. EDIT: Australian servers are coming March 14.

In conclusion, if you have an itch for an online shooter on Xbox One, Titanfall is as good as any and you’ll have a lot of fun. Otherwise, I’m not sure it or any other game for that matter could live up to it’s hype.

EDIT: The Titanfall Reviewer’s Guide supplied by EA is a great reading resource to understand the game.

Sunsetting MetroTwit: all good things must come to an end

Cross-posting this from the MetroTwit blog since this project was a big part of my life since 2010. I’m proud to have built an app that so many people enjoyed using day-in day-out. It was an amazing learning experience about WPF, .NET desktop apps and shipping a consumer application (before the Windows Store).

I want to personally thank everyone in the developer communities and Microsoft developer evangelists who helped us get across many challenging technical hurdles and made it possible for us to deliver a great Twitter experience on Windows for as long as we could.

We are saddened to announce the end-of-life of the MetroTwit for Desktop and MetroTwit for Windows 8 apps effective immediately.

Due to the “access token limit” imposed by Twitter since August 2012, we are preemptively sunsetting MetroTwit due to technical limitations of Twitter’s API which may prevent existing users from accessing the app after the limit has been crossed.

Effective immediately, we will be removing the MetroTwit for Desktop installer and MetroTwit for Windows 8 Store listing to ensure the app remains usable by all current users.

If you’re a current MetroTwit user, we apologise for the inconvenience but don’t worry, the apps you love to use will continue to work. However, we will not be supporting the app or releasing any major new features and updates.

We’re extremely proud to have worked on MetroTwit and want to thank the over 400,000 Twitter users who used MetroTwit over the past 4 years and have helped shape and support it.

A very special thanks to our MetroTwit Loop beta group who have been our exceedingly enthusiastic supporters and have let us know both the good and bad about MetroTwit since our first beta version.

None of us could have ever imagined what a humble Photoshop mockup would become as popular and acclaimed as MetroTwit. Not without its challenges and struggles, we’re proud to have worked on this app and its many updates.

Once again, thank you all.

The MetroTwit team
David Golden, Winston Pang, Long Zheng

Making directions better on Google Glass and having fun with the Mirror API

Getting map directions is easily one of the best features and use-cases for Google Glass. Seeing turn-by-turn directions at the corner of your eye when you’re out and about is one of the simple pleasures of wearing a computer on your head.

Unfortunately the only way Google provides to start navigation is with speech recognition which fails more times it works. Even though Glass’ speech recognition works well enough for simple queries like “Pizza Hut” or “62 King St”, it stumbles on more complicated place names and addresses (especially with an Australian accent). Of course there’s also the problem of sound like a crazy person yelling addresses on the street.

Needless to say this problem has been frustrating me for weeks and because I had so much fun developing my first Google Glass app, I knew I could solve it too.

The solution had to be typing, but you can’t type on Glass. So the next best thing was to type in a browser or on your phone, then send the address to Glass, like ChromeToPhone. Thankfully, the Glass Mirror API allows you to send content with a geolocation latitude/longitude and a “NAVIGATE” action for this exact purpose.

So over the Valentine’s Day weekend, I decided what better way to spend a romantic evening than with the Mirror API, PHP, SQL Azure and the Google Maps API. After a few hours of trial and error, Map2Glass.com was born.


It’s a simple website that lets you login with a Google Glass account and opens a map view with an autocomplete search box at the top. Google Maps’ v3 API makes this almost too easy. A “Send to Glass” button then takes the latitude and longitude of a pinned address (along with some other metadata), formats it to a Glass Timeline card and sends it to the Mirror API. Once received on Glass, a simple tap begins navigation to the embedded location.

I threw the code on Windows Azure Web Sites, bought a domain and started spreading it around. On a post in the Google+ community of Glass Explorer users I got a comment which was very fitting for Valentine’s Day and it made it all worthwhile.


What this “phone-to-Glass” workflow has taught me is that even though I strongly believe wearable computing is the future, simple and precise tasks like typing can be perfectly complimentary to the wearable experience.