1. On MyFitnessPal

    About a year ago I hurt my back and it’s taken me this long to be able to get back to being active.  I did 16 weeks of intensive physical therapy and have kept up with my exercises, but I still gained weight.  

    A little over 2 weeks ago I decided to lose that weight, and to use technology to do so.  After calling on my network for suggestions, I settled on buying a Garmin Vivofit and using MyFitnessPal to track my calories in.

    While I would really like the Vivofit to sync with MFP automatically, I’ve been able to manually add my steps as a “Steps Adjustment” cardio workout every night.  My strategy is to use MFP to track calories in and try to hit the goal it sets for me, and keep my walking calories as a safety net; if I go over my goal I still have my negative walking calories to put me back under.  Now that I’m going back to the gym I’m stationary biking at least twice a week, and the vivofit has really kept me motivated to not be sedentary.

    The best part about all of this is that it’s working!  After 2 weeks I’ve lost the predicted 2 pounds, this information is easily visualized in MFP’s progress window.  What I don’t see is any kind of visuals on my nutrition, so I decided to make my own.  

    It took me about 30 seconds to find the My Fitness Pal Data Downloader Chrome extension and use it to extract my MFP data as a csv.  After that I imported the data into a google doc, converted it to percentages of my daily allowance, and created this graph:


    It’s great!  I can easily tell a few things about my level of nutrition at a glance.  I eat very little sugar and trans fats, sodium is often high, I occasionally go over on fats and cholesterol and pretty much everything else is in line.

    MyFitnessPal, I’d love to see you implement something like this, or at least make it easier for us to extract our data!  

    PS: I’m a freelance software engineer and could make this happen for you.


  2. vrAse Week One

    One week in and I’m still loving my vrAse.  I’ve been really putting it through its paces trying to determine the viability of the tech and the ecosystem and so far I’ve been pretty impressed.  


    ( Photo Credit Javi Cruz )


    Essentially, the vrAse is a box with lenses in it that you put your phone in and strap to your face.  The device itself is well concieved, with no extraneous bits or things to screw up.  Putting it on and taking it off is exactly as easy as it would be with a pair of skii goggles.  I’ve been using mine mostly with my bluetooth headset and haven’t had any issues with the strap.  


    The only issue I’ve come across so far is that with extended wear it becomes uncomfortably heavy on my nose.  There doesn’t seem to be any padding in the nose area which is the source of this problem. There’s a place for an extra strap to be connected that goes over the wearer’s head (like a mining lamp) and I think getting a strap in there may help alleviate this issue.  Other than my aching nose, I’ve found it to be a pleasant experience; the device connects to my face firmly and doesn’t wobble or move around much once its there.


    In a word: excellent.  I have been nothing but impressed by the quality and consistency of the head-tracking in pretty much all of the apps I’ve tried so far.  It seems that using a camera in tandem with gyro and accelerometer is a winning combonation for low-power, low-latency tracking on the cheap (not to mention it leaves the door open for awesome augmented reality experiences).  


    This is probably the weakest point of the technology right now and it’s not even much of an issue.  The 3d graphics currently available on android devices are somewhere around the quality expected of a Playstation 2.  While this is not spectacular, and people tend to react to it, it’s not a showstopper.  Plenty of PS2 games were immersive and fun to play, and with the smoothness of the headtracking I mentioned earlier, it’s easy to forget a little jagginess and really get into the environment you’re simulating.  


    There are a handful of splitscreen-vr android apps available already (sometimes tagged with SBS for side-by-side).  Considering the fact that these devices haven’t even really been released into the mainstream community yet, I’m pretty impressed with the quality.  I’ll be following up this post with another just about content, so hang onto your eyeballs.

    Durovis Dive seems to be the dominant force in the developer community, they have a Unity sdk that includes a head-tracking plugin that most of the android apps right now seem to be using.  I downloaded the SDK but havent really dug in yet (more to come on this later).  vrAse has their own sdk I’m assessing before I start in on my first vr development, but so far I haven’t seen any buy in on it.

    The Future

    I can’t stress enough that this tech could really be The Future.  The intersection of massive availability and constant competitive development make mobile handsets an appealing platform.  There’s something truly wonderful about taking your phone out of your pocket, sliding it into the goggles and slapping it on your face.  That this tech is within reach of so much of our world’s population seals it.



  3. This is the future.  The incredible, silly looking, mind-blowing, life-changing future.  It’s called vrAse and I am lucky enough to have received a developer unit so I’ll be demoing this unit at meetups and beginning to develop some cool apps for it on android.  Check back here (and @eric_neuman ) for updates as usual

    The future’s so bright, I gotta wear VR-goggles.



  5. Quirky Issues

    I just submitted my second idea to the invention service Quirky and it was not without difficulty.  I’d like to call attention to a few UX issues the site has that could easily be corrected to improve the experience.  

    Image Upload

    The image upload process at Quirky is pretty bad, but it really doesn’t have to be.  They seem to be using dropzone.js or something similar to handle the upload interface, and it works great.  You can drag and drop your files, and you immediately get a thumbnail in the browser.  Unfortunately, when you hit the submit button, your idea page is filled with broken images.  


    When this happened to me, I immediately deleted my submission because I didn’t want my idea to look broken.  Essentially, broken images are never good, but quirky’s flow actually makes it worse.  The main ‘Invent’ page of Quirky shows the most recent ideas, but ideas come in quickly so your idea will only be there for a few minutes.  Since this is seemingly the only time your submission will be visible to other casual users it is an extremely important time to shine.  If my idea has broken images, it looses credibiity in that incredibly important time.  

    Not cool guys.  

    After contacting support and being abandoned for a few days, I tried again with mock data and images.  Eventually I discovered is that a few minutes after submission, the images I uploaded suddenly appear, fully functional and unbroken.  What this means is that Quirky is doing some asynchronous processing of the images between upload and availability but has not taken the time to provide a ‘Processing…’ message to viewers.  It’s like the engineers are pretending that the processing is instantaneous and hoping no one will notice, when in fact it takes several minutes.  

    It’s an easy problem to fix too.  All Quirky’s engineers need to do is write a quick conditional that checks to see if the image is done processing; if it’s not, throw up an image that contains a processing indicator.  That’s it.  As a stopgap measure, they could at least publish somewhere that images may take a few minutes to show up.  

    Social Sharing


    When you share an idea link on facebook or twitter, this boring Quirky symbol comes up instead of the actual image uploaded with the idea.  This makes no sense whatsoever.  The image is gray, neutral, and always the same: literally everything you want to avoid if you are trying to engage social viewers.  Additionally, it confuses my friends to see me posting about my idea but have a totally unrelated picture of a lightbulb come up.  Quirky wants it’s users to bring outsiders in to vote, so why have they done this?  I really don’t know.  They appear to have implemented the og:image tag but it’s not working.  


    These two issues, while seemingly minor have made my Quirky experience less than stellar, and given how little work it would take, it’d be nice to see them get fixed up.  


  6. My girlfriend hates Valentine’s Day, but I like it and we’re both tremendous nerds so I started a tradition: every day leading up to Feb 14 I created a custom meme of one of our favorite properties.  I just started working on this year’s so I decided to share the load from 2013.  



    Yes, they are supposed to be terrible.


  7. Python-Carteblanche

    I finally entered the world of open-source software!  Last week I published a tiny package called Carteblanche to integrate and simplify the process of generating conditional-permission-based menus.  

    This project is largely an experimental platform for me to take aim at some of the more ideological aspects of REST.  Specifically, I find that verbs are actually a very natural way for engineers to build things in the way that their users think of them.  Much more to come on this when I publish the 0.5.0 release next week.  

    Till then, please check it out here: https://github.com/neuman/python-carteblanche 

    Feedback welcome here or on github!


  8. Help Me Make Amazon’s Android Streaming Embargo Painful For Them

    I’ve been pretty irritated by amazon’s embargo of prime streaming on Android devices.  We know they already have the android app because kindle IS an android, and they even brought it to IOS which is just salt in the wound.  What this says to me is that they are going out of their way to screw us out of a service we pay just as much for in hopes of forcing us to buy kindles.  This doesn’t even really make sense because there’s no kindle phone so it literally isn’t even an option for many of us. 

    I sent this message to amazon prime streaming support, but with the [ blanks ] filled in obviously.  

    I would like a discount on my prime service due to the lack of support for mobile streaming on my device. I have a [ Your device ] and I cannot stream video. The advertised service for amazon prime streaming video promises mobile streaming but I am unable to do so due to the Amazon embargo on most of the world’s mobile devices. It seems that I am being charged the same amount as people who can do mobile streaming (kindles, IOS), but I’m not getting the same service. Please reduce my monthly fee accordingly. 

    Thank you,
    —[ YOUR NAME ]
    What I got back surprised me!

    I’m sorry for your disappointment about not being able to stream Amazon Instant Videos on Android mobile devices. 

    Currently, Amazon does not support Amazon/Prime Instant Videos on Android devices. 

    We are working to expand our Amazon Instant Video service to a broader selection of devices in the future. 

    When Amazon Instant Video is made available on a new device, it appears on our Compatible Devices page: 


    We’ve received many requests from the customers to release an Android app for Instant Video streaming and our development team is working in this regard and we hope to make Amazon Instant Video available on Android devices in the near future. 

    I’ve also forwarded your feedback to our Amazon Instant Video Development team. Customer feedback like yours is very important in helping us continue to improve the experience of using our digital video service. It is an important part of upcoming developments in our Amazon Instant Videos service. I appreciate that you wrote about this so that I can point out the increasing demand for it. As this involves many teams and individuals, I’m unable to predict the current time-line. 

    Meanwhile, as an exception and as a compensation for the inconvenience caused, I’ve issued a 25% refund i.e. $20 on your Prime membership. Refunds typically process within 2-3 business days and appear as a credit on your statement. You’ll receive an automatic confirmation e-mail when the refund is processed. 

    I hope this helps. Thank you for choosing Amazon.com.
    Best regards,

    Pretty cool!  This is not the first time I’ve asked about streaming on android, but it was the first time I asked for recompense and it worked.  
    Everyone, please send Amazon this message, and share!
    Here’s the link to send the message.  You must be logged in to your account to use it.  

    Thank you everyone for the overwhelming response!  I’m very interested to hear about what people are hearing back from amazon.  Please tweet at me @eric_neuman  or with the hashtag #AmazonAndroidEmbargo to let me know.  Keep it up everyone, and we just may get our app!

  9. Sneakopump : Self-Lacing Shoe

    It’s almost 2015, where are our futuristic self lacing shoes?

    The Problem

    Humans have struggled with footwear comfort since the first person stuffed some leaves inside a piece of leather and strapped it to their feet. Even today, shoes need to be fitted, tied, tightened, and they still come undone at the worst moments.

    Wouldn’t it be great if there was a sneaker that kept itself perfectly adjusted throughout the day?

    The Solution

    My idea is to put a pump in the heal of a shoe that pushes air into a series of artificial muscles inside the shoe every time you take a step. Rather than cocooning your foot inside sweaty, leak-prone air bladders like the Reebok Pump, this shoe cinches tight like a laced shoe providing a traditional sneaker feel.

    The cinch would be provided by Pneumatic Artificial Muscles which have their pressure maintained by an adjustable pressure valve. Every time the wearer takes a step, the pump pushes air into the valve which lets the right amount escape to keep the cinching factor constant.

    The end result is a shoe that tightens around your foot after a few steps and stays perfectly snug all day long.

    Check out the design over at quirky and vote!


  10. Here’s my talk from SciPy 2013 on the Roadmap to a Sentience Stack.  It’s basically me trying to convince a room full of machine learning and AI experts that a project like this is feasible and relevant and important.