Internet of Nothings without Friction

A couple weeks ago, I went to NLJUG’s Internet of Things developer day.  I got volunteered to go, but the food was good, the beer was free, and I got to ride a train, so aside from the train-ride and the fact I didn’t have any beers, it was ok 🙂

I think it’s, what, 14-15 years ago I was first introduced to the Internet of Things, or pervasive/ubiquitous/ambient computing as it was called back then.  Back then I made fun of it, because who in their right mind would want all their things on the internet?

At the conference, I kept asking that same question.  Well, not loudly, but I kept trying to look to see if anybody would answer it.  One of the first keynotes did in fact touch upon it, but mostly in the sense that it was an important question one should ask themselves.

Most of the talks I attended were avoiding the problem.  Either they were working on how to deal with security in the IoT, how to deploy it, Raspberry pi and Arduino because they are small and cheap (never mind they are also sucky and I can set up a virtual machine that’s 100 times more powerful for less).  Everybody were talking about imagined technical difficulties setting up some imagined Internet of Things that nobody wants and they couldn’t explain why they were providing non-solutions to non-problems.

Sure, if you’re a techie it might be great you spent 3 weeks setting up your Arduino/Raspberry Pi so it can warn you when you leave home with the lights on, or switch off your door bell at night.  A computer scientist is a person who is willing to spend just a few weeks to automate a task that is likely to save them several minutes down the line.  Real people don’t want to do that.

OLYMPUS DIGITAL CAMERA

Apple didn’t invent the smart phone.  The first iPhone was not even a smart phone by most accounts.  But it did one thing right: it made what little it could do easy to do.  It really did just work.  Techies didn’t like it – myself, I stuck to my Nokia N-series until near the end of the lifespan of the 3S.  Damn, my N95 was a great phone.  The N95 8GB a joy for a long time.  The N96 ambitious but overly so.  The N97 was on paper great – I was waiting for it with anticipation for ages.  The N900 had some of the coolest features not even matched by phones today: completely integrated phone and messaging apps, that allowed me to communicate with a person using MSN messenger, test messages, Skype messages, POTS calls, Skype calls, IP telephony using a single contact, using a single interface.  It was great.  Oh yeah, and my N900 couldn’t call because the software was missing the feature the first 6 months.  the battery lasted 6 hours.  It had a, for the time, great 5 MP camera, but it could not send picture messages because the software never got around to it unless you installed some command-line crap and felt like starting a service on your phone manually.

While the N-series was technically vastly superior to the iPhone for a long time, it had a bunch of features nobody could use.  I have a PhD degree in computer science, and I never managed to set up IP telephony (I could probably have figured it out, but I never did care enough to bother).  The software looked like it was made by 100 small companies.  Because it probably was.  Nothing worked together, everything was a chore.  When I got my first iPhone, I was missing a lot of features like the step-counter I had with Nokia Sports Tracker – my go-to guide for telling how late I got home after a night in town.  But neither did I have to very carefully let the GPS have time to find a fix before attempting navigation, I didn’t have to wait for a web-page to load before interacting with it, even though the application would gladly let me do so on the Nokia, inevitably leading to a crash.

Apple also didn’t invent the tablet with the iPad.  They didn’t invent the MP3 player iwth the iPod, they didn’t invent the graphical user interface (nor did Xerox) with the Lisa/Mac.  But by Britney, did they dominate the respective markets (well, not really the Mac)!  Sure, those things were available before.  Also vastly technologically superior.  But they took too much effort to use.  Their design was not frictionless.

Screenshot 2015-04-30 20.24.04The Internet of Things is not about putting things on the internet.  We already have that.  For some things it’s great, for some things it’s just dumb.  I have an internet-connected scale that automatically uploads my weight to an online service.  Big fucking woo.  Sure, but in doing so, it reduced the friction of using a scale.  I’ve been keeping track of my weight for a long time.  For 12 years, I’ve tried to do regular weigh-ins, noted down my weight and kept track of it.  sure, it only takes 10 seconds to step onto the scale, and 2 minutes to log into my computer, 1 minute to start up Excel, and 20 seconds to fill in my new measurement, and most people have already stopped long ago and so did I for long periods.  With my internet-connected scale, I just step on it, it measures my weight, uploads it whenever convenient, and keeps track of it forever.  That’s super convenient!  30 seconds to measure my weight, generate a random number it calls body fat percentage, and measure my heart rate.  It entertains me with statistics about how many steps I walked the day before and the weather forecast for today so I can dress accordingly (shorts – always shorts).  Putting my scale on the internet reduces friction in doing regular weigh-ins, and now, instead of doing at most 1 weigh-in a day, I do between 2 and 3 (morning, evening and after exercise).

Screenshot 2015-04-30 20.28.54 Screenshot 2015-04-30 20.28.43I also have a bunch of Philips Hue light bulbs.  That’s light bulbs that participate in a Zigbee mesh-network to talk with an internet-enabled hub.  That’s nerd-speak for “instead of using something as inconvenient as a light-bulb, I can use an app on my phone or the internet to turn on my lights.”  They are gimmicky as fuck, but have the advantage of having a very open API.  This means that third parties have started making all kinds of crazy apps for it.  The first were the obvious (disco lights synchronized to music), but also things like blinking the lights whenever an e-mail arrives or switching the lights to blue whenever it starts raining. It is always raining. The best integration is with service IFTTT, which makes it possible to do great integrations connecting various services in simple “if this happens, then do that” rules (hence the name, IF This Then That). It’s fricking brilliant that it is possible to set up a simple rule that switches on the light when I get home or when the sun sets. That has the potential to reduce friction of switching on the light: it’s just on when I need it. Sure a sensor can do much the same, but it only switches on the light when I’ve already hit my foot on the sidewalk and not when I’m 100 meters from home, and it does not know when the sun sets. Unfortunately, IFTTT cannot combine the two, so I have to choose between switching on the lights when the sun sets and I’m not home or not switching on the light when the sun sets and I’m home. I cannot tell it to switch on the light when I arrive home after dark and also switch on the lights when I’m home and it’s dark.

Screenshot 2015-04-30 20.45.23I have an internet-enabled alarm clock. It’s the Withings Aura. It’s this super-neat thing with a million sensors, a speaker, a lamp, and software that doesn’t do half of what it promises, a tenth of what it should do, and a percent of what it could do. It can sense when I’m in my bed, it can sense temperature, my pulse, and light level. It uses this do determine whether I’m asleep or not. It can combine this with my step-counter to generate this spiffy weekly activity chart. It it easy to see that I sleep fairly regularly (but not much), do some physical work in the morning and evening (getting to and from work), do a bit of activity on Saturday morning, got on a longer bike ride Saturday afternoon, and pretty much vegged out Sunday.

Screenshot 2015-04-30 20.53.28

The Aura just got integration with the Nest, so it can also turn heating down while I sleep and turn it back on when I wake up.  That’s frictionless energy saving.  But why can’t Aura switch on my lights in the morning if it is dark?  It has all the information available (I’m awake, my alarm is about to sound, and it is still dark in my bedroom).  Why can’t it switch off the lights when I go to bed?

During the developer day, I attended a talk with Kai Kreuzer, the founder of OpenHAB, the open Home Automation Bridge. It’s an open source project that is supposed to allow me to do all this. It is a programmable bus which allegedly integrates with all of these things, and allows me to do much more advanced rules than IFTTT.  It is essentially an Enterprise Service Bus but for gadgets.  (One of the talks even tried using the ESB language I use at work, Camel, for IoT-like stuff).  I decided to give it a go, downloaded the beta version of OpenHAB 2 and tried setting it up.  Here’s the overview screen I’ve set up:

Screenshot 2015-04-30 21.01.57

Note, this is all live data; I actually do have precisely two lights switched on (my desk and ceiling lamp in my office).  I can switch off all lights at home, or switch the lights in the staircase on or off (I can also control them individually which is why there’s a count and both an on and an off switch).  It shows the current date because why the fuck not, and also the outside temperature (fetched from Yahoo weather).  In principle it could also fetch the sun up and down times, but I haven’t gotten around to set that up just yet.

If I click or tab on the first floor, I get this overview:

Screenshot 2015-04-30 21.02.29

I can see the individual lamps, their intensity, dim them or change their color.  I can control a room at a time.  If I click on the outside temperature on the home page, I instead get more weather information:

Screenshot 2015-04-30 21.02.14

I get more weather information – in principle I could get a full forecast here.  The missing picture is supposed to show temperature tracked over time, but doesn’t work in the alpha version I’m playing with.  It even shows my weight because that’s weather-related information somehow.  We can see I should lose 5ºC/s.

I can add rules doing complex computations and decisions; the humidex above is computed from the temperature whenever they change.  Similarly, I can connect with my phone for proximity information and the sun set/rise information to make my lights switch on if the sun sets and I’m home or when I get home after sunset.  And switch off the lights when I leave home.  Heck, add some door/window sensors, and my phone (or sound system) can notify my if I leave without closing all windows.

OpenHAB has one big problem: it is open source as fuck.  It breaks all the time.  Sure, I’m using the alpha version of version 2, but version is about a user-friendly as a broken version of ed on VMS.  If you get that, you’re old or a mega-nerd.  The real reason I cannot show a forecast in the above picture is that this feature has not been implemented yet in this version.  The chart is broken because the interface to the database used to store temperature data doesn’t work because version 2 was a complete rewrite breaking compatibility with all the components made for previous versions.  Because why not.  Sure, they are fixing it, and you’re welcome to fix it yourself because it’s open source.  This is frictionless only in the same way getting butt-fucked by a cactus wrapped in barbed wire – no lube – is.  This is part of the configuration used for setting up the view of the first floor; first the configuration of things:

[raw] Group All
Group gFF “First Floor” <firstfloor> (All)
Group:Switch:OR(ON, OFF) Staircase “Staircase [(%d)]” (All)
Group:Switch:OR(ON, OFF) Lights “All Lights [(%d)]” (All)
Group:Switch:OR(ON, OFF) FF_Corridor “Hallway [(%d)]” <corridor> (gFF, Staircase)
Group:Switch:OR(ON, OFF) FF_Purple “Purple Room [(%d)]” <office> (gFF)
Group:Switch:OR(ON, OFF) FF_Blue “Blue Room [(%d)]” <bedroom> (gFF)
Color FF_Light_PurpleRoom_Ceiling “Ceiling” <slider> (FF_Purple, Lights) { channel=”hue:LCT001:0017880aadfa:3:color” }
Color FF_Light_Purpleroom_Desk “Desk” <slider> (FF_Purple, Lights) { channel=”hue:LCT003:0017880aadfa:4:color” }
Color FF_Light_BlueRoom_Ceiling “Ceiling” <slider> (FF_Blue, Lights) { channel=”hue:LCT001:0017880aadfa:1:color” }
[/raw]

Then the setup of menus:

[raw] Text item=gFF label=”First Floor” icon=”firstfloor” {
Frame label=”Hallway” {
Switch item=Staircase mappings=[ON=”All On”, OFF=”All Off”] }
Frame label=”Purple Room” {
Switch item=FF_Purple mappings=[ON=”All On”, OFF=”All Off”] Colorpicker item=FF_Light_PurpleRoom_Ceiling label=”Ceiling”
Colorpicker item=FF_Light_Purpleroom_Desk label=”Desk”
}
Frame label=”Blue Room” {
Switch item=FF_Blue mappings=[ON=”All On”, OFF=”All Off”] Colorpicker item=FF_Light_BlueRoom_Ceiling label=”Ceiling”
}
}
[/raw]

And how about the rule for computing the humidex from the temperature and humidity:

[raw] rule “Compute humidex”
when
Item Weather_Temperature changed or
Item Weather_Humidity changed
then
var Number T = Weather_Temperature.state as DecimalType
var Number H = Weather_Humidity.state as DecimalType
var Number x = 7.5 * T/(237.7 + T)
var Number e = 6.112 * Math::pow(10, x.doubleValue) * H/100
var Number humidex = T + (new Double(5) / new Double(9)) * (e – 10)
postUpdate(Weather_Humidex, humidex)
end
[/raw]

So, I’m sure we can agree that this is not low-friction and something sane persons would do? It’s tinker-toys for nerds. And that’s ok! In order to progress, we need to have working technology in place so the humanities can start making things nice. But it doesn’t work! That’s the problem.

During the developer days, I did actually ask one question. I noted that for a long time, people have been making internet connected things like my scale or lamps. Now, everybody is making platforms. There’s OpenHAB (which is being made into the meta-platform Eclipse SmartHome). My scale has it’s own Withings Healthmate, Apple introduced HomeKit, Google has Works with Nest, theres SmartThings, Belkin WeMo, Wink, etc, etc. Before, we had hundreds of isolated things that couldn’t talk to each other. Now we have hundreds of platforms that cannot talk to each other. Like like fricking academia, when a field gets too crowded, somebody decides to add meta in front of some existing field and by magic it’s some new field. I’m not innocent here, I made Britney Suite, an animation framework, and ASAP, a state-space analysis platform.  Each company does this because they want to be the unifying standard everybody uses.  Very rarely is the market maker also the market dominator (case in point, ICQ was the market maker for instant messaging, but AIM and MSN the big ones during the IM hey-days, MySpace was the market maker for social networks but now even the pedophiles have moved on to Facebook).

What we need to make IoT a thing is three things:

  1. we need to make things able to talk with each other. All things, or at least all things that matter.
  2. then we need to experiment with this and figure out what is actually cool
  3. and finally we need somebody to cut the crap away and make the remainder frictionless.

IMG_2124OpenHAB seems like a great idea. The problem is just that it doesn’t work. It cannot get data from my Philips Tap contacts. Using Philips software, I can program them to control my light, but only by assigning a “scene” to each button. This means that button two switches on the light in the current zone, button three in the current room, button four in the next room, and button one – the big one – switches the entire zone off. A button cannot, for example, toggle light in the current zone or room, or work as a dimmer.  I could in principle code all of this using OpenHAB – at least according to Kai’s demo in Utrecht.  Except OpenHAB doesn’t support the Philips Tap.  Nor does it support reading CO2 levels and temperature from my scale (though it registers this information), or getting anything out of my Aura.  It’s not just the fault of OpenHAB or Open Source – a lot of these APIs are just not open.

The Philips Hue lights are a great example.  On their own they are gimmicky as fuck, but thanks to their open API people have built amazing things.  We need more like this.  We need good products like this.  We need critical mass behind one standard, and people need to stop making open platforms, integrations, frameworks and what have you.

IMG_2125Only when all my things, or at least enough of my things, can talk together, can I start experimenting.  I would love to make my lights automatically switch on when I move into a room.  Make the lights in the hallway to the toilet automatically run at 20% power when I get out of bed to take a leak – all the information is already available in my sensors: my Aura knows when my alarm rings and when I’m in bed, a motion sensor can know where I am.  Whether it’s dark or not can be computed based on sun down or using the already existing light sensors.  When I go to bed, I switch on the light in my bed-room, start my Aura bed-cycle, and switch off my lights.  Why can’t this all happen automatically when Aura detects I’m in bed?  I would love to log exactly when every light-bulb is on and at what intensity.  This would be great for knowing where a lower wattage bulb might save energy, how many hours those ultra-expensive LED bulbs really last, or just getting a hold of my patterns.  Philips (and Aura) uses red tones as relaxing light and blue tones for energizing – perhaps lights should automatically be blue in the morning and red in the night?  Dim lights to wean my eyes to darker rooms and save power.  Do all of this intelligently and automatically?  My cell-phone already knows where I am at all times.  Have it show me my grocery list when I’m shopping, automatically add chickpeas to the list when I empty the bag, switch on the slow-cooker 4 hours before I am expected home and turn it down if I get delayed.  Sound an alarm (or switch it off) if I leave the stove on by mistake when leaving home.  Have my door-bell send me a message on my phone instead of a sound, and allow me to let people in by unlocking my smart-lock regardless of where I am.  Start my Roomba when I leave home and have it text me when it inevitably gets stuck under my couch.  Have it hide in it’s dock like a well-beaten child when I get home.  Make my todo list move lawn mowing from Sunday to Friday if it’s going to rain Saturday and Sunday (it’s the Netherlands so it is also going to rain Friday).

Maybe it’s just gimmicky.  A lot of it is.  But if it’s automatic and right, it’s frictionless.  It may only make my life 0.01% better, but if it happens 100 times a day it’s 1% better (or 2.7% if you believe in the theory of compound interest).  I think that’s what the IoT is about.  Lot’s of little conveniences which all contribute positively without you noticing.  There’s a theory, which I could look up but ain’t going to, stating basically that the quality of your day is determined by the number of little victories vs little defeats you get every day.  Waking up to a 120 W flourescent lamp hammering into your face is not going on the positive list, but an alarm that’s synchronized with your sleeping pattern and wakes you with slowly increasing music and lights in sync with your sleeping pattern might just.  Neither will make or break your day, but either nudge you in their own direction.  But it has to be exactly right all the time, because while I may not notice the 99 times it does everything right, I am going to notice that one time it does something that is not right.  And that one time is going to cancel out all the positive.

And that’s where the humanities come in.  We need non-technical people to make this easy to setup, to have sensible defaults (because nobody is actually going to customize these systems except nerds).  The systems should not be user-friendly.  They should be frictionless.  You should never notice them unless you want to.  There are times I want to switch on the light outside my pattern, and a switch should just do that.  One tap for an automatically computed color and brightness, two for interrogation at the Gulag.

It’s great that Apple and Google are getting interested in the field.  They might be big enough to kill off all competition.  Some will stay, like Philips which has never tried to make a platform but only to provide some of the elements, and some will need to go, like Wink which is known by nobody and just sells a box that integrates a handful pre-selected technologies in a not too flexible manner.

Then it’s time for Google to shine.  They are brilliant techies and have taken punch-the-monkey banner ads to a profitability where they can throw money at 100 retarded projects and see which two sticks.  For example, see the Google glass, their self-driving car, their abundant use of “drones,” and hundreds of other projects very few of which we’ll remember in two years time.

Finally, it’s Apple’s turn.  They won’t be first, they won’t be the ones with the most features.  But chances are the few features they have will be done right.

That’s what we should aim for, answering the why.  Of course security and deployment are important and need to be an integrated of the design, but the pressuring question right now is not how to install certificates in my light-bulb, but why on earth would I give it an IP address in the first place.

2 thoughts on “Internet of Nothings without Friction

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.