Wednesday, January 28, 2015

Why Are You So Hard To Get A Hold Of?

This is another post in my ongoing series of regular responses I give to people. Meaning when I get asked something so much, I craft blog posts that live on, and I reply to emails, tweets, etc. with a quick link to my standardized responses.

One I get less frequently, but still enough to warrant a response to, “why are you so hard to get a hold of?"

To which the answer is, "I’m not". I have a phone number that are very public, I have 3 emails all going into same inbox, a highly active Twitter, LinkedIn, Facebook, and Github presence. If you are having trouble getting a hold of me, it is because you are not using the right channels, or potentially the right frequency.

First, I don’t talk on the phone. I schedule meetings, increasingly only on Thursdays (regularly for partners, etc.), where i talk on skype, ghangout, and occasionally the phone. When I talk on these channels, I can do nothing else. I can’t multi-task. I am present. If I did this all the time, I wouldn’t be the API Evangelist—I’d be that phone talker guy.

Second, I respond well to quick, concise emails, tweets, wall posts, and github issues. Shorter, the more concise the better. This is what I mean by frequency, if you send me a long-winded email, there is good chance it could be weeks or even never that will respond. Sorry, I just don’t have the bandwidth for that frequency—I use short, precise signals.

I do not have a problem with someone being a “phone person”, but I’m not, sorry. In my experience people who require lots of phone calls, also require lots of meetings, and often shift in their needs, because it isn’t anchored to any specific outline, document, or project requirements. Personally I try to avoid these types of personalities, because they have proven some of the least efficient, and most demanding relationships in my professional life.

Please don't take this message the wrong way, I'm trying to help you be as successful as you can in making the right connection.


There Is A Good Chance That I Will Be Remembered For What You Did, Because I Told The Story

My friend Matthew Reinbold (@libel_vox) wrote a great piece on his blog titled, Storytelling and The Developer’s Need To Communicate, reflecting on an un-conference session I did last July at API-Craft in Detroit. Thanks for the great thoughts on storytelling Matt, something that is super infectious, and has reminded me a related story, which I hope continues to emphasize the importance of storytelling in API space.

Another one of my friends that I thoroughly enjoy swapping stories with at API conferences, and in the dark corners of bars around the world, is Mike Amundsen (@mamund). Now I may have the name wrong, but one time Mike told me a story about how John von Neumann (correct me if I’m wrong Mike), is known for a lot of ideas that he didn’t necessarily come up with on his own. He was just such a prolific thinker, and storyteller, which allowed him to process other people’s ideas, then publish a paper on the subject before anyone else could. Some people would see this as stealing of ideas, but one can also argue that he was just better at storytelling.

While I have developed many of my own ideas over the years, much of what I write about is extracted from what others are up to across the API space. I have made an entire career out of paying attention to what technologists are doing, and telling a (hopefully) compelling story about what I see happening, and how it fits into the bigger API picture. As a result, people often associate certain stories, topics, or concepts to me, when in reality I am just the messenger—something that will also play out in the larger history, told in coming years.

I’m not that old, but I’m old enough to understand how the layers of history lay down, and have spent a lot of time considering how to craft stories that don’t just get read, but they get retold, and have a way better chance of being included in the larger history. As Matthew Reinbold points out, all developers should consider the importance of storytelling in what they do. You don’t have to be a master storyteller, or super successful blogger, but your ideas will be much better formed if storytelling is part of your regular routine, and the chances you will be remembered for what you did, increases with each story that you tell.


Tuesday, January 27, 2015

Cybersecurity, Bad Behavior, and The US Leading By Example

As I listened to the the State of the Union speech the other day, and stewed on the topic for a few days, I can’t help but see the future of our nations cybersecurity policy through the same lens as I view our historic foreign policy. In my opinion, we’ve spent many years behaving very badly around the world, resulting in very many people who do not like us.

Through our CIA, military, and general foreign policy we’ve generated much of the hatred towards the west that has resulted in terrorism even being a thing. Sure it would still exist even if we didn’t, but we’ve definitely fanned the flames until it has become the full-fledged, never-ending profitable war it has become. This same narrative will play out in the cybersecurity story.

For the foreseeable future, we will be indundated in stories of how badly behaved Russia, China, and other world actors are on the Internet, but it will be through our own bad behavior, that we will fan the flames of cyberwarfare, around the world. Ultimately I will be be reading every story of cybersecurity in the future, while also looking in the collective US mirror.


Thursday, January 22, 2015

I Judge API Providers On How Much Value They Give Back vs. What They Extract

There are a number of data points I evaluate people and companies on while monitoring the API space, but if I had to distill my evaluation of companies down to one things, it would be based upon how much value they give back to the community vs. how much they extract.

You see some companies are really good about providing value to the community beyond just their products and services. This is done in many ways, including the open sourcing of tools, creation of valuable resources like white papers and videos, or just being active in sharing the story behind what they do.

Then there are companies who seem to be masters at extracting value from developers, and the wider API community, without ever really giving back. These companies tend to focus specifically on their products and services, and rarely share they code, knowledge, or other resources with the wider API space.

I’m not going to name specific examples of this in action, but after four years of operating in the space it is becoming easier to spot which camp a company exists in--you know who you are. I understand companies have to make money, but I’m totally judging companies across the API space based upon how much value they give the community vs how much they extract during their course of operation.


Wednesday, January 21, 2015

When Will My Router Have Docker Containers By Default?

This is something I’m working on building manually, but when will the wireless router for my home or business have Docker container support by default? I want to be able to deploy applications, and APIs either publicly or privately right on my own doorway to the Internet.

This would take more work than just adding storage, compute, and Docker support at the router level. To enable this there would have to be changes at the network level, and is something I’m not sure telco and cable providers are willing to support. I’ll be researching this as a concept over the next couple months, so if you know of any read-to-go solutions, let me know.

It seems like enabling a local layer for docker deployment would make sense, and help move us towards a more programmable web, where notifications, messaging, storage, and other common elements of our digital lives can live locally. It seems like it would be a valuable aggregation point as the Internet of Thing heats up.

I could envision webhooks management, and other Internet of Things for household automation living in this local, containerized, layer of our online worlds. This is just a thought. I’ve done no research to flush this idea out, which is why its here on If you know of anything feel free to send information my way.


Machine Readable Format For Migrating Concepts From Dreams Into The Real World

Obviously I’m working with APIs.json and Swagger a little too much, because it has really started to affect my dreams. Last night I had a dream where I was working with a university research team to define a machine readable format for migrating concepts from the dream world into the physical world.

I’m not sure I want this machine readable, but regardless it was a fun dream, and I wasn’t worried about this in the dream, so I guess it is ok. In the dream I was able to go to sleep and dream about a concept, then wake up and apply the same concept in my regular day via my iPhone. It allowed me to pick and choose from a notebook of things I had experienced in my dreams, and then apply in my daily life as I chose.

This post lives in the grey space between my fictional storytelling, and my API Evangelist storytelling, so I’ll leave it here on Kin Lane. If you are planning a startup in this area, let me know. ;-)


Thursday, January 8, 2015

Internet Of Things Security And Privacy Will Always Begin With Asking If We Should Do This At All

As I read and listen to all of the Internet of Things stories coming out of CES, I’m happy to be hearing discussions around privacy and security, come out of the event. I feel better about IoT security and privacy when I hear things like this, but ultimately I am left with overwhelming concern about of the quantity of IoT devices.

There are many layers to securing IoT devices, and protecting the privacy of IoT users, but I can't help but the think that Internet of Things security and privacy will always begin by asking ourselves if we should be doing this at all. Do we need this object connected to the Internet? Are we truly benefiting from having this item enabled with cloud connectivity?

I'm going to try and keep up with tracking on the API layer being rolled out in support of IoT devices, but not sure I will be able to keep up with the number of devices, and the massive amount of hype around products and services. At some point I may have to tap out, and focus on specific aspects of IoT connectivity ,around what I consider the politics of APIs.


Wednesday, January 7, 2015

Information Sharing And Collaboration In Government With The 18F Jekyll Hub Template

I’m a big fan of Jekyll based sites. All of the API Evangelist network runs as over 100+ little Jekyll sites, within Github repositories, via Github Pages. This is more than just website technology for me, this is my workspace. When you come across a half finished listing of contacts, or building blocks for a particular industry, or possibly a story that isn't fully edited—this is all because you are wandering through my API industry workshop. (pardon the dust)

Over the holidays, my girlfriend Audrey Watters (@audreywatters) has completed her migration of Hack Education and her personal blog Audrey Watters, to a Jekyll based mode of operation. Read her own thoughts about the new found freedom Jekyll is giving her over her content, data, workflow and the publishing of her projects—she is pretty excited.

Like APIs, a Jekyll approach to projects is way more than the technology. It is hard to articulate to folks the freedom, collaboration, flexibility, and transparency it has the potential to  introduce. It is something you have to experience, and see in action before you can fully understand, but I also have to ackknowledge that the transparency introduced by this way of working will not be for everyone.

I originally learned what I know about Jekyll from watching leaders in the federal government space, most specifically Development Seed, and round one Presidential Innovation Fellow, and now full-time Githubber Ben Balter (@BenBalter). Continuing this trend, it makes me happy to see 18F, out of the GSA, providing the 18F Hub, “a Jekyll-based documentation platform that aims to help development teams organize and easily share their information, and to enable easy exploration of the connections between team members, projects, and skill sets.” The 18F Hub is similar to the Developer Hub templates that 18F published, but I think holds a lot of potential in helping on-board a non-developer audience to the concepts of Jekyll,and  Github—hopefully making the social coding platform a little less intimating.

I do not think Jekyll and Github is for everyone. I’m not in the business of evangelizing one platform to rule them all, but I do think Jekyll itself, whether you run on Github, Amazon S3, Dropbox, or your own hosting or internal network environment, is a powerful tool for any project. I’m eager to keep an eye on what agencies put the 18F Jekyll templates to use, because it will signal for me that there are other healthy things going on at the agencies that do.


Tuesday, January 6, 2015

Playing Around With Jekyll Job APIs To Manage My Github Pages

I’m playing around with a concept right now that I’m calling "Jekyll jobs". As you may know, all of my websites use Jekyll, and run on Github Pages. Currently I have over 100 separate repos, and managing the content, and data across these repos can get complex.

I use a standardize approach I call “hacker storytelling” for publishing each of my projects, so I have a handful of things I need to update, ranging from the look of the site, to changes across all Jekyll posts, or pages. To help me keep things orderly and efficient I’m considering a lightweight, API driven, jobs framework to help me manage.

I am looking to make many of these “jobs” available to my girlfriend as well, allowing her to target specific resources, with specific jobs. Some of the jobs I’ve outlined are:

  • Link Management - Help me catalog, and manage the health of links that are used across all blog posts. A lot of links change, go bad, or any other numerous illnesses that occur.
  • Image Management - Help me catalog, optimize, and manage images that are used in my blog posts. I’m pretty good about manually doing a lot of this, but I sure could use help.
  • HTML Management - Sometimes HTML code gets ugly, either because I wrote it and didn’t give it the attention it needed, or possibly because it was generated out of another system, either way there is cleanup and maintenance from time to time.
  • Content Management - After I write a post I like to constantly re-evaluate tagging, indexing, and providing additional links to related content.
  • Content Indexing - My search across all of my Jekyll drive sites is not the best, and I’d like a way I can index all, or specific collections, and serve up as simple search API, maybe using ElasticSearch or something.

As more of my world runs as small, modular, Jekyll projects, I’m needing a way to run jobs against them, and designing APIs that do what I need, and use the Github API to work with my Jekyll site content, makes sense. I’m thinking I will just pass a Github user, and repo name, as parameters to each Github job API, and have it run a specific task against my _posts folder in the Jekyll install.

Since I’m designing these Jekyll jobs as APIs, I can run each one as an API request, and keep the job logic separate from each project. I’ll get a couple of these setup, than blog more about the pros and cons of this approach-who knows it may not be what I need to get it done.


The Rebooting Of WordPress With Just Page, Blog, Image, Link, and Comment APIs

I’m in the process of moving from a custom version of my website, and blog manager, a newer version. Back in 2011 I wrote my own custom CMS, as I migrated Audrey and I off WordPress, to deliver more security (obscurity) into our world. As I look to continue the evolution of my CMS, I’m leaving everything behind, and just launching APIs, and working from there to build exactly the user interface I will need to manage my world.

Even though I had moved my blog(s) from WordPress three years ago, there was still some serious WordPress residue on everything. Many earlier blog posts have very WordPress-esque HTML, and the graphical template I used was originally a WordPress theme, so there was HTML, comments, and many other fingerprints of the early WP template in there.

As I work through this process, I think of WordPress, and how they were considering putting in a full API with version 4.1 release. I don’t see any evidence of it on there, so I can only assume they pushed back its release. I don’t blame them, after talking with them about the challenges they face, I can imagine it is taking more work that you can imagine.

I can’t help but think about a WordPress reboot. In my world, I hate legacy code, and technical debt. I very willing to just throw everything away, and start over—except there is one small difference, I’m not a platform with 65 million users.

However let’s imagine you could! Just reboot WordPress by launching six simple APIs:

  • Pages
  • Posts
  • Links
  • Images
  • Comments

Then let the ecosystem build everything else. Create the first public, and admin UI. Then go from there. Use the brand, the momentum, and the community to reboot, and redefine the popular CMS platform. I think in just a couple of years, you’d see WordPress looking something like SalesForce or Heroku.

For me personally, I like the freedom that comes with using APIs. It makes it easy to decouple legacy code, and evolve small, or even large parts of what I do. Another aspect in which I am very fortunate to do what I do for a living. I think back over my career and all the legacy code bases I’ve had to herd around like cattle, and I am much happier in my current world of APIs.


Friday, January 2, 2015

My Unique Visitors and Page Views For API Evangelist Between Google And CloudFlare

I’ve been running all of my websites using CloudFlare since this last Thanksgiving weekend. I pointed all of my name-servers for my primary domains like and to CloudFlare, and I use them manage my DNS, and other related operations of my primary websites.

I’m intrigued by the reporting at the DNS level provided by CloudFlare, compared to the reporting at the page level provided by Google Analytics. I’ve had Google Analytics installed on all of my websites since I first launched, and use it to establish the estimates for the daily and monthly visitors to my websites—beyond that I really don’t care much about these analytics.

Regardless I think it is interesting to look at CloudFlare numbers for the last 30 days:

  • Regular Traffic: 112,241
  • Crawlers/Bots: 55,540
  • Threats: 1,697
  • Unique Visitors: 34,501
  • Page Views: 169,478

Then look at the Google Analytics number for the last 30 days:

  • Sessions: 22,569
  • Users: 17,880
  • Page Views: 38,949

Ultimately you can only compare the CloudFlare unique visitors, and Google Analytics users—these are the only two numbers that are comparable in my opinion. I don’t think CloudFlare removes crawlers/bots from page views, something Google does by default I’m assuming—rendering page views as a very different beast for each service.

I take away two things from this. 1) How meaningless data points are, unless you believe in them. 2) How data points can differ from provider to provider, and at different levels of your architecture. If you ask me what my page views are for API Evangelist, what do I say? You didn’t ask me whether it was my CloudFlare or my Google Analytics page views!

When I think about the millions of hours spent in Google Analytics dashboards across numerous industries, and the companies I’ve seen spending millions in Adwords for their advertising, all based upon numbers derived from this constructed reality, that we’ve collectively bought into—I am blown away.


Tuesday, December 30, 2014

This Reflects What It Felt Like For Me To Work In Washington D.C. Each Day

I was looking through President Obama's 2014: A Year in Photos today, and while many of the pictures evoke emotion for me, but this one in particular really summed up for me, the very short time I spent in Washington D.C., as a Presidential Innovation Fellow.

The number one lesson I walked away with from my time in Washington D.C., was a respect for the scope that exists is DC. Everything is big. Everything is under scrutiny. Everything is operating at a scale, I never had experienced before. If you think you get it, and have never worked there--you know nothing.

I respect anyone who can actually get ANYTHING done in this evironment--knowing this, I understand that my role is purely from the outside-in. I'm not saying everything there has the best possible motives, but you have to respect anyone getting ANYTHING done in an environment, where everything you do is being so heavily scrutinized.


Please Provide Me With More Information Before We Speak On The Phone

As an independent operator, I have to be very thoughtful about how I spend my time. With this in mind, it is helpful for me to have a standard response that I can give to people who make requests for phone conversations.

If we do not have a prior relationship, or a referral from someone I know well, the chances I’ll just jump on a call are slim. Please provide me with as much information on what you are up to, in as short, and concise way as you possible can.

I’m just looking for a title, executive summary and some supporting links to prime the pump. I’m happy to make time, but I need some sort of way to make sure what you need is a fit for you, and for me. I get a lot of folks who don’t quite understand what I do, and if I responded to every request, I'd be on the phone all day--thus I wouldn't be the API Evangelist anymore. ;-(

I appreciate your understanding. Additionaly I find this request helps people articulate their ideas and needs better, making the time we do spend on the phone, much more productive for both of us. I look forward to hearing more about your idea!


Please Provide Me Information Before You Ask To Speak On The Phone

As an independent operator, I have to be very thoughtful about how I spend my time. With this in mind, it is helpful for me to have a standard response that I can give to people who make requests for phone conversations.

If we do not have a prior relationship, or a referral from someone I know well, the chances I’ll just jump on a call are slim. Please provide me with as much information on what you are up to, in as short, and concise way as you possible can.

I’m just looking for a title, executive summary and some supporting links to prime the pump. I’m happy to make time, but I need some sort of way to make sure what you need is a fit for you, and for me. I get a lot of folks who don’t quite understand what I do, and if I responded to every request, I'd be on the phone all day--thus I wouldn't be the API Evangelist anymore. ;-(

I appreciate your understanding. Additionaly I find this request helps people articulate their ideas and needs better, making the time we do spend on the phone, much more productive for both of us. I look forward to hearing more about your idea!


Tuesday, December 23, 2014

When A Developer Does Not Understand What API Evangelist Is

I do not expect everyone to immediately know who I am, and fully understand my mission behind API Evangelist. However I do find it interesting when people have entirely skewed views of who I am, what I do, and then after they meet me, make a 180 degree shift in their perception of API Evangelist.

This post is about one recent encounter I had with a developer at an event. This very nice developer has worked in the API sector for a while, and is very knowledgeable about APIs, and is very aware of who I am, and my presence as the API evangelist, either from co-workers, or the larger web. When I first talked to him in a larger group, someone said do you know Kin? To which they replied yes, they were aware of who I was, but never met me, and didn't seem very interested in a deeper introduction or conversation. They had clearly made up their mind who I was, and what it is that I do, and sent out tones that I was not much more than a blogger. This doesn't happen all the time, but regularly enough that I feel compelled to write about it.

Spanning a couple of days this developer was in various group conversation I participated in, and at no point did they seem interested in engaging me, acting very disinterested, and walking away several times. Now I really have no way of knowing how they felt, or if there is something at play, but I've experienced enough to know these developers are really smart, and often times feel what I do isn't technical enough to rise to the occasion—I know this because many developers have told me this flat out, but in this particular case that hadn't happened.

What did happen is after about 7 of these types of engagement, this developer heard me talking about my vision around the Oracle vs. Google case, and my larger vision about API discovery across the Internet, and at some point during this conversation their energy towards me shifted entirely and became much friendlier, and engaging. After this conversation, they sought me out for further conversations, followed me on Twitter and worked really hard to initiate discussion with me in several other areas.

The message here is that you really shouldn't make assumptions about people in the space until you've done your homework, or quite possibly met them in person. This is something that I think developers are very poor at. I experience this online regularly, and offline less frequently. Someone lands on my site, reads one posts, maybe two, and makes some pretty radical assumptions about who I am, and what I do based upon this limited understanding. I can see how my title of “API Evangelist” might seem superficial to the untrained eye, but once you get to know me you will understand how passionate I am about APIs, something I hope will be contagious, or at least help you understand more about me, and my cause.


Thursday, December 18, 2014

When Apps You Love Lose Their Utility

With the latest version of Evernote, I’m beginning to look for the next tool for managing my notes. I live and breathe in my Evernote. I am writing this post in there. I depend on the easy note taking via my laptop, mobile phone and tablet. Evernote is the heartbeat of my writing, and I write everything from email, to blog posts, and white papers in my Evernote, then publish to the appropriate channels once ready.

The last version changed the layout, added chat, and recommendations for outside related sources, to name a few of the most prominent feature changes I'm stumbling over. Some repetitive tasks that were one click before, now take me two or three clicks, making my organization of my writing much more difficult. The introduction of chat is not only useless to me, it actually invades my private writing sanctuary and just bothers me everytime I see the button at the top.

As I evaluate what it will take to migrate from the platform, I’m unable to get an API key, it just throws an error every time I ask for one. I submitted a ticket, and will publish a video of the problems I was facing at some point. I exported a notebook as HTML, just to see what was possible for migration from the interface, and the amount of garbage in the HTML is going to create a lot of extra work for me when converting.

We all seem to be infatuated with the constant march forward of technology, and it is something I am able find harmony within, and making sure my content, data, and other assets are well defined, and portable is an important part of this. I know Evernote has a wider audience to please than just me, but I’ve been a paying customer since 2011.

It makes me sad, but moving on for me has become esaier than ever, and I don't have time to dwell on break-ups--I just move on.


The Future Internet Will Consist Of Many Affinity Networks

As much as I wish for the Internet to remain the open, accessible, neutral, distribute platform it has been since birth, I’m often faced with the reality that net neutrality will lose, in the grip of the powers that be. You see, the AT&T, Verizon, Comcast, and other powerful corporate actors in the world do not want it to be open, they want to be able to meter anything people want on the Internet, and maximize revenue, and mimic the existing power flows that exist in the regular world.

I feel like the Internet as it occurred was an accident. Something that these corporate giants didn't notice until the Internet was going full tilt, and had become part of the mainstream consciousness. Now that they have their sights set squarely on generating revenue from the Internet, things will change. Some of these evolutionary events being high profile shifts, while most of it will happen silently, put into legislation without anyone noticing, and occurring behind the boardroom doors that the public doesn't have access to.

We have the technology to work around this, we just need the will, and ultimately I believe in humans, and the power they wield in being able to work around roadblocks and challenges put in front of us. The AT&T, Verizon, and Comcasts of the world will be busy building their fast lanes, charging access on both ends, ensuring their partners content and data are visible, and making sure every possible dime is squeezed out of customers. As technologists, we need to continue building out our version of the Internet, using mesh networks, and other emerging alternative network technology.

While the motivation for large corporations will be money, and they will build networks to meet this vision, our motivation will be based upon the affinity we have with our family, friends, and professional networks. We will need to build out nodes to support our agricultural networks, music communities, and the numerous other levels in which we share affinity. We need to encourage our networks to become network nodes, and ensure our packets, bits and bytes traverse only these networks, unencumbered by the corporate traffic cops that will be spread around the globe in coming years.

Just as Tim Berners Lee, and other grandfathers, and grandmothers of the Internet did, we will have to innovate, and work hard to develop the next generation Internet. One that evades the gaze of our corporate Sauron, and stays one or two steps ahead of the corporate interests that may think they are good, but do not have the collective Internet, or world in mind.


Wednesday, December 3, 2014

Sorry I Do Not Want To Take Your Survey

I get a number of emails, instant messages, and other channel requests to take surveys. All of which I say no thank you. I don’t take surveys, I don’t care what you offer me, I hate the format, and refuse to take part. Surveys remind me of taking tests in school or the corporate work environment, where people were trying to measure my knowledge, performance or otherwise.

I’m sure that survey’s work in many environments, where you can incentivize folks to fill out your form. I’m sure some of the answers are even what you are looking for, but I’m sure there are many other folks like me who do not give a shit about jumping through hoops to answer questions, even if it is for an important cause.

In 2014 I can't help but think there are better ways of doing surveys, and wish someone would come along and do something about it. I don't mind being asked questions, but in my busy day I do not have time, or interest in filling out a long questionnaire. Maybe you could do your survey over the course of a couple days or weeks, via Twitter or other social media channels.

Seems to me that you could easily target a group of individuals via social media, populate a list of questions you are looking to have answer, then time questions so they get asked in a simple, conversational manner that is not intrusive, or disrupts my world. I'd be happy to answer your questions (well not always), but for many companies, brands, and on interesting topics I'd be happy to help in a way that fit better with my flow.


Tuesday, December 2, 2014

No Mojo For Writing On The Road

I’m sure some of you are happy when I go on the road, because the number of blog posts I publish goes significantly down. I get a lot of folks who jokingly complain about the volume I write, and state it is difficult for them to keep up sometimes. Not sure how I can help you with this one, maybe better read it later tools, or ask your boss to carve out reading time at work on a regular basis. ;-)

When I am on the road I find it very difficult to find the "mojo" to write. Ever time I come home from a trip I will have numerous Evernote entries derived from random ideas I've had traveling. There are no shortage of ideas while I roam the world, but the ability to actually think an idea through, flush it out, and tell the full story is really, really difficult for me when I'm in motion. Many managers I've had over the years, often consider writing to be an “easy” task—"just write those 500 words about that topic, it is just words in a certain order, right?" #nottrue

There are some posts that I write that don’t take much energy, but many posts Ineed write in a place that I just can't always access when traveling, worried about making my flight, finding a taxi, networking for dinner, or just plain too tired to even have a deep thought. This is why, when I come home you will find a serious uptick in the number of blog posts I publish, because I'm finally able to settle into my "happy place", where there is endless amounts of “mojo” to tap, when it comes to telling API stories.

This makes me happy to once again be home, and is why I will continue to reduce the amount I travel in 2015, because I feel like my writing is much more important to my own well-being, something I hope contributes to the overall health of the APi space.


Monday, December 1, 2014

Flipping Through The Online Channels Each Day

I was give my own color TV when I got a Commodore 64 for Christmas in 1980-something. Ever since then, I've had a noisy backchannel in my life. I don’t just like a secondary channel, I often need it to be able to focus on the primary channel--it is just how I operate.

As I reach 20 years on the Internet I can't help compare my current reality with my previous lives engaging with technology. I remember first getting my satellite access in I think 1982 or 1983. It was a big-ass-dish-out-in-the-yard setup, not the little-ass-dish-on-roof you have now.

Anyways, I've spent the last 30 years flipping through the channels of my television, ham radio, dial-up Internet, dsl, broadband, cellular connection. Which channel do I find what I’m looking for? Which frequency actually reaches me? What channel is the most relevant for the time period? There are a lot of questions, and only static answers—I demand dynamic. No real-time. No predictive. D) All The Above.

I monitor thousands of frequencies across about 10 channels, each day. I pay attention to RSS, Twitter, Github, Email, and other lesser traveled channels, looking for the signals that matter. The more I experience, the stronger the patterns I look for come into focus. Some things catch my eye, other things do not.

I’m difficult to reach, unless you know the channels. My daughter knows how to ping me on multiple channels, and various frequencies, and my mother does not. I try to manage the balance across the channels, and frequencies, but if I can't find the right modulation—it ends up being random.

How will we ever find harmony in the channel in which we receive our information? How will we ever know the proper channels to reach those that matter? Our reality is physical, but our minds are moving digital, and we can't keep up. How do we identify the right algorithm for flipping through the channels day, and maintain the sanity needed to keep a forward motion.


Wednesday, November 26, 2014

Reclaim Your Domain - GoDaddy Time!!

I cannot think of a better example of my RECLAIM YOUR DOMAIN work, than migrating the last domains from my GoDaddy account. I’ll try to not bore you to death with the GoDaddy story, but it is the single largest point of joy, and suffering, in the development of my digital self, I can think of.

I have been a GoDaddy customer for over ten years. Around 2000 I discovered the registrar, who offered a significant cheaper domain registrar, giving me a domain for a year at about $10, where previously I was paying $35 a year, per domain. At one point, I had over 100 domains registered for my own projects, and my business at the registrar.

Over the last 10 years, GoDaddy managed to get "spammier" in their approach to doing business, constantly changing their interface around the single feature I was using the platform for—domain registration. I didn’t need your other shit, I was only there for your cheap domain registration, and DNS services--I could do the rest myself.

Along the way, it was clear GoDaddy was more than just spammy, their were also pretty sexist and racist. GoDaddy represents what is wrong with the World Wide Web, demonstrating that the dominate vision of WWW is an equivalent of Times Square Advertising in New York. Nobody should have to endure GoDaddy just to be able to own a website.

I moved most of my domains off of GoDaddy a couple years ago, but I’ve hesitated the complex migration of, and This holiday season I’m thankful for the ability to reclaim my domain(s) from GoDaddy, migrating them to, and being able to setup CloudFare to help me manage, and defend my domain.

I’m not against making money on the Internet, but please if you do business on the web, offer more value than GoDaddy does, and please show some class when it comes to selling your products, and don’t just vomit in face with your website, and your services. I’m a big supporter of the Internet, and World Wide Web, but I also we can do better, which is why I finished reclaiming my domains(s) from GoDaddy today. #GoodBye


Sunday, November 16, 2014

Public Front-end And Private Back-end For My Sites On Github

I moved my public presence onto Github almost two years ago, running API Evangelist as 90+ Github repositories, driving each portion of my presence using Github Pages, HTML, CSS, JavaScript, and JSON. Each of my research sites run as a public website using Github pages, in the gh-pages branch of its repository.

Over the last year I've been playing with how to use the master branch for some projects, as a private content and data storage. Github repositories are cool like that, where you can make the gh-pages branch public, and the master branch private. If I don't want some content, JSON data, or files public, I just put it in the master branch for the research project.

All of this is possible because of Github, the social coding platform. Github provides a cloud platform on top of the open source software management solution Git. Using Github I can create repositories that act as storage solutions for each of my research projects. I count 78 repositories currently making up the API evangelist network, but I see over 100+ repositories in my Github profile. Most of my repositories are publicly available which is free with Github, but a growing number are private, and my monthly bill for Github has increased each year.

Github Pages
Github provides a simple new service that transformed Github repositories from code management folders, into public websites, called Github Pages. With a three step wizard, any Github repository can have a new Github Pages (gh-pages) branch, immediately giving each repository a public face. Each Github Pages site is extremeley simple, only running HTML, Markdown, CSS, and JavaScript, but thats ok, it is just the way I like it.

Jekyll is a simple, blog-aware, static site generator perfect for personal, project, or organization sites. Jekyll gives you the ability to manage the look and feel of your site, along with simple management of pages, and a blog. The rest is up to you. There are plenty of folks who are pushing the boundaries of what Jekyll can do, comparable to the evolution of the WordPress platform, but in a much more static way--Jekyll is the hollywood front for all of my public website.

Github API
Along with Git repositories, Github Pages, and the other social features, Github provides an API which gives you access to users, repositories, files, and pretty much every other aspect of the Github platform. This means that each one of my research projects, and their public web sites have an API. I use the Github API to publish all of the blog posts, pages, and JSON data files that support my research, across 78+ projects. is the fastest, and easiest way to integrate oAuth of popular platforms into your apps. This is no joke, with just a few lines of code, you can put Facebook, Twitter, Github, and other platforms oAuth into use on any website. I use for putting Github to work as the identity and access layer of my network. After you setup your account, setup an and Github application, and connect them together, all I do is grab my app key, and publish this JavaScript:

In the header of my websites, I add a little handler that keeps an eye out for the incoming oAuth_Token--Github and handle everything else.

If the oAuth_Token is available, using JavaScript I rewrite any navigation, and other key links, appending oAuth_Token to the link, so that it will travel with the user from page to page, alerting the website that the user is authenticated.

Next I use Github.js, a JavaScript wrapper for the Github API. Using the oAuth_Token I get using and the Github API, I can now make actual API calls using the Github.js wrapper to the private master branch of this repository, that is, if you have access.  There is one more layer of security, before you can actually get access to anything.

Using Github.js I try to grab the contents of a file in the private, master repository, and if the authenticated Github user is added as a collaborator on the repository, the call is successful, if not I deny the user access to any content, data or files in the projects master repo. This is the line between public and private, or front-end and back-end on my websites. Using Github identity as authentication to my private world, I am able control who has access, using the standard Github organization, and user management.

Currently I’m using this layer to decide whether or not to access and display data from JSON files, and access to PDF files located in the master branch. You can see this in action via my messaging research of email, SMS, or MMS. If you are listed on each of the projects repos as a collaborator, and logged in via Github, you can see the building blocks for each research area, and the current PDF version of the research report--otherwise you just get the default information.

As with all of my work, this is a work in progress. This step in my evolution is all about using as a paywall for my research, where other tools I’ve built like my CSV Converter, and Data On Github applications are using the master branch as application store, even if you aren't logged in the app will still run, you just can’t save anything. Next I’m going to play with this approach as part of my API broker research, exploring different ways of aggregating APIs, and managing key access using the private master branch for each repository.


Monday, November 3, 2014

Voting With My User Account: Shutting Down My Quora Profile

In the digital age, being a user of any service, is an endorsement, and vote about whether or not an online service is good or bad. Just as we do in the real world, and shopping at stores that reflect our views, we need to make sure and do the same online.

Each time I am frustrated with a service because of their disregard for privacy or security, or their heavy handed terms of service, their exploitation of user data, or D) all the above, I’m going to blog about it. After blogging about it, and things do not change over time, I will weight the pros and cons of a service, and vote with my user account.

Up this week is Quora. I wrote about Quora’s lack of an API a while back, have ranted on Twitter several times, and after reading my friend Eric Mill’s (@konklone) post on "Quora Keeps the World's Knowledge For Itself", I've decided to shutter my Quora account once and for all. I like idea of Quora, but without an API, data portability, and willingness to share the knowledge it is harvesting, I just can't support such a service.

I feel that even having my name attached to the service in the form of even a dormant account is unacceptable, let alone actually publishing any of my ideas and thoughts there. I’d much rather publish my questions and answers to my own blog, and to services I use that give me access and ownership my own intellectual exhaust. If Quora ever changes their way I may consider coming back, but as of today I’ve delete my account.


Saturday, October 25, 2014

Cutting Back On My Traveling And Speaking

I’m going to be cutting back on traveling and speaking in the future, and if you are one of the folks who got an email from me, declining an engagement--I’m sorry. I traveled a crazy amount in September, and put on @APIStrat in Chicago at the end of the month. I got sick during the downtime afterwards, and I just went to Stockholm to speak, where I got sick again—only just fully recovering as of yesterday.

I feel like my most important work, which is my research, and secondarily my story telling suffers greatly when I travel, and takes an even greater hit when I’m sick. When it comes to making an impact, and reaching the widest possible audience, I’d say my writing is the most important. I get only a handful of people who come up to me and say “great talk at that API event”, where I get a steady flow of people who thank me for my writing, and tell me about the impact it has made on them.

With this in mind, I'm going to dramatically cut back on accepting speaking engagements, and attending conferences. There are a handful of events I’m addicted to like Gluecon and Defrag that I don't think I will ever stop going to, and of course my own conference API Strategy & Practice will continue, but beyond that I’m going to need a damn good reason to go to an event.

I’m sorry for being kind of a jerk on this front, but the API space is moving really quickly, and I feel like I can’t afford to miss a beat when it comes to my monitoring, research, coding and writing. The ROI on traveling and speaking just isn't there, like it is with my writing—which is the most nourishing thing I have, when it comes to keep this crazy train moving forward.


Clarification On The Cease And Desist I Got From @Pluralsight

I wanted to clarify a tweet from yesterday, where I said that "I just got a cease and desist to take down a showcase I did on @pluralsight API training demos. Won't make that mistake again"—if you aren't familiar with them, Pluralsight is “the largest online #ITAdmin, #Developer & creative training library on the planet". Tweets are so bad for getting all the details across, and I wanted to add that I actually agree with the request Pluralsight was making, just not their approach.

What happened was, a link had come across my monitoring, about a Pluralsight training video on APIs, which I curated, and added the link to my API Design research. The problem was that the link was to a 3rd party site, not actually to Pluralsight. The cease & desist email I got was asking me to remove the link to the 3rd party website—which makes sense, and I’m happy to oblige.

What I do not agree with is the approach by Pluralsight in using cease and desist to get me to address the problem. It is a particularly shitty first impression to leave on someone, especially someone who runs one of the top sites for learning about a fast growing space for developers—APIs. As you can tell by this post, I will not ever link to Pluralsight again, and beyond this post you won't catch me talking about them again in stories, white papers, or at my conferences, meetups, and workshops.

Pluralsight is just using an anti-piracy service (which I won't showcase), and think it is a safe bet to say they aren't really aware of what is going on. I’m sure they have a pretty bad piracy problem, but I’m thinking a blind cease & desist campaign might not be the best approach. Maybe a first email saying, “hey! you have a link we don't like, would you consider swapping it out with a valid link?”, might be a better approach.

Anyhoo, we'll file this post under “rant”. I hope your anti-marketing strategy works out for you Pluralsight. I’m really happy to be in a space where I generate content that I can encourage the widest possible distribution, and not have to police websites like Pluralsight does.