Monday, August 25, 2014

Checkout This Lineup Of Speakers At Defrag, Including Myself

Check out this lineup of the first round speakers at Defrag this year! I recently got an email from the event queen Kimberly Norlin, and I'm pretty stoked about some of the people I will be sharing the stage with in Colorado this year.

The Norlins rock, whien it comes to bringing together some of smartest folks in the space to Broomfield, CO each year.

My presentation will be about containers and APIs, borrowing from John Sheehan, when he said "containers will do for APis what APIs do for companies". Which is a pretty prescient comment, but not surprising from someone like John is on the leading edge with his API integration startup Runscope.

Make sure and get registered for Defrag. If you aren't familiar with what Defrag is all about, read one of my earlier posts about the Defrag experience--in short, Defrag is where all the tech leaders come together in Colorado once a year, to discuss what is going on in the space, and drink a lot of beer!



from http://ift.tt/1vJ4o53

6,482 Datasets Available Across 22 Federal Agencies In Data.json Files

It has been a few months since I ran any of my federal government data.json harvesting, so I picked back up my work, and will be doing more work around datasets that federal agnecies have been making available, and telling the stories across my network.

I'm still surprised at how many people are unaware that 22 of the top federal agencies have data inventories of their public data assets, available in the root of their domain as a data.json file. This means you can go to many http://ift.tt/1zweO8m and there is a machine readable list of that agencies current inventory of public datasets.

I currently know of 22 federal agencies who have published data.json files:

Consumer Financial Protection Bureau
Department of Agriculture (USDA)
Department of Defense (DOD)
Department of Energy (DOE)
Department of Justice (DOJ)
Department of State
Department of the Treasury
Department of Transportation (DOT)
Department of Veterans Affairs (VA)
Environmental Protection Agency (EPA)
General Services Administration (GSA)
Institute of Museum and Library Services (IMLS)
Millennium Challenge Corporation (MCC)
National Aeronautics and Space Administration (NASA)
National Archives and Records Administration (NARA)
National Institute of Standards and Technology (NIST)
National Science Foundation (NSF)
National Transportation Safety Board (NTSB)
Nuclear Regulatory Commission (NRC)
Office of Personnel Management (OPM)
Social Security Administration (SSA)
United States Agency for International Development (USAID)

You can click on the logo or name, and view the full data.json files. You can also visit my Federal Agency Dataset Adoption work to see all of the datasets listed for each agency. There is stil one bug I notice in the adoption process, so don't adopt anything quite yet.

The goal of this just to highlight again, that there is a wealth of open data resources just waiting for all of us open gov hackers to take advantage of, and work make sense of. Federal agencies need our help, so get involved, there is a lot of work to be done.



from http://ift.tt/1lt8jma

Wednesday, August 20, 2014

My Information Consumption Now Driven By Companies And People, Not Just Feeds

While I still worry about the health of RSS, I feel like my overall information consumption has significantly evolved in some very meaningful ways since Google decided to shut down Google Reader. A year before the shutdown of Google Reader, I had developed my own curation and monitoring system, which included the ability to pull RSS feeds, so I was already decommissioning Google Reader before it was officially abandoned.

In my monitoring system I can add three types of entries:

  • Entities - Business, organizations, government agencies or any other non-human entity.
  • Individuals - Basically my individual CRM system for everyone I know, and don’t know.
  • Feeds - Other generic feeds from some forums, streams, etc.

For each of these entries I pull the following streams of information:

  • Blog - The RSS streams from company and individual blogs, providing a pretty key signal about companies, and individuals doing cool stuff with APIs.
  • Twitter - The tweets, DMs, and links produce as part of a companies or media individuals social exhaust that is produced via Twitter.
  • Github - The repositories, commits, and interactions around public repositories maintained by individuals and companies.
  • Email - I’m still working on the best way to process email communications with companies and individuals, while also considering parsing email newsletter to balance out the shift from RSS to email blasts by startups.

If a company or individual has a blog, Twitter or Github account I pull any publicly available signals. I use these signals to stay informed using the blog posts, valuable conversations and links available on Twitter, and keep up to date with the latest API tech and code being developed on Github.

When I’m introduced to a new company or individual, I make sure they are entered into my system, giving me a more complete perspective across their world via their blog, Twitter, and Github accounts. I’ve realized that this approach has evolved my information consumption from being about feeds, to more about companies, the individuals who work at them, and the technology that is being developed.

I feel like this evolution is giving me a more clearer picture of the API space, beyond what just RSS could do.



from http://ift.tt/1uYZ3sP

Sorry Google, Your Programming Test Is Not A Valid Measurement Of My Skills

I’ve been talking with a very nice recruiter over at Google over the last couple weeks, and she has been so kind in keeping me updated about opportunities for evangelism at Google. This is the 3rd round of talks I've had with Google while being the API Evangelist, talks that historically go nowhere because of their programming test, which is a super silly aspect of their HR process.

I was straight up with the Google recruiter a couple of weeks ago when she first emailed me, and again when we talked on the phone last week—I do not take programming tests to open up doors for employment conversations, sorry. ;-( It is a waste of my time, and yours, and doesn’t measure shit. I understand that you have to qualify large number of folks, at your very algorithmic-centric company, but when it comes to measuring what I do, a programming test isn’t a thing.

If programming a tic-tac-toe game on a live screen share is what you need to open up a conversation with professionals around evangelizing your platform, you need to look elsewhere. Nowhere in my role as the API Evangelist do I have to code under time pressure with someone else watching, sorry. I would even say, having hacker skills, trumps programming skills in a public facing evangelism role, and speed, quality of code go out the window. This is about making connections through hacker storytelling, something that doesn't always pencil out to producing best code, and is more about helping people understand what is possible using a platform, in the most meaningful way—requiring more focus on the person and their problems, not the code or algorthm.

I’ve managed to have man very meaningful conversations with other tech giants like Intel, IBM, large institutions like UC Berkeley, BYU, and establish fruitful relationships with partners like 3Scale and API Spark, and across federal agencies like Department of Education, Energy, NASA, and the White House around APIs--all without taking programming tests. I talk to startups, SMBs, SMEs, organizations, institutions, and government agencies a lll the time, and never have to code under pressure, in front of an audience.

I’m not under the influence that I will change your hiring practices Google—you are a very smart, and successful company. All I’m saying is you are probably filtering out some pretty talented folks, who are extremely passionate and knowledgeable in what they do, and connected in their space, and when you won’t engage in meaningful conversations without a programming test, your missing out.

I actually prefer working with organizations from the outside in, I think it better reflects the essence of API Evangelism. The companies who have trouble working with outside entities, without tradition HR processes, are probably not going to lead when it comes developing an API driven ecosystem.

If your company doesn't have the time to research me, and understand what I bring to the API space, and what my skills are, we probably aren't a fit. Everything about me is available online at API Evangelist, Kin Lane, Twitter, and Github--you just have to look. If you are only looking at resumes, and making people take tests, you will probably get what you are looking for!



from http://ift.tt/1tqX7pb

Monday, August 18, 2014

All Government Should Have A Social Media Directory API

I was just looking for a list of Twitter accounts for the City of Chicago, and I came across the standard social media directory page, you will find at most city, county, and state government websites.

After looking at the list, I had a decision to make. I could either manually enter each of the twitter accounts into my CRM, or I could write a script to scrape the page, harvest the cotent and put into my CRM for me--I prefer to write scripts over data entry any day.

Here is the resulting JSON:

It got me thinking that every government entity, whether city, county or state should have a social media directory API like you find for the federal government. We should never have to scrape a list of our government social media accounts.

Once their is an API for each government entity, we can then drive existing websites, and other current locations with the API, as well as use in any new web or mobile apps.



from http://ift.tt/VBg1yd

Wednesday, August 13, 2014

Never Looking Out The Window, Let Alone Trusting Anyone External Of The Department of Veteran Affairs

I haven't written much about my experience last summer as a Presidential Innovation Fellow (PIF) at the Department of Veteran Affairs (VA). I have lots of thoughts about experience at the VA, as well as participating in the PIF program, and I choose to trickle these thoughts out, as I continue to make sense of them, and bring them into alignment with my overall mission as the API Evangelist.

I was given three projects when I started work at the VA: 1) Inventory data assets 2) Inventory web services 3) Move forward D2D, a forms web service that would allow VA hospitals and Veteran Service Organizations (VSOs) to submit forms through the claims process on behalf of veteran.

The most prevalent illness I witnessed across these three efforts was a unwillingness to trust outside groups (even VSOs and hospitals), and a lack of desire to share data and resources to anyone outside of the VA (ironically except contractors), to the point where groups seem to take defensive positions around what they did on behalf of our veterans. This culture makes for some pretty toxic environments, I personally feel contributing to much of the problems we’ve seen bubble up into the public media space of late.

While work at the VA you constantly hear about the VA claims backlog, and how we need to optimize, but when you bring up sharing data, or resources to other federal agencies, trusted external partners like hospitals, and VSO’s you get pushback with concerns of security, personally identifiable information (PII), etc. All which are valid claims, but there are proven ways to mitigate these risks through Identify and Access Management (IAM), which is another whole post in itself. You start feeling crazy when you get pushback for arguing that a doctor should be able to submit disability questionnaires via an iPad application, that uses an existing VA API, in a way that securely authenticates the doctor.

As a result of other system, cultural issues, and mistakes made in the past, VA employees and contractors are extremely adverse to opening up to the outside world, even if it can help. I kept hearing references to the 2006 data breach as a reason to keep systems locked down, where an employee brought a laptop home, affecting 26M individuals. This horror story, plus a variety of other cultural issues are keeping VA staff from accepting any new way of thinking, even if it could help reduce their workload, improve the claims process, and better serve the veterans and their families.

This is a pretty fundamental flaw in how large government agencies operate, that are in conflict with the solutions API can bring to the table. I don’t give a shit how well designed your API is, in this environment you will fail. Period. I do not think I will ever fully understand what I saw at the VA, while a PIF in Washington DC, but I feel like I’m finally reaching a point where I can at least talk about things publicly, put my thoughts out there, and begin my experiences as a PIF at the VA into my overall API Evangelist message.



from http://ift.tt/VlnQIc

The Color Of Money When Deploying APIs At The Department Of Veterans Affairs

I haven’t written much about my experience last summer as a Presidential Innovation Fellow (PIF) at the Department of Veteran Affairs (VA). I have lots of thoughts about experience at the VA, as well as participating in the PIF program, and I choose to trickle these thoughts out, as I continue to make sense of them, and bring them into alignment with my overall mission as the API Evangelist.

I just wrote a piece on replacing legacy systems at the VA using APIs, where one of the systemic constraints in place that restricts the modernizzation VA systems using API is purely about money, and more specifically the color of money. I won’t bore you with the detail of the topic, but in short the color of money is: Money appropriated for one purpose cannot be used for a different purpose, according to the Purpose Act (31 U.S.C. § 1301).

In short, if $50M was given to sustain an existing legacy system, and that money cannot be re-appropriated, and applied to the newer system, what incentive is there to ever get rid of legacy VA systems, or modernize any government system for that matter? Whether it is usng APIs, or anything else. Newer approaches to using technology are difficult to accept when you are working hard to accomplished your job each day, but if you already have $50M in budget to a specific job, and that job won’t go away, unless you choose to make it go away, guess what happens? Nothing changes…hmmm?

As I said before, I don’t give a shit if you deploy APIs from the ground up, or excite via a presidential mandates from the top down, if you have incentives in place for employees to do the opposite, based upon how money is allocated, you won’t be changing any behavior or culture—you are wasting your energy. I don’t care how excited I get any one individual, team or department about the potential of APIs bundled with new systems, if it means their job is going away—too bad, nobody will give a shit.

Think about this scenario, then consider that $1,810M of the 3,323M overall VA budget (54%) is sustainment. Granted this isn't all IT system sustainment, but still over half of the budget is allocated to keep shit the same as it is. Imagine what environment this creates for the acceptance of modernization efforts at VA.

This is a pretty fundamental flaw in how large government agencies operate, that are in conflict with the solutions API can bring to the table. I don’t give a shit how well designed your API is, in this environment you will fail. Period. I do not think I will ever fully understand what I saw at the VA, while a PIF in Washington DC, but I feel like I’m finally reaching a point where I can at least talk about things publicly, put my thoughts out there, and begin my experiences as a PIF at the VA into my overall API Evangelist message.



from http://ift.tt/1yy49tm

Taking Web Service Inventory At The Department of Veteran Affairs

I haven't written much about my experience last summer as a Presidential Innovation Fellow (PIF) at the Department of Veteran Affairs (VA). I have lots of thoughts about experience at the VA, as well as participating in the PIF program, and I choose to trickle these thoughts out, as I continue to make sense of them, and bring them into alignment with my overall mission as the API Evangelist.

One of the jobs I was tasked with at the VA as a PIF, was taking inventory of the web services within the agency. When asking folks where these web services were, I was directed to various IT Leads on different groups, each giving one or two more locations I could look for word, excel, or other PDFs talking about web services used in projects and known systems. Most of the time these were redundant lists, pointing me to the same 5 web services, and omitting 20-30 that were actually in use for a project.

At one point I was given the contact information for a lady who had been working for two years on a centralized web registry project, that would be the holy grail of web service discovery at the VA. This was it! It was what I was looking for, until I sat in on the weekly call where this project got a 10 minute update, demonstrating that the registry was still about defining the how and what of the registry, and never has actually moved to cataloging actual web services in the wild at the VA. ;-(

Then one day I was introduced to a gentlemen, that was in a back office, in an unmarked cubicle, who seemed to know where most of the web services were, the one difference with this person was that they were a contractor, and not an employee. One thing you hear about, but do not experience fully until you work in government is the line between government employee and contractor—in meetings, and conversations you know who is who (it is pretty clar), but when it comes to finding APIs, I’m sorry the contractors know where everything is at. This contractor had some pretty interesting lists of what web services were in operation, where they were, and which groups at VA owned them, including up to date contact info. These contractors also had their finger on the pulse of any project that was potentially moving the web services converations forward, including the global registry.

Overall I was surprised at how IT groups knew of their own web services, could care less about the web services of other groups, but contractors new where all the web service were across the groups. I was closing in on 500 web services on my list before I left during the shutdown, and I wonder how many else I would have found if I kept up the good fight. This mission had nothing do with API, except that web services are often compared to APis, I was purely taking inventory of what was already in place, a process that went far beyond just technical inventory, and shed light on some serious business and political flaws within operations at the VA.

This is a pretty fundamental flaw in how large government agencies operate, that are in conflict with the solutions API can bring to the table. I don’t give a shit how well designed your API is, in this environment you will fail. Period. I do not think I will ever fully understand what I saw at the VA, while a PIF in Washington DC, but I feel like I’m finally reaching a point where I can at least talk about things publicly, put my thoughts out there, and begin my experiences as a PIF at the VA into my overall API Evangelist message.



from http://ift.tt/1yy45tC

Replacing Legacy Systems With APIs At The Department Of Veteran Affairs

I haven't written much about my experience last summer as a Presidential Innovation Fellow (PIF) at the Department of Veteran Affairs (VA). I have lots of thoughts about experience at the VA, as well as participating in the PIF program, and I choose to trickle these thoughts out, as I continue to make sense of them, slowly bringing them into alignment with my overall mission as the API Evangelist.

On deck are my thoughts on replacing legacy systems with APIs, at the Department of Veteran Affairs. In the “real world”, one of the motivations for deploying APIs, is to assist in the evolution, and replacement of legacy systems. The theory is, you have older system that needs to be replaced, and you can wrap in a modern web API, and slowly switch any desktop, web, mobile or other client system to use the new API—then you build out newer backend system, and make the switch in the API layer from the legacy to the newer backend system, leaving everything operating as expected. API magic!

I'm used to hostile environments to this way of thinking, but most times in the private sector there are other business objectives that can be leveraged to get legacy system owners to get on board with a shift towards API deployment—I saw no incentive for this practice in the VA environment, where in reality there are incentives for IT, and business owners, as well as 3rd party contractors to keep legacy systems in place, not replace them. There are a variety of motivation for existing VA workers to keep existing systems in place, ranging from not understanding how the system works, to budgetary restrictions on how money flows in support of this pro-sustainment culture.

Here is an example. There is old database for storing of a specific type of document, a database that X amount of existing desktop, server, web, or even mobile systems depend on. If I move in and create an API, that allows for reading and writing of data into this database, then work with all X of the legacy systems to use the API instead of a direct database connection—in theory I can now work to dismantle the legacy database, and replace with a newer, modern backend database. In most IT operations, this approach will then allow me to replace, modernize and evolve upon an existing legacy system. This is a common view of technologists who are purely looking through a technical lens, and ignoring the existing business and political constraints that exist in some companies, organizations, insitutions and government agencies. 

In the real world, you have staff, budgets, workflows, and decision making processes that are already in place. Let’s say this legacy database had $50M a year allocated in budget for its operation, and I replace with a newer database, plus API, which operates for $10M a year—you’d think I get to reallocate the staff, budget, and other resources to developing newer mobile apps, and other system with my newly liberated $40M. Right? Nope…that money goes away, and those people have no interest in moving from supporting a COBOL system, to supporting a new MongoDB + Node.js API that is driving a Swift iPhone app. ;-(

This is a pretty fundamental flaw in how large companies, organizations, institutions and government agencies operate, that are in conflict with what an API philosphy can bring to the table. I don’t give a shit how well designed your API is, in this environment you will fail. Period. I do not think I will ever fully understand what I saw at the VA, while a PIF in Washington DC, but I feel like I’m finally reaching a point where I can at least talk about things publicly, put my thoughts out there, and begin to weave my experiences as a PIF at the VA into my overall API Evangelist message.



from http://ift.tt/1vKk0c8

Friday, August 1, 2014

Please Provide An Easy To Copy Logo And Description Of What You Do

I spend a lot of time looking at the websites of companies who are doing cool things in the space. I track on about 2000 companies in the API sector, and as part of this monitoring I add the company name, logo, brief description and usually their Twitter and Github account to my systems on a regular basis.

Using this information I will publish a company as part of any research I do across multiple API business categories like API design, deployment or management. If a company is doing something interesting, I need to be able to quickly find a good quality logo, and short, concise description of what the company does—something that is easier said than done.

You'd be surprised how hard it is to grab a logo when its in the CSS, and finding a single description of what the company does, something I usually have to go to Crunchbase or Angelist to find, and often have to write myself.

If you want to help people talk about your company and what you are doing, make it easy for them to find a logo, and description. Please don’t make us click more than once to find this information--trust me, it will go a long way in helping bloggers, and other people showcase what you are up to.



from http://ift.tt/1ncPQUX

Easy To Copy Logo And Description Of What You Do

I spend a lot of time looking at the websites of companies who are doing cool things in the space. I track on about 2000 companies in the API sector, and as part of this monitoring I add the company name, logo, brief description and usually their Twitter and Github account to my systems on a regular basis.

Using this information I will publish a company as part of any research I do across multiple API business categories like API design, deployment or management. If a company is doing something interesting, I need to be able to quickly find a good quality logo, and short, concise description of what the company does—something that is easier said than done.

You'd be surprised how hard it is to grab a logo when its in the CSS, and finding a single description of what the company does, something I usually have to go to Crunchbase or Angelist to find, and often have to write myself.

If you want to help people talk about your company and what you are doing, make it easy for them to find a logo, and description. Please don’t make us click more than once to find this information--trust me, it will go a long way in helping bloggers, and other outlets showcase what you are up to.



from http://ift.tt/1ncOdqc

Why I Post Stories To My Blog(s) Like I Do

I get a lot of folks who tell me how they love my storytelling across my blog(s), but sometimes they find it hard to keep up with my posting style, emphasizing that on some days I post too much, and they just can't keep up.

Since I just got home from API Craft in Detroit, and have a mind full of ideas, and Evernote full of have baked stories, and I feel a storytelling spree coming on, I figured I'd kick it off by telling the story of why I blog the way I do.

First, I blog for me. These stories are about me better understanding the complex world of APIs, and the storytelling process forces me to distill my thoughts down into smaller, more understandable chunks.

Second, I do not feel I can move on from an idea until it has been set free—meaning it is published to the site, and tweeted out. Only then can I detach and move on to the next thing on my list. I've tried scheduling, and all of that jive, but they only conflict with my emotional attachment to my stories.

Third, there is an emotional attachment to each one of my stories. This makes my storytelling, about me, not pageviews, SEO, or any other common metric in the blogosphere—my blogging is about me learning, and sharing these ideas openly with the world, everything else is secondary.

After all of that, my blogs are about you the audience, and helping you understand the world of APIs. I’m sorry if my storytelling flow is non-existent some days / weeks, and then overwhelming other days. I leave it up to you to bookmark, and flag for consumption later.

There are some mechanisms built into my network of sites to help you with this process. The blog uses Jekyll, which has a nice next / previous feature on the blog posting, so if you visit the latest blog, you can just hit previous until your head explodes. (I’ve seen it, it is messy)

Also all of my curation of stories across the API space, and my analysis eventually trickles down to all my research sites. So anything I read or write about API design, which eventually be published to the API Design research site. So you can just make regular rounds through my core research to catch up on what I read, think and publish—I do this regularly myself.

This is just a little insight into my madness, and it is just that—my madness. Welcome to it, and I hope you enjoy.



from http://ift.tt/1xLVpzt