Sunday, August 16, 2015

Legacy Power and Control Contained Within The Acronym

As I wade through government, higher educational, and scientific research, exposing valuable data, and APIs, the single biggest area of friction I encounter is the acronym. Ironically this paradigm is also reflected in the mission of API Evangelist -- helping normal people understand what the hell an Application Programming Interface is. I live in a sort of tech purgatory, I am well aware of it. 

The number one reason acronyms are used I think, is purely because we are lazy. Secondarily though, I think there is also a lot of legacy power and control represented in every acronym. These little abbreviated nuggets can be the difference between you being in the club, or not. You either understand the technology at play, or you don't. You are in the right government circles, or not. You are trained in a specific field, or you are not. I don't think people consider what they wield when they use acronyms, I think there is a lot of baked in, subconscious things going on.

One of the most important aspects of the API journey in my opinion, is that you begin to unwind a lot of the code (pun intended) that has been laid down over the years of IT operation, government policy, and research cycles. When you begin to unwind this, and make available via intuitive URL endpoints, you increase the chances a piece of data, content, or other digital resource will get put to use--something not all parties are actually interested in. Historically IT, government, and researchers wield their power and control, but locking up valuable resources, playing gatekeeper of who is in, and who is out--APIs have the potential to unwind this legacy debt.

APIs do not decode these legacy corporate, government, and institutional pools of power and control by default. You can just as easily pay it forward with an API gateway, or via an API architect who sees no value in getting to know the resources they are putting to work, let alone it's consumer(s). However if done with the right approach, APIs can provide a rich toolbox that can assist any company, institution or government agency in decoding the legacy each has built up.

You can see this play out in the recent EPA, er I mean Environment Protection Agency work I did. Who would ever know that the EPA CERCLIS API, was actually the Comprehensive Environmental Response, Compensation, and Liability Information System API? You don't unless you are in the club, or you do the heavy lifting (clicking) to discover the fine print. I am not saying the person who named the Resource Conservation and Recovery Act Information API, the RCRAInfo service, were malicious in what they are doing--this type of unconscious behavior occurs all the time.

Ultimately I do not think there is a solution for this. Acronyms do provide us with a lot of benefit, when it comes to making language, and communication more efficient. However I think, just like we are seeing play out with algorithms, we need to be more mindful of the legacy we paying forward when we use acronyms, and make sure we are as transparent as possible by providing dictionaries, glossaries, and other tooling. 

At the very least, before you use an acronym, make sure your audience will not have to work extra hard to get up to speed, and do the heavy lifting required to reach as wide as possible audience as you possibly can. It is the API way. ;-)



from http://ift.tt/1fj97aj

Saturday, August 15, 2015

Asking For Help When I Needed To Better Understand The Accounting For US Federal Budget

As I was working my way through the data for the US federal budget, I noticed a special row in between the years 1976 and 1977. It simply had the entry TQ, and no other information available about what it was. 

To get an answer regarding what this entry was, I went to my Twitter followers:

Then, because I have the most amazing Twitter followers ever, got this response from Stephen H. Holden (@SteveHHolden):

When doing any open data work, you can't be afraid to just ask for help when you hit a wall. I've been doing data work for 25 years, and constantly hit walls when it comes to formatting, metadata, the data itself.

The moral of this story is use your Twitter followers, use your Facebook and LinkedIn followers, and make sure and publish questions as a Github issue--then always tell the story!



from http://ift.tt/1TGSWXl

Friday, August 14, 2015

Stepping Up My Open Data Work With Adopta.Agency, Thanks To Knight Foundation, @3Scale, and @APISpark

I always have numerous side project cooking. Occasionally I will submit these projects for potential grant funding. One of my projects which I called Federal Agency Dataset Adoption, was awarded a prototype grant from the Knight Foundation. It was the perfect time to get funding for my open data work, because it coincided with the Summer of APIs work I'm doing with Restlet, and work already in progress defining government open data and APIs with 3Scale.

After reviewing my Federal Agency Dataset Adoption work, I purchased a domain, and quickly got to work on my first two prototype projects. I'm calling the prototype Adopta.Agency, and kicking it off with two projects that reflect my passion for the project.

US Federal Budget
This is a project to make the US federal budget more machine readable, in hopes of building more meaningful tools on top of it. You can already access the historical budget via spreadsheets, but this project is work to make sure everything is available as CSV, JSON, as well as an active API.

VA Data Portal
This project is looking to move forward the conversation around VA data, making it more accessible as CSV and JSON files, and deploying simple APIs when I have the time. The VA needs help to make sure all of its vital assets are machine readable by default.

The first month of the project will be focused on defining the Adopta Blueprint for the project, by tackling projects that my partner in crime Audrey Watters (@audreywatters), and I feel are important, and set the right tone for the movement. Once the blueprint is stable, we ill be inviting other people into the mix, and tackle some new projects.

Adopta.Agency is not a new technology, or a new platform, it is an open blueprint that employs existing services like Github, and tools like CSV to JSON converter, to help move the government open data movement forward just one or two steps. The government is working hard, as we speak, to open up data, but these agencies don't always have the skills and resources to make sure these valuable public assets are ready for use in other websites, applications, analysis and visualizations--this is where we come in!

With Adopta.Agency, we are looking to define a Github enabled, open data and API fueled, human driven network that helps us truly realize the potential of open data and APIs in government -- please join in today.



from http://ift.tt/1DTylYx

Being The Change We Want To See In Open Government Data With Adopta.Agency

I have had a passion when it comes to open data for a number of years. Each time the federal budget has come out in the last 10 years, I would parse the PDFs, and generate XML, and more recently JSON, to help me better understand how our government works. I've worked hard to support open data and APIs in the federal government since 2012, resulting in me heading to Washington DC to work on open data projects at the Department of Veterans Affairs (VA) as a Presidential Innovation Fellow (PIF)

I understand how hard it is to do open data and APIs in government, and I am a big supporter of those in government who are working to open up anything. I also feel there is so much work left to be done to augment these efforts. While there are thousands of datasets now available via Data.gov, and in the handful of data.json files published by federal agencies, much of this data leaves a lot to be desired, when it comes to actually putting it to use.

As people who work with data know, it takes a lot of work to clean up, and normalize everything--there is just no way around this, and much of the government data that has been opened up, still needs this janitorial work, as well conversion into a common data format like JSON. When looking through government open data you are faced with spreadsheets, text files, PDFs, and any number of other obscure formats, which may meet the minimum requirements for open data, need a lot of work to get it truly ready for use in a website, visualization, or mobile application.

Adopta.Aency is meant to be an open blueprint, to help target valuable government open data, clean them up, and at a minimum, convert them to be available as JSON files. When possible, projects will also launch open APIs, but the minimum viable movement forward should be about cleaning and conversion to JSON. Each project begins with forking the Adopta Blueprint, which walks users through the targeting, cleaning, and publishing of data to make it more accessible, and usable by others.

Adopta.Agency employs Github repositories for managing the process, storage and sharing of data files, while also acting as gateway for accessing the APIs, and engaging in a conversation around how to improve upon data and APIs available as part of each project (which is what APIs are all about). Adopta is not a specific technology, it is a blueprint for using commonly available tools and services, to move government open data forward one or two steps. 

We feel strongly that making sure government open data available in a machine readable format, can be a catalyst for change. Ironically, even though this data and APIs are meant for other computers and applications, we need humans to step up, and be stewards of an ongoing portion of the journey. Government agencies do not have the skills, resources, and awareness to do it all, and when you actually think about the big picture, you realize it will take a team effort to make this happen.

Adopta.Agency is looking to define a Github enabled, open data and API fueled, but ultimately human driven network to help everyone realize the potential of open data and APIs in government -- please join us today.



from http://ift.tt/1KmmeAe

Thursday, August 13, 2015

Forget Uber, If You Build A Platform That Feeds People Like This, Then I Will Care

I was listening to the To Cut Food Waste, Spain's Solidarity Fridge Supplies Endless Leftovers segment on NPR today, which made me happy, but then quickly left me sad regarding 99% of the tech solutions I see being developed today. The tech sector loves to showcase how smart we all are, but in the grand scheme of things, we are mostly providing solutions to non-problems, when there is a world full of real problems needing solved.

I remember being at MIT for a hackathon a couple years back, where when we were done with the catered food for our event, the food was taken down to a corner in a hallway, that had a table, and a webcam. After putting the bagels, pizza, juice, and other items on the table, within about 20 minutes, it was gone--students fed, and food not wasted. #winning

The solidarity fridge idea reminded me of this experience, and it makes me sad that there is not an Uber for fucking feeding people! Why the hell isn't there a solidarity fridge and pantry on every street corner in the world? Why don't we have Uber for this idea? Why aren't there food trucks doing this? Oh, because there is no fortune to be made on actually making sure people are being fed, and Silicon Valley really doesn't give a shit about solving real problems, it is just what we tell ourselves so we can sleep at night.

If you are building a platform that helps neighborhoods manage their solidarity fridge and pantries, complete with API, mobile and web apps, and SMS push notifications, then you will see me get real excited about what you are doing--until then...



from http://ift.tt/1Ne4vjO