Wednesday, May 29, 2013

API Aggregation For Federal Government with FedAPI

I've been tracking on what I call API aggregation for some time now. I started seeing the API aggregation trends in 2010, then I saw this pick up in 2011 with the emergence of providers like Singly and Adigami.

While many new companies provide you with interoperability and automation between multiple APIs, I consider aggregators to be about bringing together multiple APIs into a single aggregated end-point for developers to use, then also providing potentially new APIs that are only possible, because of the aggregation.

Last year, after I started spending more time in Washington DC, I met a talented group of folks who were playing around with the idea of an aggregate API for federal government. In the last couple weeks it looks like they were serious about it, and launched the debut version of Fed{API}.

Fed{API} is looking to make government data more accessible and reusable by taking data from multiple sources, some easy to pull and some much more difficult, then collect, correlate and catalog into a single, aggregate API stack that developers can put to use.

I'm a big fan of API Aggregation. I think this is the future of standardizing APIs and data across many sources, but as I'm watching providers tackle this, you begin to realize what a massive undertaking it will be. And FedAPI is looking to do this by opening up federal government data, which by itself is a massive challenge. So FedAPI is tackling two pretty massive problems, as a core business model.

Even with the massive challenge ahead, I will be supporting FedAPI 100%, and doing what I can to make sure their initiative is successful. To do so, I think they will have to get two other groups involved:

  • Other API Aggregate Providers - FedAPI needs to make sure and reach out to Singly, Temboo, Adigami and the other API aggregators and see what they can learn, share knowledge and wisdom. Aside from FedAPI, all these providers need to work together and not re-invent the wheel when it comes to Auth and API connectors
  • Public At Large - FedAPI has to introduce tools that allow figure everyday Joe and Jane citizen into the equation. Some users are going to be very knowledgable and passionate about specific areas, and there has to be ways to share the load and let them help clean and verify data, and other meaningful tasks

I don't think that API Aggregation is the answer to all the problems that plague API integration, but I think API aggregators will play a major role in helping standardize API interfaces, data formats and provide critical cross-provider, inter-agency knowledge and wisdom--something individual API providers may never be able to deliver alone.

One thing I want to lend a hand with, when it comes to FedAPI, is help create the blueprint of which federal agencies already have APIs and open data sets, and where these exists. I can also be the aggregator of news, analysis and best practices across the federal API landscape.

I'm on board FedAPI, let's make this happen.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/g16V3B9VfMo/

Have You Checked Out Webshell Lately?

Have you checked out what the Webshell.io team is up to lately?  I know I haven't been in there for a couple months, and it looks like they've been heads down making some interesting improvements.

New Cloud IDE

Rather than re-invent the wheel, Webshell has integrated Cloud9 web-based IDE, into the API scripting platform:

image

New Documentation

They overhauled their platform documentation on the continuous mission to make them more valuable by developers:

image

New API Explorer

They've changed the API explorer making playing with APIs a better experience.

New Codecademy Lesson

They have created "Webshell 101" course, providing an easy way to understand how Webshell works:

image

That is four pretty significant building blocks from the Webshell platform. If your not familiar with some of the innovative API integration Webshell is doing, I recommend checking them out.  Also if you haven't taken a fresh look in a while, like me, I recommend taking a moment.

The Webshell team gets the future of API usage, what they are doing via their platform represents a meaningful blend of the technology, business and politics of APIs, providing a model for the programmable web.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/cMvTEDJTs1o/

Tuesday, May 28, 2013

New Features From BaaS Provider AnyPresence

I'm adding some new BaaS features I found in a recent press release from BaaS provider AnyPresence titled, "AnyPresence Launches 4th-Generation Mobile Backend-as-a-Service Platform with Unparalleled Enterprise Capabilities", to my list of BaaS features.  As I'm processing them I notice they are some pretty significant features:

  • Application Cloning - For organizations looking to build multiple apps that have common core functionality with only minor variations, this powerful feature allows them to create a copy of an existing app along with all data source, object, and user interface definitions, saving significant development effort.
  • Automated App UI Testing - Developers who use AnyPresence to generate a starter mobile app user interface (UI), now get the added benefit of functional test scripts for native iOS, native Android, and jQuery Mobile web apps. These test scripts can be run to ensure the app is interacting with backend functionality as expected, saving time in testing and improving reliability.
  • Custom Server Extensions - While developers have always been able to add custom code to objects within AnyPresence, they can now create re-usable “Extensions” that can be shared across teams or lines of business. This also enables third parties to encapsulate their services as official AnyPresence Extensions, enabling a marketplace of add-ons that can be used across the AnyPresence customer base.
  • Enterprise App Store Integration - AnyPresence now supports the ability to deploy apps to employees directly via enterprise app stores powered by Mobile Application Management (MAM) or Mobile Device Management (MDM) vendors. Apperian EASE is first MAM partner solution to be integrated directly with the AnyPresence Designer.
  • Cloud Infrastructure Management - For the default cloud backend server deployment to Heroku, administrators can now control the performance characteristics of their app, and choose from hundreds of Heroku add-ons, directly from the AnyPresence Designer. This seamless integration enables developers to plan for the required capacity and usage of each individual backend server instance, and manage them from one central location.

It can be difficult to interpret each BaaS provider's products and services, and establish any sort of common definition, but applicaiton cloning, automated testing, custom server extensions and enterprise app stores are pretty straight-forward and powerful BaaS features.

I'm curious to see where AnyPresence goes with their custom extensions.  I think whoever can do this part of BaaS correctly will win the BaaS world championships! (That is an award, right?)

Lot's of exciting stuff coming from the BaaS sector lately!  Fun space to watch.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/mVeFt84cXKk/

Signals I Use To Monitor Companies In The API Space

Over the last year I've worked hard to standardize and automate as much of my monitoring of the API space as I can. The amount of information I was monitoring daily was getting overwhelming--I needed to scale what I do, so I created what I call my API Stack Ranking.

The API Stack Ranking is not meant to be a top 25 list, it is meant to be a ranking that I can sort companies by each week, giving me a meaningful stack of companies, who are doing interesting things in the API space.

Each week I publish my API Stack, but you can also see this influence on any of my research areas.  If you go to API management or the Backend as a Service (BaaS) research projects, the providers are sorted by their ranking for the week.

In addition to sharing my stack each week, I'd like to also share the criteria I use to determine my stack.  Currently I bundle my signals into three areas:

  • Internal - Signals within control of the companies I'm monitoring
    • Has a Blog
    • Number of Blog Posts
    • Has a Twitter Account
    • Number of Tweets
    • Has Github Account
    • Number of Github Repositories
    • Number of Commits Across Repositories
  • External - Signals from the public, outside of the companies control that I'm monitoring
    • Number of Twitter Followers
    • Number of @Mentions for Twitter
    • Blogsphere (Techcrunch, GigaOm, ReadWrite, etc) Posts
    • Social Bookmark (Hacker News, Reddit, StumbleUpon) Links
    • Stack Exchange Reference
    • Number of Github Followers
    • Number of Github Stars
    • Number of Github Forks
  • Analyst - Signals that are dependent on what I personally see and feel is important
    • Curated News
    • Curated News I Write Notes About
    • Syndicate to Twitter
    • Syndicate to LinkedIn
    • Syndicate to G+
    • Publish story on API Evangelist
    • Vote Up 
    • Vote Down

My API Stack Ranking is always in flux. I'm considering other signals like job postings, events as well as sentiment analysist, but this listing represents what I'm currently evaluating to generate my stack each week.

While internal signals give me a good idea of what a company is up to, and the external signals give me some idea regarding what the open developer community is saying, it all really comes down to the analyst part of the algorithm.  A company may tweet a lot, but its up to me to decide if those tweets are valuable.

The API Stack Ranking isn't perfect.  But it has allowed me to quadruple the amount of information I take in, and surface what I feel are the real stories and reflect the companies that are truly impacting the API space each week.

I have also been able to expand the API Stack Ranking to any business or topical sector I wish now, much like I applied to BaaS I'm now evaluating federal government, healthcare and other sectors in the same way.

Feel free to contact me directly about the API Stack, and the data and content I produce each week.  I'd love to hear your thoughts.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/ZCatUQt8HRU/

API Management Using APiphany

I'm working my way through all the API management providers, making sure I'm up to speed on what service each provider is offering these days. As part of this work I've been playing with the Apiphany platform, using a demo site the Apiphany team was so kind to setup for me.

Using Apiphany I can manage my APIs, and launch a portal which allows me to hang all my APIs, then manage all aspects of their operations. Apiphany focuses on the three main components of API deployment & consumption:

  • APIs - API resources and their operations you've deployed and would like to hang and manage via your Apiphany portal
  • Products - Virtual products you create, which allow access to various combinations of API resources and operations
  • Apps - Applications that are registered with the platform, and consuming APIs via various product subscriptions

The way Apiphany sets up the systems is very intuitive, giving you quick, administrative control over the systems you will need to execute on either a very simple, or even a very complex API strategy.

APIs
Apiphany provides all the basics for API management, allowing you to add to endpoints, manage the title, description and manage operations for each endpoint, allowing you to define service operations to enable service documentation, interactive API console, per operation limits, request/response validation, and operation-level statistics. Beyond the basics you can mask APIs, implement sophisticated caching and import overall API definitions from a WADL, Google Discovery Document, Swagger or OData format.

Products
After you have all your raw API resources defined, you can move to define your "products", composed from your valuable API resources. The Apiphany product manager allows you to provide names and descriptions for your products, choose who has access and whether their are published or unpublished and the terms of use for any developer who subscribes to a product. API product management takes a lot of work, experience and creativity--the Apiphany management system makes it something anyone can do.

Policies
The policies management area of Apiphany is where it gets seriously powerful. Using the products, APIs and their operations, you can define granular level policies that are essential to API operations. Policies can range from converting XML to JSON, providing JSONP or simply setting headers for specific APIs operations, and potentially within as specific product composition. Policies gives you a library of common policies that are used across the API industry via easy to apply templates, but also allow you to define custom Node.js policies. The ability to define and apply policies across products, APIs and their operations provides a very modular, reusable and granular control over all aspects of your API platform.

APIs, Products and Policies are the heart of the Apiphany API management solution. These provide the essential operational building blocks every API owner will need to execute an API platform. Apiphany also provides the necessary CMS framework to manage and support the developer portal, which wraps around your products, APIs and operations.

Portal Management System
The APIPhany solution gives you full control over a developer portal to hang products, APIs and operations within--giving yuo control to customize the look, navigation of the portal with full page, content and media management. The Apiphany API Portal comes complete with the essential building blocks like documentation, blog, support and other CMS essentials to support an API platform.

Analytics
Apiphany has added a pretty robust suite of API analytics that allow you manage the most common aspects of API operations, in the following three areas:

  • Usage - Calls and bandwidth
  • Health - Status code, cache and API response time
  • Activity - By developer, product, API and operations

The Apiphany analytics package allow you to filter by products and operations broken down by default dates like today, yesterday and 7,30,90 day or custom timelines. It gives you the basics of what you will need to be in tune with your API operations and make short or long term decisions for your API roadmap.

Developers
Developers are central to any API operations and Apiphany provides the tools you will need to manage developer details and their subscriptions to products, and how their applications are using API resources. Apiphany gives you the control to block, reset and define access levels and roles for each of your developers, providing the control you need to manage your API ecosystem.

This is my first deep dive into the Apiphany API management platform. I like what I see. It has all the tools you will need to define, secure, manage and evolve your APIs. Deploying APIs requires a certain level of understanding of APIs, but with Apiphany anyone could take a set of API endpoints and transform and evolve them into some meaningful API driven products.

After spending some time playing with what is possible with the Apiphany platform I'm left with thoughts about how I could craft products around the API resources I have. I feel like I could easily define some pretty robust API products developers could subscribe to and integrate with. But I also feel like I could easily iterate and add products, policies and create new APIs and operations to better meet potential consumers needs with very little effort.

I'm going to continue setting up my API infrastructure with Apiphany, and help me understand what is possible. I'd love to have my core platform available on all self-service API platforms, helping me understand where the space is at, and what each company brings to the table.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/yn6u6sahkfE/

Monday, May 27, 2013

Github Can Be More Than Code

I have been using Github to manage my code for a couple years now, but in the last year I'm using Github more often for a variety of projects that don't always have code involved--examples ranging from hosting slide decks for my talks to repositories for my API research projects.

In the last couple months I've noticed I'm not the only one using Github to organize projects, check out of a few of these examples:

White House Open Data Project - Using Github to publish and solicit participation around the open data policy for all federal agencies
Hackathon Guide - How to throw a hackathon guide, complete with an event website template that runs as Github page
Innovators Patent Agreement - A new way to do patent assignment that keeps control in the hands of engineers and designers, put forth by Twitter.
Germany for Laws - A repository containing all German federal laws and regulations in Markdown format

Github lends itself to managing markup and markdown documents in a very collaborative way, allowing for tracking issues and revisions along the way. There is something about the open source process that introduces transparency and visibility into the document creation and management process.

I'm wll keep tracking on the innovative ways I'm seeing Github used. So far I'm pretty impressed with the different ways I'm seeing Github used.  If you know of any other interesting ways, please send my way.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/nLZvrxJl5Co/

Saturday, May 25, 2013

Quick Demonstration Showing The Benefits of The White House Digital Strategy

I wanted to share a quick visualization of the beneifts of Barack Obama's Presidential directive that every Federal Government agency should have an API, following Executive Order 13571, and part of the Whitehouse CIO's strategy, entitled "Digital Government: Building a 21st Century Platform to Better Serve the American People" and the potential of the recent Open Data Executive Order and accompanying Open Data Policy.

Each federal agency should have three things available online at their domain:

  • agency.gov/digitalstrategy/
  • agency.gov/data
  • agency.gov/developers/

Those are three pretty important signals in the API space.  This represents each agencies technology roadmap, data and developer resources. Which can be some pretty interesting signals for the overall health of a federal agency.

To demonstrate, let's look at the Department of Energy.

Let's go to:  http://energy.gov/digitalstrategy

 

Next:  http://energy.gov/data/

Then:  http://energy.gov/data/

After the Department of Energy, let's take a look at another agency, like the Internal Revenue Service (IRS).

Let's go to:  http://irs.gov/digitalstrategy


Next:  http://irs.gov/data


Then:  http://irs.gov/developer


Granted, these are two very different agencies. But I can't help but think that the IRS should hand out a copy of the White House Open Data Policy internally, and figure out how they can begin to let a little light into their operations and culture.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/0CFUcUUOnH0/

IRS Needs To Use White House Open Data Policy For Guidance

I'm reading IRS: Turn Over a New Leaf, Open Up Data, from the Open Knowledge Foundation blog. I'll let you read it in its entirety, but these are the points that are sticking with me:

One of Mr. Werfel’s first actions on the job should be the immediate implementation of the groundbreaking Presidential Executive Order and Open Data policy, released last week, that requires data captured and generated by the government be made available in open, machine-readable formats. Doing so will make the IRS a beacon to other agencies in how to use open data to screen any wrongdoing and strengthen law enforcement.
By sharing readily available IRS data on tax-exempt organizations, encouraging Congress to pass a budget proposal that mandates release of all tax-exempt returns in a machine-readable format, and increasing the transparency of its own processes, the agency can begin to turn the page on this scandal and help rebuild trust and partnership between government and its citizens.

Making IRS data open won’t solve every problem; the recent scandal has proven that the IRS must be more transparent about both the information it collects, but also how it manages that information. A commitment on day one to share the data it collects in a machine readable manner would show true leadership by Mr. Werfel and help solidify the Obama administration’s legacy as an open government.

From my vantage point, in the world of APIs--I can't help but feel the solution to our "big government" illnesses is a little API therapy.  The Open Data Policy has set the bar, and its up to the individual agencies to pay attention.

APIs are not just a technical band-aid. APIs start with a technical seed, but if a business or government entity internalizes the concepts around opening up, providing access, and setting up a feedback look with the public, partners or even with other departments, they will potentially let in a little sunlight. Something that can go a long way in preventing many of the illnesses we often associate with "big government".

I'm optimistic that the IRS can regain its footing. I'm seeing a lot of API and open data innovation coming out of the federal government lately. For such a controversial entity like the IRS to recover, they will need to open up, and communicate with the public, in a way that only APis can provide.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/hyklVUZNjwQ/

Friday, May 24, 2013

Dropbox As Your Apps Default File System

Cloud storage of documents is becoming commonplace. Individuals, companies, government and non-government organizations have increasingly seen the potential of storing files in the cloud using services like Amazon S3, Dropbox and Box.

As a web or mobile application developer, it is becoming more common to provide integration, syncing or even direct usage of popular cloud storage services like Dropbox as the application's storage system.

This last week I started playing with Dave Winer's (@davewiner) simple idea outliner, notepad, todo list and project organizer--Fargo.

What caught my attention is Dave's use of Dropbox as the central storage for the app. It reminds me of another application I use called Prose.io, which uses Github as the central storage system for the app.

I really dig this approach to delivering, dead-simple, meaningful apps like Fargo, that don't re-invent the wheel and focus on delivering value on top of the existing tools and platforms we already use.

I asked Dave what his thoughts are on this approach, in which he simply replied:

I like Dave's approach to app development which is about delivering simple, useful apps that that sensibly put APIs to use, and establish a foundation using open formats like OPML.

As the hardware and operating systems developers depend on continues being vritualized and migrating into the cloud, the usage of open formats and common, API driven cloud resources like storage from Dropbox, will be common among successful app developers.

I think Dropbox as your apps default file system is here to stay.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/sznBx2L58So/

DataSift's Open Source World

I'm increasingly finding a company's approach to using Github, a vital signal of the health of a company, their team, and the products and services they are delivering.

An example of this is with social data platform, DataSift's new open source area. DataSift has thrown up a Github page which re-enforces the company's commitment to consuming and producing open source software, as well as a list of important, Github driven signals:

  • Latest Updates - The latest repository updates acorss all of DataSift's repositories
  • API Client Libraries - Java, Python, Ruby, PHP and other client libraries for the DataSift API
  • Public Github Repos - Other tools and utilities that are avaiable via DataSift Github repositories
  • Job Vacancies - A listing of current job openings at DataSift

I think DataSift approach is a very meaningful demonstration in the potential of being "open".  As a company, DataSift provides access to valuable social signals, that any company can access via their interface and API, allowing them to develop insight and intelligence. If you look closely at their approach to deploying their Github hosted, open source site, it provides four very critical signals. API client libraries, open source code projects, job openings and updates across the company's Github profile.

Think about it. The number of platforms supported for an API, the number of public open source code projects and the number and type of job postings for a company are pretty significant signals regarding the health of a company these days.  These can be pretty difficult areas to fake, and in many cases will quickly show the activity, growth and stability of a company.

I like DataSift's approach to their Open Source area. It doesn't just provide code and tools for developers to be successful, it provides us a deeper look into how DataSift operates.  

Of the 1,640 tech companies I monitor, 751 of them have Github accounts.  I think many of these companies can learn a lot from DataSift's approach to open source and how you can put Github to use.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/5okiCCh65PY/

Salesforce Adds Sandbox Templates

Salesforce is doing some pretty interesting stuff with the sandbox environment for DeveloperForce.

Using the DeveloperForce sandbox you can create copies of your data, allowing you to develop, test or train against, not just a sandbox, but a relevant and meaningful copy of your company's data.

The Force.com sandbox has always allowed you to create a separate copy of your data, but now they let you create sandbox templates, allowing you to create specific data sets that you can reuse in different ways.

Seeing this feature coming out of Salesforce has prompted me to take a closer look at their approach to the DeveloperForce sandbox environment. I'm sure there are other features and approaches to sandboxing we can learn from the API pioneer.

This reminds me of the story I wrote about UC Berkeley's desire for An API That Scrubs Personally Identifiable Information From Other APIs.

As the API universe expands, I'm seeing a greater need for me to study and tell stories around API testing, monitoring, sandboxing and the best practices around API development and integration.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/Oa-teMr9F50/

An Open Source Code Catalog for your API

I'm working through the wave of API innovation coming out of our Federal Government recently. During normal days at API Evangelist, I'm pulling private sector API usage examples and crafting them into stories to help the Federal Government execute on their API own strategies. Today, I'd like to showcase something the Federal Government is doing, in hopes that more companies in the private API sector will emulate as part of their API strategies.

The GSA's Digital Services Innovation Center has launched the Mobile Code Catalog, an open source catalog of web and native applications that government agencies can use to jump-start their projects. Think application showcase, but all the applications are open source and allow for your API developers to download, fork and re-use code.

The idea of a code catalog is pretty interesting, and I see it as an evolution of several API building blocks I talk about--code samples, SDKs, starter kits and application showcase all rolled into one. Imagine if your API consumers can come into your developer area and not just find code samples, they can find complete applications that they can download, reverse-engineer and put to use. Talk about going from zero to API integration in as short of time as possible.

I know that many API owners compete with their ecosystems, by deploying their own version of their mobile, web or desktop apps, but for the others a code catalog is an excellent way to facilitate integrations with your platform.

Another cool thing about the Digital Services Innovation Center Mobile Code Catalog, is that the catalog itself is open source and hosted on Github. This is a whole other aspect of the innovation around a code catalog, that the catalog itself can be downloaded, forked and published anywhere on a public website or private portal.

I envision hundreds of code catalogs available in the near future, across many business, government and NGO sectors. While there is money in building proprietary apps on top of APIs, I think there is even more money to be made when you put meaningful, high quality apps in the hands of end-users.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/KBrcd5sI3hg/

Multi-Tenancy with WSO2 API Manager

I just had a demo of some of the new features in the WSO2 API Manager. Since WSO2 is one of my partners, I have a regular call with them to discuss the space and I often get demos of their new products and features.

Today's topic was multi-tenacy in their API Management platform, meaning you can easily deploy multiple API portals using the platform. Not every company will need more than one API portal, but for some companies that are further along, it provides a pretty sophisticated way to engage with API consumers.

When it comes to slicing and dicing your APIs, we usually segment our API resources by service level, apps and users. WSO2 introduces a fourth layer--by the domain. So now you can group API resources under domains, crafting different "API storefronts" for your consumers.

I'm still thinking through all of the opportunities with API management multi-tenancy, but at first thought, it will help me counteract many of the questions I get around a company's concern that APIs have to be public. Most companies have learned about APIs from the popularity of public APIs, so I spend a lot of time explaining the opportunities for internal or partner APis.

The WSO2 approach to API management multi-tenancy allows me to help companies understand that unlimited ways they can provide access to their valuable resources in not just public, private ways, but to any group, user base or target audience in a very organized and branded way.

You can hear me, and Chris Haddad VP, Technology Evangelism at WSO2 discuss multi-tenancy as part of the Your API Branding Strategy webinar on Thursday, June 13, 2013.

Disclosure: WSO2 is an API Evangelist partner



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/CHIWftl-JM8/