Saturday, March 30, 2013

Usually When Developers Are Mean, It Is About Power

I’ve been programming professionally since 1988, so I’ve been around quite a few developers in my career. There seems to be a lot of moments lately that cause me to look back at this career, reassess what I've learned, and adjust the direction I'm going with this mined wisdom.  This week I’m reflecting on the recent insanity around Adria Richards, my role as the API Evangelist, and spending time thinking about the tech space's inclusiveness as I’m hanging out with my 12 year old daughter for spring break.

Throughout my career, in just about every position I’ve taken, whether as a programmer, manager, lead, architect and VP, I’ve encountered developers who are challenged by me, my role, skills and approach to tech. I can remember 3 separate roles I’ve had where one of my first assigned tasks was to fire a developer who had made everyone’s lives miserable, because before I came along they couldn’t fire them (him), because they were dependent on them for their skills.

Quality technical talent is hard to find, and many companies can’t afford to lose the talent they already have. No matter what the personality defects or the workplace issues might be. Many developers know this and work it to their advantage, taking advantage of power they possess for their own selfish goals.

This power struggle does not just exist between developers and non-developers. As a programmer I encounter developers who always have to one up you, and make you feel stupid for what you don’t know. It might be a programming language, database platform, technical library or just the fact you use Windows vs Mac. Some programmers can co-exist, but many feel the need to challenge others, while assuming a defensive position around their established fiefdom.

Outside the corporate firewall I see this behavior extended to online interactions, where programmers make non-developers feel insignificant for what they do not know, and belittle other developers for what skills they lack. Something that often can extend into other more harmful, trollish behavior. Developers can be quick to jump one others in an online environment, creating very uncomfortable or hostile situations.

Many places I’ve worked or contracted with, this power struggle plays itself out as the classic IT bottleneck. IT bottlenecks are often portrayed as lack of resources, but more often than not they represent power struggles around budget, technology and other lesser evils. Think about some of your IT interactions--is it always a shortage or resources or lack of desire to actually deliver? It can be very hard to tell sometimes.

While the root of this behavior I feel is insecurity, I think ultimately it is all about power. I also strongly believe one of the by-products of this reality is the sexism, racism and other negativity that is a systemic issue in the tech space.

I’m not writing this to say that all developers are power trippers. They aren’t. But us technologists have a huge problems with wanting to be keepers of the knowledge. I’m not excluding myself from this. I fall victim to the desire for power and glory. I’m not exempt. But I work really, really, really hard at trying to transcend this past I share with the rest of the tech space.

I believe APIs are letting the IT resources out of the bag, and so is the consumerization of other areas like cloud computing, storage, email, web and mobile development, etc. APIs are bringing valuable IT resources closer to the people who can use these resources to solve problems. I saw the opportunities around APIs playing out from 2005 through 2010, and in middle of 2010 I decided to start API Evangelist to help spread the word to the masses about the potential of APIs--bridging a gap that has created between developers and non-developers.

It is very important that us technologists ground our work in the assistance of other non-technologists. We have to work very, very, very hard at being nice and make what we do inclusive. It isn’t easy. Listen to the first couple minutes of the API podcast Traffic and Weather from the other week and you will hear how much work I put into making sure I’m pleasant, but make an impact when engaging not only non-developers, but developers alike. It is critical.

If you are a developer, please join me in making sure everything we do is as inclusive as possible for other developers, non-developers and most importantly of other sexes, races and anyone we encounter in what we do.



from Kin Lane http://feedproxy.google.com/~r/KinLane/~3/oF6IZF_LtAI/

Providing Plain English Version Of API Terms of Service

I read your Terms of Service is one of the biggest lies on the Internet. We agree to terms of service for each and every service we use online, without ever reading and understanding exactly what we are agreeing to.

This is one of the most damaging aspects of online life, as through this process we are giving away our rights, ownership of our data and allowing for our privacy to be compromised each and every day. While not all services are abusing this, there are many online services that use this to their advantage, in an effort to maximize the amount of value they extract from their platform and end users.

Service providers have to go further in educating users about the terms of service they are agreeing to. There is a great example of this in action, via an article in NextWeb called “now THIS is how to write your startup’s Terms of Service”. The post showcases how real-time sharing platform Heello has provided plain english descriptions, next to each “legaleze” paragraph in their terms of service.

This is nothing new. You see it with other providers like Tumblr. But I think it is a very simple enhancement to API terms of service that can have a huge effect, and begin leading us in a more healthy direction when it comes to educating end-users about the TOS they are bound to. While companies need the protection of a legal terms of service, there is no reason you can't provide your end users with a translated, plain english version.  It doesn't take much work, and really sets you apart from other API service providers.

Do you know of any other examples of online services going above and beyond to help educate end users about terms of service?



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/hsJR5E0qIkA/

Thursday, March 28, 2013

Migrating My Automation Services Beyond Free

I depend on If This Then That (IFTTT) to move data around the cloud.  I syndicate blog posts from API Evangelist to Blogger and Tumblr.  This isn't just blind syndication, it is SEO and also plan B scenarios to make sure my content exists in multiple areas.

When it comes to the IT decisions for API Evangelist, I carefully evaluate what services I use.  If I begin to depend on an account, after 6 months I need to start paying a fee to secure some sort of quality of service (QOS).  This is how I roll.

Not all platorms allow for this.  In my opinion, all FREE platforms should have relief valves for users like me, who want to move to some sort of paid account, assuring me some sort of service level agreement (SLA) or I'll move away.

For example: Once I started using Evernote regularly, I moved to the premium level.  I bought into Pinboard at an early rate, once I was hooked.  Inversely, I can't pay for Twitter, Facebook and some Google services.  Today it is about IFTT.

In light of me losing Google Reader I'm taking a hard look at all my services.  IFTT is on the chopping block.

I have about 20-30 jobs running at IFTTT, moving data around between my cloud services, in a way I depend on.  I need to move this into the realm of premium or paid services.  

What are your thoughts?  Sell me on your service.



from Kin Lane http://feedproxy.google.com/~r/KinLane/~3/2lF7ks8tGVg/

The Noun Project API

The Noun Project is soliciting feedback on their upcoming API. If you aren’t familar with The Noun Project, it is a innovative visual language project, that is creating a library of icons that will enable people to visually communicate and interact around the globe. Direct from The Noun Project about page:

Humans have been using symbols to communicate for over 17,000 years because they are the one language everyone can understand. Symbols can transcend cultural and language barriers and deliver concise information effortlessly and instantaneously. They allow people to communicate quickly, effectively, and intuitively. And for the first time ever, this language is being combined with technology to create a social language that unites the world.

The Noun Project wants our feedback on what a Noun Project API might look like and what we would use it for.  So, I wanted to take a moment and share my thoughts, about what I would use The Noun Project API for, and some thoughts about what the endpoints might look like.

Embeddable
Normally an embeddable strategy for an API is in the background, relinquished to just one of the tools in the toolbox. With the Noun Project I think a healthy embeddable strategy would be front and center, providing a wide array of buttons, badges, widgets and other items users could syndicate across the web. The library of embeddable tools should be open for submission, much like the core of the Noun Project. Designers could generate custom button sets, badges and other embeddables for users to choose from. The Noun Project's embeddable strategy would make the Noun Project API accessible to everyone, even non-developers.

Builders and IDE
A Noun Project API would enable a next generation of mobile and web application apps, as well as other visualization and big data analysis tools to integrate consistent imagery. Makers of integrated development environments (IDE) could provide button, icon and other UI components with a wealth of meaningful icons, that could become the new default in modern apps. Think how Twitter bootstrap is raising the bar in overall app user interface quality? Love it or not, Twitter Bootstrap has made things look better, and consistent use of the Noun Project images could go even further.

Picture Is Worth a Thousand Words
The Noun Project API would enable a wealth of text to image visualization possibilities, allowing developers to accompany and represent everyday text in much more impactful ways. Imagine if in addition to titles and summaries, you always had a Noun Project icon to reference when reading on the web and via your favorite applications. A couple examples might be:

  • Blog & News Listings - Often times our blogging platforms have a way to upload an image to represent a post or story. The Noun Project could be used as visual cues for stories in certain categories or programmable determined by the text content of a story, dynamically pulled from the Noun Project API
  • Link Representations - When faced with a link, all you have to reference is the title and possibly a description. The addition of a Noun Project icon, pulled from the API after indexing the content of links page would help users make a determination of a site’s content before clicking
  • Messages - The Noun Project could be used to provide visual references for emails, SMS, Tweets, Facebook Wall posts and other common forms of digital communication.  Icons could be suggested based upon message content.

These are just a few ways The Noun Project API could be used to dynamically offer visual representations for common textual objects we encounter in our every day lives. I’m sure a developer community would come up with many more examples, once the API is available.

Network Graph
Built into The Noun Project API would be a representation of the social and network graph of an image. You could see how popular a particular image or set of images is. You could apply this into the filtering and retrieval of images via the API, as well as have as part of the image meta data to use in your own application logic. To achieve the level of meaning the project intends to, a graph will be needed to show the use, reach and impact of each image and sets of images.

Partner Network
Also built into The Noun Project API should be a partner layer, that reflects the core mission of the project. Every image returned by the API would be complete with all meta data regarding its creator and how the image can, and is being used across the network. The Noun Project Partner Network would give more control and insight for designers to see how their images are being used as well as commercial, volume and other monetization opportunities for designers--a layer that could be part of the overall Noun Project revenue strategy.

Attribution
Without an API, the usage of Noun Project images is a manual process, leaving attribution up to the user. With an API and a healthy branding and attribution framework, The Noun Project attribution would be built into the embeddable strategy as well as bundled into each and every Noun Project API response. This would provide The Noun Project network with easy attribution, which doesn't just credit designers, it will be the basis of The Noun Project marketing. Imagine being able to visualize the entire syndication for a single icon! Without an API this is impossible.

Bridging the Digital to Physical
The Noun Project API would allow innovative new ways to use Noun Project icons in the physical world as well as the digital. An API could allow for the printing on demand of icon templates in various sizes, and ship them to users for creating signs and other real-world uses that could be permanent or possibly even ephemeral, with the use of temporary paints or spray on chalks. This process wouldn't be a one-way street either. With a Noun Project API, developers could build icon recognition applications that could snap a picture of an icon printed (permanently or temporary) and immediately connect users back to the virtual world around the icon, either based upon its meaning, designer or some new represenation that has emerged as part of the Noun Project network graph.

That are just some of my initial thoughts around how a Noun Project API could be put to use in meaningful ways. Regarding what The Noun Project API would look like. I'd like to see some of the following endpoints:

  • Designers - Endpoint for icon designers, with access to their profile, collections and icons 
  • Users - Enpoint for Noun Project user accounts with access to account and billing settings as well as favorite, saved, purchased icons and collections
  • Icons - Endpoint for all icons, allowing to to search and filter individual icons as well as add and update existing icons for my designer profile
  • Collections - Collection endpoints allowing me to search, filter and pull individual icon collections
  • Dictionary - Endpoint for all words used for icons, allowing a dictionary that developed can reference while developing providing a reverse search for icons
  • Graph - API access to all graph related events like popular icons, collections and designers as well as trending icons collections and reverse searches on popular keywords or phrases

I'm sure there are more endpoints that could be identified in the future. I think the most important aspect is to just get an API launched, and have a healthy strategy for engaging users to find out what should be put on the roadmap. Just starting simple and making sure to expose all aspects of the core Noun Project is an important beginning--the rest will come.

What a fascinating project the Noun Project is. I strongly believe that it order to fulfill its full vision, the Noun Project has to have an API. The Noun Project API will be essential to syndicating the icons across both our physical and virtual worlds, provide a way for users and designers to interact around the icons, while also providing the most meaningful licensing framework possible that would even include much needed revenue for both designers and the Noun Project, and keep this very important project alive.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/InvvVG_r4Ns/

Tuesday, March 26, 2013

Embeddable API Conversations with Adam of Iris Media

I spent some time drinking IPAs and talking education technology with Audrey (@audreywatters) and a friend of ours Adam Wendt (@skinnyblimp), from Iris Educational Media last week. IRIS Educational Media is a behavioral research and development company that provides educational resources for parents, parent educators, K-12 school staff, including teachers, administrators, paraprofessionals, and direct support employees.

During our discussion, Adam was describing how they wanted to make Iris Media content was available for people to use on any platform, through providing valuable embeddable tools and resources for organizations and professionals to use on their sites and portals.

As Adam was talking, I’m thinking API. An API is not just a way to drive their own line of embeddable tools, but would allow for other 3rd party developers to build specialized tools and widgets for other platforms that Iris Media might not have awareness, time or resources to focus on. Wanting to be available on numerous platforms is easy, but actually delivering on them can be costly and time consuming--APIs are a way to share this load with a 3rd party community.

An API could power Iris Media’s syndication similar to Youtube. The Youtube API, plus a robust embeddable tool strategy is behind the massive syndication and growth of the popular video platform. Iris Media could go further with its own API, by integrating a partner layer within their API platform, and provide developers, organizations and educational professionals the ability to earn revenue by syndicating or developing widgets and applications on top of the Iris Media API. This revenue could go a long way in ensuring some of these essential educational professionals organizations stay in operation and continue making an impact.

Engaging with people like Adam, who are truly making a difference in priority areas like education, is why I continue doing API Evangelist. APIs can help Iris Media be more successful in reaching its audience, but also empower the huge number of vital individuals and organizations on the ground in communities around the globe, who deliver essential education where they are needed most.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/D6Gtfkqm8oE/

City of Philadelphia Shares Its Open Data Roadmap

A roadmap is an essential part of a healthy API ecosystem. The transparency and communication that come with providing a roadmap for your API and open data initiative will go a long way in building trust with your community.

The City of Philadelphia is sharing its roadmap of data sets that it intends to open in the coming months, by providing a release calendar using cloud project management service Trello. In addition to seeing the datasets the city intends to release, you can also see any data sets that have already been published.

According to the open data roadmap, Philadelphia is releasing data on street closures, energy consumption, evacuation routes, campaign finance, bike racks, budgets, expenditures and city employee salaries to name just a few.

This type of transparency doesn’t just build trust with citizens and developers, it provides an incentive for the city government to deliver high quality public data, in a meaningful and timely manner. There is still a lot of work to be done once this data is available, in order to develop quality analysis, visualizations, other APIs and tools that can be used in mobile and web applications. But what Philly is doing is a great start.

The City of Philadelphia’s approach to its open data roadmap is good to see. It is something all cities should have, as well as all API owners. Your roadmap will set expectations within your community, guide your own initiatives and provide a healthy blueprint for moving your efforts forward.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/OEi70j3sOZg/

Monday, March 25, 2013

NASA Challenges Devs to Create Space Object Tracking API

NASA has a pretty cool challenge going on, to create an API for SkyMorph, a database of optical images and catalogs generated by the Near Earth Asteroid Tracking (NEAT) program. According to the NASA description, SkyMorph provides:

access to imagery by time and position or searching by specific asteroid or other moving object. The time dimension, unique to SkyMorph, allows users to discover changes in the intensities of stars, e.g. supernovae, or to discover moving objects, e.g. comets

NASA is challenging us to help build a RESTful interface that would enable developers and citizen scientist to access SkyMorph imagery, lowering the barrier of access to the data and encouraging innovation around the program.

Ok, how cool is that! APIs that enable everyone to help track on and understand objects in the sky, along with NASA. It is projects like the SkyMorph API Challenge that keep me excited about APIs.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/_d4lK7TgDSg/

Thursday, March 21, 2013

Interactive API Documentation With Swagger

It is becoming more common for API providers to deliver documentation using what's known as interactive API documentation, instead of the usual static API documentation. Understanding how to use an API can be tough sometimes, and rather than just reading about it how it works, interactive API documentation allows you make real calls against an API, while learning about the interface.

While there are several approaches to delivering interactive API documentation, my personal favorite is using Swagger. Swagger comes built in with 3Scale, which is the API management platform I use for the API Evangelist API, but Swagger is available for anyone to use as an open source project.

Interactive API documentation using Swagger starts with a Swagger definition, which is a JSON representation of an API endpoint. In this case, the endpoint is for accessing my blog posts:

This JSON describes everything about my very basic endpoint, which allows users to query almost 3 years of API Evangelist blog posts. Using this definition, Swagger generates the following API documentation automatically:

As someone learning about the API Evangelist Analysis API, this describes the endpoint, which fields are required to make an API call, but it doesn’t just describe this with text, it provides an interactive interface in which I can enter my API keys, provide a query value and actually make a request against the API:

So I don’t just learn about the API endpoint. I learn about what keys are needed and fields are present that allow me to request different information of the API endpoint. It allows me to actually see the request to the API, the resulting body of my API request.

This type of hands on learning is essential to onboarding new users with your API. I can read your API documentation and not see the value, but when I am walked through how to build a request, and actually see the value returned in real-time, it changes the game. As a developer I’m more likely to understand and integrate with an API when I am given interactive API documentation over static pages.

The benefits of describing your API using Swagger don’t stop with just providing interactive API documentation. Swagger can also help deliver code samples in multiple programming languages for developers to put to use, generate server side code for new APIs and provide potential benefits for API discovery.

It is well known that API documentation is the number one pain point for developers. Make sure your API documentation is hands-on, and interactive, so developers will understand the value your API delivers immediately, and go from learning, to integration, in as short of period as possible.

Disclosure: 3Scale is an API Evangelist partner



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/8rnT-jAJ0Ng/

Continuing the Migration of Projects Over To Github

I’m continuing the migration of all my projects to run on Github. Eventually all public areas of my site will run as static, published Github pages and supporting back-end repositories. Last night I migrated API aggregation, Backend as a Service, Reciprocity and Real-Time providers using a version of my Hacker Storytelling format.

While apievangelist.com will still remain the master doorway to all my work, each project will live under its own subdomain and Github repository. As I make this switch I’m having to adjust my Google Analytics strategy as well as potentially my Google Feedburner strategy. In the shadow of the Google Reader deprecation I’m reconsidering not just how I consume RSS, but my analytics as well.

I can keep using Google Analytics for pages, and Google Feedburner for RSS. But I’d also like data on how JSON files are consumed as well, which neither platform provide me. Using the Github API I can track on the activity around a repository like commits, follows, downloads and forks, but I don’t get actually page view activity on pages or individual files.

Really my only hope for getting the data I need is from Github. Some sort of raw web logs for our domain, would be sweeeet!! Then I could see how many times a JSON file is accessed, and build custom reporting tools for Github page views, Jekyll blog views, etc and migrate away from Google Analytics.

I've looked briefly for any Github solutions, but everything is client-side tracking.  Let me know if I'm missing anything.



from Kin Lane http://feedproxy.google.com/~r/KinLane/~3/BQiJr3SyPNw/

App Design With An Acquire, Process, Publish API Architecture

I'm tracking on a new wave of application frameworks and API centric architecture patterns, that are not just helping deliver the next wave of web & mobile apps, but also bridging, aggregating and providing interoperability and transformations between APIs platforms.

One company I've been watching closely is Seabourne. The Seabourne team has an approach to application development that follows a very interesting set of principles:

  • Information Flows Instead of Pipelines - Information operates in ‘flows’ where inputs and outputs are flexible and happen at any point. Flows are fluid and flexible, unlike structured, point-to-point pipelines
  • Data has Multiple Owners - Information flows are composed of multiple streams of data owned by different partners and vendors. Any process must accommodate multiple canonical sources for different information
  • Use APIs to Move Information - By using APIs to move information around, we decouple the data from the underlying technology and vendor, and make it possible to combine information from different technologies. APIs provide a flexible, low cost base to grow the system and meet changing needs
  • Integrate Data Across Systems - Information lives on multiple systems inside and outside the organization. There is tremendous value to be had from combining multiple data sources together into a single information stream
  • Translation Rather Than Standardization - Information is stored in multiple structures and formats. Any effort to manage information should focus on translating between structures rather than trying to develop a common schema

I think that pretty much describes the challenges we face building web and mobile apps today, and Seabourne's approach isn’t just about building apps, it is about organizing resources from multiple providers to build the best app you can. To see it in action you have to check out the project they just implemented using their approach called GovInfo.

Using GovInfo, anyone can sign up to receive alerts from more than 100 different federal agency websites. GovInfo gets its information from a large number of sources ranging from RSS, APIs or scraped from HTML sources. Once acquired, the framework de-dupes, cleanses and normalizes the data as needed, then makes available for publishing or pushing out via email, SMS or other means.

The Seabourne approach is not just rethinking extract, transform and load (ETL), but also makes your app platform centered around acquire, process and publish, in a way that improves flexibility, centralizes processing rules for all incoming information and reduces the amount of time it takes to add new sources of data, all while providing scalability, in a very granular way.

I’ve been watching Seabourne evolve and constantly refine their approach to distributed application design ranging from their deployment of the FCC web site, to the launching of MyFCC, and now the next iteration of their platform driving GovInfo. I’ll keep following what they are doing with their AP2 approach and watch the tools and best practices that are a result of their innovative approach to application development using APIs.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/1opzVK3u_l0/

Embeddable OpenSpending Visualizations

There are some really great examples of embeddable, open data goodness over at the OpenSpending project, which is operated by the Open Knowledge Foundation, a non-profit with a missions to promote open knowledge and data.

The OpenSpending platform has a wealth of data regarding spending budgets from all over the world, providing key data that allows anyone track government and corporate financial transactions globally.

The OpenSpending platform has plenty of data visualizations avialable for use, but until recently these tools were only fixed and available on the OpenSpending site. Now they have begun developing and publishing a handful of cool, embeddable widgets that can be published anywhere:

All three visualizations are available as open-source code on Github. OpenSpending also provides examples you can play with using jsfiddle:

The embeddable tools provided by OpenSpending are exactly the types of tools I want to organize as part of my Hacker Storytelling work. I’m looking to build a wealth of embeddable tools that help people tell more meaningful, data driven stories.

I will be curating and tagging as many examples like this as I can, and continue to publish via Hacker Storytelling, for anyone to use.



from Kin Lane http://feedproxy.google.com/~r/KinLane/~3/bBkSKvBXxZ0/

I Support You Adria Richards

Way to go brogrammers.  You made sure a bright light in evangelism was partially dimmed tday.  Really? With all the shaming that goes on targeted at women, you don't have balls to take a little shaming when your being a sexist pig?   C'mon...ummm have some balls?

As I've said it before, us guys are going to have to eat shit with a side of humble pie for many years to come to balance this whole sexism thing out.  Let's own it.

All it would have taken is a humble apology from those gentleman in response to the Tweet, then some healthy discussion about why what they said was wrong, and move on kids.  

I sure hope the hacker news crowd steps up and defends every woman that is shamed by guys now, as they seem to be a defender of this now. So if you Tweet out shaming a woman in a career changing way, know hacker news will be all over you...right?  Right?

I just want to make sure you know I support you Adria. Your not unemployed, you've been promoted to new levels of evangelism. Let me know how I can help.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/rw9jVAoCzAU/

Wednesday, March 20, 2013

API Evangelist API

It was inevitable. API Evangelist now has its own API. I had a couple partners ask for more sophisticated access than provided by RSS or JSON dumps out of my platform.  So I launched a handful of API endpoints, allowing you to get at information from my world.

I currently have 13 endpoints providing access to the core of my platform, in 8 key areas:

  • APIs - Name, logo and description of the 2K APIs I'm activtely monitoring
  • API Stack - Access to my weekly and monthly API stacks
  • Analysis (Blog) - Access to my entire blog catalog
  • Service Providers - Access to all API service providers I track
  • Tools - A full catalog of all the API tools I've found
  • Curated News - Title and links of news that I personally curate from my 2K+ feeds
  • Notes - Notes I make during my daily and weekly curation
  • Building Blocks - The common building blocks I've identified after looking at 2K top APIs

There is a lot more data behind my firewall that I want to open up. I do a lot of research on top APIs like Twitter and Google as well as trending areas like aggregationbackend, reciprocity and real-time service providers.

I will also be opening up API endpoints into the site traffic for apievangelist.com, apivoice.com and my other properties. And toying with opening up some collections of my personal data like calendar, projects, checkins and other aspects of my world.

Currently I’m only issuing keys to API Evangelist subscribers and partners, but let me know if you are interested in getting at any of my data. Eventually I’ll make much of it openly available, but I’m still formulating my approach, and playing with different service compositions--so things will change.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/i2nDeofADFQ/

Tuesday, March 19, 2013

Incentivize API Developers With A Mozilla Open Badge Program

Last week Mozilla launched their new Open Badges platform, as an open standard to recognize and verify learning. I immediately started thinking about how it could be applied to APIs for incentivizing developer participation and success.

I’ve come across other badge programs, but Mozilla is the first open approach I’ve seen, providing free software and an open technical standard, that any organization can use to create, issue and verify digital badges.

So I was thinking, as an API owner, you could use Open Badges to structure a series of rewards for developers who reach important milestones integrating with your API, and engaging with your ecosystem.

Some examples badge-worthy milestones might be:

  • Achieve X amount of API calls
  • Published application to showcase
  • Published open source code library
  • Developed more than one application
  • Revenue generated from application
  • Supporting the forum and conversations

It would take a bit of planning to developer the right approach to badging for each API, as goals would be different for each provider. But, providing badges could provide an incentive for developers to work towards meaningful goals you have tied to important KPIs for your API strategy, While also providing an easy way to track your most knowledgable and potentially trusted developers.

If API providers start issuing sensible and consistent badges for API knowledge it would also provide additional ways to measure the size and overall health of the API developer community. I could envision badges that would serve specific business sectors or even around city, state and federal government APIs and civic efforts.

I will add Mozilla Open Badges to my list of API building blocks and explore other ideas about how an API provider can establish badge programs to incentivize and reward healthy API developer activity.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/Mk_CO-3NvDQ/

Google Launches Real Time API and JavaScript Library

Google launched a Realtime API for the Google Drive Platform today. It is the API version of the same functionality available on Google Drive, that allows for you and other collaborators to type, edit, annote and chat with each other within a Google Doc.

To use the Google Drive Realtime API you add the Google Realtime Javascript library, then you can give any local object on an HTML page realtime behavior. The JavaScript library and Realtime API handle network communication, storage, presence, conflict resolution and sync changes using what they call a CollaborativeString object.

The Realtime API isn’t just for documents, it can be used for productivity apps, games, entertainment and much more. The only limitation is a developers imagination within their own applications.

Google gives a quick start, documentation and other resources to get you going, and they provide a pretty cool realtime playground for building stuff with the API in a hands-on, interactive environment.

There are other players in the realtime space already, like Firebase and Pusher. Google’s entry into the arena is a signal that there is developer demand for realtime tools and provides validation for the other startups. I’ll play with Google Drive Realtime API some more and talk with the folks at Pusher and Firebase and see what their thoughts are and how it compares.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/hemjFkwnar8/

Startup API Pricing

I wrote about the web to API service Import.io earlier today, and before I close the Evernote for this story, I wanted to highlight something else I thought they did interesting, on their pricing page.  

I thought they provide a very honest peak into how they are pricing their services.  First off, they admit that they are a young company, which is cool, but they also lay out some pretty interesting points around their business philosophy::

  • We are a free service. We will always provide a free-to-use tool
  • We will introduce premium features. Some of these you may have to pay for
  • We will always provide you with a tool to export your connectors
  • We will let you know if we are going to change anything that may impact your usage of import.io
  • We are currently in Developer Preview. Some things might not work as expected
  • Free accounts may be volume-limited as we move from Developer Preview to production
  • We will never spam you or share your details with any third parties

All of these points are something I feel ALL startups should commit to as part of their services to their users. One thing I'd also like to see from startups, is some mention of their exit strategy.  What is the end goal?  I think this is a relevant question for all startups, as is also good one for Import.io after the Google acquisition and shut-down of a similar service called Needlebase, back in 2012.   

My goal isn't to pick on Import.io, but along with the other items they outline, I think all API startups should be openly discussing everything listed above.



    from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/y_-IJNVLSRw/

    I Have An Idea, Lets Launch an Analytics Platform

    During my monitoring for the API stack this last week, I noticed that both Parse and New Relic had launched mobile analytics platforms. I’ve been tracking on various API driven analytics platforms, and after seeing both Parse and New Relic launch their offerings, I thought, who else is doing this?

    Here are a few of the analytics platforms that launched in the last 30 days:

    It seems that everyone is getting pretty excited about the growth of big data and mobile. Analytics seems to be the latest boom area, something I will add to my list of API driven trends I’m monitoring.



    from Kin Lane http://feedproxy.google.com/~r/KinLane/~3/8W1OzuBdj7A/

    APIs as Art

    Audrey (@audreywatters) forwarded a very interesting series of tweets to me yesterday from software artist, writer, and educator Jer Thorp (@blprnt):

    What a fascinating thought. It is so close to how I see APIs in my minds eye. A single API design can hold so much beauty, information, expression and emotion. Imagine if you hung the API design for each iteration of the Twitter API on a wall. There is so much to interpret. It expresses the vision of Twitter founders and employees, it is in direct response to a million developers and the needs of 500 million users around the globe.  What an impression!!!

    While JSON might be the only way we have to visually describe an API, each web, mobile app and data visualization could also be considered an artistic representation of the Twitter API. You could fill an entire gallery with JSON representations of each endpoint and each of its versions, applications, and visualizations. Creating interactive art exhibition of the Twitter ecosystem.

    I remember back in 8th grade(1984) I was in my social studies class with Ms. Schuler. She brought in some fractal prints, she said were created by her mathematician boyfriend. She walked us through how each visual was actually a math problem. This was the same year my math teacher Mr. Hathaway got around 20 Commodore Vic 20s, which he didn’t know how to setup and run, so a handful of us stayed after school and helped setup the lab. At this point in my education I hated math, and without seeing the beauty of the fractal, I may never have developed a passion (obsession) for computing. My counselors told me I’d never have a career in computers, because I was so poor at math. Ha! Eat it middle school counselors!

    Jer Thorp’s tweets really resonate with me. APIs are a technical, business and political representation that can interpreted in so many different ways. When I read a swagger definition of an API, images begin to form in my mind around the intent of the APIs author (think girl in red dress from the Matrix).

    Sometimes an API will invoke kindergarten images in my mind telling me the designer didn’t have the experience it took to represent their information and resources properly. Other times they invoke very precise images, almost too precise, alerting me to the fact this API was designed by another application and not a human.

    On a very rare ocasion I will see an API design that you can tell was hand crafted, with the love and care of a individual or group of individuals who considered the needs of many end users, developers, and other consumers. After several iterations an artful API representation will also begin to bear the technical, business and political imprints of its time. Much like a painting will have certain styles, approaches, shapes, sizes, colors and materials going into the ink and canvas, or maybe markings of war, its owners or poor care.

    In my opinion APs are a very artful representation of sometimes very complex data or programmatic resources, that have the possibility to evoke very emotional responses from the developers that build on them or the from the end-users who are the intended audience.

    Thanks for planting this seed in my head Jer. It is a topic I will visit often, as I’m no doubt going to stumble across patterns and approaches that are more art than tech, as I’m navigating this amazing API driven world I'm obsessed with.

    And remember you heard it from Jer Thorp (@blprnt) first!



    from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/CYpGUA_LGv4/

    Web Harvesting to API with Import.io

    I had a demo of a new data extraction service today called Import.io. The service allows you to harvest or scrape data from websites and then output in machine readable formats like JSON. This is very similar to Needlebase, a popular scraping tool that was acquired and then shut down by Google early in 2012. Except I’d say Import.io represents a simpler, yet at the same time a more sophisticated approach to harvesting of web data and publishing than Needlebase.

    Extract
    Using Import.io you can target web pages, where the content resides that you wish to harvest, define the rows of data, label and associate them with columns in table you where the system will ultimately put your data, then extract the data complete with querying, filtering, pagination and other aspects of browsing the web you will need to get at all the data you desire.

    Connect
    After defining the data that will be extracted, and how it will be store you can stop and use the data as is, or you can setup a more ongoing, real-time connection with the data you are harvesting. Using Import.io connectors you pull the data regularly, identify when it changes, merge from multiple sources and remix data as needed.

    Put The Data To Work
    Using Import.io you can immediately extract the data you need and get to work, or establish an ongoing connection with your sources of data and use via the Import.io web app or you can manage and access via the Import.io API--giving you full control over your web harvesting connections, and the resulting data.

    When getting to work using Import.io, you have the option to build your own connectors or explore a marketplace of existing data connectors, tailored to pull from some common sources like the Guardian or ESPN. The Import.io marketplace of connectors is a huge opportunity for data consumers as well as data scraping junkies (like me) to put their talents to use building unique and desireable data harvesting scripts.

    I’ve written about database to API services like EmergentOne and SlashDB, I would put Import.io into the Harvest to API or ScrAPI category--allowing you to deploy APIs and machine readable datasets from any publicly available data, even if you aren’t a programmer.

    I think ScrAPI services and tools will play an important role in the API economy. While data will almost always originate from a database, often times you can’t navigate existing IT bottlenecks to properly connect and deploy an API from that data source. Sometimes problem owners will have to circumvent existing IT infrastructure and harvest where the data is published on the open web.  Taking it upon themselves to generate the necessary API or machine readable formats that will be needed for the last mile of mobile and big data apps that will ultimately consume and depend on this data.



    from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/7LAqfpMFPg4/