Saturday, December 31, 2011

New Year''s Resolution: Full Disk Encryption on Every Computer You Own

The New Year is upon us, and you might be partaking in the tradition of making a resolution for the coming year. This year, why not make a resolution to protect your Data privacy with one of the most powerful tools available? Commit to full disk encryption on each of your computers. Many of us now have private information on our computers: personal records, business data, e-mails, web history, or information we have about our friends, family, or colleagues.

View Full Post>>

Top 10 API Evangelist Posts for 2011

2011 was an interesting year for APIs. I wasn't as good as I wanted to be at covering all major API events, but I got some traction with some posts I didn't anticipate.

Here are my top 10 traffic posts on API Evangelist for 2011:

I expected Apple related API stories to be big, but the other topics on here like endpoint planning, transit APIs, open building blocks, RESTful business architecture and a flood of APIs were a surprise.

Overall it shows me that applying APIs and RESTful concepts to business in specific industries can really get people talking.

A few of these posts were written while I was doing deep dives into government, water and transit industries and how APIs are changing the landscape.

I hope to do many of these deep industry dives in 2012.



from API Evangelist - Blog http://feedproxy.google.com/~r/ApiEvangelist/~3/XI7F0fg9QQI/

2011 in Review: Internet Freedom in the Wake of the Arab Spring

As the year draws to a close, EFF is looking back at the major trends influencing digital rights in 2011 and discussing where we are in the fight for a free expression, innovation, fair use, and privacy. For several years, discussions about global Internet freedom have focused primarily on what are widely considered the world's two most restrictive countries: China and Iran.

View Full Post>>

Government Opened Data via APIs in 2011

One of the most important fronts of API development is Government.  All of us API and data guys have all been screaming for city, county, state and federal government to open up their data via APIs for years now. 

In 2011 I would say many government officials listened, and opened up almost 100 government APIs, according to ProgrammableWeb:

In 2012 it will be critical for more government agencies to open up, but also I think its time for us developers to step up and start making sense of this data and how our government operates.

We are getting what we wanted, now how do we deliver on the promise of open government APIs?



from API Evangelist - Blog http://feedproxy.google.com/~r/ApiEvangelist/~3/Z_tOzb-qBRw/

2011 APIs as a Tag Cloud

I pulled a list of the 2023 APIs that were added to the ProgrammableWeb API directory in 2011.  I took the description column and used Wordle to generate a tag cloud for 2011.  I think tag clouds can provide a 100K view of where people are focusing their APIs.



from API Evangelist - Blog http://feedproxy.google.com/~r/ApiEvangelist/~3/GBC1jB11-5I/

2011 in Review: Four Cases That Promoted Transparency in 2011

2011 was an important year for court decisions interpreting the Freedom of Information Act (FOIA). The Supreme Court issued two decisions that promoted Government transparency and limited the scope of FOIA exemptions, while two district courts addressed how the government administers FOIA. All of those decisions will help shape FOIA to the benefit of the public. Milner v.

View Full Post>>

2011 in Review: Patents Misused to Stifle Innovation

As the year draws to a close, EFF is looking back at the major trends influencing digital rights in 2011 and discussing where we are in the fight for a free expression, innovation, fair use, and privacy. 2011 saw what many had written off as impossible: patent reform legislation became law. Despite the urgent need for reform to today's patent system, the new law – the America Invents Act – managed to do almost nothing to address many of the most pressing problems facing innovators.

View Full Post>>

Friday, December 30, 2011

2011 in Review: Hacking Law

As the year draws to a close, EFF is looking back at the major trends influencing digital rights on 2011 and discussing where we are in the fight for free expression, innovation, fair use, and privacy. EFF has long been concerned about the Computer Fraud and Abuse Act (CFAA), a federal law that allows people to be sued civilly and charged criminally with a host of anti-hacking offenses.

View Full Post>>

Thursday, December 29, 2011

Appeals Court Revives EFF''s Challenge to Government''s Massive Spying Program

Justices Find that Spied-On Telephone Customers Have the Right to Sue San Francisco - The 9th U. S. Circuit Court of Appeals today blocked the government's attempt to bury the Electronic Frontier Foundation's (EFF's) lawsuit against the government's illegal mass surveillance program, returning Jewel v. NSA to the District Court for the next step. The court found that Jewel had alleged sufficient specifics about the warrantless wiretapping program to proceed.

View Full Post>>

2011 in Review: Search Incident to Arrest and Your Cell Phone

As the year draws to a close, EFF is looking back at the major trends influencing digital rights in 2011 and discussing where we are in the fight for a free expression, innovation, fair use, and privacy. 2011 saw 40% of all Mobile phone users in the United States carrying web-enabled smartphones, creating a cycle that results in cheaper smartphones and more first-time users. People who carry smartphones are usually carrying all of their sensitive information with them too.

View Full Post>>

Time for Supreme Court to Weigh in on Forced DNA Collection

Can the Government force people who are arrested – but not yet convicted of a crime – to give a DNA sample without a search warrant, or does that violate the Fourth Amendment?   One arrestee is asking the U. S. Supreme Court to consider this important question, and this week EFF urged the court to take the case. A federal law mandates DNA collection for those who have been arrested for felonies. The FBI analyzes the samples, and puts a profile into CODIS, a national database.

View Full Post>>

Wednesday, December 28, 2011

#MoveYourDomain to Protest the Internet Blacklist Bills

When the well-known domain name registrar Go Daddy threw its support behind the Stop Online Piracy Act, it led to a PR disaster: Internet users rebelled against the registrar, and called for Go Daddy customers to transfer their domains. In response to the boycott Go Daddy has switched their position, but some companies are deciding to take a stance against the Internet blacklist legislation.

View Full Post>>

2011 in Review: Developments in ACTA

As the year draws to a close, EFF is looking back at the major trends influencing digital rights in 2011 and discussing where we are in the fight for a free expression, innovation, fair use, and privacy. While Internet blacklist bills exploded into the domestic U. S. Congressional scene this year, foreboding international forces are also posing new threats to the Internet around the world. The most prominent of these is the Anti-Counterfeiting Trade Agreement (ACTA), signed by the U. S.

View Full Post>>

Tuesday, December 27, 2011

2011 in Review: Ever-Clearer Vulnerabilities in Certificate Authority System

As the year draws to a close, EFF is looking back at the major trends influencing digital rights in 2011 and discussing where we are in the fight for a free expression, innovation, fair use, and privacy. At EFF we are big fans of HTTPS, the secure version of HTTP that allows for private conversations between websites and the users who are browsing those websites.

View Full Post>>

2011 in Review: Defending Location Privacy in Courts and Congress

As the year draws to a close, EFF is looking back at the major trends influencing digital rights in 2011 and discussing where we are in the fight for a free expression, innovation, fair use, and privacy. 2010 was very eventful when it came to the issue of location privacy, with EFF bringing home some key court victories.

View Full Post>>

Monday, December 26, 2011

January 2012 Hackathons - December 26th, 2011

Here is the latest update of the hackathons I'm tracking on for January. You can view these on the events page and I have a Google Calendar you can subscribe to.

I added 13 new hackathons, doubling the number of hackathons in January to 26!

Designing & Developing for Mobile Workshop 01/07/2012 San Francisco
United States
NYC BigApps 3.0 Developer Day 01/07/2012 New York
United States
AT&T Mobile App Hackathon Las Vegas 01/08/2012 Las Vegas
United States
PDX Weekly Hackathon 01/12/2012 Portland
United States
Northern Kentucky Startup Weekend 01/13 01/13/2012 Highland Heights
United States
Seattle Startup Weekend 1/12 01/13/2012 Seattle
United States
West Michigan Startup Weekend 1/12 01/13/2012 Grand Rapids
United States
Arduino Camp & Robot Hackathon 01/14/2012 New York
United States
Facebook Mobile Hack - New York 01/18/2012 New York
United States
CodeChix Presents: Mobile/Web Graphic Design for Engineers (Women only) 01/18/2012 Mountain View
United States
Hackathon for Social Good 01/19/2012 New York
United States
Wikipedia - San Francisco Hackathon January 2012 01/20/2012 San Francisco
United States
Haskell Hackathon 01/20/2012 Cambridge
United States
Facebook Mobile Hack - Boston 01/20/2012 Boston
United States
CityCampHNL Hackathon 01/20/2012 Honolulu
United States
Startup Weekend Ann Arbor 01/20/2012 Ann Arbor
United States
Jacksonville Startup Weekend 01/20/2012 Jacksonville
United States
Startup Weekend SLO 01/20/2012 San Luis Obispo
United States
Cleanweb Hackathon NYC 01/21/2012 New York
United States
the Muther Michigan Hackathon and Dev Con 01/27/2012 Detroit
United States
SpinKick - Kickstarting Mobile Apps for Your Business! 01/27/2012 Seattle
United States
San Jose Startup Weekend 1/12 01/27/2012 Santa Clara
United States
St. Louis Startup Weekend - January 27-29, 2012 01/27/2012 Saint Louis
United States
HTML5 Mobile Apps Hackathon 01/28/2012 Boston
United States
Kinect for Developers 01/28/2012 Plano
United States
Digital meets Physical: A Hardware Hackathon 01/28/2012 Providence
United States

I try to record as much information about event as I can, including Twitter handle and email when its available, so you can contact them.

If you know of any hackathons I'm not tracking on, let me know. I'd like to keep the hackathon events calendar up to date with all the events going on around the globe.



from API Evangelist - Blog http://feedproxy.google.com/~r/ApiEvangelist/~3/ICUoRh5uBS0/

2011 in Review: Nymwars

As the year draws to a close, EFF is looking back at the major trends influencing digital rights in 2011 and discussing where we are in the fight for a free expression, innovation, fair use, and privacy. This year, Google launched its social networking site, Google Plus.

View Full Post>>

Sunday, December 25, 2011

APIs Can Decouple Business Information and Resources

I'm reading The Information: A History, a Theory, a Flood right now. So far its a great read. Very thought provoking stuff.

I just read this passage on how the first dictionary transformed the way we were able to think about, understand and re-use words:

...the meaning of words come from other words. It implies that all words come from other words, taken together, form an interlocking structure: interlocking, because all the words are defined in terms of other words. This could never have been an issue in an oral culture, where language was barely visible. Only when printing---and the diction---put the language into separate relief, as an object to be scrutinized, could anyone develop a sense of a word meaning as interdependent and even circular. Words had to be considered as words, representing other words, apart from things.

For the first time words stood on their own, apart from the stories and poems that contained them. The first dictionaries set into motion, much of what we take for granite today with writing and communicating.

It made me think of what APIs are doing for business information and resources. APIs decouple them, making them independent of their original intended form and meaning.  Allowing them be accessed, distributed, re-used and transformed into new ways never even conceived by its original source or creator.

The content of an existing travel book could become an interactive application, using Google Earth, Wikipedia and other APIs allowing someone to travel the global on their iPad without leaving their home.  

The CityGrid API I evangelize for offers business and places data for cities across the country, with a web and mobile adveritising network.  This single resource can mean many things to different people, it could be a restaurant directory in Seattle or a medical assistance guide in Philadelphia.  

APIs have a lot of potential in freeing up many of the business resources, media and content we use everyday and allow for them to be used in new ways.   Creating entirely new ways of doing business, using many of the same business information and resources we've been using for years.



from API Evangelist - Blog http://feedproxy.google.com/~r/ApiEvangelist/~3/cR3n3bkLPak/

API Management Service Provider Roundup for 2011

As 2011 comes to an end, I’d like to take another look at what I learned about the API management service provider space in 2011. I started the year engaging 3Scale, Apigee and Mashery, trying to find a platform to build the Mimeo Connect API platform on. Then spent the entire year getting to know each of the API management service providers, and the products they offer. At the end of the year, we are discussing the relevancy of API service providers, so I think a year-end API service provider post is due.

The First Dimension
To make sense of the API management space, I’m looking at it in five separate dimensions. As of December 2011, I would set the first dimension and primary API management playing field as:

These companies are going to provide a gateway, proxy or a connector for your APIs either on-premise or in the cloud.

Then through that gateway, proxy, connector they will offer a variety of services to secure, translate, throttle, cache, distribute and scale your APIs as you need.

Some of these companies will also provide you with a way to manage usage, billing, reporting and legal for the users, partners and developers who will be consuming your API.

These companies will also bring a whole bunch of expertise to the table, that your company can tap while building your API strategy. Some will even help you market your API to developers online and at events.

One thing to know about these API management service providers is, well...they are API management service providers. To my knowledge they don’t actually deploy your API for you, they help you build a strategy, and manage the API. But you still need to rely on other tools and in-house resources to deliver your API.

A common Google search or in-person query I get is: Apigee vs. Mashery, 3Scale vs. Apigee and Layer7 vs. Apigee, etc 

After all my research, and talking to all the API management service providers--I don’t have a straight answer. My response is...visit each of their sites, choose the ones that speak to you, and call them. They each have a unique style to API management, and its healthy to get acquainted with each of their approaches, and find one that fits your companies objectives.

The Second Dimension
Beyond the primary players I’d say there are a group of data driven API service providers that bring a different set of tools and expertise:

Data API management platforms are focusing on bringing data to life as APIs. These companies are either acquiring their own data-sets to deliver as APIs, doing it as a service for companies or providing you with tools to do it yourself. Whatever the scenario is, its about building access and delivery networks, for data, using APIs.

The Third Dimension
There is also a wave of new breed API management platforms, each with a slightly different vision of where the space should go:

These new players are bringing community and social to API management, along with new ways to describe, test, manage and share APIs. This new wave has a chance to change the conversation around APIs dramatically in 2012.

The Fourth Dimension
Beyond these companies directly offering API management services, there is always the bootstrap model. Wordpress, Github, Google Groups, Twitter and some custom coding on an Amazon EC2 instance can make for a pretty fine API and supporting area.

I’ve identified a whole range of API building blocks on API Evangelist, and I’m always working to define tools that the DIY API owner can use to deploy and manage their API. This is where you’ll find the frameworks you’ll need to deploy an API, forum, blog, code samples, repositories, etc.

However there are some key building blocks missing in this approach. There are no clear tools for API key management, rate limiting, translation, transformation, metrics, billing, reporting, and other areas that API management service providers are capitalizing on. On the road to the commodization of certain API management services, we need some more open-source players to jump in the game. Or maybe some existing player could open-source their tools?

The Fifth Dimension
So we have the primary, data and next wave of API management service providers as well as the DIY approach to API management. I’d have to say there is a fifth dimension of this space, and that’s Google.

Much like Amazon has done with EC2, S3 and RDS, Google is in a great position to lead the industry by example, with their standardized approach to Google API delivery and management with Google Console, Google Discovery, Google Explorer and Google Analytics.

In my post, the Business of Google APIs 2011, I introduce the idea of Google opening up its API management platform to other API owners. My thought is, that Google is in a good position to open up API management infrastructure via the Google Apps Platform, allowing any company to easily deploy an API and enable discover, exploration and management via the tools Google rolled out in 2011 for its own APIs.

In my opinion each of the five API management dimensions I’ve laid out will have significant roles they could play in 2012:

  • First Dimension - The primary API management players will fuel growth of API adoption in 2012 from the small business to the enterprise.
  • Second Dimension - Data API management providers will ensure the massive amounts of data we have, and are creating daily will be accessible, have delivery networks and marketplaces using APIs.
  • Third Dimension - A new wave of social, agile and community driven approaches to API management whill change the game and create entirely new ways to deploy, deliver, manage and consume what we know as APIs.
  • Fourth Dimension - DIY API management will fuel growth and innovation in the space, but without the entry of more open-source API building block players the whole industry will suffer.
  • Fifth Dimension - Google posses all the tools for deploying, discovering, exploring, managing and consuming APIs, will they share with the world, or keep it for themselves?

I may actually throw in a Sixth Dimension for the API management space in 2012, Content APIs. This is based on what I’ve seen in the space around deployment of Content APIs for Worpdress, Drupal and other CMS platforms. Essentially turning every site into an API.

Any of these six dimensions of API management could change the API conversation in 2012. The only thing I can predict 100%, is 2012 is definitely going to be a fun ride, based upon what i’ve seen in 2011--we are poised for some serious growth.



from API Evangelist - Blog http://feedproxy.google.com/~r/ApiEvangelist/~3/EEwLzwWdBmM/

2011 in Review: California Reader Privacy Upgrade

As the year draws to a close, EFF is looking back at the major trends influencing digital rights in 2011 and discussing where we are in the fight for a free expression, innovation, fair use, and privacy. 2011 saw a narrow but important upgrade in privacy for Californians, both online and offline. In early October, Governor Brown signed a law that EFF sponsored along with the American Civil Liberties Union that updates reader privacy laws for the digital age, and not a moment too soon.

View Full Post>>

Saturday, December 24, 2011

2011 in Review: Fighting the Internet Blacklist Bills

As the year draws to a close, EFF is looking back at the major trends influencing digital rights in 2011 and discussing where we are in the fight for a free expression, innovation, fair use, and privacy. The Stop Online Piracy Act (SOPA) and the Protect IP Act (PIPA) are the House and Senate version of a proposed law that would allow the U. S. Attorney General to create blacklists of websites to censor, cut off from funding, or remove from search engine indexes.

View Full Post>>

Friday, December 23, 2011

API Management Service Provider Roundup for 2011

As 2011 comes to an end, I’d like to take another look at what I learned about the API management service provider space in 2011. I started the year engaging 3Scale, Apigee and Mashery, trying to find a platform to build the Mimeo Connect API platform on. Then spent the entire year getting to know each of the API management service providers, and the products they offer. At the end of the year, we are discussing the relevancy of API service providers, so I think a year-end API service provider post is due.

The First Dimension
To make sense of the API management space, I’m looking at it in five separate dimensions. As of December 2011, I would set the first dimension and primary API management playing field as:

These companies are going to provide a gateway, proxy or a connector for your APIs either on-premise or in the cloud.

Then through that gateway, proxy, connector they will offer a variety of services to secure, translate, throttle, cache, distribute and scale your APIs as you need.

Some of these companies will also provide you with a way to manage usage, billing, reporting and legal for the users, partners and developers who will be consuming your API.

These companies will also bring a whole bunch of expertise to the table, that your company can tap while building your API strategy. Some will even help you market your API to developers online and at events.

One thing to know about these API management service providers is, well...they are API management service providers. To my knowledge they don’t actually deploy your API for you, they help you build a strategy, and manage the API. But you still need to rely on other tools and in-house resources to deliver your API.

A common Google search or in-person query I get is: Apigee vs. Mashery, 3Scale vs. Apigee and Layer7 vs. Apigee, etc 

After all my research, and talking to all the API management service providers--I don’t have a straight answer. My response is...visit each of their sites, choose the ones that speak to you, and call them. They each have a unique style to API management, and its healthy to get acquainted with each of their approaches, and find one that fits your companies objectives.

The Second Dimension
Beyond the primary players I’d say there are a group of data driven API service providers that bring a different set of tools and expertise:

Data API management platforms are focusing on bringing data to life as APIs. These companies are either acquiring their own data-sets to deliver as APIs, doing it as a service for companies or providing you with tools to do it yourself. Whatever the scenario is, its about building access and delivery networks, for data, using APIs.

The Third Dimension
There is also a wave of new breed API management platforms, each with a slightly different vision of where the space should go:

These new players are bringing community and social to API management, along with new ways to describe, test, manage and share APIs. This new wave has a chance to change the conversation around APIs dramatically in 2012.

The Fourth Dimension
Beyond these companies directly offering API management services, there is always the bootstrap model. Wordpress, Github, Google Groups, Twitter and some custom coding on an Amazon EC2 instance can make for a pretty fine API and supporting area.

I’ve identified a whole range of API building blocks on API Evangelist, and I’m always working to define tools that the DIY API owner can use to deploy and manage their API. This is where you’ll find the frameworks you’ll need to deploy an API, forum, blog, code samples, repositories, etc.

However there are some key building blocks missing in this approach. There are no clear tools for API key management, rate limiting, translation, transformation, metrics, billing, reporting, and other areas that API management service providers are capitalizing on. On the road to the commodization of certain API management services, we need some more open-source players to jump in the game. Or maybe some existing player could open-source their tools?

The Fifth Dimension
So we have the primary, data and next wave of API management service providers as well as the DIY approach to API management. I’d have to say there is a fifth dimension of this space, and that’s Google.

Much like Amazon has done with EC2, S3 and RDS, Google is in a great position to lead the industry by example, with their standardized approach to Google API delivery and management with Google Console, Google Discovery, Google Explorer and Google Analytics.

In my post, the Business of Google APIs 2011, I introduce the idea of Google opening up its API management platform to other API owners. My thought is, that Google is in a good position to open up API management infrastructure via the Google Apps Platform, allowing any company to easily deploy an API and enable discover, exploration and management via the tools Google rolled out in 2011 for its own APIs.

In my opinion each of the five API management dimensions I’ve laid out will have significant roles they could play in 2012:

  • First Dimension - The primary API management players will fuel growth of API adoption in 2012 from the small business to the enterprise.
  • Second Dimension - Data API management providers will ensure the massive amounts of data we have, and are creating daily will be accessible, have delivery networks and marketplaces using APIs.
  • Third Dimension - A new wave of social, agile and community driven approaches to API management whill change the game and create entirely new ways to deploy, deliver, manage and consume what we know as APIs.
  • Fourth Dimension - DIY API management will fuel growth and innovation in the space, but without the entry of more open-source API building block players the whole industry will suffer.
  • Fifth Dimension - Google posses all the tools for deploying, discovering, exploring, managing and consuming APIs, will they share with the world, or keep it for themselves?

I may actually throw in a Sixth Dimension for the API management space in 2012, Content APIs. This is based on what I’ve seen in the space around deployment of Content APIs for Worpdress, Drupal and other CMS platforms. Essentially turning every site into an API.

Any of these six dimensions of API management could change the API conversation in 2012. The only thing I can predict 100%, is 2012 is definitely going to be a fun ride, based upon what i’ve seen in 2011--we are poised for some serious growth.



from API Evangelist - Blog http://feedproxy.google.com/~r/ApiEvangelist/~3/H7ktifN3iMU/

Israeli Firm Allot Communications Ltd Under Fire for Selling Spyware to Iran

"Israeli spy gear sent to Iran via Denmark," reads the headline from Israeli paper YNet News.   Today, yet another breaking story of a high-tech company selling spyware to an authoritarian regime emerged.   As a  detailed report by Bloomberg News' Ben Elgin--who has made a name for himself this year reporting on the surveillance industrial complex--explains, Israeli company Allot Communications Ltd. clandestinely sold its product NetEnforcer to Iran by way of Denmark.

View Full Post>>

2011 in Review: Watershed Moments in the Fight for Free Speech, Privacy, and Fair Use

As the year draws to a close, EFF is looking back at the major trends influencing digital rights in 2011 and discussing where we are in the fight for a free expression, innovation, fair use, and privacy. From WikiLeaks to the Arab Spring, from fighting the Internet blacklist legislation to exciting wins for reader privacy, 2011 has been a watershed year for digital rights.

View Full Post>>

2011 in Review: The Year Secrecy Jumped the Shark

As the year draws to a close, EFF is looking back at the major trends influencing digital rights in 2011 and discussing where we are in the fight for a free expression, innovation, fair use, and privacy. The Government has been using its secrecy system in absurd ways for decades, but 2011 was particularly egregious. Here are a few examples: Government report concludes the government classified 77 million documents in 2010, a 40% increase on the year before.

View Full Post>>

Facebook Mobile Hack Coming To Boston and New York

Facebook just announced that they are hosting Mobile Hacks in Boston and New York:

The event is centered around both native and HTML5 mobile web.

Its an all day event, with each project presenting to a panel of expers, and prizes awarded to the best apps.



from API Evangelist - Blog http://feedproxy.google.com/~r/ApiEvangelist/~3/FwSYkfShXR0/

Thursday, December 22, 2011

Quick Walk Through the World of Location & Places APIs

Photo Credits

I took a walk through what I am calling the locations and places API landscape today. Most of these APIs I’m familiar with, but as the CityGrid API Evangelist, I’m getting an opportunity to immerse myself into this new local, social mobile world.

As I immerse myself in this semi-new world I want to share my findings with everyone else.  If you have any suggestions make sure and let me know in comments below.

First I started with CityGrid APIs, which provide several key location and places APIs:

  • The Places API - Provides functionality for information on local businesses, including search, detail, user content submission, and predictive text
  • The Offers API - Provides coupons and special offers from businesses based on geography and category
  • The Reviews API - Provides access to customer reviews for businesses selected by id or by geography or category

Then I wanted to see what Google was doing, and of course started with the Google Maps APIs:

  • Maps JavaScript API - The Google Maps Javascript API lets you embed Google Maps in your own web pages
  • Maps Image API - The Google Maps Image APIs make it easy to embed a static Google Maps image or Street View panorama into your web page, with no need for JavaScript

Along with Google Maps they offer a set of Geo Web Services that contain several location and places based APIs:

  • Directions API - The Google Directions API is a service that calculates directions between locations
  • Distance Matrix API - The Google Distance Matrix API is a service that provides travel distance and time for a matrix of origins and destinations.
  • Elevation API - The Google Elevation API provides you an interface to query locations on the earth for elevation data.
  • Geocoding API - Geocoding is the process of converting addresses into geographic coordinates
  • Places API - The Google Places API is a service that returns information about places, defined as establishments, geographic locations, or prominent points of interest

Already with CityGrid and Google I’m seeing that the type of location and places services, really start to get complicated and diverse. With Google Latitude I start separating the location from the place, with what are two location specific APIs:

  • Curent locations - Represents the user's most recent known location
  • Location history - Represents the list of all recorded user locations

After Google I have to look at another big player, Yahoo. Yahoo has several location based services:

  • Fire Eagle - Fire Eagle is a service designed to build and use location-aware applications and services
  • GeoPlanet - Yahoo! GeoPlanet is a resource for managing all geo-permanent named places on Earth
  • Local API - Provides a database of information including business address and phone, category, rating, distance, URL, and traffic alerts
  • Maps - Provides interactive maps with driving directions and traffic information
  • PlaceFinder - Converts street addresses or place names into geographic coordinates (and vice versa)
  • Placemaker - Identifies places mentioned in text, disambiguating them and returning unique identifiers

Naturally after taking a look at Yahoo I have to go see what Microsoft is up to in the space:

  • Bing Maps API- The API that power Bing Maps, an online mapping service that enables users to search, discover, explore, plan, and share information about specific locations
  • Bings Maps Location API - Use the Locations API to get location information (I love this description!)

After looking at what local and mobile offerings the big players Google, Yahoo and Microsoft had I started looking at less search and mapping based services to more carrier based location and place services. I started with Verizon, who has a single location API:

  • LBS Network API - The Verizon LBS API allows you to use the user's location to deliver specific services

Sprint brings three location APIs to the table:

  • Geofence - Provides virtual perimeter services
  • Location - Determines the location of a Sprint CDMA Device
  • Presence - Determines if a device is present on the Sprint CDMA network

AT&T has a LBS API:

  • Terminal Location - Set of Location-based Services (LBS)

Deutsche Telekom has one location API:

  • IP Location API - Locate Internet users with their IP addresses

Ericsson Labs provides a developer community around a full suite of APIS:

  • 3D Landscape API - 3D Landscape API for integration o realistic 3D MAPS
  • Mobile Location API - Allows the use a mobile phone user's current CELL-ID to obtain their geographical location
  • Network Probe API - Provides services measure certain characteristics of network IP connectivity, firewalls and Network Address Translators
  • Web Location API - Provides location data from a mobile phone using the positioning systems of mobile operators
  • Web Maps API - Provides dynamic maps for application integration

France Telecom also has a location API:

  • Location API - Allows applications to get geographic coordinates of a given Orange France mobile phone or a fleet

Makes sense for every carrier to also provide developers with a set of location services, as they don’t want to just be dumb pipes. They want to be an integrated player in their own customers handset usage.

Next I start looking to put the social in local, mobile, social. Where else to you start but Facebook, which has two location based objects as part of the Graph API:

  • Checkin - A checkin represents a single visit by a user to a location
  • Places - A search option before initiating a checkin, returning name and location information from Graph API

I thought I'd consider Twitter next.  They have Places and Geo methods, but it really doesn't seem like its going anywhere, and a really small portion of tweets have geo info recorded.  I will consider in the future if I see action around it.

In the category of location based social network I was investigating Foursquare and Gowalla, but with the recent Facebook acquisition of Gowalla I think I will only look at Foursquare. Foursquare offers access to four different APIs:

  • Core API - Users, Venues, Venue Groups, Checkins, Tips, Lists, Photos, Specials, Campaigns, Events
  • Real-time API - Notifies venue managers when users check in to their venues, and our user push API notifies developers when their users check in anywhere
  • Merchant Platform - The Merchant Platform allows developers to write applications that help registered venue owners manage their foursquare presence and specials
  • Venues Platform - The Venues Platform allows developers to search for places and access a wealth of information about them, including addresses, popularity, tips, and photos

After Foursquare you leave social, getting into the places data world, with popular player SimpleGeo. Similar to Gowalla I was going to overlook SimpleGeo, with their recent acquisition by Urban Airship, but I think SimpleGeo is still an important enough of a player, that we should still consider them in the game. SimpleGeo has four distinct web services for location and places:

  • SimpleGeo Storage - Storage of data in SimpleGeo system
  • SimpleGeo Features - Features in SimpleGeo represent real-world places such as businesses, regions, or US states
  • SimpleGeo Context - Provides relevant contextual information such as weather, demographics, or neighborhood data for a specific location
  • SimpleGeo Places - Businesses and points of interest

In the pure places data game I’d put Factual in the same category as SimpleGeo. Factual has seven location and places APIs:

  • Places Category API - Taxonomy to classify entities in the various Factual point-of-interest (POI) datasets
  • Places Crossref API - URLs for pages that mention a specific business or point of interest or vice versa
  • Places Crosswalk API - Maps third-party (Yelp, Foursquare, etc.) identifiers for businesses or points of interest to each other where each ID represents the same place
  • Places Global Database API - 55 million entities in 47 countries
  • Places Global Place Attributes API - The latest schema for the global places dataset
  • Places Resolve API - Makes partial records complete, matches one entity against another, and assists in de-duping and normalizing datasets
  • Places Restaurants API - Core places attributes in addition to 43 extended attributes on 800,000+ restaurants, bars, and casual eateries including datatypes such as cuisine, ratings, hours of operations, and price

Tied with SimpleGeo and Factual is InfoChimps. InfoChimps is a data marketplace player with some very strong location and places services:

  • Wikipedia Articles - Correlate Wikipedia articles with geographic locations
  • Business Places by Locationary - The Business Places by Locationary API delivers quality business information based on your geographically defined query
  • Foursquare Places - The Foursquare Places API delivers uniquely rich information about venues, worldwide.
  • Geonames Places - The Geonames Places API locates all places within a specified area. Places are any geographic points that can be named
  • NCDC Weather - The NCDC Weather API provides detailed weather data based on your geographically defined query. Weather data points for your query may include dew point, precipitation, snow depth, temperature, visibility, and wind speed details
  • American Community Survey (Topline) - The 2009 American Community Survey (ACS) Topline API provides basic demographic data based on your geographically defined query
  • American Community Survey (Drilldown)- The 2009 American Community Survey (ACS) Drilldown API provides detailed demographic data based on your geographically defined query
  • Core Geographic Regions - The Core Geographic Regions API delivers detailed geodata for any geographically defined query, worldwide
  • Zillow Neighborhoods - Zillow Neighborhoods retrieves geo data pertaining to neighborhoods within defined geometric parameters
  • Digital Element IP Intelligence Demographics - A geolocation API for all your demographics needs. Search by IP address to return data about a geographical area, including number of households, gender, age groups and language
  • Digital Element IP Intelligence Domains - A reverse IP lookup API with 5 fields of search results, all customized to your IP query. Search by IP address to return data about the domain, company, ISP, NAICS industry code and proxy type for an IP
  • Digital Element IP Intelligence Geolocation - A geolocation API with 20 fields of search results, all customized to your IP query. Search by IP address to return data about a geographical area, including country, region, city, internet connection speed
  • Geocoding API - The Geocoding API is a powerful and useful tool that provides location information for any given address in the United States. Geocoding is a process that assigns geographic data (ie, latitude and longitude) to an address
  • Latitude Longitude and Zip Code Conversions - This API returns approximated latitude/longitude centroids for a given zip code, along with the relative city, state, and county

Then moving out of pure data players Yelp has always been centered around reviews, and more recently, with version 2.0 of their API moved to be centered around the businesses. Yelp has two places APIs:

  • Search API - Searches for Businesses
  • Business API - Returns full details of businesses

Another player in the space is Fwix. Fwix has a different approach to places, trying to geotag the web. Fwix offers six places and location APIs:

  • Geotagger API - Returns places geotagged to a given web page
  • Content API - Returns geotagged content in or near a location
  • Categories API - Returns the list of canonical place categories
  • Location API - Returns geographic data for a latitude/longitude point
  • Places API - Return, Submit and Delete a list of businesses for a given location

After Fwix I found a couple of other mapping, location and places data services:

  • PushPin - The Pushpin Identify Service is a REST service that takes geographic coordinates (latitude and longitude) and resolves them to named locations on the earth
  • 43 Places - Allows users build 43 Places by adding places, asking questions, giving travel advice, uploading pictures of their favorite places and writing stories about the places they've been and want to go.
  • MaxMind GeoIP® City Database - Determine country, state/region, city, US postal code, US area code, metro code, latitude, and longitude information for IP addresses worldwide.
  • Compass - Allows access to a database of 16 million business establishments in the USA.

These providers either didn’t have clear market share or started deviating into other parallel universes of content and services to location and places, so I'm going to stop here.

These 17 places and location API providers are a lot to process.  I want to spend some time getting a handle on the types of services they offer, before I dive into the peripheral services as well as the players that have less market share. But in my style, I'll keep posting my findings as I pull them together.



from API Evangelist - Blog http://feedproxy.google.com/~r/ApiEvangelist/~3/mQ1h27RL-Dk/

Wednesday, December 21, 2011

Business of Google APIs 2011

ProgrammableWeb says Google has 94 APIs. I roughly count about 75 going through Google Code. I’m more concerned with public web APIs, and Google has Android, Chrome and other non-web APIs, so its hard to tell.

In any case I would consider Google to the largest public web API owner around. I don’t think any other single provider, owns the number of, as well as size of public APIs, that Google does. As with any leading API providers I think there is a lot to learn in studying their approach to API deployment and management.

With this in mind I wanted to take a look at the Business of Google APIs in 2011 as one of my year-end, API reflection posts. I think there are some important lessons to be learned from the work Google did over 2011, to get their API business in order.

Google was already setting the theme for 2011, with the launch of Google Console in November 2010. The Google API Console helps developers manage their Google API usage across all of thier sites and apps. It was clear, Google was not just looking for a way to get a handle on how they deploy and manage large numbers of APIs, they were acknowledging developers needed a way as well.

Google API Console centralized how developers managed the Google APIs they used, traffic generated via these APIs, introduced billing management for some APIs, and provided developers with project and team building tools.  Google supports 30 APIs inside of the API Console now.

In 2011 Google also worked to make their APIs more discoverable for developers with the launch Google API Discovery Service. The Google API Discovery Service provides a set of web APIs for discovering metadata across Google APIs by delivering a JSON-based API that provides a directory of supported Google APIs, and a machine-readable discovery document for each API.

Now developers can integrate Google API discovery into client libraries, IDE plugins and other tools, making it easier to discover the API they need. After providing an API discovery service, Google followed another 2011 trend around deploying the Google API Explorer.

Like other API explorers, Google API Explorer allows users to make calls and explore REST APIs using a web interface, allowing anyone to start using an API without writing any code, even when authentication using Basic Auth or OAuth is required.

API explorers have done a lot to improve the time it takes for developers to get up and running using an API, but nothing beats good quality code samples, and Google put some serious effort into standardized code samples that can be used across Google APIs, in multiple programming languages:

Beyond making it easier to discover, explore and manage APIs with Google Discovery, Google Explorer and Google Console in 2011, Google also spent a lot of time addressing API security.

The first step to improving the security of Google APIs was by supporting SSL across all Google APIs. Next Google went all in by not just working to support OAuth 2.0 across Google APIs, they want to help developers understand OAuth 2.0, making it easier to secure applications with the standard. To help facilitate this understanding, Google opened up the OAuth 2.0 Playground, which is meant to simplify experimentation with the OAuth 2.0 protocol and APIs that use the protocol by developers.

With these moves by Google in 2011, I think we can say that SSL support and OAuth 2.0 are two API security essentials that are here to stay. After working on security, Google moved into the legal department with the introduction of a single Google APIs Terms of Service.

Google has rewritten their terms from the ground up with the goal of making them easier to understand for application developers, and one by one, redirecting each API to use the centralized, easier to understand terms of service. At the moment it seems as though most of the APIs that use the central terms of service are content and data related APIs, like Google Moderator and Blogger, while more complex APIs like Youtube and Google Adwords still use their own terms of service.

Overall Google made some pretty significant improvements to get their API house in order. Of course in order to do this they also had to make some hard decision, like deciding to shut down 18 Google APIs in May, which included the Google Translate API. A decision they reversed two months later, when they decided it was better to offer Google Translate as a billable API under Google Console.

As the API Evangelist I don’t really invent any of the API approaches I write about, I try to shed light on what others are doing. Thats what this post is all about, shedding light on how Google is conducting the business of their APIs, so we can learn from them-- the good and the bad.

I think its important to remember that we are all making this shit up as we go along, of course it should be based on some experience, but ultimately we are in some seriously new territory, and even some of the biggest players in this space fumble the ball. This fact became painfully clear in an accidental post by Googler Steve Yegge, shedding light into the API strategy of not just Google, but also Amazon.

So what do I take from Google’s approach to APIs in 2011?

  • API discovery is important
  • API exploration is important
  • Centralized billing and reporting are essentials
  • Good quality code samples are essential
  • Security with SSL and OAuth 2.0 for APIs is standard
  • The legal around APIs needs to be easier and standardized
  • Sometimes, APIs go away
  • We are all making this shit up as we go along

A lot of this is what we already are seeing from API service providers in the space like 3Scale, Apigee, Atmosphere and Mashery. But, what I don’t see, is anyone addressing discoverability, easy legal and centralized billing, management and reporting from a developers perspective.

Google is addressing all of this because its in their best interest for ALL developers to be successful, where API service providers tend to focus on the success of developers who use their client APIs, not ALL developers.

Well, maybe in 2012 a service provider can step up with a solution that help developers discover and manage their business and legal across “ANY” API, or maybe Google can open the doors to any API provider to use the Google API platform as part of any Google Apps account?



from API Evangelist - Blog http://feedproxy.google.com/~r/ApiEvangelist/~3/_45GQARGHxI/

Tuesday, December 20, 2011

The FCC Lanches API Curation Platform Called MyFCC

The FCC just launched a new platform that allows anyone to create, save and manage a customized page, built from widgets that pull content from FCC APIs, called MyFCC

At first glance MyFCC might look like just another dashboard or start page, but it goes much further, as an API curation platform.

To understand, you have to go back to the beginning and see the scope of the problem being solved. To deliver on its promise for a more open government, the FCC didn't just want a new website, they needed a platform. To do this the FCC embraced an API-driven methodology while building their web site, meant to standardize the way they deliver information between internal groups, government agencies and with the public.

At the time, the managing FCC director Steve Van Roekel, partnered with Seabourne to architect a platform that could deliver on this vision.

The new FCC web site would start with Drupal, providing a content management system (CMS) that would allow the FCC to focus on getting a handle on its content, centralizing it, preparing it to be shared with other departments, agencies and the public. The federal government is drowning in content, so the entry, storage and management of this content was the first problem to solve, and an open-source platform like Drupal, was an ideal solution.

The new Drupal web site provided public access to the content, but the FCC needed to allow sharing of content with other departments, agencies and allow the public to get at the raw data for re-use.

"Everyone is talking about opening up data with APIs, to me it makes more sense to focus on opening up content using APIs.", says Mike Seabourne, owner of Seabourne.

So Seabourne developed the Content API module for Drupal, providing a plug-and-play API, not just for the FCC website, but for any Drupal driven website.

Then in April, the FCC launched their API driven content platform, stating that “Everything should be an API”. In doing so the FCC did two very significant things:

  • Got control over their content using an open-source content management tool
  • Made their content accessible inter-department, inter-agency and with the public using APIs

Now with MyFCC the FCC is providing a dashboard that allows anyone to curate content that is made available via the FCC’s API. Centralized content, opened up with APIs, bundled with curation tools opens up lot of potential for data journalist, analysts, scientists and other average users to derive deeper meaning from FCC operations.

All of potential is made possible through a healthy dose of open source Drupal CMS, open source Drupal modules, open data, and open APIs. That is a lot of open. Mike Seabourne, the mastermind the the FCC architecture, feels the MyFCC content curation platform should be open too.

I sat down with Mike, while in DC in October, when he first introduced me to Cumula , the framework that drives MyFCC. During their work with the FCC, Seabourne has been quietly committing the Cumula framework to Github, planting the seeds for a much larger vision of what the framework could do.

Cumula has the potential to act as curation platform for any single or group of APIs. The MyFCC implementation is very tailored to serving up the FCC content, but the platform can act as a visual builder and dashboard for any topical area. For example, the platform could be setup with widgets that pull medical content from various health, science and medical APIs, allowing users to create custom dashboards around the medical information most important and relevant to them.

Cumula is to API curation as Apigee console is to API exploration, Google API Discovery Service is to API discovery, and Swagger is to interactive API documentation. It has the potential to be the platform that the world uses to discover, pull, curate and make sense of the flood of content and data we are only just beginning to drown in, with this new API driven economy.



from API Evangelist - Blog http://feedproxy.google.com/~r/ApiEvangelist/~3/pNfk1vNHebw/