Saturday, March 31, 2012

Should You Consider API Starter Kits for 3rd Party Platforms

While assembling the building blocks for a successful API ecosystem, one of the requests I get from developers is for tools to help them build successful integrations with existing platforms like Wordpress, Drupal, Facebook and Salesforce.

Some of the requests I get are from non-developers who just want a plug and play tool, they can deploy without needing to write code--but many of the requests are for white label, starter kits that developers can use to quickly deploy API driven applications on popular platforms like:

I’m not an expert in any of these platforms, but I find the more I dive into developing on these platforms, the more I learn and the more I discover other experts who really understand the ins and outs of building plugins and add-ons for 3rd party systems.  

In addition to building code samples, code libraries and generic start kits for your API, you may want to take a look at building white label, 3rd party platform plugins and add-ons your developers can use to quickly integrate your API into existing networks.  

At the very least, you will learn a lot.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/RxfbWmM7CUo/

Friday, March 30, 2012

Why Tech Bloggers Suck and Not APIs

I read a post over at Cloud Ave today by Martijn Linssen, Why API’s suck, and what they lack.  I'll start by saying, everything he says about APIs, can be true for some APIs, but the post has more to do with the state of tech blogging, than anything to do with APIs.

To Linssen's points:

  • First of all, API’s aren’t open - True many APIs use this term, without it being true. But those of in the industry who get it, call these "public" APIs and have stopped using the term "open". Open has been used by a few bad marketing seeds, and perpetuated by the blogosphere.
  • Second of all, API’s are for free, mostly, and no uptime is guaranteed - What? Really? Maybe the 10 you've tried don't, but many of the APIs I depend on do. Just to name a few: Google MapsWebServiusDatasiftZencoderAmazon EC2, Amazon S3.
  • Third, API’s are badly documented, if any - Sure this can be a problem, and is something everyone is working on. This doesn't point to APIs sucking, more the documentation.  Can you point me to the documentation you’ve maintained? Its’ not as easy as it looks. And there are plenty of shining examples of how do you do it right.

Ok those are your points? And you went to ProgrammableWeb and looked at 10 APIs? And you came to the conclusion all APIs suck? I've spent hundreds of hours looking through ALL of the APIs in the ProgrammableWeb directory, and yes there is a lot of shit in there. But there are some really amazing examples of the API vision, that you seem to be bothered by. 

Have you hacked on Twilio? Twilio rocks! Have you used Stripe? Seen the forwarding thinking e-commerce APIs ElasticPath is working on? Have you hacked on any of the 97+ Google APIs lately? They have come a really long way in pulling together their interfaces, standardizing documentation, making legal easier to use, and standardizing what is free and what is billing.

Linssen seems caught up on the fact that all APIs are some sort of social bullshit. I counter with the fact that the last 5 years of Internet growth has been build on the back of Amazon Web Services, via APIs. When I started using both EC2 and S3, it was all APIs, there were no interfaces. I’ve deployed, scaled and supported global infrastructure with millions of dollars running through those APIs, personally.

My recommendation to Linssen is he spend more time hacking on APIs, and look at the playing field a little more before you professionally blog on the subject. But that brings me back to my title, and the fact that his blog post is about page views, and not about APIs, which represents the state of tech blogosphere and not APIs.

I just went over to Techmeme and look at the top 10 stories, and I decided that all tech bloggers suck and I'll stop reading tech blogs all together. Actually no..I'll keep doing what I do with all tech blogs, evaluate them one by one and unsubscribe when they stop offering value. Much like you should do with APIs.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/z39FbyVfv88/

Wednesday, March 28, 2012

Keep Your API Area Active So Developers Feel Like Someone Is Home

The largest portion of my time as API Evangelist is spent keeping the area around an API active. Your developer’s first impression when they enter your API area is critical, and if they see signs that your API is inactive, they might start looking elsewhere.

The most common ways I keep an API active is by:

  • Blogging
  • Tweeting w/ Twitter Feature in API Area
  • Forum Posts
  • How-Tos
  • Starter Projects
  • Developer Showcase

By actively posting content in these areas w/ timestamps showing when they were posted, I keep the API area looking like someone is home.

This active content doesn’t just help developers visiting the site feel like the API is active, it also helps your SEO. Search engines and social networks will regularly index your content, providing fresh traffic, and potentially new developers to your API.

Just a couple hours a day, generating fresh content and engaging with your developers goes a long ways in attracting new developers and making them feel confident that your API is a priority, and they should integrate it into their applications.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/TunHIa813EM/

Monday, March 26, 2012

Some Positive News After Last Weeks Silicon Valley Sexism

With all the attention sexism in silicon valley has gotten, I wanted to make sure and showcase some positive news on the Hackathon front this week.

There was a great article in the Oregonian out of Portland called, "Intel 'Code for Good' hackathon helps nonprofits solve technology challenges".

The theme for the hackathon: Girls in education

Code for Good co-organizer Josh Bancroft worked with co-organizer Renee Kuriyan and employees in Intel's corporate affairs group to help four nonprofit organizations, World Pulse, a Portland-based nonprofit; Room to Read; 10X10: Educate Girls Change the World; and Global Campaign for Education.

"I felt like we could amplify the impact" by software and corporate working together, Bancroft said. The partnerships were already established by corporate affairs, and the software group simply brought the technical expertise.

Their task was to create software solutions to real-world problems facing girls in education, and for each team to develop an app by the end of the day that the nonprofits could then put to use. The hackathon theme is part of Intel's "She Will" campaign and with the company's overarching mission to "create and extend computing technology to connect and enrich the lives of every person on Earth."

You can read more about the great hackathon over at Oregon Live. I just wanted to point out that not all hackathons are male dominated, and the model can go much further than some of what we see coming out of Silicon Valley.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/8-XTrAwfAF0/

Friday, March 23, 2012

Reach Corporate API Devs On LinkedIn, Independent API Devs On Twitter

Every Monday I generate a list of all new API developer who registered for the CityGrid API the week before. This list has the name and email address for each developer, and I manually go down the list, and email them individually, letting them know I’m here to help.

As I email each new API developer I use Rapportive to identify more information about them, such as their website URL, Twitter and LinkedIn profile. If a developer has a Twitter or LinkedIn profile, I engage them on those networks.

After 3 months of doing this, I’ve noticed that developers who work for larger companies have LinkedIn profiles and developers who either are freelance, own or work with a smaller development shop tend to have Twitter.

It makes sense. Twitter is a more public forum, and smaller companies tend to need to market their skills and services and actively engage the world around them. While established companies tend to keep their developers in a more closed environment, suitable for the business social network LinkedIn.

This type of information helps me segment my API developer audience into various groups allowing me to craft blog posts, announcements and share other information with them on appropriate channels. I’m not a big fan of emailing, so being able to push out information regularly on LinkedIn and Twitter, and engage with users that are active on these networks, works very well for supporting my approach to API developer support.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/5flWgXs_X7I/

Wednesday, March 21, 2012

Using Github For Your Hackathon

I’ve spent a lot of time trying to track on what gets created at hackathons.  Some innovative programming occurs at these events and I’d say 98% of it is forgotten by Monday morning.

I think a lot of attention is given to the myth that hackathons are about building startups. When in reality, how many startups come out of hackathons? In my opinion the top three things that come out of a hackathon are:

  1. Marketing - For the event organizer, sponsors and for participants.
  2. Talent Acquisition - There is some seriously talented folks attending hackathons.
  3. Networking - Hackathons are a great place to meet people, network, make friends and sure maybe a co-founder.

Beyond that I think there is a lot of code that gets generated at hackathons, that really never goes anywhere. Much of this code is not immediately VC fundable, but still some of it is really good. And I see a lot of kids spending whole weekends writing the same or similar code as I saw the weekend before.

To help remedy this, I suggest that event organizers use Github. Just like every hackathon organizer should identify a Twitter #hashtag for the event, you should setup a Github organization and add your hackers as users. Encourage them to commit their code through-out the hackathon, and wrap up the weekend with a nice README file of their project.

Seems like a great way to showcase what was built over the weekend, make the intellectual property open and accessible for other people to use, and make it easier for other hackathon participants to build off what has already been created in the past.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/OpV-TFIMYdc/

Tuesday, March 20, 2012

Is The Blog for Your API Up to Date?

As a building block, a blog is a very valuable tool for building awareness of your API, keeping your developers informed and give a personality to the team beyond an API.

My recommendation to API owners is to always have a blog, however one the most damaging things you can do is stop posting to your blog. A blog is a key variable in my API Stack algorithm, of whether or not an API is worth integrating with, and if you haven’t updated your blog anytime in the last year, I immediately step away.

If you want people to find your API, have a blog. If you want your existing developers to be educated about what can be done with your API, have blog. If you want developers who find your API, to feel confident enough to integrate it into their app, keep you blog updated!



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/tEz5Jjb6KD8/

Learning from Sqoot: Making Hackathons Accessible to EVERYONE!

Boston API Jam organizer Sqoot just got themselves into a lot of hot water with some very sexist comments on their hackathon home page (now removed):

From what I can tell, Sqoot was called out by @BoazSender:

Then the pressure quickly was put on in the Twittershpere:

Resulting in Apigee pulling their sponsorship:

And Heroku pulling theirs as well:

Sqoot quickly released an "apology" which included:

"While we thought this was a fun, harmless comment poking fun at the fact that hack-a-thons are typically male-dominated, others were offended."

Their "apology" definitely summarizes the misogynistic tone of the spreading Silicon Valley startup and hackathon culture.

This is a major problem. One of my biggest complaints about many hackathons is that they are male dominated, and this is nothing to joke about, make fun of or take lightly. You don’t realize it when all you do is attend the all boy hackathons and do your bro-gramming.

I’ve been to hackathons where there are equal sexes, and they are awesome-- way more fun, and truly work to solve real life problems, not perceived problems.

I hope other blogs will publicize this, and make other hackathon organizers aware so this kind of shit doesn’t not only go away, but we start working the other direction and making sure EVERYONE feels welcome at hackathons.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/WC6jBgk3Ops/

Thursday, March 15, 2012

Qualifying for the API Stack

I’m going through hundreds of APIs and curating a list of APIs, for what I’m calling the API Stack.  The API Stack consists of APIs that provide clear value for developers and have demonstrated real investment in their API, and are in it for long haul.

There are quite a few things I consider when looking at an API, here are a few:

  • Value - The API offers clear value to developers, without needing an explanation.
  • Web API - They don’t have to be 100% REST, but web APIs make developers lives easier.
  • Active Blog w/ RSS - A blog is a quick way to see what is behind the curtain of an API. If they care enough to have a blog, and communicate with developers, and keep it active it demonstrates they care. With an RSS feed I can programmatically understand whether they are active or not.
  • Active Forum w/ RSS - A forum gives developers a voice and also shows an API owner cares what developers thinks, and is willing to support a large community. An active forum is a positive sign of an API, and the RSS feed also gives me a programmatic way to assess.
  • Twitter - An active Twitter presence is a pretty key metric in understanding the value of an API. If the company has the resources to actively engage developers via Twitter, they care. And again I can programmatically monitor and see if someone exists behind the API.

There are other metrics I use, but these are proving to be the most positive characteristics of an API that developers can depend on. Many APIs I come across don’t even meet these basic requirements. What triggered this post, was I had just looked at about 10 APIs that did not meet these basic requirements, and then I came across one that clearly met all of these, and more--GeoIQ.

GeoIQ has a mapping API that provides clear value to developers, and its obvious they are investing the necessary resources, so the API is successful. So I’m adding GeoIQ to my API stack under the category of “local”.

If you feel your API is a good fit for the API stack, make sure and ping me.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/A80CBXtZvL8/

Wednesday, March 14, 2012

Knodes Announces $250K Fund to Invest In Their API Developers

Social Data Analyzing API Knodes has just announced a $250K startup fund that will invest in the best products, websites and applications built on top of the Knodes API platform.

Knodes Social Data Analyzing API provides developers with social data-based insights about their users across social networks like Facebook, Twitter, Foursquare and LinkedIn.

With the launch of the Fund, the Knodes team is hoping to bring awareness to their API and jumpstart a new ecosystem of developers, businesses and campaigns whose next generation social applications will be powered by its API.

The Knodes Fund which is in partnership with Quotidian Ventures, will take the best websites, web and mobile applications and give them the resources they need to be successful, including up to $25,000 in equity funding, mentorship and access to new features and social intelligence available via the Social Data Analyzing API.

The Knodes Fund is open to all individuals and companies building an application, website or program using the Knodes API. Applications can be submitted at Knod.es beginning May 1, 2012, and the deadline to apply for the Knodes Fund is June 1, 2012.

The Knodes Fund represents a growing convergence of the innovation occuring within both API ecosystems and a growing number of startup incubators. APIs provide a fertile environment to entice talented developers with fresh ideas to build new applications without commitment, while also being able to invest in successful ideas that prove themselves, bringing to life the original business development 2.0 promise of the API ecosystem.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/_NOpX1yDqvo/

Andreas Krohn - API Evangelist

Andreas Krohn (@andreaskrohn) first saw the potential of APIs when he in 2005 started working with enterprise web scraping solutions at Kapow Technologies (now Kapow Software). Conceptualizing the web as one big database and creating APIs to get to that data is not that different from the REST APIs used today. When leaving Kapow to start his own company it was a given that the focus would be on APIs and open data.

Andreas API work is focused in Sweden where he regularly speaks at conferences. Mostly he focuses on the non-technical side of APIs since APIs are doomed to fail without a good business strategy and long term marketing plan. The ones that really need to understand the potential of APIs are the business people and then focusing on the money instead of technology is the way forward. He is also providing consulting services customers regarding API strategies and technologiesone such project he has been active in is Trafiklab which is a portal for Swedish public transport APIs. The goal of the project is to collect Swedish travel related APIs in one place with consistent documentation, API-key handling and support to make them as attractive to developers as possible.

He writes about APIs on Swedens leading API blog Mashup.se and has done so for several years. The focus of Mashup.se is international API news as well as highlighting local Swedish APIs. To increase the use of Swedish APIs he has recently launched the Swedish API directory listing over 200 Swedish APIs. Not much compared to the 5000+ APIs listed on ProgrammableWeb, but a lot for a small country where the wave of APIs are just getting started. The development of APIs in Sweden is currently lead by government agencies and agencies due to European Union directives but Andreas hopes that showing the available private APIs will lead to a wider use of APIs and more private companies developing APIs.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/cu0pCrno0jM/

Thursday, March 8, 2012

Automated Documentation for REST APIs

This post comes from the SDK Bridge newsletter.  I find so much value from what Peter and Jonathan do over at SDK Bridge, I always have to post their newsletter here and share with all of you.

People are constantly trying to come up with tools to make API documentation an easier task. If you are documenting an SDK built for C++, C#, or Java, there are tools such as Doxygen, Sandcastle, and JavaDocs to take comments from the code and automatically generate documentation from them. Why aren't there tools like this for REST APIs?

The beauty of Web APIs is that they can be written in whatever language you like and in whatever manner you like. As long as when an HTTP request comes in, the proper HTTP response goes out, it doesn't matter how it actually happens on the server. But this very flexibility makes automated documentation nearly impossible, since there's no standard mapping between what an API request is and what the code is that generates its response.

Nonetheless, there are some solutions out there to this problem. I need to start by saying that there are in fact two approaches to automation that are used to document REST APIs. One is similar to the tools I mentioned above, where comments are taken from code to generate the documentation. The other involves having the documentation separate from the code, but in a data format (such as JSON) that can be parsed and used to generate the documentation.

I also should mention that documentation automation does not guarantee good documentation. Before choosing to to incorporate automation into your process, I recommend reading an excellent article by Dana Fujikawa: What to Consider Before Considering Auto-Generated Documentation.

Automated Documentation from Code

There's no off-the-shelf tool that pulls documentation comments out of code that's going to work for all REST APIs. But there are two possible solutions:

  1. Use a framework that both generates the APIs and the documentation
  2. Create methods with a one-to-one mapping with API requests.

Framework. A good example of a REST API framework is Enunciate. Enunciate is an open-source Java-based Web API framework. It creates full HTML documentation of the services it generates, where the documentation is assembled from JavaDocs comments.

Mapping. Mapping requires some disciplined practices, but has the advantage that it can be used with any technology. In this case, you need to create public methods that map directly to API requests. So, for example, you might have an API request to get a brief user profile for a user with an ID of 23423 with a call like this:

GET http://api.example.com/users/23423/profile?type=brief

When this request comes in, you need to structure your code so that it calls a method by the name of something like:

public get_users__id__profile(int id, string profile_type)

Note that the id is surrounded by double underlines, indicating that it is not literally the text "id".

This method would then have comments that could be picked up by automated tool, such as JavaDoc, RDoc, or Sandcastle, and HTML documentation would be generated. Then you would need to run the HTML documentation through an automated process that would remove unnecessary information (such as class names), and convert the method names, replacing single underlines with slashes and double underlines with slashes and brackets so that

public get_users__id__profile

would become

GET /users/{id}/profile

The parameters table would also need some modification so that it's clear which parameters are part of the URL and which are query parameters.

It's not a simple process, but I have seen it done successfully using Ruby code and RDoc.

Automated Documentation from Structured Data

The advantage of taking comments from code is that if there are changes in the code, the comments are more likely to be updated. However, a simpler and very flexible solution is to have the documentation in structured data (JSON or XML), and then have an automated process create the actual HTML documentation from it. There are a several tools that will do this, merging documentation with an ability to try out the REST calls, which is extremely handy. Here are some examples.

Swagger. Swagger is a tool created by Worknik that creates very nice looking API documentation with the ability to easily try any API request. You specify a resource discovery URL which returns JSON with information about the various REST resources, then for each resource, you specify the type of operation, the path, the potential errors, and the response. Although you are limited in how long your descriptions can be, it creates a very nice documentation system for straight-forward APIs.

I/O Docs. I/O Docs is a tool created by Mashery that is very similar to Swagger. The big difference is that it is open source. Written in JavaScript, the source is available on github, which means that you can taylor it to your own needs, as well as look-and-feel.

Create your own. If neither of these tools are flexible enough for your API, you can create your own. A beautiful implementation that I had the priviledge to work on was created by Tendril. Take a look at an example API request at Cost and Consumption for a Single Device. You can see how you can try it out on the first tab, but then other tabs list parameters, response, and notes. By creating their own system, they were able to document a fairly complex API call which would not have worked with an off-the-shelf system.

Conclusion

Automated REST API documentation can be used to:

  • Keep the documentation near the code so that it's easier to update.
  • Allow developers to try out of the API requests as part of the documentation.

Although it is impossible to have a tool that automatically generates REST API documentation from any code, there are a number of approaches that will let you autogenerate the documentation, including:

  • Using a framework that generates both the API code and its documentation.
  • Creating a mapping between methods and API requests and using standard documentation tools.
  • Writing documentation as structured data and generating HTML from it.


from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/JXa0K1ILT2E/

Monday, March 5, 2012

Turning API Forum Posts into Blog Stories

I’m always looking for new, relevant ideas to write blog posts on for the CityGrid Developer blog. I have several topics I write about regularly including new projects I’m working on, new releases around the API, and what I find during my local, mobile and social landscape analysis.

However it can be hard to find topics to write about that are relevant to CityGrid developers, or publishers as we call them. To help write blog posts that are useful to my API community I started harvesting ideas and topics from actual forum posts from developers.

Earlier today a publisher submitted a forum post stating their concerns about the age of some of the reviews they get along with businesses, when making requests against the CityGrid Places API. It was an easy question to answer, since CityGrid also provides a business reviews API, that will give you more control how you can pull reviews for a zip code, neighborhood or specific business.

Using the topic that was posted on the forum I was easily able to create a quick blog post framing the question and providing a detailed answer explaining how the CityGrid Reviews API provides a solution.

Now CityGrid publishers can potentially be exposed to this solution via the blog, RSS feed and since I actively syndicate my blog posts, they will see it if they follow CityGrid on Twitter, LinkedIn, Facebook and also using a search engine.

When evangelizing for an API, I feel it is important to try and provide as helpful content as you can via an active blog presence, while also reverberating it across the multitude of channels your developers might be listening on--turning relevant forum posts into stories is a great way to do this.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/lzNNaJcc-L4/

CityGrid Local, Mobile, Social Stack: Verizon Mapkit API

I’m spending more time building, what I’ve dubbed the CityGrid Local, Mobile, Social Stack, a list of APIs, platforms and tools that you can use in your local-mobile applications.  With the latest move by Foursquare to join the OpenStreetMaps movement, I’m focused on finding the best mapping tools for the CityGrid Local, Mobile, Social Stack.

First on my list of alternative mapping solutions is Verizon Mapkit API, which provides location-based services that include maps, search, traffic and static directions.  The Verizon Mapkit API is centered on a map object that provides a tile-based solution supporting multiple layers such road map/satellite/hybrid, traffic, and routes--with built-in controls for standard map operations such as panning and zooming using the host device’s native gestures.

The MapKit also provides a full set of search APIs delivering geocoding, geolocation and access to local content:

  • Address Search (Geocoding)
  • Reverse Geocoding
  • Local Search
  • Fuel Price Search
  • Movie Theaters
  • Movie Show-Times
  • Events
  • Event Venues
  • Traffic Incidents
  • Static Directions

The Verizon Mapkit API does not have a web API currently, but does support native application development on Android, iOS, Blackberry and Brew MP platforms. At first glance you may think the mapping solution is just for Verizon devices, but it can be used across multiple platforms.

The Verizon Mapkit API is a perfect addition to the CityGrid Local, Mobile, Social Stack. They offer a robust mapping platform that native mobile apps can take advantage of. The Mapkit API is part of the Verizon NavBuilder Inside LBS SDK, which comes with other APIs and tools I will be adding to the CityGrid Local, Mobile, Social Stack in future blog posts.




from Kin Lane http://feedproxy.google.com/~r/KinLane/~3/0rCZcHCVRFQ/

Sunday, March 4, 2012

Mobile Advertising Platform Round-Up

I’m doing more research for my Local, Mobile, Social Stack, and spending some time understanding the mobile advertising space.

So far I’ve found 24 mobile advertising platforms:

This is is just a round-up, next I will profile what each of the mobile advertising providers offer and see how the CityGrid advertising network compares.

Many of these networks offer mobile advertising for the entire world, beyond the local advertising space, but I want to see as many of the providers as I can.

Stayed tuned for more detail on mobile advertising as part of the CityGrid local, mobile, social stack.




from Kin Lane http://feedproxy.google.com/~r/KinLane/~3/QnNfPuMre3Y/

Overview of 11 Places Data APIs

Since starting as API Evangelist here at CityGrid, I have been asked a couple of times how we stack up against other places APIs. So I went through the 11 other places APIs, gathering info, in an attempt to see what each offered.


CityGrid Places API

Search Overview - Providing a places search that can be searched by longitude/latitude, "where" using cities, neighborhoods, zip codes, metro areas, addresses and intersections. Details for each places is also available.

  • Database Size - 18 Million US Places
  • Store / Cache Data -  No Storage.  Cache up to 15 minutes.
  • Attribution - Include logo and phrase “powered by CityGrid; data from Infogroup ©[YEAR]”
  • Multi-Provider IDs - Yes
  • Meta Data - No
  • Rating / Review - Yes
  • Deals / Offers - Yes
  • Revenue Share - Yes
  • Call Limits - 10M / Month
  • Response Format - XML / JSON
  • Authentication - Key
  • Pricing - Free
  • Check-In - No
  • Write - No
  • Delete - No

Other

  • Places that Pays - A program that monetizes the display of, and interaction with, CityGrid places is called Places that Pay.

URL - http://docs.citygridmedia.com/


Facebook Graph API

Search Overview - Providing the ability to search the Facebook Graph objects with a “type” of place, and longitude/latitude, keyword search and area to find places listed as objects within Facebook.

  • Database Size - Not Found
  • Store / Cache Data - No
  • Attribution - Yes
  • Multi-Provider IDs - No
  • Meta Data - No
  • Rating / Review -Yes
  • Deals / Offers - Yes
  • Revenue Share - No
  • Call Limits - One call per second
  • Response Format - XML / JSON
  • Authentication - OAuth
  • Pricing - FREE
  • Check-In - Yes
  • Write - Yes
  • Delete - No

URL - http://developers.facebook.com/docs/reference/api/


Factual

Search Overview - Providing a places search that can be searched by latitude/longitude, and “where” using full text search query string.

  • Database Size - 55 million entities in 47 countries
  • Store / Cache Data - Yes
  • Attribution - Yes
  • Multi-Provider IDs - Yes
  • Meta Data - Yes
  • Rating and Review - No
  • Deals - No
  • Revenue Share - No
  • Call Limits - cross ref = 10,000 per day / crosswalk = 500 per day / read = 10,000 per day  / resolve = 100 per day
  • Response Format - JSON
  • Authentication - Unsigned and signed requests w/ 2-legged OAuth.
  • Pricing - Free
  • Write - Yes
  • Delete - No

Other:

  • Select - What fields to include in the query.
  • Places API - Resolve - Resolve is an entity resolution API that makes partial records complete, matches one entity against another, and assists in de-duping and normalizing datasets.
  • Places API - Crossref - The Crossref API enables you to find the URLs for pages that mention a specific business or point of interest or vice versa.
  • Places API - Restaurants - The U.S. Restaurant table contains Factual's core places attributes in addition to 43 extended attributes on 800,000+ restaurants, bars, and casual eateries including datatypes such as cuisine, ratings, hours of operations, and price.

URL - http://developer.factual.com/


Foursquare Venue API

Search Overview - Providing a places search that can be searched by  hierarchical list of categories, longitude/latitude, “where” using search term, managed by requesting users, over time range, trending and exploration.

  • Database Size - Could Not Find
  • Store / Cache Data - Okay to keep caches of foursquare data as long as they are refreshed at least every 30 days.
  • Attribution -Yes
  • Multi-Provider - Yes
  • Meta Data - Yes
  • Rating and Review - Yes
  • Deals - No
  • Revenue Share - No
  • Call Limits - 5,000 requests per hour
  • Response Format - JSON
  • Authentication - OAuth 2.0 w/ To make a userless request, specify your consumer key's Client ID and Secret instead of an auth token in the request URL.
  • Pricing - Free
  • Write -Yes
  • Delete - No

Other:

  • Actions - You can edit, flag, mark to do, propose edit for venues in Foursquare database.

URL - https://developer.foursquare.com/overview/venues


Fwix

Search Overview - Providing a places search that can be searched by
latitude/longitude, and text search based upon categories, address, city, province, postal code, country, neighborhood and text keyword.

  • Database Size - 23M in US.
  • Store / Cache Data - No storage.  Yes to cache.
  • Attribution - Yes
  • Multi-Provider IDs - No
  • Meta Data - Yes
  • Rating / Review - No
  • Deals / Offers - No
  • Revenue Share - Yes
  • Call Limits - 5,000 calls per unique user per day
  • Response Format - XML / JSON
  • Authentication - Key
  • Pricing - Free
  • Write - Yes
  • Delete - Yes

Other:

  • Geotagger Methods - Returns places geotagged to a given web page.
  • Content Methods - Returns geotagged content in or near a location.

URL - http://fwix.com/developer_tools/api


Google Places API

Search Overview - Providing a places search that can be searched by latitude/longitude, keyword matched against all fields, name of place, type of place restricted by radius.   As well as pulling details for each places.

  • Database Size - Could not find
  • Store / Cache Data -
  • Attribution - "Powered by Google" logo is displayed above or below the data
  • Multi-Provider IDs - No
  • Meta Data - No
  • Rating / Review - Yes
  • Deals / Offers - No
  • Revenue Share - No
  • Call Limits - 1 000 requests per 24 hour period
  • Response Format - XML / JSON
  • Authentication - Key
  • Pricing - Free
  • Check-In - Yes
  • Write - Yes
  • Delete - Yes

URL - http://code.google.com/apis/maps/documentation/places/


InfoChimps

Search Overview - Providing a places search that can be searched by longitude/latitude with radius, address, bounding box or IP address.

  • Database Size - Could not find
  • Store / Cache Data - Yes
  • Attribution - Yes
  • Multi-Provider IDs - No
  • Meta Data -  No
  • Rating and Review - No
  • Deals / Offers - No
  • Revenue Share - No
  • Call Limits - Couldn’t Find
  • Response Format - XML / JSON
  • Authentication - Key
  • Pricing - Free w/ Premium Pricing
  • Check-In - No
  • Write -  Yes
  • Delete - Yes

URL - http://www.infochimps.com/datasets/business-places-by-locationary


Nokia

Search Overview - Providing a JavaScript places search that can be searched by search term, with a detail search for display by JS widget.

  • Database Size - Not Found
  • Store / Cache Data - No
  • Attribution - Yes
  • Multi-Provider IDs - No
  • Meta Data - No
  • Rating / Review - No
  • Deals / Offers - No
  • Revenue Share - No
  • Call Limits - Not Found
  • Response Format - JSON
  • Authentication - None
  • Pricing - Free
  • Check-In - No
  • Write - No
  • Delete - No

URL - http://api.maps.nokia.com/places/index.html


Yahoo GeoPlanet

Search Overview - Providing a places search that can be searched by type, county, state, country, oceans, seas, continents, hierarchy and full text search.  Also returns places detail by ID.

  • Database Size - Could not find
  • Store / Cache Data - No
  • Attribution - Must contain the copyright notice "Copyright © Yahoo! Inc. 2008, All Rights Reserved"
  • Multi-Provider IDs -No
  • Meta Data - Yes
  • Rating / Review - No
  • Deals / Offers - No
  • Revenue Share - No
  • Call Limits - “Reasonable Request Volume”
  • Response Format - JSON / XML
  • Authentication - Key
  • Pricing - Free
  • Write - Yes
  • Delete - Yes

URL - http://developer.yahoo.com/geo/geoplanet/


Yelp API

Search Overview - You can search location using geo bounding box, longitude and latitude, neighborhood, address or city and filter listings by “where”, using a list of support categories.   As well as pulling details for each places.

  • Database Size - Could not find
  • Store / Cache Data -No
  • Attribution - Yes with logo
  • Multi-Provider IDs - No
  • Meta Data - Yes
  • Rating / Review - Overall count with 3 review excerpts
  • Deals / Offers - Yes
  • Revenue Share - Yes with Commission Junction
  • Call Limits - 10,000 calls/day
  • Response Format - JSON
  • Authentication - OAuth
  • Pricing - Free
  • Check-In - No
  • Write - No
  • Delete - No

URL - http://www.yelp.com/developers/documentation/v2/overview


YP

Search Overview - Providing a places search that can be queried by keyword and longitude/latitude, street address, city, postal code, Neighborhood, state, points of interest or by phone number with a radius.  Places details are also provided.

  • Database Size - Could not find
  • Store / Cache Data - No
  • Attribution - Yes
  • Multi-Provider IDs - No
  • Meta Data - No
  • Rating / Review - yes
  • Deals / Offers - Yes
  • Revenue Share - Yes
  • Call Limits -  50,000 requests per day.
  • Response Format - XML / JSON
  • Authentication - Key
  • Pricing - Free
  • Check-In - No
  • Write - No
  • Delete - No

URL - http://developer.yp.com/

If you see anything missing or incorrect, let me know at @citygridapiteam.



from API Evangelist http://feedproxy.google.com/~r/ApiEvangelist/~3/HMhpkqc48b0/