Friday, September 19, 2014

Forget Being Neutral And Objective Anymore With APIs

I commonly use the statement that I don’t work for any company, and I strive to be a neutral voice in the industry. After four years, I’m going to be ditching that rhetoric, so if you hear me say it, call me out and say bullshit!! First of all I don't believe the concept anymore, and I feel like it was bullshit to begin with, and as the API space is picking up momentum, I can't help but push some technologies, products, and services over others.

I’m writing this post to seal the deal in my own memory, which is the primary reason I blog. First of all, the companies that sponsor and support me, I can't help but endorse what they offer, not just because they give me money, but also the fact that they have a decent product, otherwise I wouldn’t be associated with them. Also there are a number of API related tools I use myself, and I can't help but talk about them over some of their competitors in the space.

If you see me endorsing a certain technology, it is often because it just works for me, and ask me why, I'll give you an honest answer. The main thing that spurred this post is I was writing a story about Swagger 2.0, and I always feel bad not writing as many posts about API Blueprint, as I do Swagger, but Swagger is in my consciousness because I use it, and depend on it every day. I believe in what Jakub and gang are up to at Apiary, and highly support you using their tools and services, and will always work to showcase what they are up to.

Hopefully this line in the sand will help me feel a less guilty in this area, because I’m reducing the amount bullshit claims in my writing and stories. (minimum viable bullshit) The Swagger and API Blueprint example, is just one of many conflicts I am struggling with each day, so I figured just being as honest as I can, will be the only way I'll stay sane in all of this.

Photo Credit: Ricardo Moreira



from http://ift.tt/1wxqY0Z

Sunday, September 14, 2014

Building The Type Of Audience I Really Want

I used to work hard to write blog posts on API Evangelist that would have broad appeal with the Hacker News community, and at first I didn't have any luck, after trying to engage with readers on posts, I found myself blacklisted, where nothing I submitted showed up. I lived in some kind parallel universe, all because I argued with a couple influential HN users, who didn't like what I had to offer.

I started a new account for API Evangelist, and began playing the game with a little different approach. I didn't engage with users, and wrote posts with titles that would bring in readers, kept things short and superficial, and had some front-page exposure, which would result in thousands of pageviews, and then would dissipate. This type of storytelling ever really turned into meaningful traffic, an engaged audience, conversations, or brought any value to my overall mission.

Early last year I stopped posting to HN, only occasionally posted a story, but never really used it in the same way that I had before. I was worried my traffic would be hurt, but I went back to work, trying to write meaningful posts, that brought value to my target audience, and supported my mission. In the last year and a half, I've seen my sustained monthly traffic go from 500 page views a day to almost 2000 a day, all by just my sticking to my mission.

The audience that does visit my blog, for the most part engages with me, shares posts on Twitter, Facebook, LinkedIn and other channels, and I see much of what I write echoed on other blogs, and in conversations I see on social media, and in person at events. This approach to building my audience, and ultimately traffic to my site has been very healthy in helping me obtain my objectives. My goal is not pageviews for advertising sake, it is to build an awareness in the average business person, and the everyday individual, about the importance of APIs.

I’m writing this post because see echoes of HN in another blog DZone, which I don’t actually syndicate to, but because of the history behind my kin lane.com blog, some of my posts still get hand selected by the DZone staff, and syndicated to the blog. Many of these blog posts get some pretty good comment activity, and the other week a blog post I wrote on Google’s hiring process, which got quite a bit of comments, including some pretty trollish ones.

I made the mistake of feeding the trolls, something I don't usually do, but I can’t help myself occasionally, and it made me think of this same illness that many advertising driven technology sites possess. I’m so happy to have my blog, something I know that reaches a wide range of people, not just because of the page views, but also because of the online and in-person feedback I get. The alpha geek crowd is not my audience, and I don't care what they think about what I’m saying, but the trollish comments still get to me sometimes. i quickly shrug them off, but writing about these emotions is one way I do that—resulting in this post.

I’m stoked to have a mission driven blog, that goes beyond monetization through advertising. My audience may be small compared to DZone or Hacker News, and what I do may not matter to a lot of their users, but at least I have purpose that involves helping educate people, and I don’t feel the need to tell people how stupid they are, and that what they do is worthless. I can’t imagine being so lost, that doing that makes you feel better. It makes me sad, but then I get back to work, and move on, keeping on with my mission.

Photo Credit: Jonathan C. Dietrich



from http://ift.tt/X29sVG

Saturday, September 13, 2014

Moving Beyond Server Side Code And Hosting

I remember all the pain and suffering I used to go through in the old days looking for a place to park my server side code, leading me to eventually invest in my own servers, and rack at a datacenter. All of this changed with Amazon S3 & EC2. After 2008 I migrated everything to the cloud, selling all my servers, and never looking back.

As I complete two stories, one one an innovative API deployment platform called Blockspring, and the other on automation platform Temboo’s new JavaScript SDK for the 100+ API platforms Temboo connects with. Both of these evolutions in technology, and evolution in cloud computing, reflect one possible future where the concept of “server side code” will fade away, bringing us closer to a more programmable web.

While there will always be robust server side frameworks, I’m seeing a shift to where the need to have server side coding skills to deploy websites, mobile apps, and single page apps will go away. I envision containerized solutions like Wordpress, that allow us to deploy anything we need to support app deployment, and allow us to configure, tweak, and evolve without being aware of what the backend is up to.

I know these thoughts will drive many developer crazy, thinking that I will open up the gates to a much more shittier web, but I think if developers build high quality, very configurable, and modular APIs, we can help mitigate the trash that gets deployed. Whether you like it or not WordPress runs over 65M websites, and I think we can do even better with the next generation of online apps, and the APIs that drive them.

I’d love to see the concept of hosting evolve into allowing anyone to park the apps they need anywhere they want, even on their Facebook account, or via their Dropbox. I think APIs and containers will go a long way into moving us towards this future. It is something that won’t happen overnight, but eventually hosting, and much of the server side code wrangling we’ve done in the past will go away.



from http://ift.tt/1AL91fV

Monday, August 25, 2014

Checkout This Lineup Of Speakers At Defrag, Including Myself

Check out this lineup of the first round speakers at Defrag this year! I recently got an email from the event queen Kimberly Norlin, and I'm pretty stoked about some of the people I will be sharing the stage with in Colorado this year.

The Norlins rock, whien it comes to bringing together some of smartest folks in the space to Broomfield, CO each year.

My presentation will be about containers and APIs, borrowing from John Sheehan, when he said "containers will do for APis what APIs do for companies". Which is a pretty prescient comment, but not surprising from someone like John is on the leading edge with his API integration startup Runscope.

Make sure and get registered for Defrag. If you aren't familiar with what Defrag is all about, read one of my earlier posts about the Defrag experience--in short, Defrag is where all the tech leaders come together in Colorado once a year, to discuss what is going on in the space, and drink a lot of beer!



from http://ift.tt/1vJ4o53

6,482 Datasets Available Across 22 Federal Agencies In Data.json Files

It has been a few months since I ran any of my federal government data.json harvesting, so I picked back up my work, and will be doing more work around datasets that federal agnecies have been making available, and telling the stories across my network.

I'm still surprised at how many people are unaware that 22 of the top federal agencies have data inventories of their public data assets, available in the root of their domain as a data.json file. This means you can go to many http://ift.tt/1zweO8m and there is a machine readable list of that agencies current inventory of public datasets.

I currently know of 22 federal agencies who have published data.json files:

Consumer Financial Protection Bureau
Department of Agriculture (USDA)
Department of Defense (DOD)
Department of Energy (DOE)
Department of Justice (DOJ)
Department of State
Department of the Treasury
Department of Transportation (DOT)
Department of Veterans Affairs (VA)
Environmental Protection Agency (EPA)
General Services Administration (GSA)
Institute of Museum and Library Services (IMLS)
Millennium Challenge Corporation (MCC)
National Aeronautics and Space Administration (NASA)
National Archives and Records Administration (NARA)
National Institute of Standards and Technology (NIST)
National Science Foundation (NSF)
National Transportation Safety Board (NTSB)
Nuclear Regulatory Commission (NRC)
Office of Personnel Management (OPM)
Social Security Administration (SSA)
United States Agency for International Development (USAID)

You can click on the logo or name, and view the full data.json files. You can also visit my Federal Agency Dataset Adoption work to see all of the datasets listed for each agency. There is stil one bug I notice in the adoption process, so don't adopt anything quite yet.

The goal of this just to highlight again, that there is a wealth of open data resources just waiting for all of us open gov hackers to take advantage of, and work make sense of. Federal agencies need our help, so get involved, there is a lot of work to be done.



from http://ift.tt/1lt8jma

Wednesday, August 20, 2014

My Information Consumption Now Driven By Companies And People, Not Just Feeds

While I still worry about the health of RSS, I feel like my overall information consumption has significantly evolved in some very meaningful ways since Google decided to shut down Google Reader. A year before the shutdown of Google Reader, I had developed my own curation and monitoring system, which included the ability to pull RSS feeds, so I was already decommissioning Google Reader before it was officially abandoned.

In my monitoring system I can add three types of entries:

  • Entities - Business, organizations, government agencies or any other non-human entity.
  • Individuals - Basically my individual CRM system for everyone I know, and don’t know.
  • Feeds - Other generic feeds from some forums, streams, etc.

For each of these entries I pull the following streams of information:

  • Blog - The RSS streams from company and individual blogs, providing a pretty key signal about companies, and individuals doing cool stuff with APIs.
  • Twitter - The tweets, DMs, and links produce as part of a companies or media individuals social exhaust that is produced via Twitter.
  • Github - The repositories, commits, and interactions around public repositories maintained by individuals and companies.
  • Email - I’m still working on the best way to process email communications with companies and individuals, while also considering parsing email newsletter to balance out the shift from RSS to email blasts by startups.

If a company or individual has a blog, Twitter or Github account I pull any publicly available signals. I use these signals to stay informed using the blog posts, valuable conversations and links available on Twitter, and keep up to date with the latest API tech and code being developed on Github.

When I’m introduced to a new company or individual, I make sure they are entered into my system, giving me a more complete perspective across their world via their blog, Twitter, and Github accounts. I’ve realized that this approach has evolved my information consumption from being about feeds, to more about companies, the individuals who work at them, and the technology that is being developed.

I feel like this evolution is giving me a more clearer picture of the API space, beyond what just RSS could do.



from http://ift.tt/1uYZ3sP

Sorry Google, Your Programming Test Is Not A Valid Measurement Of My Skills

I’ve been talking with a very nice recruiter over at Google over the last couple weeks, and she has been so kind in keeping me updated about opportunities for evangelism at Google. This is the 3rd round of talks I've had with Google while being the API Evangelist, talks that historically go nowhere because of their programming test, which is a super silly aspect of their HR process.

I was straight up with the Google recruiter a couple of weeks ago when she first emailed me, and again when we talked on the phone last week—I do not take programming tests to open up doors for employment conversations, sorry. ;-( It is a waste of my time, and yours, and doesn’t measure shit. I understand that you have to qualify large number of folks, at your very algorithmic-centric company, but when it comes to measuring what I do, a programming test isn’t a thing.

If programming a tic-tac-toe game on a live screen share is what you need to open up a conversation with professionals around evangelizing your platform, you need to look elsewhere. Nowhere in my role as the API Evangelist do I have to code under time pressure with someone else watching, sorry. I would even say, having hacker skills, trumps programming skills in a public facing evangelism role, and speed, quality of code go out the window. This is about making connections through hacker storytelling, something that doesn't always pencil out to producing best code, and is more about helping people understand what is possible using a platform, in the most meaningful way—requiring more focus on the person and their problems, not the code or algorthm.

I’ve managed to have man very meaningful conversations with other tech giants like Intel, IBM, large institutions like UC Berkeley, BYU, and establish fruitful relationships with partners like 3Scale and API Spark, and across federal agencies like Department of Education, Energy, NASA, and the White House around APIs--all without taking programming tests. I talk to startups, SMBs, SMEs, organizations, institutions, and government agencies a lll the time, and never have to code under pressure, in front of an audience.

I’m not under the influence that I will change your hiring practices Google—you are a very smart, and successful company. All I’m saying is you are probably filtering out some pretty talented folks, who are extremely passionate and knowledgeable in what they do, and connected in their space, and when you won’t engage in meaningful conversations without a programming test, your missing out.

I actually prefer working with organizations from the outside in, I think it better reflects the essence of API Evangelism. The companies who have trouble working with outside entities, without tradition HR processes, are probably not going to lead when it comes developing an API driven ecosystem.

If your company doesn't have the time to research me, and understand what I bring to the API space, and what my skills are, we probably aren't a fit. Everything about me is available online at API Evangelist, Kin Lane, Twitter, and Github--you just have to look. If you are only looking at resumes, and making people take tests, you will probably get what you are looking for!



from http://ift.tt/1tqX7pb

Monday, August 18, 2014

All Government Should Have A Social Media Directory API

I was just looking for a list of Twitter accounts for the City of Chicago, and I came across the standard social media directory page, you will find at most city, county, and state government websites.

After looking at the list, I had a decision to make. I could either manually enter each of the twitter accounts into my CRM, or I could write a script to scrape the page, harvest the cotent and put into my CRM for me--I prefer to write scripts over data entry any day.

Here is the resulting JSON:

It got me thinking that every government entity, whether city, county or state should have a social media directory API like you find for the federal government. We should never have to scrape a list of our government social media accounts.

Once their is an API for each government entity, we can then drive existing websites, and other current locations with the API, as well as use in any new web or mobile apps.



from http://ift.tt/VBg1yd

Wednesday, August 13, 2014

Never Looking Out The Window, Let Alone Trusting Anyone External Of The Department of Veteran Affairs

I haven't written much about my experience last summer as a Presidential Innovation Fellow (PIF) at the Department of Veteran Affairs (VA). I have lots of thoughts about experience at the VA, as well as participating in the PIF program, and I choose to trickle these thoughts out, as I continue to make sense of them, and bring them into alignment with my overall mission as the API Evangelist.

I was given three projects when I started work at the VA: 1) Inventory data assets 2) Inventory web services 3) Move forward D2D, a forms web service that would allow VA hospitals and Veteran Service Organizations (VSOs) to submit forms through the claims process on behalf of veteran.

The most prevalent illness I witnessed across these three efforts was a unwillingness to trust outside groups (even VSOs and hospitals), and a lack of desire to share data and resources to anyone outside of the VA (ironically except contractors), to the point where groups seem to take defensive positions around what they did on behalf of our veterans. This culture makes for some pretty toxic environments, I personally feel contributing to much of the problems we’ve seen bubble up into the public media space of late.

While work at the VA you constantly hear about the VA claims backlog, and how we need to optimize, but when you bring up sharing data, or resources to other federal agencies, trusted external partners like hospitals, and VSO’s you get pushback with concerns of security, personally identifiable information (PII), etc. All which are valid claims, but there are proven ways to mitigate these risks through Identify and Access Management (IAM), which is another whole post in itself. You start feeling crazy when you get pushback for arguing that a doctor should be able to submit disability questionnaires via an iPad application, that uses an existing VA API, in a way that securely authenticates the doctor.

As a result of other system, cultural issues, and mistakes made in the past, VA employees and contractors are extremely adverse to opening up to the outside world, even if it can help. I kept hearing references to the 2006 data breach as a reason to keep systems locked down, where an employee brought a laptop home, affecting 26M individuals. This horror story, plus a variety of other cultural issues are keeping VA staff from accepting any new way of thinking, even if it could help reduce their workload, improve the claims process, and better serve the veterans and their families.

This is a pretty fundamental flaw in how large government agencies operate, that are in conflict with the solutions API can bring to the table. I don’t give a shit how well designed your API is, in this environment you will fail. Period. I do not think I will ever fully understand what I saw at the VA, while a PIF in Washington DC, but I feel like I’m finally reaching a point where I can at least talk about things publicly, put my thoughts out there, and begin my experiences as a PIF at the VA into my overall API Evangelist message.



from http://ift.tt/VlnQIc

The Color Of Money When Deploying APIs At The Department Of Veterans Affairs

I haven’t written much about my experience last summer as a Presidential Innovation Fellow (PIF) at the Department of Veteran Affairs (VA). I have lots of thoughts about experience at the VA, as well as participating in the PIF program, and I choose to trickle these thoughts out, as I continue to make sense of them, and bring them into alignment with my overall mission as the API Evangelist.

I just wrote a piece on replacing legacy systems at the VA using APIs, where one of the systemic constraints in place that restricts the modernizzation VA systems using API is purely about money, and more specifically the color of money. I won’t bore you with the detail of the topic, but in short the color of money is: Money appropriated for one purpose cannot be used for a different purpose, according to the Purpose Act (31 U.S.C. § 1301).

In short, if $50M was given to sustain an existing legacy system, and that money cannot be re-appropriated, and applied to the newer system, what incentive is there to ever get rid of legacy VA systems, or modernize any government system for that matter? Whether it is usng APIs, or anything else. Newer approaches to using technology are difficult to accept when you are working hard to accomplished your job each day, but if you already have $50M in budget to a specific job, and that job won’t go away, unless you choose to make it go away, guess what happens? Nothing changes…hmmm?

As I said before, I don’t give a shit if you deploy APIs from the ground up, or excite via a presidential mandates from the top down, if you have incentives in place for employees to do the opposite, based upon how money is allocated, you won’t be changing any behavior or culture—you are wasting your energy. I don’t care how excited I get any one individual, team or department about the potential of APIs bundled with new systems, if it means their job is going away—too bad, nobody will give a shit.

Think about this scenario, then consider that $1,810M of the 3,323M overall VA budget (54%) is sustainment. Granted this isn't all IT system sustainment, but still over half of the budget is allocated to keep shit the same as it is. Imagine what environment this creates for the acceptance of modernization efforts at VA.

This is a pretty fundamental flaw in how large government agencies operate, that are in conflict with the solutions API can bring to the table. I don’t give a shit how well designed your API is, in this environment you will fail. Period. I do not think I will ever fully understand what I saw at the VA, while a PIF in Washington DC, but I feel like I’m finally reaching a point where I can at least talk about things publicly, put my thoughts out there, and begin my experiences as a PIF at the VA into my overall API Evangelist message.



from http://ift.tt/1yy49tm

Taking Web Service Inventory At The Department of Veteran Affairs

I haven't written much about my experience last summer as a Presidential Innovation Fellow (PIF) at the Department of Veteran Affairs (VA). I have lots of thoughts about experience at the VA, as well as participating in the PIF program, and I choose to trickle these thoughts out, as I continue to make sense of them, and bring them into alignment with my overall mission as the API Evangelist.

One of the jobs I was tasked with at the VA as a PIF, was taking inventory of the web services within the agency. When asking folks where these web services were, I was directed to various IT Leads on different groups, each giving one or two more locations I could look for word, excel, or other PDFs talking about web services used in projects and known systems. Most of the time these were redundant lists, pointing me to the same 5 web services, and omitting 20-30 that were actually in use for a project.

At one point I was given the contact information for a lady who had been working for two years on a centralized web registry project, that would be the holy grail of web service discovery at the VA. This was it! It was what I was looking for, until I sat in on the weekly call where this project got a 10 minute update, demonstrating that the registry was still about defining the how and what of the registry, and never has actually moved to cataloging actual web services in the wild at the VA. ;-(

Then one day I was introduced to a gentlemen, that was in a back office, in an unmarked cubicle, who seemed to know where most of the web services were, the one difference with this person was that they were a contractor, and not an employee. One thing you hear about, but do not experience fully until you work in government is the line between government employee and contractor—in meetings, and conversations you know who is who (it is pretty clar), but when it comes to finding APIs, I’m sorry the contractors know where everything is at. This contractor had some pretty interesting lists of what web services were in operation, where they were, and which groups at VA owned them, including up to date contact info. These contractors also had their finger on the pulse of any project that was potentially moving the web services converations forward, including the global registry.

Overall I was surprised at how IT groups knew of their own web services, could care less about the web services of other groups, but contractors new where all the web service were across the groups. I was closing in on 500 web services on my list before I left during the shutdown, and I wonder how many else I would have found if I kept up the good fight. This mission had nothing do with API, except that web services are often compared to APis, I was purely taking inventory of what was already in place, a process that went far beyond just technical inventory, and shed light on some serious business and political flaws within operations at the VA.

This is a pretty fundamental flaw in how large government agencies operate, that are in conflict with the solutions API can bring to the table. I don’t give a shit how well designed your API is, in this environment you will fail. Period. I do not think I will ever fully understand what I saw at the VA, while a PIF in Washington DC, but I feel like I’m finally reaching a point where I can at least talk about things publicly, put my thoughts out there, and begin my experiences as a PIF at the VA into my overall API Evangelist message.



from http://ift.tt/1yy45tC

Replacing Legacy Systems With APIs At The Department Of Veteran Affairs

I haven't written much about my experience last summer as a Presidential Innovation Fellow (PIF) at the Department of Veteran Affairs (VA). I have lots of thoughts about experience at the VA, as well as participating in the PIF program, and I choose to trickle these thoughts out, as I continue to make sense of them, slowly bringing them into alignment with my overall mission as the API Evangelist.

On deck are my thoughts on replacing legacy systems with APIs, at the Department of Veteran Affairs. In the “real world”, one of the motivations for deploying APIs, is to assist in the evolution, and replacement of legacy systems. The theory is, you have older system that needs to be replaced, and you can wrap in a modern web API, and slowly switch any desktop, web, mobile or other client system to use the new API—then you build out newer backend system, and make the switch in the API layer from the legacy to the newer backend system, leaving everything operating as expected. API magic!

I'm used to hostile environments to this way of thinking, but most times in the private sector there are other business objectives that can be leveraged to get legacy system owners to get on board with a shift towards API deployment—I saw no incentive for this practice in the VA environment, where in reality there are incentives for IT, and business owners, as well as 3rd party contractors to keep legacy systems in place, not replace them. There are a variety of motivation for existing VA workers to keep existing systems in place, ranging from not understanding how the system works, to budgetary restrictions on how money flows in support of this pro-sustainment culture.

Here is an example. There is old database for storing of a specific type of document, a database that X amount of existing desktop, server, web, or even mobile systems depend on. If I move in and create an API, that allows for reading and writing of data into this database, then work with all X of the legacy systems to use the API instead of a direct database connection—in theory I can now work to dismantle the legacy database, and replace with a newer, modern backend database. In most IT operations, this approach will then allow me to replace, modernize and evolve upon an existing legacy system. This is a common view of technologists who are purely looking through a technical lens, and ignoring the existing business and political constraints that exist in some companies, organizations, insitutions and government agencies. 

In the real world, you have staff, budgets, workflows, and decision making processes that are already in place. Let’s say this legacy database had $50M a year allocated in budget for its operation, and I replace with a newer database, plus API, which operates for $10M a year—you’d think I get to reallocate the staff, budget, and other resources to developing newer mobile apps, and other system with my newly liberated $40M. Right? Nope…that money goes away, and those people have no interest in moving from supporting a COBOL system, to supporting a new MongoDB + Node.js API that is driving a Swift iPhone app. ;-(

This is a pretty fundamental flaw in how large companies, organizations, institutions and government agencies operate, that are in conflict with what an API philosphy can bring to the table. I don’t give a shit how well designed your API is, in this environment you will fail. Period. I do not think I will ever fully understand what I saw at the VA, while a PIF in Washington DC, but I feel like I’m finally reaching a point where I can at least talk about things publicly, put my thoughts out there, and begin to weave my experiences as a PIF at the VA into my overall API Evangelist message.



from http://ift.tt/1vKk0c8

Friday, August 1, 2014

Please Provide An Easy To Copy Logo And Description Of What You Do

I spend a lot of time looking at the websites of companies who are doing cool things in the space. I track on about 2000 companies in the API sector, and as part of this monitoring I add the company name, logo, brief description and usually their Twitter and Github account to my systems on a regular basis.

Using this information I will publish a company as part of any research I do across multiple API business categories like API design, deployment or management. If a company is doing something interesting, I need to be able to quickly find a good quality logo, and short, concise description of what the company does—something that is easier said than done.

You'd be surprised how hard it is to grab a logo when its in the CSS, and finding a single description of what the company does, something I usually have to go to Crunchbase or Angelist to find, and often have to write myself.

If you want to help people talk about your company and what you are doing, make it easy for them to find a logo, and description. Please don’t make us click more than once to find this information--trust me, it will go a long way in helping bloggers, and other people showcase what you are up to.



from http://ift.tt/1ncPQUX

Easy To Copy Logo And Description Of What You Do

I spend a lot of time looking at the websites of companies who are doing cool things in the space. I track on about 2000 companies in the API sector, and as part of this monitoring I add the company name, logo, brief description and usually their Twitter and Github account to my systems on a regular basis.

Using this information I will publish a company as part of any research I do across multiple API business categories like API design, deployment or management. If a company is doing something interesting, I need to be able to quickly find a good quality logo, and short, concise description of what the company does—something that is easier said than done.

You'd be surprised how hard it is to grab a logo when its in the CSS, and finding a single description of what the company does, something I usually have to go to Crunchbase or Angelist to find, and often have to write myself.

If you want to help people talk about your company and what you are doing, make it easy for them to find a logo, and description. Please don’t make us click more than once to find this information--trust me, it will go a long way in helping bloggers, and other outlets showcase what you are up to.



from http://ift.tt/1ncOdqc