Monday, February 23, 2015

Making Sense At The100K Level: Twitter, Github, And Google Groups

I try to make sense of which companies are doing interesting things in the API space, and the interesting technologies that are done by these companies, that sometimes take on a life of their own. The thing I wrestle with with constantly, is how do you actually do this? The best tools in my toolbox currently are Twitter and Github. These two platforms provide me with a wealth of information about what is going on within a company, or specific project, the surrounding community, and the relationships they have developed (or not), along the way.

Recently I’ve been spending time, diving deeper into the Swagger community, and two key sources of information are the @swaggerapi Twitter account, and the Swagger Github account, with its 25+ repositories. Using each of these platform APIs, I can pull followers, favorites, and general activity for the Swagger community. Then I come up against the SwaggerSocket Google Group. While there is a rich amount of information, and activity at the forum, with a lack of RSS or API, I can’t make sense of the conversation at a macro level, alongside the other signals I’m tracking on—grrrrrr.

At any time I can tune into the activity on Twitter, and Github for the Swagger community, but the Google Group takes much more work, and I have to go to the website to view, and manually engage. Ideally I could see Twitter, Github, and Google Group activity side by side, and make sense of the bigger picture. I can get email updates from the forum, but this applies from now forward, and gives me no context of history of the conversation within the group—without visiting the website.

Just a side rant from the day. This is not a critique of the Swagger community, just an outside view on the usage of Google Groups as an API community management tool. I use the platform for APIs.json and API Commons, but I think I might work on a better way to manage the community, one that allows outside entities to better track on the conversation. 



from http://ift.tt/1Eqi2jU

Sunday, February 8, 2015

Emails From People Saying Nice Things And Not Wanting Anything From Me

I process my thoughts through stories on my blogs, and often times you'll find me bitching about people and companies here on kinlane.com. Other times you'll find me waxing poetic about how nice people can be—welcome to my bipolar blogging world.

In this post, I want to say how much I like finding nice emails from people in my inbox, especially when they don’t want anything from me. Getting these nice notes from people, about specific stories, topics, or just generally thanking me for what I do, makes it all worth it.

Ok, I'll stop gushing, but I just wanted to say thank you—you know who you are.



from http://ift.tt/1A91Cee

Friday, February 6, 2015

An Archive.org For Email Newsletters Using Context.io

I’m not going to beat around the bush on this idea, it just needs to get done, and I just don’t have the time. We need an archive.org for email newsletters, and other POP related elements of the digital world we have created for ourselves. Whether we love or hate the inbox layer of our life, it plays a significant role in crafting our daily reality. Bottom line, we don’t always keep the history that happens, and we should be recording it all, so that we can pause, and re-evaluate at any point in the future.

I cannot keep up with the amount of newsletters flowing into my inbox, but I do need to be able to access this layer, as I have the bandwidth available to process. Using Context.io, I need you to create an index of popular email newsletter indexes that are emerging. I feel like we are seeing a renaissance in email, in the form of the business newsletter--something I don't always have the time to participate in.

During the course of my daily monitoring, I received an email from Congress.gov, about a new legislative email newsletter, something that seems like something I’d be interested in, but then immediately I’m questioning my ability to process the new information:

  • A specific bill in the current Congress - Receive an email when there are updates to a specific bill (new cosponsors, committee action, vote taken, etc.); emails are sent once a day if there has been a change in a particular bill’s status since the previous day.
  • A specific member’s legislative activity - Receive an email when a specific member introduces or cosponsors a bill; emails are sent once a day if a member has introduced or cosponsored a bill since the previous day.
  • Congressional Record - Receive an email as soon as a new issue of the Congressional Record is available on Congress.gov.

This is all information I’m willing to digest, but ultimately have to weight it alongside the rest of my information diet—a process that isn’t always equitable. If I could acknolwedge an email newsletter as something that I’m interested in, but only when I had time, I would be open to the adoption of a new service.

We need to record this layer of our history, something that our inboxes just aren’t doing well enough. I think we need a steward to step up, and be the curator of this important content that is being sent to our inboxes, and doesn’t always exist on the open Internet. Technically I do not think it would be too difficult to it using Context.io, I just think someone needs to spend a lot of time signing up for newsletters, and being creative in crafting the interface, and index for people to be able to engage with in meaningful ways, that people will actually find useful and pay for.



from http://ift.tt/1KoYwWh

Tuesday, February 3, 2015

A Machine Readable Version of The Presidents Fiscal Year 2016 Budget On Github

The release of the the president's fiscal year 2016 budget in a machine readable format on Github was one of the most important things to come out of Washington D.C. in a while when it comes to open data and APIs. I was optimistic when the president mandated that all federal agencies need to go machine readable by default, but the release of the annual budget in this way is an important sign that the White House is following its own open data rhetoric, and something every agency should emulate.

There is still a lot of work to be done to make sense of the federal budget, but having it published in a machine readable format on Github saves a lot of time, and energy in this process. As soon as I landed on the Github repository, clicked into the data folder, and saw the three CSV files, I got to work converting them to JSON format. Having the budget available in CSV is a huge step beyond the historic PDFs we’ve had to process in the past, to get at the budget numbers, but having it in JSON by default, would be even better.

What now? Well, I would like to make more sense of the budget, and to be able to slice and dice it in different ways, I’m going to need an API. Using a Swagger definition, I generated a simple server framework using Slim & PHP, with an endpoint for each file, budauth, outlays, and receipts. Now I just need to add some searching, filtering, paging, and other essential functionality, and it will be ready for public consumption--then I can get to work slicing and dicing the budget, and previous years budgets in different ways.

I already have my eye on a couple D3.js visualizations to help me make sense of the budget. First I want to be able to show the scope of budget for different areas of government, to help make the argument against bloat in areas like military. Second, I want to provide some sort of interactive tool that will help me express what my priorities are when it comes to the federal budget--something I've done in the past.

It makes me very happy to see the federal government budget expressed in a machine readable way on Github. Every city, county, state, and federal government agency should be publishing their budgets in this way. PDF is not longer acceptable, in 2015, the minimum bar for government budget is a CSV on Github—let’s all get to work!



from http://ift.tt/1DCnz46

Saturday, January 31, 2015

My Smart Little (Big) Brother And Programmatically making Sense Of PDFs

I was in Portland, Oregon a couple of weeks ago, and one of the things I do when I visit PDX is drink beer with my little (big) brother Michael (@m_thelander). He is a programmer in Portland, working diligently away at Rentrak. Unlike myself, Michael is a classically trained programmer, and someone you want as your employee. ;-) He’s a rock solid guy.

Anyhoo. Michael and I were drinking beer in downtownt Portland, and talking about a project he had worked on during an internal hackathon at Retrak. I won’t give away the details, as I didn’t ask him if I could write this. :-) The project involved the programmatic analysis of thousand of PDFs, so I asked him what tools he was using to work with PDFs?

He said they were stumbling on the differences between the formatting of each PDF, and couldn’t get consistent results, so they decided to just save each page as an image, and used the tesseract open source OCR engine to read each image. Doing this essentially flattened the differences between PDF types, giving him additional details provided when you use tesseract.

It may not seem like much, but ultimately it is a very interesting approach, and as I continue doing big data projects around things like patents, I’m always faced with the question—what do I do with a PDF? I will have to steal (borrow) from my smart little brothers work and build a tesseract API prototype.



from http://ift.tt/1yYL8Es

Wednesday, January 28, 2015

Why Are You So Hard To Get A Hold Of?

This is another post in my ongoing series of regular responses I give to people. Meaning when I get asked something so much, I craft blog posts that live on kinlane.com, and I reply to emails, tweets, etc. with a quick link to my standardized responses.

One I get less frequently, but still enough to warrant a response to, “why are you so hard to get a hold of?"

To which the answer is, "I’m not". I have a phone number that are very public, I have 3 emails all going into same inbox, a highly active Twitter, LinkedIn, Facebook, and Github presence. If you are having trouble getting a hold of me, it is because you are not using the right channels, or potentially the right frequency.

First, I don’t talk on the phone. I schedule meetings, increasingly only on Thursdays (regularly for partners, etc.), where i talk on skype, ghangout, and occasionally the phone. When I talk on these channels, I can do nothing else. I can’t multi-task. I am present. If I did this all the time, I wouldn’t be the API Evangelist—I’d be that phone talker guy.

Second, I respond well to quick, concise emails, tweets, wall posts, and github issues. Shorter, the more concise the better. This is what I mean by frequency, if you send me a long-winded email, there is good chance it could be weeks or even never that will respond. Sorry, I just don’t have the bandwidth for that frequency—I use short, precise signals.

I do not have a problem with someone being a “phone person”, but I’m not, sorry. In my experience people who require lots of phone calls, also require lots of meetings, and often shift in their needs, because it isn’t anchored to any specific outline, document, or project requirements. Personally I try to avoid these types of personalities, because they have proven some of the least efficient, and most demanding relationships in my professional life.

Please don't take this message the wrong way, I'm trying to help you be as successful as you can in making the right connection.



from http://ift.tt/1tvcNME

There Is A Good Chance That I Will Be Remembered For What You Did, Because I Told The Story

My friend Matthew Reinbold (@libel_vox) wrote a great piece on his blog titled, Storytelling and The Developer’s Need To Communicate, reflecting on an un-conference session I did last July at API-Craft in Detroit. Thanks for the great thoughts on storytelling Matt, something that is super infectious, and has reminded me a related story, which I hope continues to emphasize the importance of storytelling in API space.

Another one of my friends that I thoroughly enjoy swapping stories with at API conferences, and in the dark corners of bars around the world, is Mike Amundsen (@mamund). Now I may have the name wrong, but one time Mike told me a story about how John von Neumann (correct me if I’m wrong Mike), is known for a lot of ideas that he didn’t necessarily come up with on his own. He was just such a prolific thinker, and storyteller, which allowed him to process other people’s ideas, then publish a paper on the subject before anyone else could. Some people would see this as stealing of ideas, but one can also argue that he was just better at storytelling.

While I have developed many of my own ideas over the years, much of what I write about is extracted from what others are up to across the API space. I have made an entire career out of paying attention to what technologists are doing, and telling a (hopefully) compelling story about what I see happening, and how it fits into the bigger API picture. As a result, people often associate certain stories, topics, or concepts to me, when in reality I am just the messenger—something that will also play out in the larger history, told in coming years.

I’m not that old, but I’m old enough to understand how the layers of history lay down, and have spent a lot of time considering how to craft stories that don’t just get read, but they get retold, and have a way better chance of being included in the larger history. As Matthew Reinbold points out, all developers should consider the importance of storytelling in what they do. You don’t have to be a master storyteller, or super successful blogger, but your ideas will be much better formed if storytelling is part of your regular routine, and the chances you will be remembered for what you did, increases with each story that you tell.



from http://ift.tt/1CzDYaY

Tuesday, January 27, 2015

Cybersecurity, Bad Behavior, and The US Leading By Example

As I listened to the the State of the Union speech the other day, and stewed on the topic for a few days, I can’t help but see the future of our nations cybersecurity policy through the same lens as I view our historic foreign policy. In my opinion, we’ve spent many years behaving very badly around the world, resulting in very many people who do not like us.

Through our CIA, military, and general foreign policy we’ve generated much of the hatred towards the west that has resulted in terrorism even being a thing. Sure it would still exist even if we didn’t, but we’ve definitely fanned the flames until it has become the full-fledged, never-ending profitable war it has become. This same narrative will play out in the cybersecurity story.

For the foreseeable future, we will be indundated in stories of how badly behaved Russia, China, and other world actors are on the Internet, but it will be through our own bad behavior, that we will fan the flames of cyberwarfare, around the world. Ultimately I will be be reading every story of cybersecurity in the future, while also looking in the collective US mirror.



from http://ift.tt/1uZSQyK

Thursday, January 22, 2015

I Judge API Providers On How Much Value They Give Back vs. What They Extract

There are a number of data points I evaluate people and companies on while monitoring the API space, but if I had to distill my evaluation of companies down to one things, it would be based upon how much value they give back to the community vs. how much they extract.

You see some companies are really good about providing value to the community beyond just their products and services. This is done in many ways, including the open sourcing of tools, creation of valuable resources like white papers and videos, or just being active in sharing the story behind what they do.

Then there are companies who seem to be masters at extracting value from developers, and the wider API community, without ever really giving back. These companies tend to focus specifically on their products and services, and rarely share they code, knowledge, or other resources with the wider API space.

I’m not going to name specific examples of this in action, but after four years of operating in the space it is becoming easier to spot which camp a company exists in--you know who you are. I understand companies have to make money, but I’m totally judging companies across the API space based upon how much value they give the community vs how much they extract during their course of operation.



from http://ift.tt/1yLGdWc

Wednesday, January 21, 2015

When Will My Router Have Docker Containers By Default?

This is something I’m working on building manually, but when will the wireless router for my home or business have Docker container support by default? I want to be able to deploy applications, and APIs either publicly or privately right on my own doorway to the Internet.

This would take more work than just adding storage, compute, and Docker support at the router level. To enable this there would have to be changes at the network level, and is something I’m not sure telco and cable providers are willing to support. I’ll be researching this as a concept over the next couple months, so if you know of any read-to-go solutions, let me know.

It seems like enabling a local layer for docker deployment would make sense, and help move us towards a more programmable web, where notifications, messaging, storage, and other common elements of our digital lives can live locally. It seems like it would be a valuable aggregation point as the Internet of Thing heats up.

I could envision webhooks management, and other Internet of Things for household automation living in this local, containerized, layer of our online worlds. This is just a thought. I’ve done no research to flush this idea out, which is why its here on kinlane.com. If you know of anything feel free to send information my way.



from http://ift.tt/1E4bY16

Machine Readable Format For Migrating Concepts From Dreams Into The Real World

Obviously I’m working with APIs.json and Swagger a little too much, because it has really started to affect my dreams. Last night I had a dream where I was working with a university research team to define a machine readable format for migrating concepts from the dream world into the physical world.

I’m not sure I want this machine readable, but regardless it was a fun dream, and I wasn’t worried about this in the dream, so I guess it is ok. In the dream I was able to go to sleep and dream about a concept, then wake up and apply the same concept in my regular day via my iPhone. It allowed me to pick and choose from a notebook of things I had experienced in my dreams, and then apply in my daily life as I chose.

This post lives in the grey space between my fictional storytelling, and my API Evangelist storytelling, so I’ll leave it here on Kin Lane. If you are planning a startup in this area, let me know. ;-)



from http://ift.tt/15az6vw

Thursday, January 8, 2015

Internet Of Things Security And Privacy Will Always Begin With Asking If We Should Do This At All

As I read and listen to all of the Internet of Things stories coming out of CES, I’m happy to be hearing discussions around privacy and security, come out of the event. I feel better about IoT security and privacy when I hear things like this, but ultimately I am left with overwhelming concern about of the quantity of IoT devices.

There are many layers to securing IoT devices, and protecting the privacy of IoT users, but I can't help but the think that Internet of Things security and privacy will always begin by asking ourselves if we should be doing this at all. Do we need this object connected to the Internet? Are we truly benefiting from having this item enabled with cloud connectivity?

I'm going to try and keep up with tracking on the API layer being rolled out in support of IoT devices, but not sure I will be able to keep up with the number of devices, and the massive amount of hype around products and services. At some point I may have to tap out, and focus on specific aspects of IoT connectivity ,around what I consider the politics of APIs.



from http://ift.tt/1Dr71iv

Wednesday, January 7, 2015

Information Sharing And Collaboration In Government With The 18F Jekyll Hub Template

I’m a big fan of Jekyll based sites. All of the API Evangelist network runs as over 100+ little Jekyll sites, within Github repositories, via Github Pages. This is more than just website technology for me, this is my workspace. When you come across a half finished listing of contacts, or building blocks for a particular industry, or possibly a story that isn't fully edited—this is all because you are wandering through my API industry workshop. (pardon the dust)

Over the holidays, my girlfriend Audrey Watters (@audreywatters) has completed her migration of Hack Education and her personal blog Audrey Watters, to a Jekyll based mode of operation. Read her own thoughts about the new found freedom Jekyll is giving her over her content, data, workflow and the publishing of her projects—she is pretty excited.

Like APIs, a Jekyll approach to projects is way more than the technology. It is hard to articulate to folks the freedom, collaboration, flexibility, and transparency it has the potential to  introduce. It is something you have to experience, and see in action before you can fully understand, but I also have to ackknowledge that the transparency introduced by this way of working will not be for everyone.

I originally learned what I know about Jekyll from watching leaders in the federal government space, most specifically Development Seed, and round one Presidential Innovation Fellow, and now full-time Githubber Ben Balter (@BenBalter). Continuing this trend, it makes me happy to see 18F, out of the GSA, providing the 18F Hub, “a Jekyll-based documentation platform that aims to help development teams organize and easily share their information, and to enable easy exploration of the connections between team members, projects, and skill sets.” The 18F Hub is similar to the Developer Hub templates that 18F published, but I think holds a lot of potential in helping on-board a non-developer audience to the concepts of Jekyll,and  Github—hopefully making the social coding platform a little less intimating.

I do not think Jekyll and Github is for everyone. I’m not in the business of evangelizing one platform to rule them all, but I do think Jekyll itself, whether you run on Github, Amazon S3, Dropbox, or your own hosting or internal network environment, is a powerful tool for any project. I’m eager to keep an eye on what agencies put the 18F Jekyll templates to use, because it will signal for me that there are other healthy things going on at the agencies that do.



from http://ift.tt/14vZpNx

Tuesday, January 6, 2015

Playing Around With Jekyll Job APIs To Manage My Github Pages

I’m playing around with a concept right now that I’m calling "Jekyll jobs". As you may know, all of my websites use Jekyll, and run on Github Pages. Currently I have over 100 separate repos, and managing the content, and data across these repos can get complex.

I use a standardize approach I call “hacker storytelling” for publishing each of my projects, so I have a handful of things I need to update, ranging from the look of the site, to changes across all Jekyll posts, or pages. To help me keep things orderly and efficient I’m considering a lightweight, API driven, jobs framework to help me manage.

I am looking to make many of these “jobs” available to my girlfriend as well, allowing her to target specific resources, with specific jobs. Some of the jobs I’ve outlined are:

  • Link Management - Help me catalog, and manage the health of links that are used across all blog posts. A lot of links change, go bad, or any other numerous illnesses that occur.
  • Image Management - Help me catalog, optimize, and manage images that are used in my blog posts. I’m pretty good about manually doing a lot of this, but I sure could use help.
  • HTML Management - Sometimes HTML code gets ugly, either because I wrote it and didn’t give it the attention it needed, or possibly because it was generated out of another system, either way there is cleanup and maintenance from time to time.
  • Content Management - After I write a post I like to constantly re-evaluate tagging, indexing, and providing additional links to related content.
  • Content Indexing - My search across all of my Jekyll drive sites is not the best, and I’d like a way I can index all, or specific collections, and serve up as simple search API, maybe using ElasticSearch or something.

As more of my world runs as small, modular, Jekyll projects, I’m needing a way to run jobs against them, and designing APIs that do what I need, and use the Github API to work with my Jekyll site content, makes sense. I’m thinking I will just pass a Github user, and repo name, as parameters to each Github job API, and have it run a specific task against my _posts folder in the Jekyll install.

Since I’m designing these Jekyll jobs as APIs, I can run each one as an API request, and keep the job logic separate from each project. I’ll get a couple of these setup, than blog more about the pros and cons of this approach-who knows it may not be what I need to get it done.



from http://ift.tt/1FkY5x0

The Rebooting Of WordPress With Just Page, Blog, Image, Link, and Comment APIs

I’m in the process of moving from a custom version of my website, and blog manager, a newer version. Back in 2011 I wrote my own custom CMS, as I migrated Audrey and I off WordPress, to deliver more security (obscurity) into our world. As I look to continue the evolution of my CMS, I’m leaving everything behind, and just launching APIs, and working from there to build exactly the user interface I will need to manage my world.

Even though I had moved my blog(s) from WordPress three years ago, there was still some serious WordPress residue on everything. Many earlier blog posts have very WordPress-esque HTML, and the graphical template I used was originally a WordPress theme, so there was HTML, comments, and many other fingerprints of the early WP template in there.

As I work through this process, I think of WordPress, and how they were considering putting in a full API with version 4.1 release. I don’t see any evidence of it on there, so I can only assume they pushed back its release. I don’t blame them, after talking with them about the challenges they face, I can imagine it is taking more work that you can imagine.

I can’t help but think about a WordPress reboot. In my world, I hate legacy code, and technical debt. I very willing to just throw everything away, and start over—except there is one small difference, I’m not a platform with 65 million users.

However let’s imagine you could! Just reboot WordPress by launching six simple APIs:

  • Pages
  • Posts
  • Links
  • Images
  • Comments

Then let the ecosystem build everything else. Create the first public, and admin UI. Then go from there. Use the brand, the momentum, and the community to reboot, and redefine the popular CMS platform. I think in just a couple of years, you’d see WordPress looking something like SalesForce or Heroku.

For me personally, I like the freedom that comes with using APIs. It makes it easy to decouple legacy code, and evolve small, or even large parts of what I do. Another aspect in which I am very fortunate to do what I do for a living. I think back over my career and all the legacy code bases I’ve had to herd around like cattle, and I am much happier in my current world of APIs.



from http://ift.tt/1DgwafN

Friday, January 2, 2015

My Unique Visitors and Page Views For API Evangelist Between Google And CloudFlare

I’ve been running all of my websites using CloudFlare since this last Thanksgiving weekend. I pointed all of my name-servers for my primary domains like apievangelist.com and kinlane.com to CloudFlare, and I use them manage my DNS, and other related operations of my primary websites.

I’m intrigued by the reporting at the DNS level provided by CloudFlare, compared to the reporting at the page level provided by Google Analytics. I’ve had Google Analytics installed on all of my websites since I first launched, and use it to establish the estimates for the daily and monthly visitors to my websites—beyond that I really don’t care much about these analytics.

Regardless I think it is interesting to look at CloudFlare numbers for the last 30 days:

  • Regular Traffic: 112,241
  • Crawlers/Bots: 55,540
  • Threats: 1,697
  • Unique Visitors: 34,501
  • Page Views: 169,478

Then look at the Google Analytics number for the last 30 days:

  • Sessions: 22,569
  • Users: 17,880
  • Page Views: 38,949

Ultimately you can only compare the CloudFlare unique visitors, and Google Analytics users—these are the only two numbers that are comparable in my opinion. I don’t think CloudFlare removes crawlers/bots from page views, something Google does by default I’m assuming—rendering page views as a very different beast for each service.

I take away two things from this. 1) How meaningless data points are, unless you believe in them. 2) How data points can differ from provider to provider, and at different levels of your architecture. If you ask me what my page views are for API Evangelist, what do I say? You didn’t ask me whether it was my CloudFlare or my Google Analytics page views!

When I think about the millions of hours spent in Google Analytics dashboards across numerous industries, and the companies I’ve seen spending millions in Adwords for their advertising, all based upon numbers derived from this constructed reality, that we’ve collectively bought into—I am blown away.



from http://ift.tt/1xh3tft

Tuesday, December 30, 2014

This Reflects What It Felt Like For Me To Work In Washington D.C. Each Day

I was looking through President Obama's 2014: A Year in Photos today, and while many of the pictures evoke emotion for me, but this one in particular really summed up for me, the very short time I spent in Washington D.C., as a Presidential Innovation Fellow.

The number one lesson I walked away with from my time in Washington D.C., was a respect for the scope that exists is DC. Everything is big. Everything is under scrutiny. Everything is operating at a scale, I never had experienced before. If you think you get it, and have never worked there--you know nothing.

I respect anyone who can actually get ANYTHING done in this evironment--knowing this, I understand that my role is purely from the outside-in. I'm not saying everything there has the best possible motives, but you have to respect anyone getting ANYTHING done in an environment, where everything you do is being so heavily scrutinized.



from http://feedproxy.google.com/~r/KinLane/~3/AtZCmnVRZDI/this-reflects-what-it-felt-like-for-me-to-work-in-washington-dc-each-day

Please Provide Me With More Information Before We Speak On The Phone

As an independent operator, I have to be very thoughtful about how I spend my time. With this in mind, it is helpful for me to have a standard response that I can give to people who make requests for phone conversations.

If we do not have a prior relationship, or a referral from someone I know well, the chances I’ll just jump on a call are slim. Please provide me with as much information on what you are up to, in as short, and concise way as you possible can.

I’m just looking for a title, executive summary and some supporting links to prime the pump. I’m happy to make time, but I need some sort of way to make sure what you need is a fit for you, and for me. I get a lot of folks who don’t quite understand what I do, and if I responded to every request, I'd be on the phone all day--thus I wouldn't be the API Evangelist anymore. ;-(

I appreciate your understanding. Additionaly I find this request helps people articulate their ideas and needs better, making the time we do spend on the phone, much more productive for both of us. I look forward to hearing more about your idea!



from http://feedproxy.google.com/~r/KinLane/~3/4-zRMQhiCws/please-provide-me-with-more-information-before-we-speak-on-the-phone

Please Provide Me Information Before You Ask To Speak On The Phone

As an independent operator, I have to be very thoughtful about how I spend my time. With this in mind, it is helpful for me to have a standard response that I can give to people who make requests for phone conversations.

If we do not have a prior relationship, or a referral from someone I know well, the chances I’ll just jump on a call are slim. Please provide me with as much information on what you are up to, in as short, and concise way as you possible can.

I’m just looking for a title, executive summary and some supporting links to prime the pump. I’m happy to make time, but I need some sort of way to make sure what you need is a fit for you, and for me. I get a lot of folks who don’t quite understand what I do, and if I responded to every request, I'd be on the phone all day--thus I wouldn't be the API Evangelist anymore. ;-(

I appreciate your understanding. Additionaly I find this request helps people articulate their ideas and needs better, making the time we do spend on the phone, much more productive for both of us. I look forward to hearing more about your idea!



from http://feedproxy.google.com/~r/KinLane/~3/qxLQdJ54nnM/please-provide-me-information-before-you-ask-to-speak-on-the-phone

Tuesday, December 23, 2014

When A Developer Does Not Understand What API Evangelist Is

I do not expect everyone to immediately know who I am, and fully understand my mission behind API Evangelist. However I do find it interesting when people have entirely skewed views of who I am, what I do, and then after they meet me, make a 180 degree shift in their perception of API Evangelist.

This post is about one recent encounter I had with a developer at an event. This very nice developer has worked in the API sector for a while, and is very knowledgeable about APIs, and is very aware of who I am, and my presence as the API evangelist, either from co-workers, or the larger web. When I first talked to him in a larger group, someone said do you know Kin? To which they replied yes, they were aware of who I was, but never met me, and didn't seem very interested in a deeper introduction or conversation. They had clearly made up their mind who I was, and what it is that I do, and sent out tones that I was not much more than a blogger. This doesn't happen all the time, but regularly enough that I feel compelled to write about it.

Spanning a couple of days this developer was in various group conversation I participated in, and at no point did they seem interested in engaging me, acting very disinterested, and walking away several times. Now I really have no way of knowing how they felt, or if there is something at play, but I've experienced enough to know these developers are really smart, and often times feel what I do isn't technical enough to rise to the occasion—I know this because many developers have told me this flat out, but in this particular case that hadn't happened.

What did happen is after about 7 of these types of engagement, this developer heard me talking about my vision around the Oracle vs. Google case, and my larger vision about API discovery across the Internet, and at some point during this conversation their energy towards me shifted entirely and became much friendlier, and engaging. After this conversation, they sought me out for further conversations, followed me on Twitter and worked really hard to initiate discussion with me in several other areas.

The message here is that you really shouldn't make assumptions about people in the space until you've done your homework, or quite possibly met them in person. This is something that I think developers are very poor at. I experience this online regularly, and offline less frequently. Someone lands on my site, reads one posts, maybe two, and makes some pretty radical assumptions about who I am, and what I do based upon this limited understanding. I can see how my title of “API Evangelist” might seem superficial to the untrained eye, but once you get to know me you will understand how passionate I am about APIs, something I hope will be contagious, or at least help you understand more about me, and my cause.



from http://ift.tt/1AEFzvf

Thursday, December 18, 2014

When Apps You Love Lose Their Utility

With the latest version of Evernote, I’m beginning to look for the next tool for managing my notes. I live and breathe in my Evernote. I am writing this post in there. I depend on the easy note taking via my laptop, mobile phone and tablet. Evernote is the heartbeat of my writing, and I write everything from email, to blog posts, and white papers in my Evernote, then publish to the appropriate channels once ready.

The last version changed the layout, added chat, and recommendations for outside related sources, to name a few of the most prominent feature changes I'm stumbling over. Some repetitive tasks that were one click before, now take me two or three clicks, making my organization of my writing much more difficult. The introduction of chat is not only useless to me, it actually invades my private writing sanctuary and just bothers me everytime I see the button at the top.

As I evaluate what it will take to migrate from the platform, I’m unable to get an API key, it just throws an error every time I ask for one. I submitted a ticket, and will publish a video of the problems I was facing at some point. I exported a notebook as HTML, just to see what was possible for migration from the interface, and the amount of garbage in the HTML is going to create a lot of extra work for me when converting.

We all seem to be infatuated with the constant march forward of technology, and it is something I am able find harmony within, and making sure my content, data, and other assets are well defined, and portable is an important part of this. I know Evernote has a wider audience to please than just me, but I’ve been a paying customer since 2011.

It makes me sad, but moving on for me has become esaier than ever, and I don't have time to dwell on break-ups--I just move on.



from http://ift.tt/1wpgZx8

The Future Internet Will Consist Of Many Affinity Networks

As much as I wish for the Internet to remain the open, accessible, neutral, distribute platform it has been since birth, I’m often faced with the reality that net neutrality will lose, in the grip of the powers that be. You see, the AT&T, Verizon, Comcast, and other powerful corporate actors in the world do not want it to be open, they want to be able to meter anything people want on the Internet, and maximize revenue, and mimic the existing power flows that exist in the regular world.

I feel like the Internet as it occurred was an accident. Something that these corporate giants didn't notice until the Internet was going full tilt, and had become part of the mainstream consciousness. Now that they have their sights set squarely on generating revenue from the Internet, things will change. Some of these evolutionary events being high profile shifts, while most of it will happen silently, put into legislation without anyone noticing, and occurring behind the boardroom doors that the public doesn't have access to.

We have the technology to work around this, we just need the will, and ultimately I believe in humans, and the power they wield in being able to work around roadblocks and challenges put in front of us. The AT&T, Verizon, and Comcasts of the world will be busy building their fast lanes, charging access on both ends, ensuring their partners content and data are visible, and making sure every possible dime is squeezed out of customers. As technologists, we need to continue building out our version of the Internet, using mesh networks, and other emerging alternative network technology.

While the motivation for large corporations will be money, and they will build networks to meet this vision, our motivation will be based upon the affinity we have with our family, friends, and professional networks. We will need to build out nodes to support our agricultural networks, music communities, and the numerous other levels in which we share affinity. We need to encourage our networks to become network nodes, and ensure our packets, bits and bytes traverse only these networks, unencumbered by the corporate traffic cops that will be spread around the globe in coming years.

Just as Tim Berners Lee, and other grandfathers, and grandmothers of the Internet did, we will have to innovate, and work hard to develop the next generation Internet. One that evades the gaze of our corporate Sauron, and stays one or two steps ahead of the corporate interests that may think they are good, but do not have the collective Internet, or world in mind.



from http://ift.tt/1wpgSBP

Wednesday, December 3, 2014

Sorry I Do Not Want To Take Your Survey

I get a number of emails, instant messages, and other channel requests to take surveys. All of which I say no thank you. I don’t take surveys, I don’t care what you offer me, I hate the format, and refuse to take part. Surveys remind me of taking tests in school or the corporate work environment, where people were trying to measure my knowledge, performance or otherwise.

I’m sure that survey’s work in many environments, where you can incentivize folks to fill out your form. I’m sure some of the answers are even what you are looking for, but I’m sure there are many other folks like me who do not give a shit about jumping through hoops to answer questions, even if it is for an important cause.

In 2014 I can't help but think there are better ways of doing surveys, and wish someone would come along and do something about it. I don't mind being asked questions, but in my busy day I do not have time, or interest in filling out a long questionnaire. Maybe you could do your survey over the course of a couple days or weeks, via Twitter or other social media channels.

Seems to me that you could easily target a group of individuals via social media, populate a list of questions you are looking to have answer, then time questions so they get asked in a simple, conversational manner that is not intrusive, or disrupts my world. I'd be happy to answer your questions (well not always), but for many companies, brands, and on interesting topics I'd be happy to help in a way that fit better with my flow.



from http://ift.tt/1rXWsu0

Tuesday, December 2, 2014

No Mojo For Writing On The Road

I’m sure some of you are happy when I go on the road, because the number of blog posts I publish goes significantly down. I get a lot of folks who jokingly complain about the volume I write, and state it is difficult for them to keep up sometimes. Not sure how I can help you with this one, maybe better read it later tools, or ask your boss to carve out reading time at work on a regular basis. ;-)

When I am on the road I find it very difficult to find the "mojo" to write. Ever time I come home from a trip I will have numerous Evernote entries derived from random ideas I've had traveling. There are no shortage of ideas while I roam the world, but the ability to actually think an idea through, flush it out, and tell the full story is really, really difficult for me when I'm in motion. Many managers I've had over the years, often consider writing to be an “easy” task—"just write those 500 words about that topic, it is just words in a certain order, right?" #nottrue

There are some posts that I write that don’t take much energy, but many posts Ineed write in a place that I just can't always access when traveling, worried about making my flight, finding a taxi, networking for dinner, or just plain too tired to even have a deep thought. This is why, when I come home you will find a serious uptick in the number of blog posts I publish, because I'm finally able to settle into my "happy place", where there is endless amounts of “mojo” to tap, when it comes to telling API stories.

This makes me happy to once again be home, and is why I will continue to reduce the amount I travel in 2015, because I feel like my writing is much more important to my own well-being, something I hope contributes to the overall health of the APi space.



from http://ift.tt/1yLDRqP

Monday, December 1, 2014

Flipping Through The Online Channels Each Day

I was give my own color TV when I got a Commodore 64 for Christmas in 1980-something. Ever since then, I've had a noisy backchannel in my life. I don’t just like a secondary channel, I often need it to be able to focus on the primary channel--it is just how I operate.

As I reach 20 years on the Internet I can't help compare my current reality with my previous lives engaging with technology. I remember first getting my satellite access in I think 1982 or 1983. It was a big-ass-dish-out-in-the-yard setup, not the little-ass-dish-on-roof you have now.

Anyways, I've spent the last 30 years flipping through the channels of my television, ham radio, dial-up Internet, dsl, broadband, cellular connection. Which channel do I find what I’m looking for? Which frequency actually reaches me? What channel is the most relevant for the time period? There are a lot of questions, and only static answers—I demand dynamic. No real-time. No predictive. D) All The Above.

I monitor thousands of frequencies across about 10 channels, each day. I pay attention to RSS, Twitter, Github, Email, and other lesser traveled channels, looking for the signals that matter. The more I experience, the stronger the patterns I look for come into focus. Some things catch my eye, other things do not.

I’m difficult to reach, unless you know the channels. My daughter knows how to ping me on multiple channels, and various frequencies, and my mother does not. I try to manage the balance across the channels, and frequencies, but if I can't find the right modulation—it ends up being random.

How will we ever find harmony in the channel in which we receive our information? How will we ever know the proper channels to reach those that matter? Our reality is physical, but our minds are moving digital, and we can't keep up. How do we identify the right algorithm for flipping through the channels day, and maintain the sanity needed to keep a forward motion.



from http://ift.tt/1ya6Lko