Thursday, April 17, 2014

Join Me For The Kin Lane Show At API Days in Berlin May 5th And 6th

I attended a conference in Vancouver British Columbia last week. Well I didn’t actually attend or speak at the conference, I was just there to support my partner in crime Audrey Watters(@audreywatters). If you have seen me at events, you know Audrey and I usually travel together, and support each other at events.

While I just worked in the room and lobby, while she attended the conference, I took notice of her approach to delivering her unique keynote. She was the final keynote of the two day event, and her talk was crafted throughout the two days of watching other speakers, and interacting with attendees. On the final day she provided a summary of what she heard and saw, and how it was aligned or misaligned with the overall ed-tech space—the world she covers.

After watching this process, I thought I might give the approach a try. While I’m too busy at API Strategy & Practice organizing and MC’ing, this is something I could do at API Days. I remembered that I actually hadn’t provided a talk title or abstract to API Days in Berlin, they know I’ll deliver something right? So I visited the website to see where they had me in the schedule.

I visited the page, scrolled down, down, and what do ya know? I’m at the end of the first day. Perfect opportunity for providing a recap of day one, which I see many topics of which I can speak to as part of the big API picture, but also craft a talk on the fly, hacked together from what I’m hearing from other speakers and attendees—and call it the Kin Lane Show.

Great name API Days, I couldn't have come up with better title myself. I look forward to seeing y'all in Berlin in couple of weeks.



from http://ift.tt/1i0J5mi

I Will Be Talking Business of Internal APIs in Vegas at IBM Impact

The biggest impact APIs will have at your company will be the internal, cultural change regarding how you do business. We are in the middle of an explosion of APIs, and while there are many new public APIs emerging, the majority of growth is coming from the deployment of internal APIs.

There is a lot that companies can learn from the open API movement over the last fourteen years, with many building blocks, and healthy practices that can be applied when internally deploying APIs.

This is a topic I will be exploring on API Evangelist, and at the IBM Impact conference at the Venetian in Las Vegas, NV—April 27th through May 1st. The abstract from my talk, titled "Business of Internal APIs” is:

The largest area of growth in the last couple years of APIs is internally within the enterprise. There are numerous lessons that can be extracted from the world of public APIs, and applied internally within any company, helping increase agility, efficiency, while also helping develop new business lines. When it comes to APIs, the secret to success isn’t all about technology, and this session will help introduce you to the business of internal APIs, extracted from the approach of leaders in the space.

In my experience there are fundamental elements that make APIs successful, where more classic approaches like service oriented architecture (SOA) fall short. When companies are deploying APIs for internal consumption, you have to consider elements like self-service access, up to date API docs, code samples, and the human elements, that have made public API deployments successful.

Even though internal APIs consumers will have some unique needs that your partner or public API consumers won't, there will also be many similar patterns across your internal, partner and public API customers that you will need to leverage. If you can’t be in Vegas at the end of the month to hear me talk, just tune into API Evangelist and I’ll make sure I publish everything online for consumption by all my readers.



from http://ift.tt/1qRYsoi

Monday, April 14, 2014

Your Password Management Process For Your App Sucks!

I just went through 52 services that I depend on, and experienced 52 different ways to manage passwords—most of which sucked! It seems that each service has their own way of allowing you to change your password.

It would be nice if there were UX blueprints that companies could follow when offering standard features like password management. At least it would be nice if developers went and experienced 10-20 password flows within existing leading apps, before designing their own.

In 2014 there are some pretty strong patterns that have emerged for application user experience, I think we just need more places where we can browse and experience the best patterns out there, so we can emulate them when developing our own apps.



from http://ift.tt/1qC5TQl

Saturday, April 12, 2014

Reclaiming My Domain

After the recent heartbleed security fiasco, I’m spending my weekend going through my list of online services that I depend on, changing my passwords, and along the way I’m going to reclaim as much of my domain as I can.

I will be asking some questions of each online service, questions like, why do I use this service? does this service have an API? Questions that will help me establish a profile of that service, to better understand how I use it, and whether there is any valuable content or information I should be organizing in a better way.

I produce a lot of content each day, and on the surface it seems like I maintain control over most of this, but in reality my content resides in online services like Twitter, LinkedIn, and other places that I frequent daily.

I do a pretty good job of centrally managing the blog content I generate each day. I have a central CMS where I create my blog entries, and then syndicate to my network of 60+ Github repositories, that use Github Pages + Jekyll.

When it comes to my social media streams, I do a very poor job of maintaining control over my network. I do not store my Tweets, Facebook, Linked or other social streams anywhere. I don’t centrally sync my network of friends and business contact, until recently. I now have a central CRM alongside my blog CMS, but I’m just getting started there.

Overall, there is just a lot of content I generate on the Internet, and in addition to understanding my online footprint, to make sure it is secure, I want to make sure I maintain as much control over the intellectual exhaust I generate each day, as I can. I want all my messages, contacts, streams and media to have the widest possible reach, across the latest online services, but I also want to make sure they are organized and stored in a central place that I control.

You will hear me say the phrase “Reclaim Your Domain” a lot in the future. I think this is the evolution of API Evangelist, beyond the business of APIs, and more into the politics of APIs, and helping people understand why APIs are important for orchestrating your online world, keeping yourself safe and secure, but also establishing the greatest control over the value you generate online each day.

Photo Credit - Wikimedia Commons



from http://ift.tt/1hJSQVL

Friday, April 4, 2014

So Much Can Be Lost Through Automation

As I go through the management of my 60+ research sites that make up my API Evangelist network, I can't help but think about how I can automate specific aspects of the process. This is the way my custom built platform works, I have workflows and tasks I accomplish each day, and as I have time, I will write scripts and automate where I can.

In an ordinary week I will process hundreds of blog feeds, tweets and new APIs, and sometimes I’m tempted to automated my curation, sorting, tagging and other aspects of what I do, but then I find valuable nuggets on company sites, blog and other places. Valuable insight that my algorithms wouldn't necessarily find, things that I can use in stories across the API Evangelist network, as well as expand my own knowledge.

Each day I work to strike this balance between manually monitoring the API space, and automating what I do. I am constantly re-evaluating whether something is better automated, and at what point does something need my critical eye.

As I write this post, I can't help but think about all the talk of automation via the web, mobile apps and Internet of things (Iot) each day, and hope that we all can find restraint, and apply critical thinking around how we best use Internet enabled technology in our lives—minimizing what we lose by automating our worlds.



from http://ift.tt/1hJCosS

Sunday, March 2, 2014

EU Companies Wanna Be Here, US Companies Wanna Be In Ireland

I’m spending a lot of time looking at the API landscape in Europe, trying to understand the companies behind some of the valuable API resources emerging from across the pond. It can be tough to determine the country of origin for many tech companies, for two reason:

  1. Many European companies want to look like they have a presence in San Francisco, to get the funding they need to take it to the next level
  2. US companies are establishing a European presence, specifically in Ireland to avoid some of the tax burden they experience in the US

I wonder what all this financial positioning will do to the global economy? Many companies proudly wave the flag of their origin company, but others do not. US corporations definitely do not have any allegiance to their country of origin.

It will be interesting to watch the Internet change the world, and how the money flows, especially when viewing it through the lens of the API economy.



from http://ift.tt/1fBsgNy

Saturday, March 1, 2014

Being Social Can Be About Code, Not Just Twitter or Facebook

I was looking through VersionEye today, a company that provides a notification system for software libraries. After I was done browsing the site I wanted to sign up and test drive, and when I visited their login page, it spoke my language.

In addition to the traditional signup form, VersionEye allows me to signup with the “social media account” of my choice, giving me the option with oAuth via Github or Bitbucket. I’m more Github than Bitbucket, but I like that VersionEye allows me to use either, and they acknowledges that both are “social”.

I use Github for just as many interactions as I do my Twitter and Facebook. Being social in 2014 is increasingly done via Github and Bitbucket, and often times doesn’t directly involve writing or reading of code, but one degree of separation, where we all build the code that drives the world wide web, in a very social way.

Also, a shamless plug: You can easily create this login for using OAuth.io!



from http://ift.tt/1km9tO4

Friday, February 28, 2014

Secure Communication Channels Over Any Network With Telehash

Much like after 911, things will never be the same on the Internet after the Snowden-NSA revelations. Our delusional belief that the Internet is by default “open”, has been crushed, and corporate and government surveillance is now an expected part of daily life--there is no going back! To help me move forward, I'm exporing possibilities for what is next, which has led me back to a communication protocol called Telehash.

I first learned about Telehash several years ago, as I was doing the same research on the future of APIs, and in 2014 I’m picking up where I lef off with my education. My friend Jeremie is working hard on pushing the communication protocol forward, and e needs help funding his work, as well as attracting active technical contributors—I am doing what I do best, tell stories around what is possible with Telehash.

There are several key characteristics of Telehash that stand out for me, and make it something I can't get out of my head:

  • Communication channels are encrypted all the time - there is no unencrypted mode
  • Each application instance or device generates its own public/private keypair, they cannot be impersonated and security is not dependent on trust in certificate authorities
  • Network addresses are generated from public key fingerprints, not centrally managed as with IP addresses
  • Routing is based on a globally distributed hash table (DHT), not a central authority or managed hierarchy
  • Uses a dual JSON/binary packet format
  • Bindings to Bluetooth, IEEE 802.15.4, and other low-layer transports are also on the way

You see all of the elements of what is needed for a next generation communication protocol in there. This isn’t just about privacy and security, it is about us defining our own networks, whether that is on existing Internet infrastructure or adhoc device or an Internet of Things (IoT) defined network.

Telehash is not just some random side project, it is the complete focus of Jeremie (@jeremie), who also helped found Jabber/XMPP, which is the backbone of common messaging apps today, including the now infamous WhatsApp that Facebook just acquired for $19B.  Telehash is building on what we already know, but introducing the key ingredients we WILL depend on for messaging in the future.

As I did with Telehash Node Runners in Egypt, I’m going to continue to craft stories that help us understand how Telehash can be applied, which is already pushing my understanding of what a network can and should be, and how us humans, and our devices can safely communicate on the open Internet, as well as our own privately defined, trusted networks.



from http://ift.tt/1khyf1T

Monday, February 24, 2014

Why Are We Not Innovating On The Big Problems?

As I scan my news feeds I constantly see talk of all the innovation going on—everyone is doing it! In the same feeds I see lots of really big problems like people out of work, mega-drought in California, heating fuel shortages in rural areas, power grid issues, and the list goes on and on.

On side of my news feed showcase how we are innovating with technology in so many new ways, and the other side just tells how screwed we are. Which is it? Are we innovating or are we drowning in big problems?

I’m not saying that there aren’t interesting innovations coming out of the tech sector. I’m saying that I think much of what we claim to be innovations is wasted on non-problems, and potentially profitable ways, not on the real problems we actually face as a society.

How do we incentivize entrepreneurs and venture capitalists to turn their attention to the biggest problems like water, energy, the environment and other critical areas of our world? If we can do this, then I think we can proudly showcase our work as innovation.



from http://ift.tt/1ervuD5

Monday, February 17, 2014

Thoughtful Use of JavaScript When Designing Embeddable Tools

One of the security blogs I follow is Schneier on Security from Bruce Schneier. If you want to understand what is going on around the NSA and security, Bruce is the guy. I was tweeting out a story from his blog today and noticed his share buttons:

You have to actually enable the buttons before you can click, protecting you from the common monitoring we face everyday through JavaScript. It is a simple, but very powerful concept when thinking about how we use JavaScript.

This approach represents a very thoughtful use of JavaScript, something I would like to do more of, and is something we should all be doing as we are building embeddable JavaScript tools.

Thanks Bruce!



from http://ift.tt/1bZJCDt

Wednesday, February 5, 2014

Essential Variable in Big Data Algorithm: Transparency

It is easy to get excited about the potential around “big data”. Many individuals and companies feel this latest trend is all about offering up big data solutions with business models that are built around algorithms, that founders consider their “secret sauce”.

I don't have a problem with this, more power to you, however I personally feel big data solutions, especially those within government should be more transparent than many of the secret sauce, big data approaches we’ve seen to date.

Alex Howard (@digiphile) has a great post at TechRepublic, called data-driven policy and commerce requires algorithmic transparency, which outlines this very well. Alex uses the the phrase "algorithmic accountability”, which I think sums all of this up very nicely.

When it comes to big data solutions, especially in the public sector, it is fine to collect large amounts of data, offer up analytics, visualizations and other big data tools, but algorithmic accountability is something that will be essential in moving forward and building trust across all indusries when it comes to big data.



from http://ift.tt/LzA0YW

Friday, January 31, 2014

Why Guest Posting Has Gotten A Bad Rap

As a proprietor of a small, successful niche blog, I can easily share some insight into why Google recently started punishing blogs that have guest posts.

At API Evangelist I get about two offers a week from random companies and individuals asking to guest post on my blog. These people cite several reasons for wanting to do it, ranging from me helping them as an aspiring blogger, to them helping me with more content and traffic. If you know me, you know I don't have a problem producing content, and I do not blog because I give a shit about pageviews.

In addition to these smaller, much frequent requests for guest posting. I also get the occasional bigger company looking to “partner” with me, when in reality they have no desire to partner and generate value for my readers, or move my research forward. These conversations start out entertaining my perspective of partnering and bringing me value, but once I choose to dance with these partners, they almost always start getting heavy handed about me publishing what they want, and providing links to their own sites and content.

My friend Mike Schinkel has a great post on this very topic, echoing much of what I’m saying. Mike is like me, he blogs for himself, not to generate pageviews. I started API Evangelist to help me better understand the API space, and while I do have a mission of also educating the masses about APIs, the primary directive is still about educating myself—without this none of it matters.

The reason Google has begun cracking down is because it is in their best interest to ensure blogs are of the highest quality, unique content possible. What these “guest post farmers”, and the enterprise companies that employ this same practice don't realize, is their aren’t generating any value, they are extracting and diminishing the value of these blogs, and this is what has catching Google’s attention.

On second thought, maybe these companies realize what they are doing. They are just leeches, looking to extract value for their purpose—at all costs. They don't care about your blog, content or career. They want to suck every last page view from your blogs soul, transferring any value you may have had to their operations.



from http://ift.tt/1nxJxPk

Thursday, January 23, 2014

It Is A Start: IRS Enables Americans To Download Tax Transcripts

I recently talked about how an IRS API would be the holy grail of APIs, and after reading IRS enables Americans to download their tax transcripts over the Internet, by Alex Howard (@digiphile), I’m getting excited that we might be getting closer.

As Alex reports, at a White House Datapalooza last week, the IRS announced the new access:

“I am very excited to announce that the IRS has just launched, this week, a transcript application which will give taxpayers the ability to view, print, and download tax transcripts,” said Katherine Sydor, a policy advisor in the Office of Consumer Policy of the Treasury, “making it easier for student borrowers to access tax records he or she might need to submit loan applications or grant applications.”

The topic came front and center for me as I was working on the FAFSA API, and I realized how critical parents tax transcripts are to the student aid process. I started considering how important taxes are in not just getting student loans and grants, but home mortgage, insurance and pretty much every other aspect of life.

I’m not going to hold my breathe for an IRS API in 2014, but this latest offering shows they are on track to making our tax information available to us over the Internet. The IRS should be able to achieve a modern API ecosystem, as they already have a working model around the IRS e-file platform, but having gained a better understanding of how government works this year, I know it won't be as easy as it may seem.



from http://ift.tt/1hp8NTG

Monday, January 20, 2014

With Open Data There Is No Way To Submit Back

I have been doing some work on a project to develop an API for the Free Application for Federal Student Aid (FAFSA) form. After getting the core API designed and published, I wanted to provide several supporting APIs that would help developers be more successful when building their apps on the API.

One example is a list of colleges and universities in the United States. This dataset is available at Data.gov, under the education section. Except for it is just a tab separated text file, which is machine readable, but a little more work for me than if it was available in JSON.

The txt file of all schools was 25.7 MB and contained all elementary as well as secondary schools. For this project i'm just interested in secondary schools, but I need to process the whole file to get at what I needed.

I imported the file into MySQL. Next I was able to filter by type of school, and get the resulting data set I was looking for, with a couple hours of work.

Now I have two JSON files, one for elementary and one for secondary schools. The whole FAFSA project is a working example of what can be done with government data, outside of the government, but I wanted to highlight the number of hours put into pulling, cleaning up the school data. The federal government most likely does not have the capacity to accept this work back from me, forcing it to remain external.

I would love to see a way to link up the original list of public elementary and secondary schools with this JSON data set I've produced, so that we can take advantage of the cycles I've spent evolving this data. I'm sure there are other individuals and companies like me who have cleaned up data, and would be happy to submit it back--there is just now way to do it.

This is why there has to be a sort of DMZ for the public and private sector to interact, allowing the federal government to take advantage of work being done by the numerous folks like me who are working to improve government and build applications using government generated open data.



from http://ift.tt/1bJFsTq

Sunday, January 19, 2014

Why You Are Missing All The Signals

I get a lot of requests from individuals and companies to partner with them to work on projects. Partnering can range from advising a startup, working on in-kind or paid projects or just having a conversation around a specific topic. Each one of these engagements, I guess you could consider is a sort of interview, for lack of a better description.

Many of these requests never get past email or phone call, but some move forward quickly. I recently had a company who was looking to partner with me on research in a specific area. I got on the phone with the company after brief email exchange, and the conversation started with someone from this company saying, “I haven’t really looked up much about you, but wanted to talk and learn more.” Immediately the conversation took an interview tone, went on for about 15 minutes and ended. Shortly afterwards I got an email requesting 2 references the company could use.

Now, I don’t have a problem with interviews or providing work references, I’m capable of delivering on both. What I have a problem with is not conducting your due diligence, before getting on a call, and using the legacy interview process as a crutch for your lack of desire to understand who someone is. If you are going to partner with someone, get to know them. Period.

If you don’t Google my name, you are missing out. I’m pretty accessible. You can look at my Blogs, Twitter and Github, and get a pretty good understanding of who I am. I’m working on over 50+ projects, engaging in active conversations on Twitter, and actively pushing stories and code to my blogs, using Github. I understand that not everyone is like this in their personal and professional lives, but you should be aware enough to look, and be willing to do 15 minutes of Googling—the minimum viable due diligence these days.

In short, if you are missing all the signals I’m putting out daily, we probably aren’t a good partnership. I’ll decline your request politely, and move on. There is nothing wrong with that, it happens in life all the time. The whole process acts as a great filtration process for me, I just wanted to share with you, so that you can understand. Its not me, its you, or maybe its both of us.



from http://ift.tt/1eLtDKR

Saturday, January 18, 2014

Adopt A Federal Government Dataset

When I pulled the over 5,000 datasets from 22 federal agencies after the implementation of OMB Memorandum M-13-13 Open Data Policy-Managing Information as an Asset, I was reminded how much work there is still to do around opening up government data. Overall I gave the efforts a C grade, because it seemed like agencies just rounded up a bunch of data laying around, published to meet the deadline, without much regard for the actual use or consumption of the data.

Even with these thoughts, I know how hard it was to round up this 5,000 datasets in government, and because of that I can't get these datasets out of my mind. I want to go through all of them, cleanup and share them back with each agency. Obvioiusly I can't do that, but what is the next best thing? As I ws walking the other day, I thought it would be a good to have a sort of adoption tool, so that anyone can step up and adopt a federal agency's dataset and improve it.

AS with most of my projects, it is hard to get them out of my head until I get at least a proof of concept up and running. So I developed Federal Agecny Dataset Adoption, and published it to Github. I published a JSON listing of the 22 federal agencies, and the data.json file that each agency published. Next I created a simple UI to browse the agencies, datasets, view details, distributions with the ability to "adopt" a dataset. 

When you choose to adopt a federal agency dataset, the site authenticates with your Github account using oAuth, then creates a repository to house any work that will occur. Each dataset you adopt gets its own branch within the repository, a README file, and a copy of the datasets entry from it's agency's data.json file. 

I would copy actual datasets to the repo, but many of the references are just HTML or ASP pages, and you have to manually look up the data. Each repo is meant to be a workspace, and users who adopt datasets can just update the data.json to point to any new data distributions that are generated. I will programmatically pull these updates and register with the master site on a regular basis.

The system is meant to help me track which datasets I'm working on, and if other people want to get involved, awesome! I envision it acting as a sort of distributed directory, in which agencies, and consumers of agency's data, can find alternate versions of federal government data. Additionally, data obsessed folks like me can clean up data, and contribute back to the lifecycle in a federated way, using our own Github accounts combined with a centralized registry.

As with my other side projects, who knows where this will go. I'd like to get some other folks involved, and maybe get the attention of agencies, then drum up with some funding so I can put some more cycles into it. If you are interested, you can get involved via the Federal Government Dataset Adoption Github repository



from http://ift.tt/Kj53YR

Thursday, December 26, 2013

Hacker Storytelling - Ed-Tech Funding

I just finished the basic setup for a project that @audreywatters and I have been working on together. A while back Audrey said she wanted to better understand the world of investment behind the ed-tech space. I saw it as a perfect opportunity for collaboration between Audrey's world and mine, so we setup The Ed-Tech (Industry) Matrix.

As I do with all my projects, I setup a Github repository as a container for all the research. I added a base Jekyll template, allowing us to manage all the pages for the research project easily, and we can have also have a project blog which we will use to showcase project updates--leaving Ed-Tech Funding research with three elements:

  • Overview - Home page of the project, explaining the research and providing a single landing page to get at all aspects of the work.
  • Updates - A chronological Jekyll blog which we are using to capture stories of the work we do in real-time, providing an update for each step.
  • Roadmap - A list of work we intend to do on the project, driven from the underlying Github issues. This allows either of us, or any other Github user to add issues to steer where we are going with the work.

That is it for now. I have a lot of work to do in pulling corporate data on all of these ed-tech companies Audrey has targeted through her research. I have already pulled details from Crunchbase, but will be pulling what I can from Angel List, Open Corporates and from the SEC.

The project is a container for us to use when we have time to dedicate to the project. All our work is available in a collaborative way via the Github repository, and we use Jekyll + Github to handle all of the project tasking, roadmap and storytelling around any work that occurs. I'm sure as we make progress both Audrey and I will also be providing deeper analysis via our own blogs.

If you'd like to know more about the project, feel free to ping @audreywatters or @kinlane, or submit an issue for the project. It you'd like to know more about this project format, which I call Hacker Storytelling, I'm happy to help share my thoughts.



from http://feedproxy.google.com/~r/KinLane/~3/itrfVMIQ0pQ/hacker-storytelling--edtech-funding

Sunday, November 24, 2013

Salesforce Hackathon: Y U No Understand, Bigger != Better

I'm reading through the some of the news about the Salesforce Hackathon, and while I'm disappointed in the outcome, with a bounty that big I'm not surprised. The event organizers are focused on the one thing in a hackthon you can scale, which will not actually scale any value of the hackathon.

I've attended, sponsored, mentored and thrown many hackathons and anyone who is a lover of the hackathon knows that value of these events is never the resulting prize.

Like so much else in this world (ie. startups, college education), when you focus on just the end goal, and scaling, you will lose so much value and meaning along the way. The increasingly big bounty in hackathons has occurred right alongside the demise of this very valuable event format.

The best hackathons I've been at, were small in attendance, and size of the prize. Teams formed organically around a topic or cause, and people shared ideas, skills, knowledge and genuinely got to know one another in an intimate environment over good food and drink. They are never about the finished project or prize--these are all things you cannot scale. Sorry. :-(

Some of the worst hackathons I've been to were large in number of people and size of the prize. Nobody got to know each other, teams came pre-formed and things were so competitive even a 6'3 white male veteran programmer like myself felt intimidated by the competition, and the aggression.

I don't think these big event holders know the damage they are doing to their own events, let alone to the entire hackathon space. They are taught bigger is better, when in reality they are turning off newcomers to the space, and turning away people like myself who thoroughly enjoy hackathons but really do not enjoy spending their weekends battling with even a handful of cocky young dudes, let alone several hundred.

If you are committed to focusing on the end goal of your college education (the degree), startup (the exit) or a hackathon (prize), you are missing out on so much good stuff in the middle in the form of relationships, experience, knowledge, skills and so much more. If everything is about scale to you, you probably will be focusing on some pretty empty aspects of this world, because the most important things in life do not scale.



from http://feedproxy.google.com/~r/KinLane/~3/3Iy_Sv55NPs/salesforce-hackathon-y-u-no-understand-bigger--better

Crowd-Sourced, Real-Time City Bus Location Network

We have anywhere from 1 to 25 people on a city bus at any time. Every one of these folks have a cell phone in their pocket. I think we can assume at least a handful of them possess smart phones.

With this technology, why don't we know where each bus is in real-time? We know that each bus has a tracking device on it, so knowing the location isn't the problem, it is getting the data. Even with the technology, getting municipalities to open up and share the data is proving to be a challenge.

Why can't we create a crowd-sourced, incentive based network of bus riders who are open to having their position tracked while on the city bus? Of course we could compensate them for this data, and not just exploit their involvement.

Having city bus riders voluntarily sharing their data, establishing trusted relationships and profiles, and cross referencing across multiple users would provide a real-time base of data we could use to identify where any bus is at any time--without complex technology or systems.

In some cities this isn't a problem that needs solving. In Los Angeles, the bus is NEVER on time and never predictable. There is no way of knowing what time you should walk up to the bus stop. There should be push notifications that let me know the bus is at a specific stop, that is nearby, and I should consider heading to my own bus stop.



from http://feedproxy.google.com/~r/KinLane/~3/u8DwpK84y5s/crowdsourced-realtime-city-bus-location-network

Salesforce Hackathon: Y U No Understand, Bigger != Better

I'm reading through the some of the news about the Salesforce Hackathon, and while I'm disappointed in the outcome, with a bounty that big I'm not surprised. The event organizers are focused on the one thing in a hackthon you can scale, which will not actually scale any value of the hackathon.

I've attended, sponsored, mentored and thrown many hackathons and anyone who is a lover of the hackathon knows that value of these events is never the resulting prize.

Like so much else in this world (ie. startups, college education), when you focus on just the end goal, and scaling, you will lose so much value and meaning along the way. The increasingly big bounty in hackathons has occurred right alongside the demise of this very valuable event format.

The best hackathons I've been at, were small in attendance, and size of the prize. Teams formed organically around a topic or cause, and people shared ideas, skills, knowledge and genuinely got to know one another in an intimate environment over good food and drink. They are never about the finished project or prize--these are all things you cannot scale. Sorry. :-(

Some of the worst hackathons I've been to were large in number of people and size of the prize. Nobody got to know each other, teams came pre-formed and things were so competitive even a 6'3 white male veteran programmer like myself felt intimidated by the competition, and the aggression.

I don't think these big event holders know the damage they are doing to their own events, let alone to the entire hackathon space. They are taught bigger is better, when in reality they are turning off newcomers to the space, and turning away people like myself who thoroughly enjoy hackathons but really do not enjoy spending their weekends battling with even a handful of cocky young dudes, let alone several hundred.

If you are committed to focusing on the end goal of your college education (the degree), startup (the exit) or a hackathon (prize), you are missing out on so much good stuff in the middle in the form of relationships, experience, knowledge, skills and so much more. If everything is about scale to you, you probably will be focusing on some pretty empty aspects of this world, because the most important things in life do not scale.



from http://feedproxy.google.com/~r/KinLane/~3/wfeukkBya14/salesforce_hackathon_y_u_no_understand_bigger__better

Walmart: The Market Will Work Itself Out

When I read stories like Walmart Holding Canned Food Drive For Its Own Underpaid Employees, I can't help but think about the statement I've heard from numerous conservative friends, that "the market will work itself out". That somehow markets are this magical force that always will find balance, and work out for everyone.

I think Walmart represents the truth of this statement. The market will work itself out for the merchant class, the rest of us will have to really take care of each other, because markets are about business owners, shareholders and profits.

Unless we begin seeing the light, I think the future will look like Walmart. There will be lots of places to buy the cheap crap we think we need, we won't have healthcare, a living wage, and the environment will be trashed.

Don't worry though! The market will work itself out!



from http://feedproxy.google.com/~r/KinLane/~3/VpmIOGWFHpQ/walmart_the_market_will_work_itself_out

On Losing My Storytelling Voice

photo credit

I'm totally thankful for the experiences I've had over the last 90 days in Washington D.C. as a Presidential Innovation Fellow, and even more thankful I'm able to keep doing much of the work I was doing during my fellowship. In reality, I'm actually doing more work now, than I was in DC.

While there were several challenges during my time as a PIF, the one that I regret the most, and is taking the longest to recover from, is losing my storytelling voice. This is my ability to capture everyday thoughts in real-time via my Evernote, sit down and form these thoughts into stories, and then share these stories publicly as the API Evangelist.

During my time in DC, I was steadily losing my voice. It wasn't some sort of government conspiracy. It is something that seems to happen to me in many institutional or corporate settings, amidst the busy schedule, back to back meetings and through a more hectic project schedule--eventually my voice begins to fade.

In July I wrote 61 blog posts, August 41 and September 21. A very scary trend for me. My blog is more than just just stories for my audience and page views generated. My blog(s) are about me working through ideas and preparing them for public consumption.

Without storytelling via my blog(s) I don't fully process ideas, think them through, flush them out and think about the API space with a critical eye. Without this lifecycle I don't evolve in my career, and maintain my perspective on the space.

In October I've written 28 posts and so far in November I've already written 27 posts, so I'm on the mend. In the future, I'm using my voice as a canary in the coal mine. If a project I'm working on is beginning to diminish my voice, I need to stop and take a look at things, and make sure I'm not heading in a negative direction.



from http://feedproxy.google.com/~r/KinLane/~3/WUIFnE-4YUU/on_losing_my_storytelling_voice

Why I Exited My Presidential Innovation Fellowship

Since this blog acts as kind of a journal for my world, I figured I should make sure and add an entry regarding my exit of my Presidential Innovation Fellowship, affectionately called PIF program.

In June I was selected as a Presidential Innovation Fellow, where I went out to Washington D.C. and accepted a position at the Department of Veterans Affairs. I didn't actually start work until August 11th, but accepted I accepted the role along with the other 42 PIFs earlier that summer.

After 60 days, I decided to leave the program. The main reason is that Audrey and I couldn't make ends meet in DC, on what they paid, and after spending our savings to get out there, with no credit cards to operate on, and experiencing the shutdown, and facing another shutdown this winter--it just wasn't working for us.

The benefits gained by the title, and the G-14 employment position just didn't balance out the negative. In the end I'm thankful for the opportunity, but I couldn't ask Audrey or myself to make the trade-off. I knew things would be hard, but facing sleeping on friends couches and not being able to pay our AWS bills was not in the cards.

As is my style, I've spent zero time dwelling on my exit. I am determined to pick up all my projects, and continue moving them forward. In short I will still be doing all the work, just leave behind the title and official PIF status. I strongly believe that the best way to apply my skills is from the outside-in, and my exit will allow me to make a larger impact across government in the long run.

I hope everyone who I worked with at the VA, GSA, OSTP and beyond understands why I left by now, and knows I'm here to continue my support. I think the PIF program has a lot to offer future rounds, and I will continue to play an active role in the program and helping change how government operates using open data and APIs.

Thanks everyone!



from http://feedproxy.google.com/~r/KinLane/~3/LIgK3h-6cns/why_i_exited_my_presidential_innovation_fellowship

Being The Change I Want To See In The Presidential Innovation Fellowship (PIF) Program

I just wrote a post on why I left my Presidential Innovation Fellowship (PIF). Overall I think PIF program is a pretty amazing vehicle for bringing smart folks from the private sector and puting them to work changing how government operates. However, now that I've exited I wanted to share two thoughts on how the program could be more effective.

I think the responsibility of mking the PIF program better lies in the hands of each round of PIFs, which is essentially what I'm doing with my exit of the program. There are two main areas I would adjust the program:

  • Dedicated Roles Across Agencies - I was placed at the Department of Veterans Affairs, but because of my unique focus on APIs I found myself working across multiple agencies. For some of the PIFs I think dedicated roles could be filled including, but not limited to API, UI/UX, Programming, Event Organizer etc. Some individuals will be better suited to this type of specialization, and better applied across agencies--this will also significantly benefit other agency focused PIFs.
  • Internal and External Fellows - In my case, being a government employee was not beneficial. I don't aspire to establish a career in government, as I hope will be case with some future PIFs, and the role didn't really open up enough access, to make it worth my while. The PIF Program should have two distinct tracks that individuals can choose from, either tackling their fellowship from the inside-out or from the outside-in, without the shackles of being a government employee.

These are my two changes to the program that I feel strongly about. I know there are other areas that former and current PIFs would like to see changed, but these are the two I'm will to "be the change I want to see in the program". With this in mind, I'm willing to exit the program, make the change, and evolve the program into what I think it should be.

From the outside I will be able to apply my API skills across multiple agencies, and I will be able to bring external resources that my fellow PIFs can put to use.. Coupled with the efforts of other internal PIFs and government employees, I feel I can maximize my impact on how government operates in the coming years.



from http://feedproxy.google.com/~r/KinLane/~3/fdJyd-GrU20/being_the_change_i_want_to_see_in_the_presidential_innovation_fellowship_pif_program

What If All Gov Programs Like Healthcare.gov Had A Private Sector Monitoring Group?

The Healthcare.gov launch has been a disaster. I cannot turn on CNN or NPR during the day, without hearing a story about what a failure the technology and implementation has been for the Affordable Care Act(ACA).

I have written and talked about how transparency was the biggest problem for the Healthcare.gov rollout. Sure there was numerous illnesses from procurement to politics, but ultimately if there had been more transparency, from start to finish, things could have been different.

Throughout this debacle I have been making my exit from federal government back to the private sector, and I can't help but think how things could have been different with Healthcare.gov if it there had been some sort of external watchdog group tracking on the process from start to finish. I mean, c'mon this project is way to big and way to important to just leave to government and its contractors.

What if there had been a group of people assigned to the project at its inception? External, non-partisan, independent individuals who had the skills and tracked on the procurement process, design, development and launch of Healthcare.gov? What if any federal, state or city government project had the potential to have a knowledgable group of outside individuals tracking on projects and made recommendations in real-time how to improve the process? Things could be different.

Of course there are lots of questions to ask: How to fund this? Who watches the watchers? On and on. Even with all the quesitons, we should be looking for new and innovative ways to bring the public and private sector together to solve the biggest problems.

It is just a thought. As I exit the Presidential Innovation Fellowship (PIF) program, and head back to the private sector, I can't help but think about ways that we can improve the oversight and involvement of the private sector in how government operates.



from http://feedproxy.google.com/~r/KinLane/~3/fakGRodn8qs/what_if_all_gov_programs_like_healthcaregov_had_a_private_sector_monitoring_group