Thursday, December 26, 2013

Hacker Storytelling - Ed-Tech Funding

I just finished the basic setup for a project that @audreywatters and I have been working on together. A while back Audrey said she wanted to better understand the world of investment behind the ed-tech space. I saw it as a perfect opportunity for collaboration between Audrey's world and mine, so we setup The Ed-Tech (Industry) Matrix.

As I do with all my projects, I setup a Github repository as a container for all the research. I added a base Jekyll template, allowing us to manage all the pages for the research project easily, and we can have also have a project blog which we will use to showcase project updates--leaving Ed-Tech Funding research with three elements:

  • Overview - Home page of the project, explaining the research and providing a single landing page to get at all aspects of the work.
  • Updates - A chronological Jekyll blog which we are using to capture stories of the work we do in real-time, providing an update for each step.
  • Roadmap - A list of work we intend to do on the project, driven from the underlying Github issues. This allows either of us, or any other Github user to add issues to steer where we are going with the work.

That is it for now. I have a lot of work to do in pulling corporate data on all of these ed-tech companies Audrey has targeted through her research. I have already pulled details from Crunchbase, but will be pulling what I can from Angel List, Open Corporates and from the SEC.

The project is a container for us to use when we have time to dedicate to the project. All our work is available in a collaborative way via the Github repository, and we use Jekyll + Github to handle all of the project tasking, roadmap and storytelling around any work that occurs. I'm sure as we make progress both Audrey and I will also be providing deeper analysis via our own blogs.

If you'd like to know more about the project, feel free to ping @audreywatters or @kinlane, or submit an issue for the project. It you'd like to know more about this project format, which I call Hacker Storytelling, I'm happy to help share my thoughts.



from http://feedproxy.google.com/~r/KinLane/~3/itrfVMIQ0pQ/hacker-storytelling--edtech-funding

Sunday, November 24, 2013

Salesforce Hackathon: Y U No Understand, Bigger != Better

I'm reading through the some of the news about the Salesforce Hackathon, and while I'm disappointed in the outcome, with a bounty that big I'm not surprised. The event organizers are focused on the one thing in a hackthon you can scale, which will not actually scale any value of the hackathon.

I've attended, sponsored, mentored and thrown many hackathons and anyone who is a lover of the hackathon knows that value of these events is never the resulting prize.

Like so much else in this world (ie. startups, college education), when you focus on just the end goal, and scaling, you will lose so much value and meaning along the way. The increasingly big bounty in hackathons has occurred right alongside the demise of this very valuable event format.

The best hackathons I've been at, were small in attendance, and size of the prize. Teams formed organically around a topic or cause, and people shared ideas, skills, knowledge and genuinely got to know one another in an intimate environment over good food and drink. They are never about the finished project or prize--these are all things you cannot scale. Sorry. :-(

Some of the worst hackathons I've been to were large in number of people and size of the prize. Nobody got to know each other, teams came pre-formed and things were so competitive even a 6'3 white male veteran programmer like myself felt intimidated by the competition, and the aggression.

I don't think these big event holders know the damage they are doing to their own events, let alone to the entire hackathon space. They are taught bigger is better, when in reality they are turning off newcomers to the space, and turning away people like myself who thoroughly enjoy hackathons but really do not enjoy spending their weekends battling with even a handful of cocky young dudes, let alone several hundred.

If you are committed to focusing on the end goal of your college education (the degree), startup (the exit) or a hackathon (prize), you are missing out on so much good stuff in the middle in the form of relationships, experience, knowledge, skills and so much more. If everything is about scale to you, you probably will be focusing on some pretty empty aspects of this world, because the most important things in life do not scale.



from http://feedproxy.google.com/~r/KinLane/~3/3Iy_Sv55NPs/salesforce-hackathon-y-u-no-understand-bigger--better

Crowd-Sourced, Real-Time City Bus Location Network

We have anywhere from 1 to 25 people on a city bus at any time. Every one of these folks have a cell phone in their pocket. I think we can assume at least a handful of them possess smart phones.

With this technology, why don't we know where each bus is in real-time? We know that each bus has a tracking device on it, so knowing the location isn't the problem, it is getting the data. Even with the technology, getting municipalities to open up and share the data is proving to be a challenge.

Why can't we create a crowd-sourced, incentive based network of bus riders who are open to having their position tracked while on the city bus? Of course we could compensate them for this data, and not just exploit their involvement.

Having city bus riders voluntarily sharing their data, establishing trusted relationships and profiles, and cross referencing across multiple users would provide a real-time base of data we could use to identify where any bus is at any time--without complex technology or systems.

In some cities this isn't a problem that needs solving. In Los Angeles, the bus is NEVER on time and never predictable. There is no way of knowing what time you should walk up to the bus stop. There should be push notifications that let me know the bus is at a specific stop, that is nearby, and I should consider heading to my own bus stop.



from http://feedproxy.google.com/~r/KinLane/~3/u8DwpK84y5s/crowdsourced-realtime-city-bus-location-network

Salesforce Hackathon: Y U No Understand, Bigger != Better

I'm reading through the some of the news about the Salesforce Hackathon, and while I'm disappointed in the outcome, with a bounty that big I'm not surprised. The event organizers are focused on the one thing in a hackthon you can scale, which will not actually scale any value of the hackathon.

I've attended, sponsored, mentored and thrown many hackathons and anyone who is a lover of the hackathon knows that value of these events is never the resulting prize.

Like so much else in this world (ie. startups, college education), when you focus on just the end goal, and scaling, you will lose so much value and meaning along the way. The increasingly big bounty in hackathons has occurred right alongside the demise of this very valuable event format.

The best hackathons I've been at, were small in attendance, and size of the prize. Teams formed organically around a topic or cause, and people shared ideas, skills, knowledge and genuinely got to know one another in an intimate environment over good food and drink. They are never about the finished project or prize--these are all things you cannot scale. Sorry. :-(

Some of the worst hackathons I've been to were large in number of people and size of the prize. Nobody got to know each other, teams came pre-formed and things were so competitive even a 6'3 white male veteran programmer like myself felt intimidated by the competition, and the aggression.

I don't think these big event holders know the damage they are doing to their own events, let alone to the entire hackathon space. They are taught bigger is better, when in reality they are turning off newcomers to the space, and turning away people like myself who thoroughly enjoy hackathons but really do not enjoy spending their weekends battling with even a handful of cocky young dudes, let alone several hundred.

If you are committed to focusing on the end goal of your college education (the degree), startup (the exit) or a hackathon (prize), you are missing out on so much good stuff in the middle in the form of relationships, experience, knowledge, skills and so much more. If everything is about scale to you, you probably will be focusing on some pretty empty aspects of this world, because the most important things in life do not scale.



from http://feedproxy.google.com/~r/KinLane/~3/wfeukkBya14/salesforce_hackathon_y_u_no_understand_bigger__better

Walmart: The Market Will Work Itself Out

When I read stories like Walmart Holding Canned Food Drive For Its Own Underpaid Employees, I can't help but think about the statement I've heard from numerous conservative friends, that "the market will work itself out". That somehow markets are this magical force that always will find balance, and work out for everyone.

I think Walmart represents the truth of this statement. The market will work itself out for the merchant class, the rest of us will have to really take care of each other, because markets are about business owners, shareholders and profits.

Unless we begin seeing the light, I think the future will look like Walmart. There will be lots of places to buy the cheap crap we think we need, we won't have healthcare, a living wage, and the environment will be trashed.

Don't worry though! The market will work itself out!



from http://feedproxy.google.com/~r/KinLane/~3/VpmIOGWFHpQ/walmart_the_market_will_work_itself_out

On Losing My Storytelling Voice

photo credit

I'm totally thankful for the experiences I've had over the last 90 days in Washington D.C. as a Presidential Innovation Fellow, and even more thankful I'm able to keep doing much of the work I was doing during my fellowship. In reality, I'm actually doing more work now, than I was in DC.

While there were several challenges during my time as a PIF, the one that I regret the most, and is taking the longest to recover from, is losing my storytelling voice. This is my ability to capture everyday thoughts in real-time via my Evernote, sit down and form these thoughts into stories, and then share these stories publicly as the API Evangelist.

During my time in DC, I was steadily losing my voice. It wasn't some sort of government conspiracy. It is something that seems to happen to me in many institutional or corporate settings, amidst the busy schedule, back to back meetings and through a more hectic project schedule--eventually my voice begins to fade.

In July I wrote 61 blog posts, August 41 and September 21. A very scary trend for me. My blog is more than just just stories for my audience and page views generated. My blog(s) are about me working through ideas and preparing them for public consumption.

Without storytelling via my blog(s) I don't fully process ideas, think them through, flush them out and think about the API space with a critical eye. Without this lifecycle I don't evolve in my career, and maintain my perspective on the space.

In October I've written 28 posts and so far in November I've already written 27 posts, so I'm on the mend. In the future, I'm using my voice as a canary in the coal mine. If a project I'm working on is beginning to diminish my voice, I need to stop and take a look at things, and make sure I'm not heading in a negative direction.



from http://feedproxy.google.com/~r/KinLane/~3/WUIFnE-4YUU/on_losing_my_storytelling_voice

Why I Exited My Presidential Innovation Fellowship

Since this blog acts as kind of a journal for my world, I figured I should make sure and add an entry regarding my exit of my Presidential Innovation Fellowship, affectionately called PIF program.

In June I was selected as a Presidential Innovation Fellow, where I went out to Washington D.C. and accepted a position at the Department of Veterans Affairs. I didn't actually start work until August 11th, but accepted I accepted the role along with the other 42 PIFs earlier that summer.

After 60 days, I decided to leave the program. The main reason is that Audrey and I couldn't make ends meet in DC, on what they paid, and after spending our savings to get out there, with no credit cards to operate on, and experiencing the shutdown, and facing another shutdown this winter--it just wasn't working for us.

The benefits gained by the title, and the G-14 employment position just didn't balance out the negative. In the end I'm thankful for the opportunity, but I couldn't ask Audrey or myself to make the trade-off. I knew things would be hard, but facing sleeping on friends couches and not being able to pay our AWS bills was not in the cards.

As is my style, I've spent zero time dwelling on my exit. I am determined to pick up all my projects, and continue moving them forward. In short I will still be doing all the work, just leave behind the title and official PIF status. I strongly believe that the best way to apply my skills is from the outside-in, and my exit will allow me to make a larger impact across government in the long run.

I hope everyone who I worked with at the VA, GSA, OSTP and beyond understands why I left by now, and knows I'm here to continue my support. I think the PIF program has a lot to offer future rounds, and I will continue to play an active role in the program and helping change how government operates using open data and APIs.

Thanks everyone!



from http://feedproxy.google.com/~r/KinLane/~3/LIgK3h-6cns/why_i_exited_my_presidential_innovation_fellowship

Being The Change I Want To See In The Presidential Innovation Fellowship (PIF) Program

I just wrote a post on why I left my Presidential Innovation Fellowship (PIF). Overall I think PIF program is a pretty amazing vehicle for bringing smart folks from the private sector and puting them to work changing how government operates. However, now that I've exited I wanted to share two thoughts on how the program could be more effective.

I think the responsibility of mking the PIF program better lies in the hands of each round of PIFs, which is essentially what I'm doing with my exit of the program. There are two main areas I would adjust the program:

  • Dedicated Roles Across Agencies - I was placed at the Department of Veterans Affairs, but because of my unique focus on APIs I found myself working across multiple agencies. For some of the PIFs I think dedicated roles could be filled including, but not limited to API, UI/UX, Programming, Event Organizer etc. Some individuals will be better suited to this type of specialization, and better applied across agencies--this will also significantly benefit other agency focused PIFs.
  • Internal and External Fellows - In my case, being a government employee was not beneficial. I don't aspire to establish a career in government, as I hope will be case with some future PIFs, and the role didn't really open up enough access, to make it worth my while. The PIF Program should have two distinct tracks that individuals can choose from, either tackling their fellowship from the inside-out or from the outside-in, without the shackles of being a government employee.

These are my two changes to the program that I feel strongly about. I know there are other areas that former and current PIFs would like to see changed, but these are the two I'm will to "be the change I want to see in the program". With this in mind, I'm willing to exit the program, make the change, and evolve the program into what I think it should be.

From the outside I will be able to apply my API skills across multiple agencies, and I will be able to bring external resources that my fellow PIFs can put to use.. Coupled with the efforts of other internal PIFs and government employees, I feel I can maximize my impact on how government operates in the coming years.



from http://feedproxy.google.com/~r/KinLane/~3/fdJyd-GrU20/being_the_change_i_want_to_see_in_the_presidential_innovation_fellowship_pif_program

What If All Gov Programs Like Healthcare.gov Had A Private Sector Monitoring Group?

The Healthcare.gov launch has been a disaster. I cannot turn on CNN or NPR during the day, without hearing a story about what a failure the technology and implementation has been for the Affordable Care Act(ACA).

I have written and talked about how transparency was the biggest problem for the Healthcare.gov rollout. Sure there was numerous illnesses from procurement to politics, but ultimately if there had been more transparency, from start to finish, things could have been different.

Throughout this debacle I have been making my exit from federal government back to the private sector, and I can't help but think how things could have been different with Healthcare.gov if it there had been some sort of external watchdog group tracking on the process from start to finish. I mean, c'mon this project is way to big and way to important to just leave to government and its contractors.

What if there had been a group of people assigned to the project at its inception? External, non-partisan, independent individuals who had the skills and tracked on the procurement process, design, development and launch of Healthcare.gov? What if any federal, state or city government project had the potential to have a knowledgable group of outside individuals tracking on projects and made recommendations in real-time how to improve the process? Things could be different.

Of course there are lots of questions to ask: How to fund this? Who watches the watchers? On and on. Even with all the quesitons, we should be looking for new and innovative ways to bring the public and private sector together to solve the biggest problems.

It is just a thought. As I exit the Presidential Innovation Fellowship (PIF) program, and head back to the private sector, I can't help but think about ways that we can improve the oversight and involvement of the private sector in how government operates.



from http://feedproxy.google.com/~r/KinLane/~3/fakGRodn8qs/what_if_all_gov_programs_like_healthcaregov_had_a_private_sector_monitoring_group

Monday, November 18, 2013

Walmart: The Market Will Work Itself Out

When I read stories like Walmart Holding Canned Food Drive For Its Own Underpaid Employees, I can't help but think about the statement I've heard from numerous conservative friends, that "the market will work itself out". That somehow markets are this magical force that always will find balance, and work out for everyone.

I think Walmart represents the truth of this statement. The market will work itself out for the merchant class, the rest of us will have to really take care of each other, because markets are about business owners, shareholders and profits.

Unless we begin seeing the light, I think the future will look like Walmart. There will be lots of places to buy the cheap crap we think we need, we won't have healthcare, a living wage, and the environment will be trashed.

Don't worry though! The market will work itself out!



from http://feedproxy.google.com/~r/KinLane/~3/3Qrufco-eQg/walmart-the-market-will-work-itself-out

Sunday, November 17, 2013

On Losing My Storytelling Voice

photo credit

I'm totally thankful for the experiences I've had over the last 90 days in Washington D.C. as a Presidential Innovation Fellow, and even more thankful I'm able to keep doing much of the work I was doing during my fellowship. In reality, I'm actually doing more work now, than I was in DC.

While there were several challenges during my time as a PIF, the one that I regret the most, and is taking the longest to recover from, is losing my storytelling voice. This is my ability to capture everyday thoughts in real-time via my Evernote, sit down and form these thoughts into stories, and then share these stories publicly as the API Evangelist.

During my time in DC, I was steadily losing my voice. It wasn't some sort of government conspiracy. It is something that seems to happen to me in many institutional or corporate settings, amidst the busy schedule, back to back meetings and through a more hectic project schedule--eventually my voice begins to fade.

In July I wrote 61 blog posts, August 41 and September 21. A very scary trend for me. My blog is more than just just stories for my audience and page views generated. My blog(s) are about me working through ideas and preparing them for public consumption.

Without storytelling via my blog(s) I don't fully process ideas, think them through, flush them out and think about the API space with a critical eye. Without this lifecycle I don't evolve in my career, and maintain my perspective on the space.

In October I've written 28 posts and so far in November I've already written 27 posts, so I'm on the mend. In the future, I'm using my voice as a canary in the coal mine. If a project I'm working on is beginning to diminish my voice, I need to stop and take a look at things, and make sure I'm not heading in a negative direction.



from http://feedproxy.google.com/~r/KinLane/~3/8Ua7fJUFfqI/on-losing-my-storytelling-voice

Friday, November 15, 2013

Being The Change I Want To See In The Presidential Innovation Fellowship (PIF) Program

I just wrote a post on why I left my Presidential Innovation Fellowship (PIF). Overall I think PIF program is a pretty amazing vehicle for bringing smart folks from the private sector and puting them to work changing how government operates. However, now that I've exited I wanted to share two thoughts on how the program could be more effective.

I think the responsibility of mking the PIF program better lies in the hands of each round of PIFs, which is essentially what I'm doing with my exit of the program. There are two main areas I would adjust the program:

  • Dedicated Roles Across Agencies - I was placed at the Department of Veterans Affairs, but because of my unique focus on APIs I found myself working across multiple agencies. For some of the PIFs I think dedicated roles could be filled including, but not limited to API, UI/UX, Programming, Event Organizer etc. Some individuals will be better suited to this type of specialization, and better applied across agencies--this will also significantly benefit other agency focused PIFs.
  • Internal and External Fellows - In my case, being a government employee was not beneficial. I don't aspire to establish a career in government, as I hope will be case with some future PIFs, and the role didn't really open up enough access, to make it worth my while. The PIF Program should have two distinct tracks that individuals can choose from, either tackling their fellowship from the inside-out or from the outside-in, without the shackles of being a government employee.

These are my two changes to the program that I feel strongly about. I know there are other areas that former and current PIFs would like to see changed, but these are the two I'm will to "be the change I want to see in the program". With this in mind, I'm willing to exit the program, make the change, and evolve the program into what I think it should be.

From the outside I will be able to apply my API skills across multiple agencies, and I will be able to bring external resources that my fellow PIFs can put to use.. Coupled with the efforts of other internal PIFs and government employees, I feel I can maximize my impact on how government operates in the coming years.



from http://feedproxy.google.com/~r/KinLane/~3/3pRjrwM21EU/being-the-change-i-want-to-see-in-the-presidential-innovation-fellowship-pif-program

Why I Exited My Presidential Innovation Fellowship

Since this blog acts as kind of a journal for my world, I figured I should make sure and add an entry regarding my exit of my Presidential Innovation Fellowship, affectionately called PIF program.

In June I was selected as a Presidential Innovation Fellow, where I went out to Washington D.C. and accepted a position at the Department of Veterans Affairs. I didn't actually start work until August 11th, but accepted I accepted the role along with the other 42 PIFs earlier that summer.

After 60 days, I decided to leave the program. The main reason is that Audrey and I couldn't make ends meet in DC, on what they paid, and after spending our savings to get out there, with no credit cards to operate on, and experiencing the shutdown, and facing another shutdown this winter--it just wasn't working for us.

The benefits gained by the title, and the G-14 employment position just didn't balance out the negative. In the end I'm thankful for the opportunity, but I couldn't ask Audrey or myself to make the trade-off. I knew things would be hard, but facing sleeping on friends couches and not being able to pay our AWS bills was not in the cards.

As is my style, I've spent zero time dwelling on my exit. I am determined to pick up all my projects, and continue moving them forward. In short I will still be doing all the work, just leave behind the title and official PIF status. I strongly believe that the best way to apply my skills is from the outside-in, and my exit will allow me to make a larger impact across government in the long run.

I hope everyone who I worked with at the VA, GSA, OSTP and beyond understands why I left by now, and knows I'm here to continue my support. I think the PIF program has a lot to offer future rounds, and I will continue to play an active role in the program and helping change how government operates using open data and APIs.

Thanks everyone!



from http://feedproxy.google.com/~r/KinLane/~3/XHMHs5OO2Yo/why-i-exited-my-presidential-innovation-fellowship

Thursday, November 14, 2013

What If All Gov Programs Like Healthcare.gov Had A Private Sector Monitoring Group?

The Healthcare.gov launch has been a disaster. I cannot turn on CNN or NPR during the day, without hearing a story about what a failure the technology and implementation has been for the Affordable Care Act(ACA).

I have written and talked about how transparency was the biggest problem for the Healthcare.gov rollout. Sure there was numerous illnesses from procurement to politics, but ultimately if there had been more transparency, from start to finish, things could have been different.

Throughout this debacle I have been making my exit from federal government back to the private sector, and I can't help but think how things could have been different with Healthcare.gov if it there had been some sort of external watchdog group tracking on the process from start to finish. I mean, c'mon this project is way to big and way to important to just leave to government and its contractors.

What if there had been a group of people assigned to the project at its inception? External, non-partisan, independent individuals who had the skills and tracked on the procurement process, design, development and launch of Healthcare.gov? What if any federal, state or city government project had the potential to have a knowledgable group of outside individuals tracking on projects and made recommendations in real-time how to improve the process? Things could be different.

Of course there are lots of questions to ask: How to fund this? Who watches the watchers? On and on. Even with all the quesitons, we should be looking for new and innovative ways to bring the public and private sector together to solve the biggest problems.

It is just a thought. As I exit the Presidential Innovation Fellowship (PIF) program, and head back to the private sector, I can't help but think about ways that we can improve the oversight and involvement of the private sector in how government operates.



from http://feedproxy.google.com/~r/KinLane/~3/WNQQwcC-Lvk/what-if-all-gov-programs-like-healthcaregov-had-a-private-sector-monitoring-group

Wednesday, November 6, 2013

Knowing Your HTTP Status Codes In Federal Government

Photo Credit: Runscope

I've been working on open data and APIs pretty heavily since Barack Obama directed all federal agencies to go machine readable by default, back in May of 2012. The White House directive told agencies to publish a copy their digital strategy at their website in the following location: [agency].gov/digitalstrategy. There was also supposed to be an HTML, XML and JSON representation of their digital strategy.

Pretty cool request. I immediately got busy writing a script that I could run each night and let me know which federal agency had published their digital strategy. To be fair, the White House mandate was for top level agencies, not really all 246 as I showcase. However, I think it is a process that all agencies can learn from, so I leave it up.

Back to pulling the digital strategy for each agency. First I needed all the agencies website addresses, which I pulled from the federal agency directory API. I then appended /digitalstrategy.html, and /digitalstrategy.xml, and /digitalstrategy.json to each agency URL. Now remember, I am a script or piece of code trying to determine if one of 246 x 3 pages exist. I'm not a human looking at each page load with my eyeballs. The only think I have to tell me what is happening is the HTTP status code(s):

  • 200 - Ok, The request has succeeded. 
  • 301 - Moved Permanently 
  • 404 - Not Found

What the government agency sends to me as a status code triggers one of three responses in my code:

  • 200 - They have published their digital strategy 
  • 301 - They have published their strategy but it is located somewhere else 
  • 404 - They have NOT published their digital strategy

After you run the script you see most of the agencies return a 404--not published. Ok, but then I started seeing 301 without an actual URL that redirected me to existing location. I saw published digital strategies return 404 and unpublished strategies return 200. While most agencies adhered to basic HTTP principles, some I just had to hard code. I had to manually code a section saying IF agency = XX then assume this response code. This is a pretty basic problem, something you won't see unless you actually write some code against the situation (which I assume agencies aren't doing).

Fast-forward two years you have the Office of Management and Budget (OMB) directing that government agencies post .data.json files in a similiar way to the earlier digital strategy. I hope someday they will also require /api.son files, /roadmap.json and other machine readable goodies, but that is another story. This story is about proper HTTP status codes.

Each government agencies should be publishing their /digitalstrategy and /data.json files at their website, and they should be properly returning 200 OK or a 301 with proper URI of where you put your digitalstrategy.json or data.json (or other resource). It is acceptable to have these files in an alternate location, but you must provide a complete 301 status code so that my code or script can properly make a decision and properly locate your digitalstrategy.json or data.json files.

I thought I wrote this story last year, but apparently the story in my head didn't match what I actually published to my blog. So I wanted to make sure there was a fresh copy to help government agencies understand this simple, but very important aspect of their digital strategy.



from http://feedproxy.google.com/~r/KinLane/~3/PA9b_3sXBWA/knowing-your-http-status-codes-in-federal-government

Private Web Application Running on Github

I wanted to launch a small web application in stealth mode. I also wanted it to run completely on Github, using Github Pages. I was able to setup a Github repository, as a private repo, then was able to launch a Github Page for the same repository and make this branch public.

This approach gives me two branches to work with. The master which is private and the gh-pages which is public. Using Jekyll I was able to quickly setup a basic website in the gh-pages brand of my repository. Once I had the basic site outline and index page setup, I needed a way to make content show on each page, but only for people who had access.

To secure my website I did two things. First I setup a JSON file that was stored in the private, master branch of my site repository. Then I setup my site to pull the navigation of the site and the content for each page from this JSON file. My site would load the home page, about page and other pages from this private JSON file.

Next I needed to provide a key to my private website, one that would allow the public gh-pages branch to pull the JSON file from the private master branch. I opted to use oAuth as the authentication layer. I went to my Github settings and generated an oAuth key for a specific application I set-up for my project. Using this key I can control who has access to my site.

When you visit the website URL, then append the oAuth token as a variable, all the content shows on the website. Using Github.js I pass the oAuth token and authenticate with the master repository, pull the contents of the JSON file and publish to the page.

My JSON file located in the private master repository acts as a sort of backend database to my public gh-pages repository which is actually viewed publicly at the website URL. Using JavaScript, the Github API and oAuth for security I'm easily able to control access to the web application.

A more advanced approach to this would be to require oAuth authentication using Github, allowing each user to be managed through Github team or collaborators settings. Then I could control each user who has access, but the approach applies to both scenarios, and is a pretty quick and dirty way to launch a private web site or application, that completely runs on Github.



from http://feedproxy.google.com/~r/KinLane/~3/oLl0KDi_bZ8/private-web-application-running-on-github

Saturday, October 19, 2013

Transparency Is Not Just About Github, Crowdsourcing, Open Source And Open APIs

I wrote a piece on the rollout of Healthcare.gov, and while there are numerous illnesses in the government that contributed to the launch being such a failure, my analysis took it up to the highest level possible, where the biggest problem can be attributed to a lack of transparency.

The post got a lot of comments via Twitter, LinkedIn, Facebook and other conversation threads I participated in from people who disagreed with me and kept interpreting my use of transparency as referring to using Github, crowdsourcing, open source software or APIs. Stating that these elements would not have saved the project, and we just needed to fix government contracting and get the right people on the job.

These responses fascinate me and reflect what I see from technologists across the space. Developers often bring their luggage with them and don't engage with people or read articles entirely, they bring their understanding of a certain word, attach and plow forward without critical analysis or deeper background research. I'm not exempt from this, I work hard to reverse this characteristic in my own personality.

What I mean by transparency is about letting the sunlight in to your overall operations, by default. In the case of Healthcare.gov, one of the numerous contractors applied this on front-end development, but the entire rest of the supply chain did not. The front-end group used Github, open source software, APIs and did crowdsource their work at several critical points of the development cycle. However, even this represents just the visible building blocks, not the resulting effects of "being transparent".

First and foremost, this approach to projects makes you the developer, project or product manager think differently about how you structure things. You know that your work will see the light of day and be potentially scrutinized by others. This makes you immediately think differently about how you work. There is no hiding in the shadows, where mistakes, cut-comers and your shortcomings cannot be hidden from the public.

Even if you don't use Github, listen to any comments or issues raised the public and keep all software proprietary and talk directly with code libraries and your database, but showcase the project work out in the open, you will see the benefits of transparency. It is just so happens that Github, establishing feedback loops, open source software and APIs help amplify transparency, and let in the healing benefits of sunlight.

There are numerous reasons I hear for NOT doing this. The true reasons are usually masked with the amount of additional resources needed for doing it this way or lack of expertise in open source projects, but really they tend to mask incompetency, insecurity, corruption or deep rooted beliefs that protecting your intellectual property will result in more money to be made.

Transparency isn't about a specific tools, platform or process. It is about opening up, letting other people in or possibly being almost entirely public in everything you do. Now I agree that not everyone is ready for this approach, and it may not be suited for every business sector, but I think you'd be surprised how easy it actually is, and how it can help you learn, grow and reduce the spread of illnesses within your project life cycle that may eventually cause you to fail.



from http://feedproxy.google.com/~r/KinLane/~3/mIJ3PBPLkDQ/transparency-is-not-just-about-github-crowdsourcing-open-source-and-open-apis

Added Video of My API Talk at #OpenVA at University of Mary Washington

I gave a talk as part of the Mind the Future discussion at the #OpenVA gathering on the University of Mary Wasington campus last Monday.  

My talk was focused on helping educational institutions the role APIs will play in the future of education, and helping ensure web literacy across our society.

You can find the slides from my talk, along with my other talks on Github. I've added the talk to my video section of my site, but you can also view below. 



from http://feedproxy.google.com/~r/KinLane/~3/6DSys8zcz14/added-video-of-my-api-talk-at-openva-at-university-of-mary-washington

Tuesday, October 15, 2013

Securing Site That Runs on Github Pages With JSON Backend In Private Repository

I have been deploying websites that run 100% on Github, using Github Pages and Jekyll for a while now. I'm pushing forward with different approaches to deploying sites and applications using this model, and my recent evolution is securing a website, only allowing specific people to access and interact with the site or application.

In this case, I have a web application that I am developing, and will run on Github, but I'm not ready for it to be public. So I created a private repository, then using the Automatic Page Generator under Github settings, I created a public site for the repository using Github Pages.

Next I created a JSON file that contained the navigation for the site, and each page and its content:

I put this JSON file in the master branch of the repository, which is private. After that, using Github.js I wrote a little bit of JavaScript using Jquery that pulled the JSON from the master branch and built the navigation and page content on the page:

Before the page will build, you have to have a valid oAuth token for the repository. In this particular scenario I am just passing the oAuth token through URL as a parameter, and if the variable isn't present or is invalid, the request for the JSON file just returns a 404 and none of the navigation or site content is updated. For other version I will be using oAuth.io to secure the application and just add people as Github team members if I want them to have access to the application.

Once I'm done with this particular application, and I am ready to make public I will just make the Github repository public and replace the pulling of master JSON file with a regular JQuery getJSON call, and use the JSON to build the site just like I do now.

This approach is definitely not for all applications, but easily allows me to run applications on Github and maintain a private, secure back-end. I just use Github OAuth security to access any files I want to keep private, and make only what I need public. In this case, unless you have access you just see a splash page.



from http://feedproxy.google.com/~r/KinLane/~3/jbihd2BdiPM/securing-site-that-runs-on-github-pages-with-json-backend-in-private-repository

Tuesday, October 8, 2013

Thoughts On Being An Employee

I am entering my first day as a furloughed government worker. I've been suiting up and going to work each day for almost two months. I spend each day going from meeting to meeting, working to carve out 15 minutes here and 15 minutes there to get actual work done.

Today is the first day I didn't suit up and go anywhere. I rolled out of bed, made coffee and got to work reading my feeds, sorting through emails and working through my Evernote notes and tasks. Then I got to work tackling some of the low hanging fruit on my to do list.

While it will probably take me a few days to get back into my old rhythm of productivity, i'm already finding some mojo to get things done. I'm struggling with shedding some of the framework of the employee framework that I've been subjected to, for even just two short months. I can see how people have difficulty in going from having a job to being freelance. Luckily I have the skills, discipline and mindset to pull from, so it shouldn't take me long to get back to normal.

This small glimpse gives me some insight into the damage our current employee framework does to people's creativity and productivity. The rituals of the commute, lunch breaks, meetings, coffee from Starbucks and other items not only take up our days, they drain our energy making us much more exhausted each evening.

I don't think freelance and / or working from home is for everyone. The employee role is not going anywhere, but I really think as businesses, we have to consider how we structure "work" for our workers, and as individuals we have to really consider how we find balance, happiness and productivity in our careers.

Each day I spend back in the world of "open work", the chances of me going back to being an employee gets slimmer and slimmer.



from http://feedproxy.google.com/~r/KinLane/~3/8VIsxSWCQaw/thoughts-on-being-an-employee

Sunday, October 6, 2013

Lack of Transparency Is Healthcare.gov Biggest Bottleneck

If you pay attention to the news, you have probably heard about the technical trouble with the launch of the Affordable Care Act, 50 state marketplaces and the central Healthcare.gov site.

People across the country are encountering show-stopping bugs in the sign up process, and if you go to the healthcare.gov site currently, you get a splash page that states, "We have a lot of visitors on the site right now." If you stay on the page it will refresh every few seconds until, eventually you might get a successful registration form.

I worked at it for hours last night was finally able to get into the registration process, only to get errors several steps in, but eventually got through the flow and successfully registered for an account, scrutinizing the code and network activity behind the scenes as I went along.

There are numerous blog posts trying to break down what is going wrong with the Healthcare.gov registration process, but ultimately many of them are very superficial, making vague accusations of vendors involved, and the perceived technology at play. I think one of the better one's was A Programmer's Perspective On Healthcare.gov And ACA Marketplaces, by Paul Smith.

Late last night, the Presidential Innovation Fellows (PIF), led by round one PIF Phillip Ashlock(@philipashlock), set out to try and develop our own opinion about what is happening behind the scenes. Working our way through the registration process, trying to identify potential bottlenecks.

When you look at the flow of calls behind each registration page you see a myriad of calls to JavaScript libraries, internal and external services that support the flow. There definitely could have been more thought put into preparing this architecture for scaling, but a handful of calls really stands out:

https://www.healthcare.gov/marketplace/global/en_US/registration.js
https://www.healthcare.gov/ee-rest/ffe/en_US/MyAccountEIDMUnsecuredIntegration/createLiteEIDMAccount

The second URL pretty clearly refers to the Center for Medicare and Medicaid Services(CMS) Enterprise Identity Management (EIDM) platform, which provides new user registration, access management, identity lifecycle management, giving users of the Healthcare Exchange Plan Management can register and get CMS credentials. Where the registration.js appears handles much of the registration process.

Philip identified the createLiteEIDMAccount call as the most telling part of the payload and response, and would most likely be the least resilient portion of the architecture, standing out as a potentially severe bottleneck. The CMS EIDM platform is just one potential choke point, and isn't a bleeding edge solution, it is pretty straightforward enterprise architecture that may not have had adequate resources allocated to handle the load. I'm guessing underallocated server and application resources is playing a rampant role across Healthcare.gov operations.

Many of the articles I've read over the last couple days make reference to the front-end of Healthcare.gov in using Jekyll and APIs, and refer to the dangers of open washing, and technological solution-ism. Where this is most likely an under-allocated, classic enterprise piece of the puzzle that can't keep up. I do agree with portions of the open washing arguments, and specifically around showcasing the project as "open", when in reality the front-end is the only open piece, with the backend being a classic, closed architecture and process.

Without transparency into the entire stack of Healthcare.gov and the marketplace rollouts, it is not an open project. I don't care if any part of it is--making it open-washing. The teams in charge of the front-end were very transparent in getting feedback on the front-end implementation and publishing the code to Github for review. It isn't guaranteed, but if the entire backend stack followed the same approach, publishing technology, architectural approaches and load testing numbers throughout a BETA cycle for the project--things might have been different on launch day.

Transparency goes a long way into improving not just the technology and architecture, but can shed light on illnesses in the procurement, contracting and other business and political aspects of projects. Many technologists will default to thinking I'm talking about open source, open tools or open APIs, but in reality I'm talking about an open process.

In the end, this story is just opinion and speculation. Without any transparency into exactly what the backend architecture of Healthcare.gov and the marketplaces are, we have no idea of actually what the problem is. I'm just soapboxing my opinion like the authors of every other story published about this problem over the last couple days, making them no more factual than some of my other fictional pieces about this being an inside job or a cleverly disguised denial of service attack!



from http://feedproxy.google.com/~r/KinLane/~3/36RZBfaJxqc/lack-of-transparency-is-healthcaregov-biggest-bottleneck

Saturday, October 5, 2013

End To A Very Tough Week in Washington DC

It was a really tough week in Washington DC. We came into the office Monday morning to learn that in addition to facing a possible government shutdown, that about 30% of the workforce in our department at the VA was moving on, due to a change in contract. While these folks had an idea the contract was being renegotiated, they only learned they would be leaving that morning.

These folks had been here two years and held quite a bit of knowledge, so their exit represented a serious knowledge drain for the organization. Sure we will get new bodies, but that's all they'll be until they get up to speed, are warm bodies. This type of contracting has to play a significant role in keeping government being from being as efficient as it could.

Then on Tuesday the news came of the government shutdown. While our part of the VA was declared to have funding through Friday. One by one other groups within the VA, and other agencies across government went silent, with furloughed workers heading home, turning off the lights and agency servers as they left.

As I tried to stay working I was faced with numerous challenges, people gone that I needed to talk to, and websites that I depended on were reduced to splash pages, including frequently used data.gov. I was dependent on this site for data sets, in my daily work, but more importantly for an upcoming hackathon for veterans in NYC.

With this fresh in my mind, I set out downloading and scraping data from existing VA sites, in hopes of preparing them and publishing via Github, so that hackers have some resources to build web and mobile applications for veterans at The Feast Hackathon in NYC. Not only will the hackathon have limited access to data, VA leadership that was planning on going won't be able to attend, and any press support the VA was going to provide won't be going out. WTF!

Supposedly Monday is our last day, then we face furlough along with the hundreds of thousands government workers. For right now I will just drink a beer, and think deeply about why I'm in Washington DC. More to come...



from http://feedproxy.google.com/~r/KinLane/~3/vxnE_1RSNms/end-to-a-very-tough-week-in-washington-dc

Thursday, October 3, 2013

I Am lucky I Am Furloughed

I'm lucky. I'm furloughed, but my kid's education doesn't depend on it. I don't live month to month. I'm not at the beginning of my career, and I'm not a single parent. I took a cut in salary and left my family to go to Washington to answer the President's call to try to make government "cool" and efficient, and I work 70 hours a week to do it. I was handed a plum of a project which possibly will save the government a billion dollars in 2014, but the government just cost us all that and more.

We are now like a runner forced to stop mid-stride for one minute and then allowed to run for one minute---we expend more effort and cover less distance than running at a smooth pace. The cost of the shutdown, even if it is reversed tonight, has already been at least a billion, by my own humble reckoning. That assumes of course that you believe and that we have become a great nation because of good government, not in spite of bad government.

As duty demands, I'm going into work now to shut down the servers that host the Beta site for my project. It is the first day I haven't worn a tie in 4 months. One must always look on the bright side.

After I shut down my servers, I have to leave and am forbidden to do any work for the government as long as the shutdown lasts.



from http://feedproxy.google.com/~r/KinLane/~3/C5wkdyKELcE/i-am-lucky-i-am-furloughed

Monday, September 23, 2013

Content and Data Management Via Github Pages

I'm pushing forward how I'm using Github to build prototypes and even full blown applications. I've been publishing all of my public sites using Github + Github Pages for most of the year, but this last week I've taken my usage of Github to new levels.

Using two important JavaScript libraries I'm able to use Github as an application platform in new and exciting ways:

  • oAuth.io - Simple API integration for Github and other providers
  • Github.js - A JavaScript wrapper for the Github API

Using oAuth.io I'm able to provide a logon via Github, that is entirely client-side on Github Pages. So there is no server side handling of the oAuth flow(by me), allowing me to authenticate users via Github Pages, securing content behind it in a private repo or allow for editing and saving of content and data live from Github once a user is authenticated.

Using this approach, I'm allowing users to visit an application I have built in HTML, CSS, JavaScript and JSON that is running on Github Pages. You can choose to authenticate using Github oAuth, and if I have added you to the list of members not the Github organization the repository exists in, you can edit content and data directly from the web page / app.

This is a quick and dirty way to allow a user to manage content via pages and data via JSON using a custom interface I've built and running via Github pages, with no server side technology needed--Github does all of the heavy lifting.

I'm playing with different approaches to giving users control over content and data, allowing them to edit pages and blog posts via apps like prose.io and the ability to edit JSON via JavaScript table editors or using tools like Online JSON Editor.

I'm in the early days of building applications via Github in this way, so I'm sure I will get more sophisticated in my approach in the coming months. oAuth.io and Github.js has allowed me to move into this new realm and overcome some pretty significant hurdles in how I secure my applications and provide a backend for them that runs exclusively on Github.



from http://feedproxy.google.com/~r/KinLane/~3/V-Om-Dk44NE/content-and-data-management-via-github-pages