{"API Security"}

API Rate Limiting At The DNS Layer

I just got an email from my DNS provider CloudFlare about rate limiting and protecting my APIs. I am a big fan of CloudFlare, partly because I am a customer, and I use to manage my own infrastructure, but also partly due to the way they understand APIs, and actively use them as part of their business, products, and services.

Their email spans a couple areas of my research that I find interesting, and extremely relevant: 1) DNS, 2) Security, 3) Management. They are offering me something that is traditionally done at the API management layer (rate limiting), but now offering to do it for me at the DNS layer, expanding the value of API rate limiting into the realm of security, and specifically in defense against DDoS attacks--a serious concern.

Talk about an easy way to add value to my world as an API provider. One that is frictionless, because I'm already depending on them for the DNS layer of my web, and API layers of operations. All I have to do is signup for the new service, and begin dialing it in for my all of my APIs, which span multiple domains--all conveniently managed using CloudFlare.

Another valuable thing CloudFlare's approach does, in my opinion, is to reintroduce the concept of rate limiting to the world of websites. This helps me in my argument that companies, organizations, institutions and government agencies should be considering having APIs to alleviate website scraping. Using CloudFlare they can now rate limit the website while pointing legitimate use cases to the API where their access can be measured, metered, and even monetized when it makes sense.

I'm hoping that CloudFlare will be exposing all of these services via their API, so that I can automate the configuration of rate limiting for my APIs at the DNS level using APIs. As I design and deploy new API endpoints I want them automatically protected at the DNS layer using CloudFlare. I don't want to have to do extra work when it comes to securing and managing web or API access. I just want a baseline for all of my operations, and when I need I can customize per specific domains, or down to the individual API path level--the rest is automated as part of my continuous integration workflows.

See The Full Blog Post


With Each API We Increase The Attack Surface Area

It is easy for me to get excited about a new API. I'm an engineer. I'm a dude. I am the API Evangelist. It easy to think about the potential for good when it comes to APIs. It is much harder to suspend the logical side of my brain and think about the ways in which APIs can be used in negative ways. As a technologist it is natural for me to focus in on the technology, and tune out the rest of the world--it is what we do. It takes a significant amount of extra effort to stop, suspend the portion of your brain that technology whispers to, and think about the unintended consequences, and the pros and cons of why we are doing APIs.

Technologists aren't very good at slowing down and thinking about the pros/cons of connecting something to the Internet, let alone whether or not an API should even exist in the first place (it has to exist!). As I read a story about the increases in DDOS attacks on the network layer of our online world, I can't help but think that with each new API we deploy, that we are significantly increasing the attack surface area for our businesses, organizations, institutions, and government agencies. It feels like we are good at thinking about the amazing API potential, but we really suck at seeing what a target we are putting on our back when we do APIs.

We seem to be marching forward, drunk on the potential of APIs and Internet-connected everything. We aren't properly securing the technology we have, something we can see playing out with each wave of vulnerabilities, breaches, and leaks. We are blindly pushing forward with new API implementations, and using the same tactics we are using for our web and mobile technology, something we are seeing play out with the Internet of Things, and the vulnerable cameras, printers, and another object we are connecting to the Internet using APIs.

With each API we add, we are increasing the attack surface area for our systems and devices. APIs can be secured, but from what I'm seeing we aren't investing in security with our existing APIs, something that is being replicated with each wave of deployments. We need to get better at thinking about the negative consequences of doing APIs. We need to stop making ourselves targets. We need to get better at thinking about whether or not an API should exist or not. We need a way to better visualize the target surface area we've crafted for ourselves using APIs, and be a little more honest with ourselves about why we are doing this.

See The Full Blog Post


Discovering New APIs Through Security Alerts

I tune into a number of different channels looking for signs of individuals, companies, organizations, institutions, and government agencies doing APIs. I find APIs using Google Alerts, monitoring Twitter and Github, using press releases and via patent filings. Another way I am learning to discover APIs is via alerts and notifications about security events.

An example of this can be found via the Industrial Control Systems Cyber Emergency Response Team out of the U.S. Department of Homeland Security (@icscert), with the recent issued advisory ICSA-16-287-01 OSIsoft PI Web API 2015 R2 Service Acct Permissions Vuln to ICS-CERT website, leading me to the OSIsoft website. They aren't very forthcoming with their API operations, but this is something I am used to, and in my experience, companies who aren't very public with their operations tend to also cultivate an environment where security issue go unnoticed.

I am looking to aggregate API related security events and vulnerabilities like the feed coming out of Homeland Security. This information needs to be shared more often, opening up further discussion around API security issues, and even possibly providing an API for sharing real-time updates and news. I wish more companies, organizations, institutions, and government agencies would be more public with their API operations and be more honest about the dangers of providing access to data, content, and algorithms via HTTP, but until this is the norm, I'll continue using API related security alerts and notifications to find new APIs operating online.

See The Full Blog Post


An Auditing API For Checking In On API Client Activity

Google just released a mobile audit solution for their Google Apps Unlimited users looking to monitor activity across iOS and Android devices. At first look, the concept didn't strike me as anything I should write about, but once I got to thinking about how the concept applies beyond mobile to IoT, and the potentially for external 3rd party auditing of API and endpoint consumption--it stood out as a pattern I'd like to have in the filing cabinet for future reference.

Using the Google Admin SDK Reports API you can access mobile audit information by users, device, or by auditing event. API responses include details about the device including model, serial numbers, user emails, and any other element that included as part of device inventory. This model seems like it could easily be adapted to IoT devices, bot and voice clients.

One aspect that stood out for me as a pattern I'd like to see emulated elsewhere, is the ability to verify that all of your deployed devices are running the latest security updates. After the recent IoT launched DDOS attack on Krebs on Security, I would suggest that the security camera industry needs to consider implementing an audit API, with the ability to check for camera device security updates.

Another area that caught my attention was their mention that "mobile administrators have been asking for is a way to take proactive actions on devices without requiring manual intervention." Meaning you could automate certain events, turning off, or limiting access to specific API resources. When you open this up to IoT devices, I can envision many benefits depending on the type of device in play.

There are two dimensions of this story for me. That you can have these audit events apply to potentially any client that is consuming API resources, as well as the fact that you can access this data in real time, or on a scheduled basis via an API. With a little webhook action involved, I could really envision some interesting auditing scenarios that are internally executed, as well as an increasing number of them being executed by external 3rd party auditors making sure mobile, devices, and other API-driven clients are operating as intended.

See The Full Blog Post


A Dedicated Security Page For Your API Portal

One area I am keeping an eye on while profiling APIs, and API service providers, are any security-related practices that I can add to my research. While looking through DataDog I came across their pretty thorough security page, providing some interesting building blocks that I will add to my API security research. This is all I do as the API Evangelist--aggregate the best practices of existing providers, and shine a light on what they are up to. 

On their security page, DataDog provides details on physical and corporate security, information about data in transit, at rest, as well as retention, including personally identifiable information (PII), and details surrounding customer data access. They also provide details of their monitoring agent and how it operates, as well as how they patch, employ SSO, and require their staff to undergo security awareness training. The important part of this is that they encourage you to disclose any security issues you find--critical for providers to encourage this.

Transparency when it comes to security practice is an important tool in our API security toolbox. It is important that API providers share their security practices like DataDog does, helping build trust, and demonstrate competency when it comes to operations. I'm working on an API security page template for my default API portal, and DataDog's approach provides me with some good elements I can add to my template.

See The Full Blog Post


Helping You Address The Security Gap In Your API Infrastructure With Sapience

Welcome to our latest APIWare project--Sapience! Our team's response to a need for a more API focused security scanning solution. At the end of 2015, the team was looking for our next project, and they asked me for my thoughts on what the biggest need was in the API sector, based upon my monitoring as the API Evangelist -- I quickly responded with security. 

Sapience is currently in beta, but I wanted to take a moment to share some of the thinking that has gone into Sapience, and the current state of security when it comes to APIs. We feel pretty strongly that like security, API security is a very large and daunting challenge, and we need to work hard to peel back the layers a bit, and get to work on better securing digital infrastructure that is increasingly being made available via web APIs.

Being API-First Helps With Security
The first stop when it comes to API security is just doing them, and making it a priority across all websites, mobile, and device-based developed, as well as system to system integration. Using a consist interface to access all of your digital assets help ensure consistency, allowing for potentially more accountability as part of overall security efforts. API-first is the first step of any successful API security strategy. 

SSL All The APIs By Default 
Encryption is one of the most important tools in our security toolboxes, but unfortunately is also something that still is not the default mode for API providers. Whether its costs associated with certificates and implementation or legacy beliefs around the performance tax encryption can bring, APIs are not always SSL by default. SSL by default is the second step of any successful API security strategy.

API Management Provides Authentication
As I studied the API security landscape the leading API management providers often dominate the conversation, with their ability to secure APIs using keys, OAuth, and other increasingly common solutions. API management is definitely a significant portion of the frontline when it comes to API security, the problem is when the conversation stops here, and API providers are not actively testing and pushing on their infrastructure at this front line. 

Securing The Known Universe With API Definitions And Discovery
Another layer of API security discussions that emerged as I studied the landscape was the important role API definitions are playing when it comes to securing API infrastructure. In short, you can't secure wheat you don't know about, and having your API-first infrastructure well defined using common API definition formats, is significantly helping API providers get their security house in order. 

Automated Scanning For Most Common API Vulnerabilities
After API-first practices, SSL by default, modern API management solutions, and robust API definition and discovery work, we get to where Sapience excels--scanning this API infrastructure for common vulnerabilities. I know, many of you will want a magic pill that will address all of our security needs, but in our rush to deploy APIs for the rapidly expanding mobile landscape, many companies are not actively securing existing infrastructure for the most common threats.

In my monitoring of the space I regularly come across technology solutions that will provide comprehensive online security solutions, and even more agencies who will help you secure your company's online presence, but as of January 2016 there were no API-specific, SaaS solutions that help address even the most simple vulnerabilities when it came to security. This is why I identified security as the number one problem out there, and why the APIWare jumped at the opportunity to develop an API specific solution.

APIs have provided a much healthier approach to defining the digital infrastructure of companies, organizations, institutions, and government agencies for the last 10 years. The next stage of this evolution is continuing to bring security out of the IT shadows, and acknowledge that much of this infrastructure is running on the open Internet, even if it is hidden behind the web, mobile, or Internet of things applications. At APIWare, we want to help lead this conversation, and this is why we started Sapience.

Contact us today, to get started scanning your critical API infrastructure today.

See The Full Blog Post


We Need to Change the Psychology of Security

 

 

http://motherboard.vice.com/read/we-need-to-change-the-psychology-of-security

See The Full Blog Post


A Healthy Stance On Privacy And Security When It Comes To Healthcare APIs

I am reading through the API task force recommendations out of the Office of the National Coordinator for Health Information Technology (ONC), to help address privacy and security concerns around mandated API usage as part of the Common Clinical Data Set, Medicare, and Medicaid Electronic Health Records. The recommendations contain a wealth of valuable insights around healthcare APIs but are also full of patterns that we should be applying across other sectors of our society where APIs making an impact. To help me work through the task force's recommendations, I will be blogging through many of the different concepts at play. 

Beyond the usage of "patient-directed APIs" that I wrote about earlier, I thought the pragmatic view on API privacy and security was worth noting. When it comes to making data, content, and other digital resources available online, I hear the full spectrum of concerns, and it leaves me optimistic to hear government agencies speak about security and privacy in such a balanced way.

Here is a section from the API task force recommendations:

Like any technology, APIs allow new capabilities and opportunities and, like any other technology, these opportunities come with some risks. There are fears that APIs may open new security vulnerabilities, with apps accessing patient records "for evil", and without receiving proper patient authorization. There are also fears that APIs could provide a possible "fire hose" of data as opposed to the "one sip at a time" access that a web site or email interface may provide.

In testimony, we heard almost universally that, when APIs are appropriately managed, the opportunities outweigh the risks. We heard from companies currently offering APIs that properly managed APIs provide better security properties than ad-hoc interfaces or proprietary integration technology.

While access to health data via APIs does require additional considerations and regulatory compliance needs, we believe existing standards, infrastructure, and identity proofing processes are adequate to support patient directed access via APIs today.

The document is full of recommendations on how to strike this balance. It is refreshing to hear such a transparent vision of what APIs can be. They weigh the risks, alongside the benefits that APIs bring to the table while also being fully aware that a "properly managed API" provides its own security. Another significant aspect of these recommendations for me is that they also touch on the role that APIs will play in the regulatory and a compliance process.

I have to admit, the area of healthcare APIs isn't one of the most exciting stacks in the over 50 areas I track on across the API space, but I'm fully engaged with this because of the potential of a blueprint for privacy and security that can be applied with other types of APIs. When it comes to social, location, and other data the bar has been set pretty low when it comes to privacy and security, but health care data is different. People tend to be more concerned with access, security, privacy, and all the things we should already be discussing when it comes to the rest of our digital existence--opening the door for some valuable digital literacy discussions.

Hopefully, I don't run you off with all my healthcare API stories, and you can find some gems in the healthcare API task force's recommendations, like I am. 

See The Full Blog Post


Competing Views Around The Value And Ownership Of Digital Resources Impacting API Security

I was reading a post about how having an unclear sense of ownership hurts API security, which showcases the different views on who owns security, when it comes to exposing corporate digital assets via APIs. When I read the title, I anticipated the story being about a difference in how ownership of the asset(s) itself is viewed, but it ended up focusing on the ownership of security itself, not ownership of the assets which are being exposed -- something I think gets closer to the root of the problem, than who "owns security". 

In short, the IT and developers who are often charged with exposing corporate assets via APIs will view those digital resource in some very different ways, than other people at the company. These groups will focus on exposing digital resources in a very technical sense, making them available so that others can integrate into their apps and systems--it isn't always in their nature to secure things sensibly. Their focus is to open up access, something the article touches on, but is something I think it goes deeper than just being about API security. Developers and IT are rarely ever going to see the digital resource in the same way that business stakeholders will, let alone security focused players (hopefully your IT and dev team has specialist influencing things).

APIs have made a name for themselves because a handful of companies successfully exposed their digital resources in this new way, allowing external perspectives of those digital resources to enter the conversation, which allowed for innovation to occur. In this handful of API origin stories we tell like to tell, owners of the digital resources at play were open to outside views of what their digital resource was, and how that resource could be put to use. These leading companies were open to an alternative view of the ownership and access of these digital resources, something that allowed API platforms to flourish << This is not something that will happen in all situations.

APIs really begin to go wrong, when the sense of ownership around digital resource is already unhealthy, resulting in what my friend Ed Anuff speaks of, with everyone doing the API economy wrong. Without proper buy-in, developers and IT will often overlook security around resources being exposed--they just don't understand the importance of the resource in the same way. Coming from the opposite direction, business users will often come in and apply their "wet blanket" sense of ownership on the platform--resulting in heavy handed registration and approval flows, sales cycle(s), pricing, rate limits, and other common things you see slow API adoption.

APIs should be about us exposing our digital resources using the now ubiquitous Internet technology, in a way that opens up our resources, and the culture our business, organizations, institutions, and government agencies to outside views about what our resources are, and how they can be put to use. This is something that when done in the right environment, can reap some serious benefits for everyone involved, but when done in a culture where there is a already an imbalance around what digital resource ownership is, shit can really go wrong -- with security being just one stage where it plays out. In the end, APIs are not for everyone. Some people just have too strict of a view around the value of their digital resources, and of the ownership of that resource, for the API thing to every actually work--with company IT and developer security practices being just a symptom of a much larger illness.

See The Full Blog Post


There Are Two Types of Online Security Discussions Going On Currently, One Is Real, And The Other Is Theater

I added "security" as an area of my monitoring of the API space a couple years back, where I curate news stories, white papers, and other resources on the topic of online security. I have carved off most of the API related aspects of this into my security research project, but I have also noticed another schism emerging in much of the information I'm gathering during the course my work.

There are two distinct types of conversations going on, the first one is about security, encryption, and many of the technical aspects of how we protect ourselves, our businesses, and our organizations online. The second half focus on the theater around all of this, using many of the same words, but is far less technical, much more emotional, and wraps itself in special words like "cybersecurity".

This security theater is used by the NSA, FBI, police, and other government organizations, but it isn't exclusive to these groups. Tech giants like Apple, Google, Facebook, and Twitter also play the security theater card when it benefits them. Kind of like a soccer player will overplay a foul, to get the attention of the referee. It can be difficult to tell the difference when these groups are truly discussing security, or putting on a play or skit on the online security stage--for the rest of our benefits.

Security theater, aka "cybersecurity" plays just as important role in security, as the technical nuts and bolts. If you get citizens acting on their emotions and fears, it is much easier to shift the conversation in your favor, without ever having to do anything truly technical. Cybersecurity is not that new, it is just an evolution of previous security fear tactics used by the military industrial complex, and government leaders to control the population--it is just now being adapted to the new digital landscape we find ourselves living in.

Security theater will continue to be a box office hit in coming years!

http://security.apievangelist.com/

See The Full Blog Post


Be More Transparent With Your Security Using APIs

See The Full Blog Post


Security Will Increasingly Be Used AS Component Of Tiered API Planning

As I look through the business models of leading API providers I am profiling, I'm increasingly seeing security as a selling point. When API providers break down their pricing into tiers, they are usually very good at breakdown down the elements of what goes into each plan--this is what I have been studying for last couple weeks. 

When I come across security leveraged as part of API plans, it is rarely a part of the free or entry levels, and is something you usually see in the higher level paid plans, and enterprise tiers. Here is an example of this, in a screen shot from the Box pricing page.

These plans are part of the SaaS side of the Box operations, but Box are pretty unique in that they have separate pricing tiers for the SaaS side of their operations, and a related, but addition set of pricing for the API side of their platforms. However, Box's approach provide the best example of this in action, with add-ons beyond the plan based pricing, that were also very security focused.

Box is a document platform that services the enterprise, which includes numerous, very heavily regulated industries. It makes sense that they are emphasizing security. My focus is that security is leveraged as specific feature of individual plans, with an emphasis on it being present in the upper tiers, and with add-ons.

This really isn't news. It makes sense that an emotionally charged element like security is used as a component of planning and pricing. I'm just looking to document security as a component of API planning for my research, educate other platforms about the potential for this type of use, but I am also looking to better understand how different companies, in different industries are wielding it (or not).

I predict that security will increasingly be used as a component like this in SaaS and API planning. In the current, very insecure online environment we all are living in, individuals and companies will pay a premium for real or perceived security. My goal is to better discovery when security is wielded like this, and try to better understand when it is a real component of API planning versus when its just used as an emotional way to convince people to upgrade to higher level plans.

See The Full Blog Post


You Have Had Three Security Strikes, Hand Over Your Data Storage Operations For 12 Months Plus Probation #APIDesignFiction

Your company has had its third strike. Your first security breach was in January of 2018, with the second only six months later in June, and the last came in February of 2019. Your company has shown that it just doesn't have what it takes to secure your users data, and there we are going to need you to have over the keys for your data storage, for a 12 month period. 

During this time, you will access your company data via APIs, that are monitored via a professional data management organization, as well as state and federal auditors to make sure all required security and privacy procedures are followed. All server, database, and storage operating procedures will be documented, and shared with your organization when the 12 month period is complete.

We have assessed that 60% of your infrastructure uses APIs, so the switch-over to the new infrastructure will not be that difficult. Your company will have 45 days to accomplish the other 40% of refactoring all your software to use APIs. Part of your illness within your organization was that this 40% of your operations was technical debt that your organization refused to bring up to speed--resulting in several large breaches. 

When we hand your data storage infrastructure back to you, we will audit for another 18 month period to ensure you are practicing a 100% API strategy, as well as end-to-end encryption, making sure it is applied for all servers, storage, and in transit using SSL. During this 18 month period our auditors will assess whether you have the resources to bring your operations up to an acceptable level. If you do not meet requirements, the period can be extended, or your infrastructure can be ordered back into a forced-management situation again.

If you have any questions, please contact your case manager, and your IT operations manager will be in touch shortly with more details on the coming transition period.

See The Full Blog Post


To Incentivize API Performance, Load, And Security Testing, Providers Should Reduce Bandwidth And Compute Costs Asscociated

I love that AWS is baking monitoring testing by default in the new Amazon API Gateway. I am also seeing new service from AWS, and Google providing security and testing services for your APIs, and other infrastructure. It just makes sense for cloud platforms to incentivize security of their platforms, but also ensure wider success through the performance and load testing of APIs as well.

As I'm reading through recent releases, and posts, I'm thinking about the growth in monitoring, testing, and performance services targeting APIs, and the convergence with a growth in the number of approaches to API virtualization, and what containers are doing to the API space. I feel like Amazon baking in monitoring and testing into API deployment and management because it is in their best interest, but is also something I think providers could go even further when it comes to investment in this area.

What if you could establish a stage of your operations, such as QA, or maybe production testing, and the compute and bandwidth costs associated with operations in these stages were significantly discounted? Kind of like the difference in storage levels between Amazon S3 and Glacier, but designed specifically to encourage monitoring, testing, and performance on API deployments.

Maybe AWS is already doing this and I've missed it. Regardless it seems like an interesting way that any API service provider could encourage customers to deliver better quality APIs, as well as help give a boost to the overall API testing, monitoring, and performance layer of the sector. #JustAThought

See The Full Blog Post


Reality Show Surveillance Package: Why Pay For Security, When We Can Pay You? #DesignFiction

In 2020, why would you go with any of the mainstream home security providers? The RealityCom Reality Show Surveillance Package is tailored perfectly for the modern family. We do not just keep your family safe 247/7, we also help amplify the most important aspects of your life, and share with family and friends, and even the public.

Our critics call us a "modern surveillance apparatus", but these people live in the past. Privacy is a concept of the last century, and the modern family has embraced being not just a consumption family, but also contributing, and participating in the best reality programming out there. The most efficient, and cost effective way to keep your family safe and sounds, is through what we call "transparent living", where your home, possessions, and love ones are all monitored, and plugged into the RealityCom Security Network.

When our security production staff also finds an interesting scenario, our programming staff is notified, and we take the family moment, and streamed in near real-time to the audience of your choosing. Using the transparent living technology platform, you get to share with family, friends, and when we identify it as a quality moment of programming, we will pay you for the media and content--if your family becomes viral success, we pay you exponentially more, depending on the attention your family comands.

Not every family becomes a paid RealityCom Family, but you will not know if you have what it takes unless you get started with your Reality Show Surveillance Package. The best part is it is all free. We come out and install all the equipment, maintain all equipment, and store all the video, audio, and other content at no cost to you. You get modern home, auto, and work security for you and your family, for FREE! It doesn't get any better than this, sign up to day, so we can get started with your installation.

See The Full Blog Post


The Functional Purpose Of Cybersecurity Working Groups

See The Full Blog Post


Breaking Down The Layers of API Security And Considering Link Integrity

One of the reasons I setup individual research projects, is to provide me with a structure for better defining each aspect of the API world, something I am working hard to jump-start within my API security research. You will notice the project does not have any building blocks defined, which when you compare with one of my oldest research areas, you start to see what I mean.

The blog posts, and other links I curate as part of my API security will help me find companies and tools that are providing value to the space. As I break down each company, and what they offer, I often have to read between the lines, in trying to understand how an API, service, or tool can be used by API providers, as well as potentially API consumers. I am looking for APIs that offer security, but also APIs that offer security to APIs--make sense?

As part of this research, I am playing with Metacert, which bills itself as  a security API for mobile application developers, helping them block malicious ads, phishing links & unwanted pornography inside apps, but I think it is so much more. I could see Metacert being pretty valuable to API providers, as well as API consumers building web and mobile apps. Security isn't always about brute force attacks, and could easily just be in a simple link, added with some content, via your API.

I am adding Metacert to my API security research, with a focus on its potential to API providers. I could see API providers seamlessly integrating the Metacert API into their own stack, processing all links that are submitted through regular operations. I will also be adding link screening like this as a building block to my API security research.

If you are looking for a wise investment in the API security space, you should be talking with Metacert. APIs like Metacert provide us with a model for thinking about how we deliver API driven security services for web, mobile, and IoT applications, but it also provides a potential wholesale API layer that other APIs can use to better secure their own APIs. I consider it a strong blueprint, because its API driven, they have all the essential building blocks, which includes a monetization strategy, and they do one thing, and they do it well.

See The Full Blog Post


Why Would Want To List All Your Universities Web Services (APIs) Out In The Open, Via Central Portal? What A Security Risk!

I went up to California State University Channel Islands the other day to talk APIs with their tech team, and I was happy to find at least one strong API skeptic on the team. API skeptics also give me material for stories, so I thoroughly enjoy coming across them, and telling these stories is how keep polishing my argument for the next API skeptic encounter at campus IT, at the higher educational institutions that I visit.

During the discussion I was asked several interesting questions, the first one was: why would you want to list of your web services (APIs) for your public university, out in the open, via a central portal?? What a security risk!!

Sorry, But Security Through Obscurity Is Not A Strategy
I’m sorry, hiding things, and hoping nobody finds them is not a valid IT strategy. You are a public university, and you should have a sophisticated identity and access management, as well as well tight approach to security. If you can’t properly secure an API resource available at a public URL, you shouldn’t be running campus IT, sorry. You should be able to provide a public description, of what resources are available, without giving away the farm.

The Feeling You Have —It Comes From Legacy Power And Control
As an IT professional, if you don’t acknowledge that power and control exists within classic IT processes, you are in denial. I’ve taken over numerous company IT operations, and there is ALWAYS political struggles between IT and the rest of operations. If you list all digital resources in a single portal, and make them available in a self-service way, it disrupts existing power and control structures. When you make things hard to find, you play a central role in people finding these resources, and you are in power—adding to decades of legacy stories about IT being a negative force in operations. Self-service access to campus resources is the future, no matter how much you resist.

Centralized Location To Aggregate All Digital Resources
A centralized API portal provides a single place for anyone to discover or share their digital resources. Like the main website, administrators, faculty, students, parents, and the public will know there is a single place to find machine readable versions of campus resources. API portals are more than just a listing of APIs, it includes documentation, code samples, widgets, buttons, spreadsheet connectors, visualizations, blogs, twitter accounts, and other resources that can turn a simple portal into an active ecosystem where everyone collaborates around campus resources.

Self-Service Access For 3rd Party Vendors Delivering Vital Campus Services
It takes resources to engage with 3rd party vendors on campus, and a single, self-service portal provides a standardized way for vendors to access the institutional resources they need to deliver the services they are bringing to campus operations. There is no reason that IT or departmental contacts should ever be bottlenecks in delivering the information vendors will need. API portals should provide them with the access they need, along with proper identity and access management, and monitoring of exactly how vendors are accessing, and putting campus resources to use (or not).

Interoperability Between Higher Educational Institutions
Most higher educational institutions are part of a larger network of institutions, and at the very least have relationships with other schools, in which information, and resources are shared. As with vendors, much of this can be provided via a self-service API portal, where institutions can find the data, content, and other resources they need, able to access only the resources they are supposed to during the fullfillment of established relationships.

Smoother Interactions With Local, State, And Federal Government
Higher educational institutions have regular integration points with local, state, and federal government agencies. Data is reported on a regular basis, back and forth, and a central API portal is excellent for aggregating the commons resources a government agency will be asking for. API portals are not just for student hackers, or 3rd party developers, they are increasingly how government agencies are sharing, and getting what they need to regulate, monitor industry, and govern more effectively.

It Is About People Getting Access To What They Need
Central API portals are about making sure people get access to what they need, and at a public university, this should be priority number one. You have data, content, and other digital resource students, faculty, administrators, vendors, government, parents and alumni need, and because you aren’t confident in securing it, from a handful of hackers you are going to hide this away, and forgo all the benefits? I’m sorry, the perceived negatives just don’t out-weighs the positives. There are proven ways to secure APIs, and for much of university operations, it won’t hurt to provide a title, description, and location of a resource, and let identity and access management handle who should be able to get at resources—or not.

Bringing IT Out of Shadows Across Campus Operations
Security is a concern, and should be front-and-center in all conversations, but it is too often used to hide away insecurities, incompetency, and shortcomings in security, rather than actual address the root of security concerns. If a resource is accessible via Internet protocols, over campus networks, it should be mapped out, with the definition made publicly available in a sensible way. This conversation should involve careful consideration what resource should be public, and what should remain private, with a heavy emphasis on transparency at a public university.

Between 2000 and 2007, we start seeing a shift of IT services into the cloud, and much of this has happened, as un-sanctioned shadow IT, by administrators, faculty, and students looking to get their work done, because IT hasn’t been able to keep up. The cloud evolution of IT has been all API driven, and making IT resources available to the public at Amazon sounded insane at first, but it has changed not just how Amazon operates, but has evolved the entire world deploys software architecture—using APIs.

There are many great examples of leading companies, making personally identifiable information, banking, healthcare, and other vital data and content available on the open Internet, in a secure way. It is not a security risk to share information, and provide access to your valuable resources, in a central, publicly available portal--if you do it right. Who knows, if you do open up, there may be some unintended consequences, leading to IT being seen as positive influence on the education process, as opposed to the roadblock image it often has on campuses across the country.

See The Full Blog Post


The Politics, Marketing, And Fear of API Security

I cringe, when I think about the number of mobile applications out there, that people depend on in their personal and professional lives, that are using insecure APIs, allowing personally identifiable information (PII) to flow across the open Internet. I’m a big advocate for helping mobile developers understand the important role that a public API can play in this situation, but another side of the discussions that also scares me is the fear, uncertainty, and doubt (FUD) that also emerges as part of this conversation.

I’ve covered recent security and privacy involving mobile usage of APIs at Snapchat, and Moonpig, and was processing the API news this morning for inclusion on API.Report, when I came across this press release, Wandera Finds Official NFL App to be Leaking Users' Personal Data Just Days Ahead of The Big Game. I can get behind the NFL securing their users personal data, but have a hard time jumping on the bandwagon when this is used as PR for a mobile security service, in the lead up to the Superbowl.

Understanding how everyday mobile apps are using APIs to communicate in the cloud is important. Sharing stories about how to map out this very public surface area, and secure it properly, while giving end-users more awarness and control over their PII is critical. Doing this in the name of marketing, PR, or to ride a fear hype wave is not ok. Yet, I fear this will become commonplace in coming months and years, as security breaches, cybersecurity, and privacy are front and center in the media.

See The Full Blog Post


Cybersecurity, Bad Behavior, and The US Leading By Example

As I listened to the the State of the Union speech the other day, and stewed on the topic for a few days, I can’t help but see the future of our nations cybersecurity policy through the same lens as I view our historic foreign policy. In my opinion, we’ve spent many years behaving very badly around the world, resulting in very many people who do not like us.

Through our CIA, military, and general foreign policy we’ve generated much of the hatred towards the west that has resulted in terrorism even being a thing. Sure it would still exist even if we didn’t, but we’ve definitely fanned the flames until it has become the full-fledged, never-ending profitable war it has become. This same narrative will play out in the cybersecurity story.

For the foreseeable future, we will be indundated in stories of how badly behaved Russia, China, and other world actors are on the Internet, but it will be through our own bad behavior, that we will fan the flames of cyberwarfare, around the world. Ultimately I will be be reading every story of cybersecurity in the future, while also looking in the collective US mirror.

See The Full Blog Post


APIs Role In Data Security And Privacy

As we get close to wrapping up the first month in 2015, it is clear that Internet security and privacy will continue to be front and center this year. As technology continues to play a central role in our personal and business lives--security, transparency, and respect for privacy is only growing more critical.

I know I'm biased in thinking that APIs will continue to take a central role in this conversation, but I feel it is true. Many of the existing conversations around security about platforms like Snapchat, and MoonPig, are directly related to APIs, while other security scope at companies like Sony, JP Morgan Chase, and beyond could easily be reduced with a sensible API strategy.

Companies are increasingly operating online, but do not act like any of information lives in an online environment. Adopting an API approach to defining company resources, helps map out this surface area, acknowledging it is available over the Internet, and works to define, secure, and monitor this surface in a healthier way.

Mobile users need access to their data, and by applying an API centric approach, providing account management, data portability, and access and identity controls using oAuth, you can increase transparency, while also strengthening overall security. If your company operations is centered around customer and end-user data transactions, you should be making all data points available via an API, accompanied by a well oiled oAuth layer to help end-users manage their resources, playing a significant role in their own privacy and security.

I'm not delusional in thinking that APIs provide a perfect solution for all our security and privacy woes, it doesn't, but it does set a tone for a more healthy conversation about how companies are doing business on the open Internet, and how we can better secure the online web, mobile, and device-based applications we are increasingly depending on in this new world we have created.

See The Full Blog Post


Providing An oAuth Signature Generator Inline In Documentation

I talked about Twitter's inclusion of rate limits inline with documentation the other day, which is something I added as a new building block, that API providers can consider when crafting their own strategy. Another building block I found while spending time in the Twitter ecosystem, was an oAuth signature generator inline within the documentation.

While browsing the Twitter documentation, right before you get to the example request, you get a little dropdown that lets you select from one of your own applications, and generate an oAuth signature without leaving the page.

I am seeing oAuth signature generators emerge in a number of API platforms, but this is the first inline version I’m seeing. I’ve added this to my tentative list of oAuth and security building blocks I recommend, but will give some time before I add. I like to see more than one provider do something before I put it in there, but sometimes when it is just Twitter, that can be enough.

See The Full Blog Post


Internet Of Things Security And Privacy Will Always Begin With Asking If We Should Do This At All

As I read and listen to all of the Internet of Things stories coming out of CES, I’m happy to be hearing discussions around privacy and security, come out of the event. I feel better about IoT security and privacy when I hear things like this, but ultimately I am left with overwhelming concern about of the quantity of IoT devices.

There are many layers to securing IoT devices, and protecting the privacy of IoT users, but I can't help but the think that Internet of Things security and privacy will always begin by asking ourselves if we should be doing this at all. Do we need this object connected to the Internet? Are we truly benefiting from having this item enabled with cloud connectivity?

I'm going to try and keep up with tracking on the API layer being rolled out in support of IoT devices, but not sure I will be able to keep up with the number of devices, and the massive amount of hype around products and services. At some point I may have to tap out, and focus on specific aspects of IoT connectivity ,around what I consider the politics of APIs.

See The Full Blog Post


Treating All Mobile Application API Usage Like It Is External

I have read several stories about security breaches in the past couple days, ranging from exploitation of APIs across the distributed systems we are increasingly depending on, to no security at MoonPig for their mobile app, and the reverse engineering of a popular mobile app—this time it is the Kayak mobile app.

Maybe I am biased, but I can’t help but think, in a world increasingly driven by mobile devices, we need to treat all applications like they are external, get to work seting up a proper API program, and secure all apps the same way, no matter whether they are used for internal, partner, or public usage. Treating internal apps differently, opens things up for a whole world of hurt when internal systems are compromised, and ignoring how public facing your API is, when embedded in a popular mobile app, goes into the same category.

If you have a single API stack, that ALL of your systems integrations, web, mobile, and device based-apps use, you have established a single surface area youneed to secure. Then, if you are using modern API infrastructure, a surface area that you can monitor, and have established relevant service composition, tailored for internal, partner, and public access of resources, you will be able to stay in tune with your API usage, and identify negatives situations much faster--across all devices.

If you are operating your business on the Internet today (i hope you are), consider treating all your apps as external, no matter where they reside, or who uses them—your overall architecture will be much healthier, and resilient because off because of it.

See The Full Blog Post


Another High Profile Mobile To API Security Breach, This One At MoonPig Greeting Cards

I saw a story of yet another security breach related to how mobile phones are using APIs today. This one is from Paul Price, on his blog ifc0nfig.com, about the greeting card site MoonPig.

Paul highlights the not just lack of, but actual absence of security when making API calls to MoonPig, allowing you to impersonate any user, place orders, add/retrieve card information, and any other API driven feature of the mobile application.

MoonPig is looking into it of course:

When those of us in the API industry talk about public APIs, many folks think we are talking about APIs like Twitter that have publicly available information, or public APIs that allow for anyone to sign up for the service, when in reality, most of the time we are talking about APIs that use the public Internet for transport.

MoonPig’s awareness regarding their API surface area, and complete lack of security is something we are going to see play out, in numerous very public, high profile security breaches in the coming years. There are hundreds if not thousands of companies out there right now who have the same setup, and rely on security through obscurity (nobody has discovered it yet).

I personally recommend treating all APIs used in mobile devices like they are public APIs, with a well defined, very public API definition, bundled with a very public security layer, like oAuth. You can still control who has access to signup and use the API, but one beneficial side effect is you can also treat all internal apps like their are just another user on the API.

This is the benefit of public API. Not the myths you often hear about becoming the next Twilio, but the reality that being public will force you to have a very much more well designed, and secure API, that you acknowledge operates on the open Internet. With this mindset you aren’t operating in the shadows, you will begin treat your APIs traffic like is available on the open Internet—which, well, IT IS!!!

Back to the MoonPig story, look at the timeline of disclosure at the bottom of Paul's post--he has given them 17 months since he first reported, and they still haven't fixed. OUCH!!

See The Full Blog Post


Moving Elasticsearch Into API Management With New API Security And Access Features

Elasticsearch, the open source, distributed, real-time search and analytics engine just announced that it is introducing a security layer on top of their API driven search platform. Historically you have to secure any APIs exposed via Elasticsearch through your own proxy or firewall solution, now with "Shield" you can natively manage your APIs directly in Elasticsearch.

Shield, in the same spirit of Marvel, is built on top of Elasticsearch public extensions points, and is easily installed as a plugin to add security features to any existing Elasticsearch installation. It does not require a different distribution of Elasticsearch, and relies heavily on the open public APIs Elasticsearch already exposes.

The security Elasticsearch is bringing to the table reflects the core features you see in the API space from API infrastructure providers like 3Scale--providing the basics of what you need to secure access to API endpoints:

  • Role-based Access Control - Set granular cluster, index, and alias-level permissions for each user of your Elasticsearch cluster. For example, allow the marketing department to freely search and analyze social media data with read-only permissions, while preventing access to sensitive financial data.
  • Authentication System Support - Shield integrates with LDAP-based authentication systems as well as Active Directory, so your users don’t need to remember yet another password. We also provide a native authentication system, for those who want to manage all access within Elasticsearch.
  • Encrypted Communications - Node-to-node encryption protects your data from intruders. With certificate-based SSL/TLS encryption and secure client communications with HTTPS, Shield keeps data traveling over the wire protected.
  • Audit Logging - Ensure compliance and keep a pulse on security-related activity happening in your Elasticsearch deployment; record login failures and attempts to access unauthorized information.

I've had Elasticsearch in the API deployment research project for some time now, but now I will add it to my API management research as well. If you can manage your API access, user roles, and generate log files for analytics from Elasticsearch API endpoints, the tool is moving squarely into the API management category.

I makes me happy to see open source tools like Elasticsearch improving their security features. Elasticsearch is something I recommend to government agencies to use when looking to open up access to document stores, using APIs. I would like to see more of the API management players working together to allow for interoperability between management platforms, but I’m guessing this is a wish I won’t get anytime soon.

Disclosure: 3Scale is an API Evangelist partner.

See The Full Blog Post


ETrade Launches Stock Trading API


ETrade recently entered the API scene with a new financial API.  This can position ETrade to catch up to the base of custom applications connected to TD Ameritrade's API, with the many compelling features provided at first launch. The ETrade API has plenty of room for expansion, but a solid base and a good step in the right direction. Coming with standard methods for an ETrade customer to review one's own account, or to make updates to one's account (basic trading operations are enabled), the basic needs are provided in this API. Security matters are also well considered, with the API having Oauth support and SSL. But more interestingly, ETrade launched with push-notifications, leading to opportunities in offering more up-to-date information to its users. For the time-being, ETrade's push notifications only provide information regarding recent transactions, such as when a buy request was successfully completed.   I am curious to see where else this could take ETrade integrations.  I'm sure the more hard core investment crowd would love to obtain a variety of custom, up-to-the-minute notifications for their applications, from price changes in one's portfolio to factors surrounding a potential investment. The ETrade API is available both for customers to create their own applications and for developers to build distributable integrations, as stated on its developer site: This API requires developers to have ETrade accounts, of course, but unlike TD Ameritrade's high account minimum, no other access restrictions apply.

URL: http://feedproxy.google.com/~r/ProgrammableWeb/~3/z4llvPiz_RM/09

See The Full Blog Post


API Evangelist Thoughts On The Right To An API Key And Algorithmic Organizing

There was a very interesting piece from venture capitalist Albert Wenger (@albertwenger) of Union Square Ventures over the labor day weekend, called Labor Day: Right to an API Key (Algorithmic Organizing), that I’ve had open ever since and wanted to take a moment to add my thoughts to. First let me say, I agree 100% with Albert’s post, but I felt that the piece left out some very critical elements, which I think Albert simply left out because he was just trying to get a short thought published over a holiday weekend, but I feel pretty strongly these points are critical to his argument, and should be put out there.

You can read the full post over on Albert's blog, but I think this statement sums it up nicely:

There is a simple and universal regulatory change that would dramatically shift the bargaining power: an individual right to an API Key. By this I mean a key that would give an enduser *full* read/write access to the system including every action or screen the enduser can take or see on the web site or application. Alternatively one could think of this as an individual right to be represented by an algorithm.

Shortly after Albert published his post, his partner Fred Wilson (@fredwilson) chimed in with his own post, called Algorithmic Organizing, and again I took just single paragraph that I think sums it up:

I believe that in the long run these platforms may/will be replaced by blockchain based networks of labor where there is no platform middleman and there would be no need for a legal right to an API because all the data would be public by default.

I agree with what both Albert and Fred are saying, and it makes me happy to see such prominent VC firms seeing the future in this way. What I wish to add to this conversation is some critical thought around the the business and political building blocks that often times don't get discussed in these conversations, but are actually the real reason this vision can work, but at the same time are why this vision could also become more dangerous than the one we currently have.

The dangerous part I refer to, is the blind faith that the algorithm can represent us, providing some neutral, pure extension of ourselves, free of the power structures that perpetuate much of the divide between the haves and have nots that we see in the physical world. I believe that APIs have all the potential to deliver a better future, but the API, and the keys, are just two variables in the algorithm in which they Albert and Fred speak of, when in reality there are numerous other variables at play that need to be discussed as well, or we will find ourselves in the same situations we currently are, but a potentially more dangerous world where the algorithm obfuscates any exploitation that is happening, and protects the perpetrator.

Passive Aggressive API Rate Limiting
The easiest way to limit the power any developer or end-user that is given by an API, is rate limiting. Rate limits can be established to restrict what you can access by the second, minute, hour, day, or any other configuration, making it pretty near impossible to realize any power or control you may think you have been given by a service. Most API providers are transparent with their rate limiting, and publish simple explanations of what they limit, and why, acknowledging the real world need to reduce overhead in API operations, and provide a certain quality of service for everyone. The problem with rate limiting comes in when platforms are not clear about their motivations for rate limiting, what the limitations are, and the actual rate limits to do reflect what is published publicly.

Hiding Motives Behind API Error Handling
Error handling on API requests are great way to limit actual access to any API resource, allowing the API provider to hide behind errors in the system, when in reality they are purposefully working to limit what you can actually do with the API. Errors can be generated per user, application or for entire groups, making it almost impossible to hold platforms accountable when it comes to making resources truly accessible. I’m not saying this is a common practice for API providers, I’m just saying that it is technically feasible, and I am pretty damn sure I’ve seen it in the wild, making it a real concern for me. Even if it occurs due to incompetency, it is still preventing you from being truly accessing any resources, and making the truly algorithm represent you.

Service Stability Can Bring Down Any Vision
Building on the API error handling described above, overall service stability is a common way that API driven resources can be also rendered useless. Sure a company has an API, and you have an API key to access resources made available via that API, but if you can’t actually connect to the API in a reliable way, what does it matter? There are a number of reasons service stability can be unreliable, it can be due a lack or resources, incompetence, or it can be intentional—as a recovering IT director, I can guarantee I’ve seen this in action. I don’t care how open your API is, if it isn't reliable it will not provide any value to anyone.

Security Needs To Be Priority
An open, publicly available API might be great for you to gain access to your online resources, and the assets of any company, organization, or government agency, but the same can instantly be used against you if security is not given top priority. Security is not a reason to avoid APIs, just like websites, and other systems, APIs can be secured, but if it is not done right, and systems are not properly monitored, any API can be breached, going from APIs being a good thing, to APIs being the worst thing on earth—demonstrating pretty clearly this is more than just about the technology of APIs.

Terms of Service Rule Everything
The Terms of Service (TOS) provide a legal framework for developers, and end-users to operate within, set forth by a platform. TOS should protect the API owners company, assets and brand, but should also provide assurances for developers who are building businesses on top of an API, and ultimately how end-users can make a system work for them. If the TOS are out of balance, no amount of API access will matter if you are legally compromised in what you can actually do with resources.

Transparency in Partner Tiers Of Access
One of the benefits of modern approaches to APIs is the ability to compose different sets of services, and multiple access levels for different partners. Partner access is a great way to incentivize development on top of any platform, and give higher levels of access to those who contribute to the value of a platform. Where this model begins to break down is when there is a lack of transparency, and platforms do not share information about what levels of access actually exist, which basically brings us back to the good ol boy networks we currently see across virtually any industry.

Paying Attention To Privacy
Privacy policies also protect the interests of partners, developers and platform users, while also protecting the API owner from damaging activity on their platform. Like an API terms of service, privacy policies need to strike a balance that protects everyone involved, while also allowing for innovation and commercial activity. APIs can very technically sound, but if the privacy of end-users are not respected, an API can become a liability for everyone involved. Privacy is another area that we will see increasingly become a problem in the future, preventing many from seeing APIs as a good thing, but if done right we can make sure APIs reflect the vision Albert and Fred speak of.

Who Owns The Information I Pull Via An API
Content and data ownership is an extremely contentious topic right now, with platforms claiming they own content that is generated using their services, and end-users rightfully feeling they have a stake in ownership of the content, data, media, and other information they generate on the platform. What good is API access, if I have no ownership of the content we generate via the platform? Sure I can access valuable resources, but if I can’t legally do anything with it, what value is created by the API and its underlying algorithm?

The Deprecation Of Any API Led Vision
An API deprecation policy sets expectations with API consumers about when and how API resources will be shut down. These policies help build trust with API developers and end-users, giving them an idea of how long they can depend on an API resource, and what they can expect when an API reaches the end of life. You may have access to data and content that a platform contains, complete with API access, but if that service can go away at any point after an acquisition, or due to lack or resources or leadership, and you do not have any sort of heads up, an API is immediately rendered meaningless in this wider discussion.

The Power of Industry Influence
As I watch the enterprise and government take notice of the API space, I’m seeing some pretty clear examples of industry influence over the value brought to the table by APIs. I do not care how open, transparent, and technically sound your API is, and you can follow every bit of my advice, but if the 1000 lb gorilla in any industry is not happy with a service, an API will be no defense against acquisition, shutdown, or their legal attack. If a large corporate or government entity doesn't like your platform, they can shut you down, even if it is just through a sustained legal attack. This is one of my biggest concerns about VC investment in seemingly open, and altruistic APIs driven platforms, is that your investors always provide a doorway for industry influence to change the course of your API, no matter what you might believe as the platform owner.

The Smoke And Mirrors Of "Open"
One of the most used terms in the world of APIs is “open", and at the same time is also one of the most abused terms I know of. All of the reasons listed above can affect how truly open any API, platform and company is, preventing Albert and Fred’s vision from ever becoming a reality. I could list another 50 ways that API providers prevent  access to content and resources via APIs, ranging from lack of communication to complex pricing. My goal with this post is to show that there are some very real ways in which APIs can be used against the average worker and citizen, and that APIs are not good, bad, nor neutral—just like algorithms. They just reflect the intentions of their creators, and while I think there are many opportunities to make sure they reflect us, the end-user, citizen and worker, I think more often than not, they reflect the desires of their owners.

Twitter As The Perpetual Poster Child
I really love Twitter, and I think the platform is amazing. The company has made some very serious effot to improve problems on their API platform, but they still are the best example I can use to demonstrate how every one of these variables listed can be used against the access and freedom an API could provides, without you even knowing it. When it comes to demonstrating the power that APIs bring to the table, and the democratization of our digital resources, Twitter is a shining example. When it comes to demonstrating how an API can be leveraged against its users, and algorithmic organizing can just as easily be used against us, Twitter is a shining example. Search across the API Evangelist network and you will see I’m both a lover and critic of Twitter, and I do not believe Dick Costolo is Mr. Smithers plotting and scheming how Twitter can screw us all over, but I do think at scale, after you’ve taken a certain amount of funding, and you become interesting to the powers that be—things change, and an API, and the algorithms behind can be used in very harmful ways.

Algorithmic Transparency
It is easy to get excited about the potential around APIs and algorithms. I don't have a problem with this, more power to you (pun intended), however I personally feel algorithmic solutionism, and API solutionism for that matter, is not healthy. The way forward involves transparency, and communication around some of the very difficult areas I’ve listed. Alex Howard (@digiphile) wrote a great post at TechRepublic earlier this year, called data-driven policy and commerce requires algorithmic transparency, which I think sums up this concept very nicely. For all of this to work, there has to be enough sunlight to keep things disinfected or we are going to see many of the same problem we see in the physical world, as well as a whole bunch of new problem we’ve never anticipated.

Algorithmic Accountability
Additionally, I don’t feel that even algorithmic transparency will be enough, we have to make sure companies, organizations, institutions, government agencies, and individuals are held accountable for their actions. If there is no way to hold companies accountable for their violations in the ways discussed above, what does an open, transparent API or algorithm even matter? It doesn’t mean anything, and it doesn’t give us any balance, it just transfers the same power structures we’ve known in the physical world to the virtual world, but now things are even more difficult to understand or hold any individual or company accountable for their malicious activity.

As I said earlier, I totally agree with Albert Wenger and Fred Wilson on their labor day API thoughts. I feel that every company, organization, institution, government agency, and even many individuals should have APIs, and that they are the key to a better future. However I think many of these entities will also omit very important detail from API discussions, similar to each of Albert and Fred’s posts. I do not believe they did it intentionally, or have any ill intent, but I think in our obsession with technology, algorithms, and APIs, we can miss a lot, and this is what many companies will count on, and use to their advantage.

If we do not pay attention to how the technological, business, and political building blocks are being used for exploitation, and manipulation, we are creating an even more dangerous divide between the haves and have nots, while opening up serious opportunities for abuse by the ruling merchant class, and further erode the rights we enjoy as workers, not moving them in a more positive direction. The real scary part for me, is that you probably won’t ever even notice that any of this exploitation is evn occurring, and it will be very difficult to hold anyone accountable, all because of the magic of the algorithm and APIs.

See The Full Blog Post


6,482 Datasets Available Across 22 Federal Agencies In Data.json Files

It has been a few months since I ran any of my federal government data.json harvesting, so I picked back up my work, and will be doing more work around datasets that federal agnecies have been making available, and telling the stories across my network.

I'm still surprised at how many people are unaware that 22 of the top federal agencies have data inventories of their public data assets, available in the root of their domain as a data.json file. This means you can go to many example.gov/data.json and there is a machine readable list of that agencies current inventory of public datasets.

I currently know of 22 federal agencies who have published data.json files:

Consumer Financial Protection Bureau
Department of Agriculture (USDA)
Department of Defense (DOD)
Department of Energy (DOE)
Department of Justice (DOJ)
Department of State
Department of the Treasury
Department of Transportation (DOT)
Department of Veterans Affairs (VA)
Environmental Protection Agency (EPA)
General Services Administration (GSA)
Institute of Museum and Library Services (IMLS)
Millennium Challenge Corporation (MCC)
National Aeronautics and Space Administration (NASA)
National Archives and Records Administration (NARA)
National Institute of Standards and Technology (NIST)
National Science Foundation (NSF)
National Transportation Safety Board (NTSB)
Nuclear Regulatory Commission (NRC)
Office of Personnel Management (OPM)
Social Security Administration (SSA)
United States Agency for International Development (USAID)

You can click on the logo or name, and view the full data.json files. You can also visit my Federal Agency Dataset Adoption work to see all of the datasets listed for each agency. There is stil one bug I notice in the adoption process, so don't adopt anything quite yet.

The goal of this just to highlight again, that there is a wealth of open data resources just waiting for all of us open gov hackers to take advantage of, and work make sense of. Federal agencies need our help, so get involved, there is a lot of work to be done.

See The Full Blog Post


Low Hanging Fruit For API Discovery In The Federal Government

I looked through 77 of the developer areas for federal agencies, resulting in reviewing approximately 190 APIs. While the presentation of 95% of the federal government developer portals are crap, it makes me happy that about 120 of the 190 APIs (over 60%) are actually consumable web APIs, that didn't make me hold my nose and run out of the API area. 

Of the 190, only 13 actually made me happy for one reason or another:

Don't get me wrong, there are other nice implementations in there. I like the simplicity and consistency in APIs coming out of GSA, SBA, but overall federal APIs reflect what I see a lot in the private sector, some developer making a decent API, but their follow-through and launch severeley lacks what it takes to make the API successful. People wonder why nobody uses their APIs? hmmmmm....

A little minimalist simplicity in a developer portal, simple explanation of what an API does, interactive documentation w/ Swagger, code libraries, terms of service (TOS), wouild go a looooooooooooong way in making sure these government resources were found, and put to use. 

Ok, so where the hell do I start? Let's look through theses 123 APIs and see where the real low hanging fruit for demonstrating the potential of APIs.json, when it comes to API discovery in the federal government.

Let's start again with the White House (http://www.whitehouse.gov/developers):

Only one API made it out of the USDA:

Department of Commerce (http://www.commerce.gov/developer):

  • Census Bureau API - http://www.census.gov/developers/ - Yes, a real developer area with supporting building blocks. (Update, News,( App Gallery, Forum, Mailing List). Really could use interactive document though. There are urls, but not active calls. Would be way easier if you could play with data, before committing. (B)
  • Severe Weather Data Inventory - http://www.ncdc.noaa.gov/swdiws/ - Fairly basic interface, wouldn’t take much to turn into modern web API. Right now its just a text file, with a spec style documentation explaining what to do. Looks high value. (B)
  • National Climatic Data Center Climate Data Online Web Services - http://www.ncdc.noaa.gov/cdo-web/webservices/v2Oh yeah, now we are talking. That is an API. No interactive docs, but nice clean ones, and would be some work, but could be done. (A)
  • Environmental Research Division's Data Access Program - http://coastwatch.pfeg.noaa.gov/erddap/rest.html - Looks like a decent web API. Wouldn’t be too much to generate a machine readable definition and make into a better API area. (B)
  • Space Physics Interactive Data Resource Web Services - http://spidr.ngdc.noaa.gov/spidr/docs/SPIDR.REST.WSGuide.en.pdf - Well its a PDF, but looks like a decent web API. It would be some work but could turn into a decide API with Swagger specs. (B)
  • Center for Operational Oceanographic Products and Services - http://tidesandcurrents.noaa.gov/api/ - Fairly straightforward API, Simple. Wouldn’t be hard to generate interactive docs for it. Spec needed. (B)

Arlington Cemetary:

Department of Education:

  • Department of Education - http://www.ed.gov/developers - Lots of high value datasets. Says API, but is JSON file. Wouldn’t be hard to generate APIs for it all and make machine readable definitions. (B)

Energy:

  • Energy Information Administration - http://www.eia.gov/developer/ - Nice web API, simple clean presentation. Needs interactive docs. (B)
  • National Renewable Energy Laboratory - http://developer.nrel.gov/ - Close to a modern Developer area with web APIs. Uses standardized access (umbrella). Some of them have Swagger specs, the rest would be easy to create. (A)
  • Office of Scientific and Technical Information - http://www.osti.gov/XMLServices - Interfaces are pretty well designed, and Swagger specs would be straightforward. But docs are all PDF currently. (B)

Department of Health and Human Services (http://www.hhs.gov/developer):

Food and Drug Administration (http://open.fda.gov):

Department of Homeland Security (http://www.dhs.gov/developer):

Two losse cannons:

 Department of Interior (http://www.doi.gov/developer):

Department of Justice (http://www.justice.gov/developer):

Labor:

  • Department of Labor - http://developer.dol.gov/ - I love their developer area. They have a great API, easy to generate API definitions. (A)
  • Bureau of Labor Statistics - http://www.bls.gov/developers/ - Web APIs in there. Complex, and lots of work, but can be done. API Definitions Needed. (B)

Department of State (http://www.state.gov/developer):

Department of Transportation (http://www.dot.gov/developer):

Department of the Treasury (http://www.treasury.gov/developer):

Veterans Affairs (http://www.va.gov/developer):

Consumer Finance Protectection Bureau:

Federal Communications Commission (http://www.fcc.gov/developers):

Lone bank:

  • Federal Reserve Bank of St. Louis - http://api.stlouisfed.org/ - Good API and area, would be easy to generate API definitions. (B)

General Services Administration (http://www.gsa.gov/developers/):

National Aeronautics and Space Administration http://open.nasa.gov/developer:

Couple more loose cannons:

Recovery Accountability and Transparency Board (http://www.recovery.gov/arra/FAQ/Developer/Pages/default.aspx):

Small Business Administration (http://www.sba.gov/about-sba/sba_performance/sba_data_store/web_service_api):

Last but not least.

That is a lot of potentially valuable API resource to consume. From my perspective, I think that what has come out of GSA, SBA, and White House Petition API, represent probably the simplest, most consistent, and high value targets for me. Next maybe the wealth of APis out of Interior and FDA. AFter that I'll cherry pick from the list, and see which are easiest. 

I'm lookig to create a Swagger definition for each these APIs, and publish as a Github repository, allowing people to play with the API. If I have to, I'll create a proxy for each one, because CORS is not common across the federal government. I'm hoping to not spend to much time on proxies, because once I get in there I always want to improve the interface, and evolve a facade for each API, and I don't have that much time on my hands.

See The Full Blog Post


Looking At 77 Federal Government API Developer Portals And 190 APIs

I spent most of the day yesterday, looking through 77 of the developer portals listed on the 18F Github portal. While I wanted to evaluate the quality and approach of each of the agencies, my goal for this review cycle was to look for any APIs that already had machine readable API definitions, or would be low hanging fruit for the creation of Swagger definitions, as part of my wider API discovery work.

I had just finished updating all my API Evangelist Network APIs to use verion 0.14 of APIs.json, and while I wait for the search engine APIs.io to update to support the new version, I wanted to see if I could start the hard work of applying API discovery to federal government APIs. 

Ideally all federal agencies would publish APIs.json on their own, placing it within the root of their domain, like they do with data.json, and provide an index of all of their APIs. Being all to familiar with how this stuff work, I know that if I want this to happen, I will have to generate APIs.json for many federal agencies first. However for the APis.json to have their intended impact, I need many of the APIs to have machine readable API definitions that I can point to--which equals more work for me! yay? ;-(

My thinking is that I will look through all of the 77 developer areas, and resulting APIs looking for the low hanging fruit. Basically I would grade each API on its viability to be included in my federal government API discovery work. I spent minimal amount of time look at each API, and in some cases looking for the API, before giving up. I would inspect the supporting developer area, and the actual interface for complexity, helping me understand how hard it would be to hand craft a Swagger spec, and APIs.json for each agency and their APIs. 

(warning contains my raw un-edited notes from doing this research, not suitable for children)

As I went through, I wrote a couple of notes:

  • National Climate Data Center is nice looking, and is a high profile--they should have a kick ass API!
  • Inversely NOAAA Climate Data Online very simple, clean and well done.
  • Department of Education is acceptable as dev area, only because of Socrata
  • National Renewable Energy Laboratory just gets it. They just get it.
  • Some of these are billed as developer areas, but really just a single API. It is a start I guess. 
  • I love me some Department of Labor. Their developer area and API is freak'n cool!
  • MyUSA citizen API has oAuth!!! WTF. How did I not notice this before? Another story, and work here.
  • MyUSA has really good, simple, high value, API resources. 
  • NASA ExoAPI not just API cool, but space cool!!
  • FOIA needs a fucking API. No excuses. FOIA needs an API!
  • Some APIs it might be better to go back to data source and recreate API from scratch, they are so bad.

I wanted to share some of my notes, before the long list of developer areas, and their APIs. There are some specific notes for each APIs, but much of it is very general, helping grade each API, so I can go back through the list of B grade or higher APIs, and figure out which are candidates for me to create a Swagger API definition, APIs.json and ultimately adding to APIs.io. 

For this work I went down the 77 federal agency links, which were billed as developer areas, but many were single APIs. So when a developer area resulted in multiple APIs, I grouped them together, and many of the agencies who have a single API I will group together, and include my commentary as necessary. I'm leaving the URLs visible to help as a reference, show the craziness of some of them, and because it would have been sooooo much work to apply all of them.

Let's start with the White House(http://www.whitehouse.gov/developers):

Next up is the USDA (http://www.usda.gov/wps/portal/usda/usdahome?navid=USDA_DEVELOPER), which is a hodgepodge of service, no consistency whatsoever between services in interface, supporting content or anything. 

Overall I give USDA a D on all their APIs. A couple might be high value sources, and going after, but definitely not low hanging fruit for me. It would be easier to tackle as independent project for generating brand new APIs.

Next up is Department of Commerce (http://www.commerce.gov/developer), who definitely high some higher value resources, as well as some health API initiatives. 

Next with the Department of Defense (http://www.defense.gov/developer/)There are 8 things billed as aPIs, with a variety of datasets, API like things, and web services available.  Not really sure whats up? (D)

We have one by itself here:

Then the department of education who is just riding their data.gov work as API area:

  • Department of Education - http://www.ed.gov/developers - Lots of high value datasets. Says API, but is JSON file. Wouldn’t be hard to generate APIs for it all and make machine readable definitions. (B)

Next some energy related efforts:

  • Department of Energy - http://www.energy.gov/developers - Lots of datasets. Pretty presentation.  Could use some simple APIs,. Wouldn’t be much to pick high value data sets and create cache of them. (C)
  • Energy Information Administration - http://www.eia.gov/developer/ - Nice web API, simple clean presentation. Needs interactive docs. (B)
  • National Renewable Energy Laboratory - http://developer.nrel.gov/ - Close to a modern Developer area with web APIs. Uses standardized access (umbrella). Some of them have Swagger specs, the rest would be easy to create. (A)
  • Office of Scientific and Technical Information - http://www.osti.gov/XMLServices - Interfaces are pretty well designed, and Swagger specs would be straightforward. But docs are all PDF currently. (B)

Moving on to the Department of Health and Human Services (http://www.hhs.gov/developer), which all of their APis are somewhat cosistent, and provide simple resources:

The Food and Drug Administration (http://open.fda.gov) is one of the agencies that is definitely getting on board with APIs, they have some pretty nice implementations, but there there are some not so nice ones that need a lot of work:

Next up the Department of Homeland Security (http://www.dhs.gov/developer), where they have three APIs (its a start):

Then we have two agencies that have pretty simple API operations, so I'll group together:

Then we have several API developer efforts under the Department of Interior (http://www.doi.gov/developer):

Now we have some APIS coming out of law enforcement side of government, starting with Department of Justice (http://www.justice.gov/developer):

Now we get to one of my favorite afforts in the federal government

  • Department of Labor - http://developer.dol.gov/ - I love their developer area. They have a great API, easy to generate API definitions. (A)
  • Bureau of Labor Statistics - http://www.bls.gov/developers/ - Web APIs in there. Complex, and lots of work, but can be done. API Definitions Needed. (B)

Next we have the API efforts from Department of State (http://www.state.gov/developer):

Moving on to the Department of Transportation (http://www.dot.gov/developer):

Now let's head over to the Department of the Treasury (http://www.treasury.gov/developer):

The Department of Veterans Affairs (http://www.va.gov/developer) has some hope, because of the work I did in the fall.

Moving to another on of my favorite agencies, well quasi gov agencies:

One agency that appears to be on radar, but I really can't tell what is going on API wise:

  • Environmental Protection Agency - http://www.epa.gov/developer/ - There is a really nice layout to the area, with seemingly a lot of APIs, and they look like web APIs, but it looks like one API being represented as a much of methods? Would be too much work, but still hard to figure out WTF. (C)

The the Federal Communications Commission (http://www.fcc.gov/developers) has a lot of APIs going on, in various states of operation:

All by itself on the list, we have a bank one lonely bank:

  • Federal Reserve Bank of St. Louis - http://api.stlouisfed.org/ - Good API and area, would be easy to generate API definitions. (B)

The General Services Administration (http://www.gsa.gov/developers/) definitely is ahead of the game when it comes to API design and deployment:

Once I reached the National Aeronautics and Space Administration http://open.nasa.gov/developer I found some really, really cool APIs:

Grouping a couple loose agencies together:

The Recovery Accountability and Transparency Board (http://www.recovery.gov/arra/FAQ/Developer/Pages/default.aspx) has some APIs to look at: 

The Small Business Administration (http://www.sba.gov/about-sba/sba_performance/sba_data_store/web_service_api) has some nice APIs that are consistent and well presented:

Lasty, we have a couple of loose agencies to look at (or not):

Ok that was it. I know there are more APIs to look at, but this is derived from the master list of federal government developer portals. This gives me what I need. Although, I'm a little disappointed I have less than 5 Swagger definitions, after looking at about 190 APIs. 

After looking at all of this, I'm overwhelmed. How do you solve API discovery for a mess like this? Holy shit!

I really need my fucking head examined for taking this shit on. I really am my own worst enemy, but I love it. I am obsessed with contributing to a solution for API discovery that will work in private sector, as well as a public sector mess like this. So where the hell do I start?  More to come on that soon...

See The Full Blog Post


Google Accounts As Blueprint For All Software as a Service Applications

While there are many things I don’t agree with Google about, but they are pioneers on the Internet, and in some cases have the experience to lead in some very important ways. In this scenario I’m thinking about Google Account management, and how it can be used as blueprint for all other Software as a Service (SaaS) applications.

During a recent visit to my Google account manager, I was struck by the importance of all the tools that were made available to me.

Account Settings
Google gives you the basic level control to edit your profile, adding, updating the information you feel is relevant.

Security
Google gives you the password level control, but then steps up with security with 2-step verification, and application specific passwords.

Manage Apps
Google provides a clean application manager, allowing you to control who has access to your account via the API. You can revoke any app, as well as see how they are accessing your data--taking advantage of the oAuth 2.0, that is a standard across all Google systems.

Platform Apps
The management of applications is not exclusive to 3rd party applications. Google gives you insight into how they accessing your account as well. This view of the platform is critical to providing a comprehensive lens into how your data is used, and in establishing trust.

Data Tools
Google rocks it when it comes to data portability, with their data dashboard that allows you to view your data, as well as the option download your data at any point, via the Google Takeout system which gives you direct access to over all your data across Google systems.

API Access
Google has long been a leader in the API space, providing over 100 (at last count) APIs. Most any application you use on Google platform will have an API to allow for deeper integration into other applications and platforms.

Logging It All
A complete activity log is provided, in addition to being able to see how specific applications are accessing data. This easy access to logs is essential for users to understand how their data is being accessed and put to use.

There are other goodies in the Google Account management, but these seven areas, provide a blueprint that I think ANY software as a service should provide as default for ALL users. I’m not even kidding. This should be the way ALL online services work, and users should be educated about why this is important.

I’m going to continue to work on this blueprint, as a side project, and start harassing service providers. ;-)

See The Full Blog Post


Retrieve My Data Like Retrieving Video Surveillance Photos From CCTV

I’m an advisor to the camera API platform, EverCam. I don’t advise the startup because I’m super excited about the opportunities for APIs for security cameras. I'm involved because I believe in the Evercam team, and I want to be aware of this fast growing aspect of the Internet of things and API economy. Security cameras are not going away, and I want to help lend some critical thought to how we use security cameras, and apply APIs to help introduce transparency and accountability into this easily abused layer of our society.

One of the things I learned from Evercam, is that in the UK you can request any photos of you taken on the vast closed circuit television, that is ubiquitous across the UK landscape. You can submit a request for a time, day and location and request any photo or video footage taken of you. Its kind of like a visual FOIA request for the surveillance layer of our society. This concept intrigued me, and I wanted to explore in relationship to other layers of convergence between the API economy and our increasingly digital society.

Imagine if there was FOIA process for data. I could submit a request to a single organization that would then make requests to leading technology, and big data companies, asking them for a copy of all data they possess about me, and disclose any partners that they have shared this data with. I know portions of this exist from companies like Acxiom, but I would like to see a more coherent, intra-company solution that could better serve individuals who wish to understand how companies are using their data.

A concept like FOIA for data across any company will not please corporate america, especially in a landscape where exploitation of users data is the predominant business model. However we are in the early years of the Internet, and things are very much the wild wild west, and it is only a matter of time before government regulations are needed to ensure the privacy of all citizens, and reduce exploitation and abuse by the bad apples.

This concept isn't far fetched. With modern, API driven systems, it is easy to track all of a users data, and where and how it is used across a company’s network. If all data access is required to occur via APIs, it will be easy to pull a history of which users were accessed, by which internal or external consumers. Each company could be required to have an API allowing a 3rd party auditor to pull data on behalf of users, allowing independent organizations to make FOIA style data requests across multiple companies on behalf of users.

I know that business owners will cry foul at such an idea, claiming it is just more unnecessary regulation that they will have to deal with, but we need a way of making all this more accountable. The API driven systems that would make this possible would also give companies all the other benefits APIs afford, in making company assets more accessible. APIs would allow companies to rapidly deploy web and mobile applications, while also providing assurances to every citizen that their privacy was being respected, and all of our vital personal information was not being exploited.

See The Full Blog Post


My Response To How Can the Department of Education Increase Innovation, Transparency and Access to Data?

I spent considerable time going through the Department of Education RFI, answering each question in as much detail as I possibly could. You can find my full response below. In the end I felt I could provide more value by summarizing my response, eliminating much of the redundancy across different sections of the RFI, and just cut through the bureaucracy as I (and APIs) prefer to do.

Open Data By Default
All publicly available data at the Department of Education needs to be open by default. This is not just a mandate, this is a way of life. There is no data that is available on any Department of Education websites that should not be available for data download. Open data downloads are not separate from existing website efforts at Department of Education, they are the other side of the coin, making the same content and data available in machine readable formats, rather than available via HTML—allowing valuable resources to be used in systems and applications outside of the department’s control.

Open API When There Are Resources
The answer to whether or not the Department of Education should provide APIs is the same as whether or not the agency should deploy websites—YES! Not all individuals and companies will have the resources to download, process, and put downloadable resources to use. In these situations APIs can provide much easier access to open data resources, and when open data resources are exposed as APIs it opens up access to a much wider audience, even non-developers. Lightweight, simple, API access to open data inventory should be default along with data downloads when resources are available. This approach to APIs by default, will act as the training ground for not just 3rd party developers, but also internally, allowing Department of Education staff to learn how to manage APIs in a safe, read-only environment.

Using A Modern API Design, Deployment, and Management Approach
As the usage of the Internet matured in 2000, many leading technology providers like SalesForce and Amazon began using web APIs to make digital assets available to 3rd party partners, and 14 years later there are some very proven approaches to designing, deploying and management APIs. API management is not a new and bleeding edge approach to making assets available in the private sector, there are numerous API tools and services available, and this has begun to extend to the government sector with tools like API Umbrella from NREL, being employed by api.data.gov and other agencies, as well as other tools and services being delivered by 18F from GSA. There are many proven blueprints for the Department of Education to follow when embarking on a complete API strategy across the agency, allowing innovation to occur around specific open data, and other program initiatives, in a safe, proven way.

Use API Service Composition For Maximum Access & Control
One benefit of 14 years of evolution around API design, deployment, and management is the establishment of sophisticated service composition of API resources. Service composition refers to the granular, modular design and deployment of APIs, while being able to manage who has access to these resources. Modern API access is not just direct, public access to a database. API service composition allows for designing exactly the access to resources that is necessary, one that is in alignment with business objectives, while protecting the privacy and security of everyone involved. Additionally service composition allows for real-time awareness of how all data, content, and other resources at the Department of Education are accessed and put to use, allowing new APIs to be designed to support specific needs, and existing APIs to evolved based upon actual demand, not just speculation.

Deeper Understanding Of How Resources Are Used
A modern API service composition layer opens up possibility for a new analytics layer that is not just about measuring and reporting of access to APIs, it is about understanding precisely how resources are accessed in real-time, allowing API design, deployment and management processes to be adjusted in a more rapid and iterative way, that contributes to the roadmap, while providing the maximum enforcement of security and privacy of everyone involved. When the Department of Education internalizes a healthy, agency-wide API approach, a new real-time understanding will replace this very RFI centered process that we are participating in, allowing for a new agility, with more control and flexibility than current approaches. A RFI cycle takes months, and will contain a great deal of speculation about what would be, where API access, coupled with healthy analytics and feedback loops, answers all the questions being addressed in this RFI, in real-time, reducing resource costs, and wasted cycles.

APIs Open Up Synchronous and Asynchronous Communication Channels
Open data downloads represents a broadcast approach to making Department of Education content, data and other resources available, representing a one way street. APIs provide a two-way communication, bringing external partners and vendors closer to Department of Education, while opening up feedback loops with the Department of Education, reducing the distance between the agency and its private sector partners—potentially bringing valuable services closer to students, parents and the companies or institutions that serve them. Feedback loops are much wider currently at the Department of Education occur on annual, monthly and at the speed of email or phone calls , with the closest being in person at events, something that can be a very expensive endeavor. Web APIs provide a real-time, synchronous and asynchronous communication layer that will improve the quality of service between Department of Education and the public, for a much lower cost than traditional approaches.

Building External Ecosystem of Partners
The availability of high value API resources, coupled with a modern approach to API design, deployment and management, an ecosystem of trusted partners can be established, allowing the Department of Education to share the workload with an external partner ecosystem. API service composition allows the agency to open up access to resources to only the partners who have proven they will respect the privacy and security of resources, and be dedicated to augmenting and helping extend the mission of the Department of Education. As referenced in the RFI, think about the ecosystem established by the IRS modernized e-file system, and how the H&R Blocks, and Jackson Hewitt’s of the world help the IRS share the burden of the country's tax system. Where is the trusted ecosystem for the Department of Education? The IRS ecosystem has been in development for over 25 years, something the Department of Education has to get to work on theirs now.

Security Fits In With Existing Website Security Practices
One of the greatest benefits of web APIs is that they utilize existing web technologies that are employed to deploy and manage websites. You don’t need additional security approaches to manage APIs beyond existing websites. Modern web APIs are built on HTTP, just like websites, and security can be addressed right alongside current website security practices—instead of delivering HTML, APIs are delivering JSON and XML. APIs even go further, and by using modern API service composition practices, the Department of Education gains an added layer of security and control, which introduces granular levels of access to all resource, something that does not exist for website. With a sensible analytics layer, API security isn’t just about locking down, it is about understanding who is access resources, how they are using them, striking a balance between the security and access of resources, which is the hallmark of APIs.

oAuth Gives Identity and Access Control To The Student
Beyond basic web security, and the heightened level of control modern API management deliver, there is a 3rd layer to the security and privacy layer of APis that does not exist anywhere else—oAuth. Open Authentication or oAuth provides and identity and access layer on top of API that gives end-users, and owner of personal data control over who access their data. Technology leaders in the private sector are all using oAuth to give platform users control over how their data is used in applications and systems. oAuth is the heartbeat of API security, giving API platforms a way to manage security, and how 3rd party developers access and put resources to use, in a way that gives control to end users. In the case of the Department of Education APIs, this means putting the parent and student at the center of who accesses, and uses their personal data, something that is essential to the future of the Department of Education.

How Will Policy Be Changed?
I'm not a policy wonk, nor will I ever be one. One thing I do know is you will never understand the policy implications in one RFI, nor will you change policy to allow for API innovation in one broad stroke--you will fail. Policy will have to be changed incrementally, a process that fits nicely with the iterative, evolutionary life cyce of API managment. The cultural change at Department of Education, as well as evolutionary policy change at the federal level will be the biggest benefits of APIs at the Department of Education. 

An Active API Platform At Department of Education Would Deliver What This RFI Is Looking For
I know it is hard for the Department of Education to see APIs as something more than a technical implementation, and you want to know, understand and plan everything ahead of time—this is baked into the risk averse DNA of government.  Even with this understanding, as I go through the RFI, I can’t help but be frustrated by the redundancy, bureaucracy, over planning, and waste that is present in this process. An active API platform would answer every one of your questions you pose, with much more precision than any RFI can ever deliver.

If the Department of Education had already begun evolving an API platform for all open data sets currently available on data.gov, the agency would have the experience in API design, deployment and management to address 60% of the concerns posed by this RFI. Additionally the agency would be receiving feedback from existing integrators about what they need, who they are, and what they are building to better serve students and institutions. Because this does not exist there will be much speculation about who will use Department of Education APIs, and how they will use them and better serve students. While much of this feedback will be well meaning, it will not be rooted in actual use cases, applications and existing implementations. An active API ecosystem answers these questions, while keeping answers rooted in actual integrations, centered around specific resources, and actual next steps for real world applications.

The learning that occurs from managing read-only API access, to low-level data, content and resources would provide the education and iteration necessary for the key staff at Department of Education to reach the next level, which would be read / write APIs, complete with oAuth level security, which would be the holy grail in serving students and achieving the mission of the Department of Education. I know I’m biased, because of my focus on APIs, but read / write access to all Department of Education resources over the web and via mobile devices, that gives full control to students, is the future of the agency. There is no "should we do APIs", there is only the how, and I’m afraid we are wasting time, and we need to just do it, and learn to ask these questions along the way.

There is proven technology and processes available to make all Department of Education data, content and resources available, allowing both read and write access in a secure way, that is centered around the student. The private sector is 14 years ahead of the government in delivering private sector resources in this way, and other government agencies are ahead of the Department of Education in doing this as well, but there is an opportunity for the agency to still lead and take action, by committing the resources necessary to not just deploy a single API, but internalize APIs in a way that will change the way learning occurs in the coming decades across all US institutions.


A. Information Gaps and Needs in Accessing Current Data and Aid Programs

1. How could data sets that are already publicly available be made more accessible using APIs? Are there specific data sets that are already available that would be most likely to inform consumer choice about college affordability and performance?

Not everyone has the resources download, process and put open datasets to use. APIs can make all of the publicly available datasets more available to the public, allowing for easy URL access, deployment of widgets, visualizations as well as integration with existing tools like Microsoft Excel. All datasets should have option of being published in this way, but ultimately the Dept. of Ed API ecosystem should speak to which datasets would be most high value, and warrant API access.

2. How could APIs help people with successfully and accurately completing forms associated with any of the following processes: FAFSA; Master Promissory Note; Loan Consolidation; entrance and exit counseling; Income-Driven Repayment (IDR) programs, 15 such as Pay As You Earn; and the Public Student Loan Forgiveness program?

APIs will help decouple each data point on a form. Introductory information, each questions, and other supporting resources can be broken up and delivered via any website, and mobile applications. Evolving a form into a linear, 2-dimensional form into an interactive application that people can engage with, providing the assistance needed to properly achieve the goals surrounding a form.

Each form initiative will have its own needs, and a consistent API platform and strategy from the department of Education will help identify each forms unique requirements, and the custom delivery of just the resources that are needed for a forms target audience.

3. What gaps are there with loan counseling and financial literacy and awareness that could be addressed through the use of APIs to provide access to government resources and content?

First, APIs can provide access to the content that educates students about the path they are about to embark on, before they do, via web and mobile apps they frequent already, not being required to visit the source site and learn. Putting the information students need into their hands, via their mobile devices will increase the reach of content and increase the chances that students will consume.

Second, APIs plus oAuth will give students access over their own educational finances, forcing them to better consider how they will manage all the relationships they enter into, the details of loans, grants and with the schools they attend. With more control over data and content, will come a forced responsibility in understanding and managing their finances.

Third, this process will open up students eyes to the wider world of online data and information, and that APIs are driving all aspects of their financial life from their banking and credit cards to managing their online credit score.

APIs are at the heart of all of the API driven digital economy, the gift that would be given to students when they first leave home, in the form of API literacy would carry with them throughout their lives, allowing them to better manage all aspects of their online and financial lives—and the Department of Education gave them that start.

4. What services that are currently provided by title IV student loan servicers could be enhanced through APIs (e.g., deferment, forbearance, forgiveness, cancellation, discharge, payments)?

A consistent API platform and strategy from the department of Education would provide the evolution of a suite of verified partners, such as title IV student loan services. A well planned partner layer within an ecosystem would allow student loan services to access data from students in real-time, with students having a say in who and how they have access to the data. These dynamics introduced by, and unique to API platforms that employ oAuth, provide new opportunities for partnerships to be established, evolve and even be terminated when not going well.

API platform using oAuth provide a unique 3-legged relationship between the data platform, 3rd party service providers and students (users), that can be adopted to bring in existing industry partners, but more importantly provide a rich environment for new types of partners to evolve, that can improve the overall process and workflow a student experiences.

5. What current forms or programs that already reach prospective students or borrowers in distress could be expanded to include broader affordability or financial literacy information?

All government forms and programs should be evaluated for the pros / cons of an API program. My argument within this RFI response will be focused on a consistent API platform and strategy from the department of Education. APIs should be be part of every existing program change, and new initiatives in the future.

B. Potential Needs to be Filled by APIs

1. If APIs were available, what types of individuals, organizations, and companies would build tools to help increase access to programs to make college more affordable?

A consistent API platform and strategy from the department of Education will have two essential components, partner framework, and service composition. A partner framework defines which external, 3rd party groups can work with Department of Education API resources. The service composition defines how these 3rd party groups can can access and ultimately use Department of Education API resources.

All existing groups that the Department of Education interacts with currently should be evaluated for where in the API partner framework they exists, defining levels of access for general public, student up to certified and trusted developer and business partnerships.

The partner framework and service composition for the Department of Education API platform should be applied to all existing individuals, organizations and companies, while also allow for new actors to enter the game, and potentially redefining the partner framework and add new formulas for API service composition, opening up the possibilities for innovation around Department of Education API resources.

2. What applications and features might developers, schools, organizations, and companies take interest in building using APIs in higher education data and services?

As with which Department of Education forms and programs might have APIs apply, which individuals, organizations and companies will use APIs, the only way to truly understand what applications might developers, schools, organizations and companies put APIs cannot be know, until it is place. These are the questions an API centric company or institution asks of its API platform in real-time. You can’t define who will use an API and how they will use it, it takes iteration and exploration before successful applications will emerge.

3. What specific ways could APIs be used in financial aid processes (e.g., translation of financial aid forms into other languages, integration of data collection into school or State forms)?

When a resource is available via an API, it is broken down into the smallest possible parts and pieces possible, allowing them to be re-used, and re-purposed into every possible configuration management. When you make form questions independently available via an API, it allows you to possible reorder, translate, and ask in new ways.

This approach works well with forms, allowing each entry of a form to be accessible, transferable, and open up for access, with the proper permissions and access level that is owned by the person who owns the format data. This opens up not just the financial aid process, but all form processes to interoperate with other systems, forms, agencies and companies.

With the newfound modularity and interoperability introduced by APIs, the financial aid process could be broken down, allowing parents to take part for their role, schools for theirs, and allow multiple agencies to be engaged such as IRS or Department of Veterans Affairs (VA). All of this allows any involved entity or system to do its part for the financial aid process, minimizing the friction throughout the entire form process, even year over year.

4. How can third-party organizations use APIs to better target services and information to low-income students, first-generation students, non-English speakers, and students with disabilities?

Again, this is a questions that should be asked in real-time of a Department of Education platform. Examples of how 3rd party organizations can better target services and information to students, is the reason for an API platform. There is no way to no this ahead of time, I will leave to domain experts to attempt at answering.

5. Would APIs for higher education data, processes, programs or services be useful in enhancing wraparound support service models? What other types of services could be integrated with higher education APIs?

A sensibly design,deployed, managed and evangelized API platform would establish a rich environment for existing educational services to be augmented, but also allow for entirely new types of services to be defined. Again I will leave to domain experts to speak of specific service implantations based upon their goals, and understanding of the space.

C. Existing Federal and Non-Federal Tools Utilizing APIs

1. What private-sector or non-Federal entities currently offer assistance with higher education data and student aid programs and processes by using APIs? How could these be enhanced by the Department’s enabling of additional APIs?

There are almost 10K public APIs available in the private sector. This should be viewed as a pallet for developers, and something that developers use as they are developing (painting) their apps (painting). It is difficult for developers to know what they will be painting with, without knowing what resources are available. The open API innovation process rarely is able to articulate what is needed, then make that request for resources—API innovations occurs when valuable, granular resources are available fro multiple sources, ad developers assemble them, and innovate in new ways.

2. What private-sector or non-Federal entities currently work with government programs and services to help people fill out government forms? Has that outreach served the public and advanced public interests?

Another question that should be answered by the Department of of Education, and providing us with the answers. How would you know this without a properly definitely partner framework? Stand up an API platform, and you will have the answer.

3. What instances or examples are there of companies charging fees to assist consumers in completing otherwise freely available government forms from other agencies? What are the advantages and risks to consider when deciding to allow third parties to charge fees to provide assistance with otherwise freely available forms and processes? How can any risks be mitigated?

I can't speak to what is already going on in the space, regarding companies charging feeds to consumers, I am not expert on the education space at this level. This is just such a new paradigm made possible via APIs and open data, there just aren’t that many examples in the space, built around open government data.

First, the partner tiers of API platforms help verify and validate individuals and organizations who are building applications and charging for services in the space. A properly design, managed and policed partner tier can assist in mitigating risk in the evolution of such business ecosystems.

Second API driven security layers using oAuth give access to end-users, allowing students to take control over which applications and ultimately service providers have access to their data, revoking when services are done or a provider is undesirable. With proper reporting and rating systems, policing of the API platform can be something that is done within the community, and the last mile of policing being done by the Department of Education.

Proper API management practices provide the necessary identity, access and control layers necessary to keep resources and end-users safe. Ultimately who has access to data, can charge fees, and play a role in the ecosystem is up to Department of education and end-users when applications are built on top of APIs.

4. Beyond the IRS e-filing example, what other similar examples exist where Federal, State, or local government entities have used APIs to share government data or facilitate participation in government services or processes - particularly at a scale as large as that of the Federal Student Aid programs?

This is a new, fast growing sector, and there are not a lot of existing examples, but there area few:

Open311
An API driven system that allows citizens to report and interact with municipalities around issues within communities. While Open311 is deployed in specific cities such as Chicago and Baltimore, it is an open source platform and API that can be deployed to serve any size market.

Census Bureau
The US Census provides open data and APIs, allowing for innovation around government census survey data, used across the private sector in journalism, healthcare, and many other ways. The availability of government census data is continually spawning new applications, visualizations and other expressions, that wouldn’t be realized or known, if the platform wasn’t available.

We The People
The We The People API allows for 3rd-Party integration with the White House Petition process. Currently only allowing for read only access to the information, and the petition process, but is possibly one way that write APIs will emerge in federal government.

There are numerous examples of open APIs and data being deployed in government, even from the Department of Education. All of them are works in progress, and will realize their full potential over time, maturation and much iteration and engagement with the public.

D. Technical Specifications

1. What elements would a read-write API need to include for successful use at the Department?

There are numerous building blocks can be employed in managing read-write APIs, but there are a couple that will be essential to successful read-write APIs in government:

Partner Framework
Defined access tiers for consumers of API data, with appropriate public, partner and private (internal) levels of access. All write methods are only accessible by partner and internal levels of access, requiring verification and certification of companies and individuals who will be building on top of API resources.

Service Management
The ability to compose many different types of API resource access, create service bundles that are made accessible to different levels of partners. Service management allows for identity and access management, but also billing, reporting, and other granular level control over how services are composed, accessed and managed.

Open Authentication (oAuth 2.0)
All data made available via Department of Education API platforms and involves personally identifiable information will require the implementation of an open authentication or oAuth security layer. oAuth 2.0 provides an identity layer for the platform, requiring developers to use token that throttle access to resources for applications, a process that is initiated, managed and revoked by end-users—providing the highest level of control over who has access to data, and what they can do with it, by the people who personal data is involved.

Federated API Deployments
Not all APIs should be deployed and managed within the Department of Education firewall. API platforms can be made open source so that 3rd party partners can deploy within their own environments. Then via a sensible partner framework, the Department of Education can decide which partners they should not just allow to write to APIs, but also pull data from their trusted systems and open API deployments.

APIs provide the necessary access to all of federal government API resources, and a sensible partner framework, service management layer in conjunction with oAuth will provide the necessary controls for a read / write API in government. If agencies are looking to further push risk outside the firewall, federated API deployments with trusted partners will have to be employed.

2. What data, methods, and other features must an API contain in order to develop apps accessing Department data or enhancing Department processes, programs, or services?

There are about 75 common building blocks for API deployments (http://management.apievangelist.com/building-blocks.html), aggregated after looking at almost 10K public API deployments. Each government API will have different needs when it comes to other supporting building blocks.

3. How would read-only and/or read-write APIs interact with or modify the performance of the Department’s existing systems (e.g., FAFSA on the Web)? Could these APIs negatively or positively affect the current operating capability of such systems? Would these APIs allow for the flexibility to evolve seamlessly with the Department’s technological developments?

There are always risks with API access to resources, but a partner framework, service management, oAuth, and other common web security practices these risks can be drastically reduce, and mitigated in real-time

Isolated API Deployments
New APIs should rarely be deployed and directly connected to existing systems. APIs can be deployed as an isolated interface, with an isolated data store. Existing systems can use the same API interface to read / write data into the system and keep in sync with existing internal systems. API developers will never have access to existing system and data stores, just isolated, defined API interfaces as part of a secure partner tier, only accessing the services they have permission to, and the end-user data that has been given access to by end-users themselves.

Federated Deployments
As described above, if government agencies are looking to further reduce risk, API deployments can be designed and deployed as open source software, allowing partners with the ecosystem to download and deploy. A platform partner framework can provide a verification and certification process for federal API deployments, allowing the Department of Education to decide who they will pull data from, reducing the risk to internal systems, providing a layer of trust for integration.

Beyond these approaches to deploying APIs, one of the biggest benefits of web API deployments is they use the same security as other government websites, just possessing an additional layer of securing determining who has access, and to what.

It should be the rare instance when an existing system will have an API deployed with direct integration. API automation will provide the ability to sync API deployments with existing systems and data stores.

4. What vulnerabilities might read-write APIs introduce for the security of the underlying databases the Department currently uses?

As stated above, there should be no compromise in how data is imported into existing databases at the Department of Education. It is up to the agency to decide which APIs they pull data from, and how it is updated as part of existing systems.

5. What are the potential adverse effects on successful operation of the Department’s underlying databases that read-write APIs might cause? How could APIs be developed to avoid these adverse effects?

As stated above, isolated and external, federated API deployments will decouple the risk from existing systems. This is the benefit of APIs, is they can deployed as isolated resources, then integration and interoperability, internally and externally is up to the consumer to decide what is imported and what isn’t.

6. How should APIs address application-to-API security?

Modern API partner framework, service management and oath provide the necessary layer to identify who has access, and what resources can be used by not just a company and user, but by each application they have developed.

Routing all API access through the partner framework plus associated service level, will secure access to Department of Education resources by applications, with user and app level logging of what was accessed and used within an application.

OAuth provides a balance to this application to API security layer, allowing the Department of Education to manage security of API access, developers to request access for their applications, but ultimately control is in the hand of end users to define which applications have access to their data.

7. How should the APIs address API-to-backend security issues? Examples include but are not limited to authentication, authorization, policy enforcement, traffic management, logging and auditing, TLS (Transport Layer Security), DDoS (distributed denial-of-service) prevention, rate limiting, quotas, payload protection, Virtual Private Networks, firewalls, and analytics.

Web APIs use the exact same infrastructure as websites, allowing for the re-use of existing security practices employed for websites. However APIs provide the added layer of security, logging, auditing and analytics provided through the lens of the partner framework, service composition and only limited by the service management tooling available.

8. How do private or non-governmental organizations optimize the presentation layer for completion and accuracy of forms?

Business rules. As demonstrated as part of a FAFSA API prototype, business rules for each form field, along with rejection codes can also be made available via an API resources, allowing for developers to build in a form validation layer into all digital forms.

After submission, and the first line of defense provide red by API developers building next generation forms, platform providers can provide further validation, review and ultimately a status workflow that allows forms to be rejected or accepted based upon business logic.

9. What security parameters are essential in ensuring there is no misuse, data mining, fraud, or misrepresentation propagated through use of read- only or read-write APIs?

A modern API service management layer allows the platform provider to see all API resources that are being access, by whom, and easily establish patterns for healthy usage, as well as patterns for misuse. When misuse is identified, service management allows providers to revoke access, and take action against companies and individuals.

Beyond the platform provider, APIs allow for management by end-users through common oAuth flows and management tools. Sometimes end-users can identify an app is misusing their data, even before a platform provider might. oAuth gives them the control to revoke access to their data, via the API platform.

oauth, combined with API service management tooling has allowed for a unique security environment in which the platform can easily keep operations healthy, but end-users and developers can help police the ecosystem as well. If platform providers give users the proper rating and reporting tools, they can help keep API and data consumers in check.

10. With advantages already built into the Department’s own products and services (e.g., IRS data retrieval using FAFSA on the Web), how would new, third-party API-driven products present advantages over existing Department resources?

While existing products and services developed within the department do provide great value, the Department of Education cannot do everything on their own. Because of the access the Department has, some features will be better by default, but this won’t be the case in all situations.

The Department of Education and our government does not have unlimited resources, and with access to ALL resources available via the department the private sector can innovate, helping share the load of delivering vital services. Its not whether or not public sector products and services are better than private sector or vice vera, it is about the public sector and private sector partnering wherever and whenever it make sense.

11. What would an app, service or tool built with read-write API access to student aid forms look like?

Applications will look like turbotax and tax act developed within the IRS ecosystem, and look like the tools developed by the Sunlight Foundation on top of government open data and APIs.

We will never understand what applications are possible until the necessary government resources are available. All digital assets should be open by default, with consistent API platform and strategy from the department of Education, and the platform will answer this question.

E. Privacy Issues

1. How could the Department use APIs that involve the use of student records while ensuring compliance with potentially applicable statutory and regulatory requirements, such as the Family Educational Rights and Privacy Act (20 U.S.C. § 1232g; 34 CFR Part 99) and the Privacy Act (5 U.S.C. § 552a and 34 CFR Part 5b)?

As described above the partner framework, service management and oAuth layer provides the control and logging necessary to execute and audit as part of any application statutory and regulatory requirement.

I can’t articulate enough how this layer provides a tremendous amount of control over how these resources are access, giving control to the involved parties who matter the most—end-users. All API traffic is throttled, measured and reviewed as part of service management, enforcing privacy that in a partnership between the Department of Education, API consumers and end-users.

2. How could APIs ensure that the appropriate individual has provided proper consent to permit the release of privacy-protected data to a third party? How can student data be properly safeguarded to prevent its release and use by third parties without the written consent often required?

As articulated above the partner framework, service management and oAuth address this. This is a benefit of API deployment, breaking down existing digital access, providing access and granular control, combined with oAuth and logging of all access—APIs take control to a new level.

oAuth has come to represent this new balance in security and control of digital resources, allowing the platform, developers and end-users to execute within their defined role on the platform. This balance introduced by APIs and oAuth, allow data to be safeguarded, while also opening up for the widest possible use in the next generation applications and other implementations.

3. How might read-only or read-write APIs collect, document, and track individuals’ consent to have their information shared with specific third parties?

oAuth. Period.

4. How can personally identifiable information (PII) and other financial information (of students and parents) be safeguarded through the use of APIs?

Access of personally identifiable information (PII) via Department of Education APIs will be controlled by students and their parents. The most important thing you can do to protect PII is to give the owner of that data, education about how to allow developer access to it in responsible ways that will benefit them.

APIs open up access, while oAuth will give the students and parents the control they need to integrate with apps, and existing system to achieve their goals, while retaining the greatest amount of over safeguarding their own data.

5. What specific terms of service should be enabled using API keys, which would limit use of APIs to approved users, to ensure that information is not transmitted to or accessed by unauthorized parties?

A well designed partner layer would define multiple level of access, combined with sensible service packages, will establish the terms of service levels that will be bundled with API keys and oAuth level identity and access to personally identifiable information.

Common approaches to deploying partner layers with appropriate service tiers, using oAuth have been well established over the last 10 years in the private sector. Controlling access to API resources at a granular level, providing the greatest amount of access that makes sense, while knowing who is access data and how they are using is what APIs are designed for.

6. What are the relative privacy-related advantages and disadvantages of using read-only versus read-write APIs for student aid data?

You will face many of the similar privacy concerns whether an API is read or write. If it is personably identifiable information, read or write access to the wrong parties violates a student's privacy. Just ensure that data is updated via trusted application providers is essential.

A properly defined partner layer will separate who has read and who has write access. Proper logging and versioning of data is essential to ensure data integrity, allowing end-users to manage their data via an application or system with confidence.

F. Compliance Issues

1. What are the relative compliance-related advantages and disadvantages of using read-only versus read-write APIs for student aid data?

APIs provide a single point of access to student aid data. With the implementation of proper partner framework, service management and oAuth every single action via this doorway is controlled and logged. When it comes to auditing ALL operations whether it is from the public, partners or internal, APIs excel in satisfying compliance concerns.

2. How can the Department prevent unauthorized use and the development of unauthorized products from occurring through the potential development of APIs? How might the Department enforce terms of service for API key holders, and prevent abuse and fraud by non-API key holders, if APIs were to be developed and made available?

As described above the partner framework, service management and oAuth will provide the security layer needed to manage 99% of potential abuse, but overall enforcement via the API platform is a partnership between the Department of Education, API consumers as well as end-users. The last mile of enforcement will be executed by the Department of Education, but it will be up to the entire ecosystem and platform to police and enforce in real-time.

3. What kind of burden on the Department is associated with enforcing terms and conditions related to APIs?

The Department of Education will handle the first line of defense, in defining partner tiers and service composition that wraps all access to APis. The Department will also be the last mile of decision making and enforcement when violations occur. The platform should provide the data needed by the department to make decision as well as the enforcement necessary in the form of API key and access revocation, and banning apps, individuals and business from the ecosystem.

4. How can the Department best ensure that API key holders follow all statutory and regulatory provisions of accessing federal student aid funds and data through use of third-party products?

First line of define to ensure that API key holders follow all statutory and regulatory provision will be verification and validation of partners upon registration, applications going into production and availability in application galleries and other directories in which students discover apps.

Second line of defense will be reporting requirements and usage patterns of API consumers and their apps. If applications regular meet self-reporting requirements and real-time patterns establishing healthy or unhealthy behavior, they can retain their certification. If partners fail to comply they will be restricted from the API ecosystem.

Last line of defense is the end-users, the students and parents. All end-users need to be educated regarding the control they have, given reporting and ranking tools that allow them file complaints and rank the applications that are providing quality services.

As stated several times, enforcement will be a community effort, something the Department of Education has ultimate control of, but requires giving the community agency as well.

5. How could prior consent from the student whom the data is about be provided for release of privacy- protected data to third party entities?

An API with oAuth layer is this vehicle. Providing the access, logging all transactions, and holding all partners to a quality of service. All the mechanism are there, in a modern API implementation, the access just needs to be defined.

6. How should a legal relationship between the Department and an API developer or any other interested party be structured?

I’m not a lawyer. I’m not a policy person. Just can’t contribute to this one.

7. How would a legal relationship between the Department and an API developer or any other interested party affect the Department’s current agreements with third-party vendors that operate and maintain the Department’s existing systems?

All of this will be defined in each partner tier, combined with appropriate service levels. With isolated API deployments, this should not affect currently implementations.

However a benefit of consistent API strategy is that existing vendors can access resources via APis, increasing the agility and flexibility of existing contracts. APIs are a single point of access, not just for the public, but 3rd party partners as well as internal access. Everyone involved can participate and receive benefits of API consumption.

8. What disclosures should be made available to students about what services are freely available in government domains versus those that could be offered at a cost by a third party?

A partner tier for the API platform will define the different levels of partners. Trusted, verified and certified partners will get different recommendation levels and access than lesser known services, and applications from 3rd party with lesser trusted levels of access.

9. If the Department were to use a third-party application to engage with the public on its behalf, how could the Department ensure that the Department follows the protocols of OMB Memorandum 10-23?

Again, the partner tier determines the level of access to the partner and the protocols of all OMB memorandum call be built in. Requiring all data, APIs and code is open sourced, and uses appropriate API access tiers showing how data and resources are accessed and put to use.

API service management provides the reporting necessary to support government audits and regulations. Without this level of control on top of an API, this just isn’t possible in a scalable way, that APIs plus web and mobile applications offer.

G. Policy Issues

1. What benefits to consumers or the Department would be realized by opening what is currently a free and single-point service (e.g., the FAFSA) to other entities, including those who may charge fees for freely-available services and processes? What are the potential unintended consequences?

Providing API access to government resources is an efficient and sensible use of taxpayers money, and reflect the mission of all agencies, not just the Department of Education. APIs introduce the agility and flexibility needed to deliver the next generation government application and services.

The economy in a digital age will require a real-time partnership between the public sector and the private sector, and APIs are the vehicle for this. Much like it has done for private sector companies like Amazon and Google, APIs will allow the government to create new services and products that serve constituents with the help of the private sector, while also stimulating job growth and other aspects of the economy.

APIs will not all be an up-side, each program and initiative will have its own policy problems and unintended consequences. One problem that plagues API initiatives is enough resources in the form of money and skilled works to make sure efforts are successful. Without the proper management, poorly executed APIs can open up huge security holes, introduce privacy concerns at a scale never imagined.

APIs need to be managed properly, with sensible real-time controls for keeping operations in check.

2. How could the Department ensure that access to title IV, HEA student aid programs truly remains free, even amidst the potential development of third-party apps that may charge a fee for assistance in participating in free government programs, products, and services with or without providing legitimate value-added services?

Partner Framework + Service Management = Quality of Service Across Platform

3. What other policy concerns should the Department consider with regard to the potential development of APIs for higher education data and student aid processes at the Department?

Not a policy or education expert, I will leave this to others to determine. Also something that should be built into API operations, and discovered on a program by program basis.

4. How would APIs best interact with other systems already in use in student aid processes (e.g., within States)?

The only way you will know is if you do it. How is the IRS-efile system helping with this, but it isn’t even a perfect model to follow. We will never know the potential here until a platform is stood up, and resources are made available. All signs point to APIs opening up a huge amount of interoperability between not just states and the federal government, but also with cities and counties.

5. How would Department APIs benefit or burden institutions participating in title IV, HEA programs?

If APIs aren’t given the proper resources to operate it can introduce security, privacy and support concerns that would not have been there before. A properly run API initiative will provide support, while an underfunded, undermanned initiative will just further burden institutions.

6. While the Department continues to enhance and refine its own processes and products (e.g., through improvements to FAFSA or the IDR application process), how would third-party efforts using APIs complement or present challenges to these processes?

These two things should not be separate. The internal efforts should be seen as just another partner layer within the API ecosystem. All future service and products developed internally within the Department of Education should use the same API infrastructure developed for partners and the public.

If APIs are not used internally, API efforts will always fail. APIs are not just about providing access to external resources, it is about opening up the Department to think about its resources in an external way that benefits the public, partners as well as within the government.

See The Full Blog Post


The Levers, Dials, And Switches For Your Participation In The API Economy

I am playing with different ways of explaining the 100K view of how companies, and ultimately governments will participate in the API economy. As with APIs themselves, visualizing something like how an API platform can position itself in the emerging API economy, is very difficult to do—something that takes refinement, something I do by blogging, so here we go...

Much like developers are looking into the details of each API call, using API integration tools like RunScope and APITools, to better understand how their applications are consuming APIs, API providers need to understand all the moving parts involved with successfully executing an API strategy. To help me articulate to API providers, I try to break things down into small digestible modules that will help me demonstrate  how they all work in concert to orchestrate the delivery of digital resources inside and outside the firewall. 

On-Boarding - How you get consumers up and running with an API

  • Discovery - How are APIs found by potential consumers
  • Public / Private - Are resources publicly or privately available
  • Self-Service - How much can developers do on their own when on boarding
  • Approval - Do API consumers require approval, verification or certification before on boarding
  • Best Practices - In plain english, what is expected of consumers when they use an API

Service Composition - How are API resources organized and made accessible to all groups of consumers

  • Access Tiers - What types of access tiers are available, i.e.. public, partner, internal, paid
  • Read / Write - Separation of read and write access to API resources, defined by service composition
  • Rate Limits - How much of an API resource can be consumed by apps, and ho ow much is free or paid
  • Pricing - How much do API resources cost, and are they all uniformly priced or are there variances

Partner - Verified and certified levels of relationships with API consumers

  • Access - Levels of access to API resources as defined by partnership
  • Distribution - Access to new distribution channels defined by partnerships
  • App - Certification of applications built on top of API resources
  • Consumers - Certification of individual API consumers
  • Businesses - Verification of business behind API consumers

Legal - The legal definitions and protections for API consumers and providers

  • TOS - Terms of service for consumers of an API resource
  • SLA - Service level agreement defining expected level of operations by an API
  • Privacy - Legal protection for the privacy of API consumers and end users
  • Licensing - Licensing of all code, data, and API interfaces
  • Branding - Definitions and guidelines for corporate and product branding

Communication - External and internal communication with API consumers and stakeholders

  • Blog - A weblog of API operations, providing transparency within the API community
  • Social - An active social presence on public or corporate social networks about API operations

Support - What type of, and how much support is offered to API consumers

  • Direct - What types of direct support options are available like in-person, chat, phone, email, etc.
  • Indirect - What type of in-direct support options are available like forum, FAQ, knowledgeable, etc.

Resources - Code and content resources that help API consumers put resources to work

  • Docs - Simple, complete and up to date document for all API resources
  • Code - Code samples, libraries and complete SDKs in a variety of languages

Updates - Regular updates around API operations

  • Change-Log - Historical record of all changes made to an API and supporting resources
  • Roadmap - A look at what is planned in the short term or long term future of an API

Stability - What is the overall stability of an API platform

  • Security - How secure are APIs and supporting systems, including integrated apps
  • Reliability - How reliable is an API, is it highly available and few breaking changes

All of these modules represent the levers, dials and switches you will use to define your participation in the API economy. How you position yourself in all of these areas, will define how your API will ultimately operate. API consumers will never use an API if they can't on-board efficiently, the price is too high or licensing to strict, while they will flourish in a more balanced, carefully planned configuration.

APIs aren't just about making data, content and digital resources available publicly. APIs are about optimizing access and consumption of these resources, encouraging access in ways that benefit owners and stakeholders. While there is no perfect configuration that will ensure success in all business sectors, there are some blueprints we can follow that set the right tone, and the levers, dials and switches listed above can be adjusted and tuned in real-time for optimal performance.

See The Full Blog Post


Thoughtful Use of JavaScript When Designing Embeddable Tools

One of the security blogs I follow is Schneier on Security from Bruce Schneier. If you want to understand what is going on around the NSA and security, Bruce is the guy. I was tweeting out a story from his blog today and noticed his share buttons:

You have to actually enable the buttons before you can click, protecting you from the common monitoring we face everyday through JavaScript. It is a simple, but very powerful concept when thinking about how we use JavaScript.

This approach represents a very thoughtful use of JavaScript, something I would like to do more of, and is something we should all be doing as we are building embeddable JavaScript tools.

Thanks Bruce!

See The Full Blog Post


Do US Government Web APIs Require System Interconnectivity Agreements?

I'm been so busy with work lately, I haven't been able to maintain my usual rhythm of blog posts on API Evangelist. The good news is I'm doing some interesting work that I'm able to pull stories from. This post is from a forum post I made on the US Government API forum i frequent, which has some very interesting conversations about APIs in the federal government.

In a recent post from Brian over at DC3 (Defense Cyber Crime Center), an interesting question was asked: Do US Government Web APIs Require System Interconnectivity Agreements? I will let you visit the conversation and see more detail around his question, as well as some of the other responses, but here were some of my thoughts:

Web / http APIs do not fit earlier definitions of “system interconnectivity agreements”, which represents the technical and fundamental shift between network connections, SOAP APIs and this new world of web APIs.
 
Web APIs were successful in part because of the loosely coupled nature of HTTP, leveraging a client / server and request / response. There is interconnectivity, but not the tight coupling and governance of previous network protocols and APIs. 

 APIs are the contract! Each API endpoint provides access to a resource, then with accompanying management building blocks, you dial in that contract.
 
At the API provider level, you can enforce / encourage interoperability using common web api approaches:

  • API Definitions - Machine readable API definitions like SwaggerAPI Blueprint and RAML can provide templates that enforce / encourage common blueprints that providers can follow when publishing APIs to centralized or decentralized API management platforms, establishing base contracts for interoperability, as well as underlying data models.
  • Terms of Use - Common terms of use can be established for management, allowing provides to contract their own TOS that is tailored for specific resources, but derived from common patterns. You see this emerging in private sector with Swedish API License.
  • Licensing - Much like TOS, common licensing blueprints can be provided, allowing data and resource licensing to be tailored for resources, but pulled from existing licensing pools. Many API providers are providing a base stack of licensing arrangements that meet their needs as well as consumers.
  • Service Level Agreements (SLA) - Same as TOS, service level agreements can be forged allowing API providers to meet expectations of consumers. Some SLAs are loose and some are tight, depending on goals.
  • Security - Common tooling and practices around the security of APIs need to be established, using common approaches like API Keys, oAuth, SSO and other common standards already in use. As with API interface patterns, data models, etc, each API provider should not be allowed to bring their home brew security to the table.
  • Service Composition - A practice referred to as “service composition” is a common part of all API management platforms currently available. Service composition allows API providers to build service tiers and compose products from API resources, establishing different levels of access and usage that can be designed for internal, partner or public access to resources. Ie. Service tier A allows read only access to APIs with specific rate limits, while service tier B allows read / write access with unlimited. Service composition reflects and extends the contract that an API is. 
  • Analytics - Real-time metrics give API providers with living views into API registration and usage, allowing an organic view into how resources are truly being used, allowing for real-time adjustments, enforcements, and service level composition adjustments to be made. This will allow providers to respond to adverse conditions by tightening control and stimulate innovation by loosening control as needed.

All of these provider level building blocks work in concert to standardize design, deployment and management of common API resources. They provide a common backbone that make APIs a living contract, that is flexible enough to work with many different resources and agencies, without being too rigid—providing the ability to innovate, while still establishing desired levels of governance, which will vary from agency to agency and resource to resource.

At the API consumer level, you can enforce / encourage interoperability using common web api approaches:

  • Portal - Each API will have one or many portals where you access the APIs and supporting documentation. The availability of a portal, whether its public or private can reflect access and interoperability goals for API providers. Portal availability is the gateway to interoperability enforcement / encouragement.
  • Registration - Even if you have access to an API resource, and it is publicly available, most APIs require registration to obtain required application keys or obtain necessary oAuth credentials. Registration can be self-service, invite only or require approval, providing all levels of enforcement / encouragement of agreements.
  • Security - Each API provider has designated appropriate API security levels derived from a common pool of tools and standards. Consumers are guided by API provider security standards, operate within API key restrictions, oAuth identity and access restrictions, and designated service composition frameworks that are linked to security access levels. Security is not just technology, it transcends business and politics of API interoperability.
  • Terms of Service - Every API consumer is bound by the API TOS at the point of registration, and will be legally required to agree / adhere to future changes. 
  • Best Practices - Best practices provide a plain english explanation of legal TOS for all API consumers, ensuring that all API consumers truly execute on TOS that are set forth, cause you know, nobody reads TOS.
  • Service Level Agreements (SLA) -  - Service Level Agreements can be extended to all API consumers as part of their service level composition, registration and TOS. SLAs provide necessary real-time expectations of system performance.
  • Analytics - Real-time metrics are provided to API consumers letting them know the reality of their  API consumption, where they exist within service level agreements, service composition, and other aspects of API interoperability and agreements.

All the building blocks listed above provide the two-sides of the web API coin. Modern API initiatives from Amazon, Google, Twitter and thousands of other companies are proving that loosely coupled, modular approaches to API design, deployment and management provide a flexible, agile approach to interoperability that isn't as rigid as classic API approaches or networking protocols.

All of these building blocks work in concert to orchestrate interoperability that protects the interest of API providers and consumers, and even 3rd party intermediaries. You even seen this approach to interoperability move from the technical, to meaningful reciprocity across providers, as we've seen with newer generation of automation providers like If This Then That and Zapier—building on legacy ETL concepts, but bringing into a new global, Internet era.

We have to establish case studies that will shift decision makers away from more rigid approaches. Without them we won't be able to achieve the flexibility that web APIs bring, and are left with a heavy handed Tech + Legal governance.

So to directly answer your questions:

1) Do US Gov Web APIs *require* system interconnectivity agreements?  Which policies and conditions? 

NO!

2) Can system interconnectivity agreements be mitigated to a common agreement instead of agency-specific? (i.e.  single information sharing agreement, and/or Acceptable Use Policies, common Terms of Service per API provider )

YES!

3) Which policies/guidance can be leveraged prevent the specific use of "system interconnectivity agreements" into the rest of the US Gov Web API space? 
The patterns exist in private sector and are slowly emerging in government. We just need to work to identify, standardardized and establish the case studies we need to steer away from the  specific use of "system interconnectivity agreements".

See The Full Blog Post


10 API Commandments for Providers

I was having one of my regular Google Hangouts with the OG API Evangelist John Musser (@johnmusser) the other day as I was flying back from east coast, and he made a comment about my classic API Evangelist drawing, and said I should write the ten commandments of APIs.

At first I thought, not my style, but on second thought, what the hell. Here they are:

  1. Deliver Value - Generate value for consumers, without value APIs mean nothing.
  2. Provide Documentation - Provide clean, simple, up to date API documentation for users.
  3. Code Libraries - Deliver code sample, libraries, SDKs, or starter projects in a variety of languages.
  4. Provide Support - If you don’t support your APIs, your efforts won’t be sustainable.
  5. Don’t Break Minimize Breaking Shit - You will never establish trust with consumers if you break your shit all the time.
  6. Business Model - Without a business model for your API, they will not be sustainable.
  7. Terms of Service - Establish TOS that protect your company's interests, but allow consumers to prosper and grow.
  8. Security - Establish solid security practices that keep your resources secure, while also protecting all users involved.
  9. Privacy - Passionately protect the privacy of your API consumers and end-users.
  10. Tell Stories - Craft and tell meaningful stories about the value your APIs deliver or nobody will will know or care about them.

These are my 10 commandments for API providers, and represent what I think are the 10 most important elements of delivering APIs. If you follow these 10 commandments, in your API design, development, deployment, and management you will find happiness. ;-)

These commandments won’t guarantee your success in the market, but they will guarantee that you will learn and grow in your API initiative, whether its internal or external—which in my opinion is the most important aspect of adopting an API way of life.

Update: Adjusting breaking shit, as John cites below. It is inevitable so minimizing is my advice.

See The Full Blog Post


A World Where Every Camera Is Connected To The Internet Via APIs

I look at a lot of APIs--some are crap, some make sense, a few are interesting, and every great once in a while you see an API that you know will be one of the next big API platforms. I’m reviewing one such API, Evercam.io.

I know that Evercam.io will be be big, because it bridges an increasingly ubiquitous technology—the camera. Whether its its for home or commercial usage, Internet connected cameras represents low hanging fruit for applying proven API techniques.

Evercam.io was born out of experience working at a cloud CCTV company, where the Evercam team realized the opportunity was about becoming a developer platform that enabled any developer to interact with potentially hundreds of types of cameras, while also applying modern API techniques to the world of security and webcams.

Cameras + Oauth 2.0 + API + App Store = Evercam.io

Storage
When it comes to video, the obvious API usage involve storage of the video, still photos, audio, and logs generated by a video camera. The cloud opens the ability to scale storage to match whatever needs the camera owner may have. Video potentially can be disk heavy, something that is perfectly suited for the cloud, as long as someone is willing to pay for the storage.

Connectivity
Cameras are just one node, in this fast growing world of Internet connected devices that you may find at home or in the business. The ability to sync up cameras and still photos with point of sale (POS) activity, sensors or even other cloud applications is significant. Zapier and IFTTT like actions or seamless reporting and analytics across systems will open up with API deployent.

Events
Evercam.io introduces events as a layer between the camera and users, allowing developers to define custom or scheduled events based upon what happens in a video. Examples include sending of MMS when camera receives SMS from owner, take picture of storefront or dining area each hour or during peak hours. Video is all about a sequence of events, and adding an API layer allows for an unlimited amount of slicing and dicing and custom defined of events that are meaningful to camera owners.

Logging
One of the significant benefits of API layers between camera and their access layers is the ability to log operations. Camera availability, access logs, events, actions and basically any interaction with a camera potentially can be logged. When used for security, this API layer expands on the existing definition of what a security camera is used for.

Marketplace
Any successful platform needs a marketplace, where developers can showcase the applications they've engineered on top of the platform, and even generate revenue through application sales and add-ons. Evercam.io has a marketplace out of the gate, modeled after the Chrome Web Store and Force.com, which charges developer s30% of revenue generated via the apps they publish to the marketplace.

Security
Deployment of an API for one or many cameras, provides a single point of security for all devices. This is only as good or bad as Evercam.io’s platform security, which only time will tell how solid it is. This one element will make or break platforms like Evercam, and is the area that will keep me up at night when thinking about API driven cameras.

Ok, now I’ve laid out some of what I feel will be key things that make Evercam.io significant. I think Evercam.io will be big because it represents a fairly obvious target for applying APIs, and the deployment of cameras, as well as the use of video is only going to grow--exponentially. I don’t just think that Evercam.io will be big because of the technological and business opportunities, I think it will be huge because of the political implications.

Our homes and automobiles will be increasingly wired with cameras, businesses will be operated, managed and secured through cameras, and governments will increasingly monitor citizens via Internet connected video devices. Camera APIs will be one of the most influential, yet silent players in our personal and professional lives, from here on forward—there is no escaping it.

Honestly all of this has me worried, but I’ve done my due diligence on the Evercam.io team, and feel like there isn’t a better crew I’d like to see help lead in this potentially frightening new world of Internet connected cameras. I try to help lead the API space by shining the light on some of the positive in the space, while calling out potentially negative practices. I think the Evercam.io team reflects these values, and I’m interested in seeing more of how they handle it in this potentially volatile aspect of our online world.

It can be easy to freak out over some of the potentials for exploitation via a platform like Evercam.io, but after reading their business plan, and looking through their site, you see examples like the agricultural scale that uses a camera and the Evercam.io API to take pictures of grain deliveries, providing the pictures as part of the grain sale process. These are every day uses that will have significant implications on the economy and become regular part of business operations.

After spending some time researching Evercam.io, and thinking about the world of Internet connected cameras, I’m intrigued. I’m interested in seeing what solutions get developed on the platform, how security is handled, and what issues arise in different scenarios when you connect cameras to the Internet, apply APIs and start building applications and logging tools that operate in this new API defined space.

I predict the Evercam.io ecosystem will grow rapidly, and be an interesting, and potentially scary place to watch our world be connected to the Internet via billions of tiny cameras, that track on every aspect of our personal, business and increasingly very public lives.

See The Full Blog Post


Benefits Of Treating Your Private API Like a Public One

I stopped counting the number of successful applications that have had their private APIs reversed engineered by some savvy users, Most recently Snapchat, and famously the original rogue Instagram API was developed this way, and even auto maker Tesla had their API reverse engineered.

This is fresh on my my mind because of the Snapchat security breach, in which this savvy tech user notified Snapchat back in August, and the company instituted rate limiting, but still left the API exposed for further attack, which the user took full advantage of in December by scraping all users after documenting and expoiting the private API.

If you are developing a mobile application, you are building an API. Even if you never intend on making this API public, there are huge benefits of going through the motions of pretending like it is—you will learn a lot! Every application API I’ve seen reverse engineered was clearly developed in a rush, thinking that security through obscurity was a solid production strategy.

When you are planning your apps API, stop! Consider what you will need to make this publicly available. The exercise is healthy. Allow for registration of developers and apps, even if it will just be your team signing up. Put in place rate limits, analytics, etc. There are a ton of features that have evolved in the public API space that are transferrable to private APIs.

API infrastructure providers like 3Scale have been working at this for years. You can sign up for a free account, get started without any extra work from your dev team, and you can save any resources for making sure you monitor your API stats on a daily basis and responding to not just security threats, but find tuning how your apps are using your own API.

Don’t make the mistake that the Snapchat, Instagram and Teslas of the world have made. Treat your API like it is public, even if you never intend to open it up publicly. You will think differently and be ahead of the curve when it comes to monitoring and security.

Disclosure: 3Scale is an API Evangelist partner

See The Full Blog Post


Are Your API Security Practices In Better Shape Than The Snapchat API?

If you weren't following the news over the holidays, a rogue group released SnapchatDB, containing the 4.6 million Snapchat user profiles, after exploiting the poorly secured mobile application API.

There are opposing views of what's happened, but apparently the group contacted Snapchat in August 2013 letting them know of a potential vulnerability in their API, in which Snapchat claims they responded by instituting rate limiting to address the problem.

Apparently the actual vulnerability wasn't addressed, and in December the group mapped the private API, the company uses for their mobile app. They don't officially have an API, but like most mobile applications, it is right beneath the surface.

After mapping the interface the group proceeded to suck all the data, organized and publish as SnapchatDB, in an effort to raise awareness of the issue and point out that Snapchat was to slow in responding to the exploit.

Regardless of the exact facts, it is clear that Snapchat was lax on security. API rate limiting and other common security measures are pretty common place. API providers like 3Scale have been around for years delivering plug and play infrastructure to help you deal with this. There is no reason to be caught with your pants down.

It doesn't matter whether your API is public, private or just for partners, you need to have your security practices tight. You owe it to your users and developers.

Disclosure: 3Scale is an API Evangelist partner.

See The Full Blog Post


Access, Interoperability, Privacy and Security Of Technology Will Set The Stage For The Future of Education

In 2010 when I started API Evangelist I saw the technological potential of APIs, but while the rest of the online space was focused on what APis could do for developers, I was focused on what APIs could do for the average person. APIs don't just open up access for developers, they open up access for end-users, introducing interoperability, data portability and ultimately tools that give them control over their own data, content and other valuable resources.

This realization has been central to my mission at API Evangelist, which is about educating the masses about APIs. What is an API? Why are APIs important? I strongly feel that APIs empower end-users to make better decisions about which platforms they use, which applications they adopt, and gives them more ownership, control and agency in their own worlds. When you help an individual understand they can host their own Wordpress blog and migrate from the cloud hosted version of Wordpress, or migrate their blog from Blogger to Wordpress via APIs, you are giving the gift of web literacy.

Leading technology platforms like Amazon, Google, eBay and Flickr have long realized the potential of opening up APIs and empowering end-users. Since then, thousands of platform providers have also realized that opening up APIs enables developers and end-users to innovate around their platform and services, and that there is much more opportunity for growth, expansion and revenue when end-users are API literate. Users are much more likely to adopt a platform and deeply integrate it into their personal or business lives, if they are able to connect it with their other cloud services, taking control and optimizing their information and work flow.

Helping business owners, developers and end-users understand the potential that APIs introduce is essential to the future of education, and will be the heart of a healthy and thriving economy. There is a key piece of technology that reflects this new paradign and is currently operating and thriving across the web, called oAuth. This open authentication (oAuth) standard provides the ability for platforms to open up access to content and data that enables developers to build web and mobile applications, but in a way that gives the control to end-users, who are ulimately the owners of a platforms content and data, and are the target of the applications that developers are building.

oAuth has introduced a new online dance, that is widely known as three-legged authentication, and is being used across common platforms from Google to Facebook, allowing end-users, developers and platforms to interact in a way that makes the Internet go round. If any of these three legs are out of balance and security or privacy is compromised, or one of the players is not educated and exploitation occurs, the cycle quickly breaks down. This delicate balance encourages all three legs to be educated, empowered and in control over their role in this critical supply chain of the Internet.

Online platforms, and the web and mobile applications that are built on them, are playing an ever increasing role in every aspect of our personal, professional and public lives, from turning in class assignments in high school to paying our taxes as adults. APIs and oAuth are being used as the pipes and gatekeepers for everything from photos and location data to our vital healthcare records. These online platforms will play a central role in our education from infancy to retirement, and being educated, aware and literate in how these platforms operate is essential to it all working--for everyone involved.

The future of education depends on all online platforms providing access, interoperability and data portability, while also fully respecting end-users privacy and security and investing in their education about these features and the opportunities they open up. Education will continue to exist within traditional institutions, but will persist throughout our lives in this new online environment. It is imperative that every citizen possesses a certain level of web literacy to be able to learn, grow and evolve as a human being in this increasingly digital society.

I will be speaking at OpenVA, Virginia’s First Annual Open and Digital Learning Resources Conference on this topic and continue to work this message into my overall API Evangelist message. The link between APIs, the access they provide, and education is critical. It is something that I feel provides just as many opportunity for exploitation as it does for benefiting end-users, developers and platforms--requiring a great deal of transparency and scrutiny.

Lots to think about, and discuss.  I look forward to seeing you at University of Mary Washington for OpenVA.

See The Full Blog Post


Providing Access To Services That Help Americans With Their Food Security Using APIs

I had the pleasure to connect with the talented Code for America fellow, Moncef Belyamani(@monfresh) this week and talk about a very meaningful API project, called the Ohana API.

"The Ohana API is a project from the San Mateo County team of Code for America fellows that is aiming to create open access to community social services, with an initial emphasis on food security."

I couldn't' think of a more important use of APIs, than making sure people can find the soical services they need--especially services that ensure people are fed.

I'm also impressed with the approach of Code for America in giving Github a central role in the project. The API project, the API wrapper in Ruby and a cool API to PDF generator are all available on Github.

The Oahana API is only in alpha, they are looking for people to help the Code for America team take it to the next level with approaches to keep the data current and developing an SMS interface.

The Ohana API project itself, and the model used by Code for America, provides an important blueprint for how technology can be applied, and make a difference in our daily lives in a scalable way.

This type of work keeps me coming back back and working on API Evangelist, even after three years of covering a space that often leaves me pretty discouraged.

See The Full Blog Post


IRS Modernized e-File (MeF): A Blueprint For Public & Private Sector Partnerships In A 21st Century Digital Economy (DRAFT)

Download as PDF

The Internal Revenue Service is the revenue arm of the United States federal government, responsible for collecting taxes, the interpretation and enforcement of the Internal Revenue code.

The first income tax was assessed in 1862 to raise funds for the American Civil War, and over the years the agency has grown and evolved into a massive federal entity that collects over $2.4 trillion each year from approximately 234 million tax returns.

While the the IRS has faced many challenges in its 150 years of operations, the last 40 years have demanded some of the agency's biggest transformations at the hands of technology, more than any time since its creation.

In the 1970s, the IRS began wrestling with the challenge of modernizing itself using the latest computer technology. This eventually led to a pilot program in 1986 of an new Electronic Filing System (EFS), which aimed in part to gauge the acceptance of such a concept by tax preparers and taxpayers.

By the 1980s, tax collection had become very complex, time-consuming, costly, and riddled with errors, due to what had become a dual process of managing paper forms while also converting these into a digital form so that they could be processed by machines. The IRS despereatly needed to establish a solid approach that would enable the electronic submission of tax forms.

It was a rocky start for the EFS, and Eileen McCrady, systems development branch and later marketing branch chief, remembers, “Tax preparers were not buying any of it--most people figured it was a plot to capture additional information for audits." But by 1990, IRS e-file operated nationwide, and 4.2 million returns were filed electronically. This proved that EFS offered a legitimate approach to evolving beyond a tax collection process dominated by paper forms and manual filings.

Even Federal Agencies Can't Do It Alone

Even with the success of early e-file technology, the program did not get the momentum it needed without the support of two major tax preparation partnerships--H&R Block and Jackson-Hewitt. These helped change the tone of EFS efforts, making it more acceptable and appealing to tax professionals. It was clear that e-File needed to focus on empowering a trusted network of partners to submit tax forms electronically, sharing the load of tax preparation and filing with 3rd party providers. And this included not just the filing technology, but a network of evangelists spreading the word that e-File was a trustworthy and viable way to work with the IRS.

Bringing e-File Into The Internet Age

By 2000, Congress had passed IRS RRA 98, which contained a provision setting a goal of an 80% e-file rate for all federal tax and information returns. This, in effect, forced the IRS to upgrade the e-File system for the Internet age, otherwise they would not be able meet this mandate. A working group was formed, comprised of tax professionals and software vendors that would work with the IRS to design, develop and implement the Modernized e-File(MeF)-Program-Information) system which employed the latest Internet technologies, including a new approach to web services which used XML that would allow 3rd party providers to submit tax forms in a real-time, transactional approach (this differed from the batch submissions required in a previous version of the EFS).

Moving Beyond Paper One Form At A Time

Evolving beyond a 100 years of paper process doesn't happen overnight. Even with the deployment of the latest Internet technologies, you have to incrementally bridge the legacy paper processes to a new online, digital world. After the deployment of the MeF, the IRS worked year by year to add the myriad of IRS forms to the e-File web service, allowing software companies, tax preparers, and corporations to digitally submit forms into IRS systems over the Internet. Form by form, the IRS was being transformed from a physical document organization to a distributed network of partners that could submit digital forms through a secure, online web service.

Technological Building Blocks

The IRS MeF solution represents a new approach to using modern technology by the federal government in the 21st century Internet age. In the last 15 years, a new breed of Internet enabled software standards have emerged that enable the government to partner with the private sector, as well as other government agencies, in ways that were unimaginable just a decade ago.

Web Services

Websites and applications are meant for humans. Web services, also known as APIs, are meant for other computers and applications. Web services has allowed the IRS to open up the submission of forms and data into central IRS systems, while also transmitting data back to trusted partners regarding errors and the status of form submissions. Web services allow the IRS to stick with what it does best, receiving, filing and auditing of tax filings, while trusted partners can use web services to deliver e-Filing services to customers via custom developed software applications.

Web services are designed to utilize existing Internet infrastructure used for everyday web operations as a channel for delivering trusted services to consumers around the country, via the web.

An XML Driven Communication Flow

XML is a way to describe each element of IRS forms, and its supporting data. XML makes paper forms machine readable so that the IRS and 3rd party systems can communicate using a common language, allowing IRS to share a common set of logic around each form, then use what is known as schemas, to validate the XML submitted by trusted partners against a set of established business rules that provide enforcement of the IRS code. XML gives the ability for IRS to communicate with 3rd party systems using digital forms, applying business rules to reject or accept the submitted forms, which then can be stored in an official IRS repository in a way that can be viewed and audited by IRS employees (using stylesheets which make the XML easily readable by humans).

Identity and Access Management (IAM)

When you expose web services publicly over the Internet, secure authentication is essential. The IRS MeF system is a model for securing the electronic transmission of data between the government and 3rd party systems. The IRS has employed a design of the Internet Filing Application (IFA) and Application to Application (A2A) which are features of the Web Services-Interoperability (WS-I) security standards. Security of the MeF system is overseen by the IRS MITS Cyber Security organization which ensures all IRS systems receive, process, and store tax return data in a secure manner. MeF security involves an OMB mandated Certification and Accreditation (C&A) Process, requiring a formal review and testing of security safeguards to determine whether the system is adequately secured.

Business Building Blocks

To properly extend e-File web services to partners isn't just a matter of technology. There are numerous building blocks required that are more business than technical, ensuring a healthy ecosystem of web service partners. With a sensible strategy, web services need to be translated from tech to business, allowing partners to properly translate IRS MeF into e-filing products that will deliver required services to consumers.

Four Separate e-Filing Options

MeF provided the IRS with a way to share the burden of filing taxes with a wide variety of trusted partners, software developers and corporations who have their own software systems. However MeF is just one tool in a suite of e-File tools. These include Free File software that any individual can use to submit their own taxes, as well as free fillable digital forms that individuals can use if they do not wish to employ a software solution.

Even with these simple options, the greatest opportunities for individuals and companies is to use commercial tax software that walks one through what can be a complex process, or to depend on a paid tax preparer who employ their own commercial versions of tax software. The programmatic web service version of e-file is just one option, but it is the heart of an entire toolkit of software that anyone can put to use.

Delivering Beyond Technology

The latest evolution of the e-file platform has technology at heart, but it delivers much more than just the transmission of digital forms from 3rd party providers, in ways that also make good business sense:

  • Faster Filing Acknowledgements - Transmissions are processed upon receipt and acknowledgements are returned in near real-time, unlike the once or twice daily system processing cycles in earlier versions
  • Integrated Payment Option - Tax-payers can e-file a balance due return and, at the same time, authorize an electronic funds withdrawal from their bank accounts, with payments being subject to limitations of the Federal Tax Deposit rules
  • Brand Trust - Allowing MeF to evolve beyond just the IRS brand, allowing new trusted commercial brands to step up and deliver value to consumer, like TurboTax and TaxAct.

Without improved filing results for providers and customers, easier payment options and an overall set of expectations and trust, MeF would not reach the levels of e-Filing rates mandated by Congress. Technology might be the underpinning of e-File, but improved service delivery is the thing that will seal the deal with both providers and consumers.

Multiple Options for Provider Involvement

Much like the multiple options available for tax filers, the IRS has established tiers of involvement for partners to be involved with the e-File ecosystem. Depending on the model and capabilities, e-File providers can step up and be participate in multiple ways:

  • Electronic Return Originators (EROs) - ERO prepare returns for clients or have collected returns from taxpayers who have prepared their own, then begin the electronic transmission of returns to the IRS
  • Intermediate Service Providers - These providers process tax return data, that originate from an ERO or an individual taxpayer, and forward to a transmitter.
  • Transmitters - Transmitters are authorized to send tax return data directly to the IRS, from custom software that connect directly with the IRS computers
  • Online Providers - Online providers are a type of transmitter that sends returns filed from home by taxpayers using tax preparation software to file common forms
  • Software Developers - write the e-file software programs that follow IRS specifications for e-file.
  • Reporting Agents - An accounting service, franchiser, bank or other person that is authorized to e-file Form 940/941 for a taxpayer.

The IRS has identified the multiple ways it needed help from an existing, evolving base of companies and organizations. The IRS has been able to design its partner framework to best serve its mission, while also delivering the best value to consumers, in a way that also recognizes the incentives needed to solicit participation from the private sector and ensure efforts are commercially viable.

Software Approval Process

IRS requires all tax preparation software used for preparing electronic returns to pass the requirements for Modernized e-File Assurance Testing (ATS). As part of the process software vendors notify IRS via an e-help Desk, that they plan to commence testing, then provide a list of all forms that they plan to include in their tax preparation software, but do not require that vendors support all forms. MeF integrators are allowed to develop their tax preparation software based on the needs of their clients, while using pre-defined test scenarios to create test returns that are formatted in the specified XML format. Software integrators then transmit the XML formatted test tax returns to IRS, where an e-help Desk assister checks data entry fields on the submitted return. When IRS determines the software correctly performs all required functions, the software is approved for electronic filing. Only then are software vendors allowed to publicly market their tax preparation software as approved for electronic filing -- whether for usage by corporations, tax professionals and individual users.

State Participation

Another significant part of the MeF partnership equation is providing seamless interaction with the electronic filing of both federal and state income tax returns at the same time. MeF provides the ability for partners to submit both federal and state tax returns in the same "taxpayer envelope", allowing the IRS to function as an "electronic post office" for participating state revenue services -- certainly better meeting the demands of the taxpaying citizen. The IRS model provides an important aspect of a public / private sector partnership with the inclusion of state participation. Without state level participation, any federal platform will be limited in adoption and severely fragmented in integration.

Resources

To nurture an ecosystem of partners, it takes a wealth of resources. Providing technical, how-to, guides, templates and other resources for MeF providers is essential to the success of the platform. Without proper support, MeF developers and companies are unable to keep up with the complexities and changes of the system. The IRS has provided the resources needed for each step of the e-Filing process, from on-boarding, to how to understanding the addition of the latest forms, and changes to the tax code.

Market Research Data

Transparency of the MeF platform goes beyond individual platform operations, and the IRS acknowledges this important aspect of building an ecosystem of web service partners. The IRS provides valuable e-File market research data to partners by making available e-file demographic data and related research and surveys. This important data provides valuable insight for MeF partners to use in their own decision making process, but also provides the necessary information partners need to educate their own consumers as well as the general public about the value the e-File process delivers. Market research is not just something the IRS needs for its own purposes; this research needs to be disseminated and shared downstream providing the right amount of transparency that will ensure healthy ecosystem operations.

Political Building Blocks

Beyond the technology and business of the MeF web services platform, there are plenty of political activities that will make sure everything operates as intended. The politics of web service operations can be as simple as communicating properly with partners, providing transparency, or all the way up to security, proper governance of web service, and enforcement of federal laws.

Status

The submission of over 230 million tax filings annually requires a significant amount of architecture and connectivity. The IRS provides real-time status of the MeF platform for the public and partners, as they work to support their own clients. Real-time status updates of system availability keeps partners and providers in tune with the availability of the overall system, allowing them to adjust availability with the reality of supporting such a large operation. Status of availability is an essential aspect of MeF operations and overall partner ecosystem harmony.

Updates

An extension of MeF platform status is the ability to keep MeF integrators up-to-date on everything to do with ongoing operations. This includes providing alerts when the platform needs to tune-in platform partners to specific changes with tax law, resource additions, or other relevant news of operations. The IRS also provides updates via an e-newsletter, providing a more asynchronous way for the IRS MeF platform to keep partners informed about ongoing operations.

Updates over the optimal partner channels are an essential addition to real-time status and other resources that are available to platform partners.

Roadmap

In addition to resources, status and regular updates of platform status of the overall MeF system, the IRS provides insight into where the platform is going next, keeping providers apprised with what is next for the e-File program. Establishing and maintaining the trust of MeF partners in the private sector is constant work, and requires a certain amount of transparency -- allowing partners to anticipate what is next and make adjustments on their end of operations. Without insight into what is happening in the near and long term future, trust with partners will erode and overall belief in the MeF system will be disrupted, unraveling over 30 years of hard work.

Governance

The Modernized e-File (MeF) programs go through several stages of review and testing before they are used to process live returns. When new requirements and functionality are added to the system, testing is performed by IRS's software developers and by IRS's independent testing organization. These important activities ensure that the electronic return data can be received and accurately processed by MeF systems. Every time an IRS tax form is changed and affects the XML schema, the entire development and testing processes are repeated to ensure quality and proper governance.

Security

Secure transmissions by 3rd parties with the MeF platform is handled by the Internet Filing Application (IFA) and Application to Application (A2A), which are part of the IRS Modernized System Infrastructure, providing access to trusted partners through the Registered User Portal (RUP). Transmitters using IFA are required to use their designated e-Services user name and password in order to log into the RUP. Each transmitter also establishes a Electronic Transmitter Identification Number (ETIN) prior to transmitting returns. Once the transmitter successfully logs into the RUP, a Secure Socket Layer (SSL) Handshake Protocol allows the RUP and transmitter to authenticate each other, and negotiate an encryption algorithm, including cryptographic keys before any return data is transmitted. The transmitter’s and the RUP negotiate a secret encryption key for encrypted communication between the transmitter and the MeF system. As part of this exchange, MeF will only accommodate one type of user credentials for authentication and validation of A2A transmitters; username and X.509 digital security certificate. Users must have a valid X.509 digital security certificate obtained from an IRS authorized Certificate Authority (CA), such as like VeriSign or IdenTrust, then have their certificates stored in the IRS directory using an Automated Enrollment process.

The entire platform is accredited by the Executive Level Business Owner, who is responsible for the operation of the MeF system, with guidance provided by the National Institute of Standards (NIST). The IRS MITS Cyber Security organization and the business system owner are jointly responsible and actively involved in completing the IRS C&A Process for MeF, ensuring complete security of all transmissions with MeF over the public Internet.

A Blueprint For Public & Private Sector Partnerships In A 21st Century Digital Economy

The IRS MeF platform provides a technological blueprint that other federal agencies can look to when exposing valuable data and resources to other agencies as well as the private sector. Web services, XML, and proper authentication can open up access and interactions between trusted partners and the public in ways that were never possible prior to the Internet age.

While this web services approach is unique within the federal government, it is a common way to conduct business operations in the private sector -- something widely known as Service Oriented Architecture (SOA), an approach that is central to a healthy enterprise architecture. A services oriented approach allows organizations to decouple resources and data and open up very wide or granular levels of access to trusted partners. The SOA approach makes it possible to submit forms, data, and other digital assets to government, using XML as a way to communicate and validate information in a way that supports proper business rules, wider governance, and the federal law.

SOA provides three essential ingredients for public and private sector partnership:

  • Technology - Secure usage of modern approaches to using compute, storage and Internet networking technology in a distributed manner
  • Business - Adherence to government lines of business, while also acknowledging the business needs and interest of 3rd party private sector partners
  • Politics - A flexible understanding and execution of activities involved in establishing a distributed ecosystem of partners, and maintaining an overall healthy balance of operation

The IRS MeF platform employs this balance at a scale that is unmatched in federal government currently. MeF provides a working blueprint can be applied across federal government, in areas ranging from the veterans claims process to the financial regulatory process.

The United States federal government faces numerous budgetary challenges and must find new ways to share the load with other federal and state agencies as well as the private sector. A SOA approach like MeF allows the federal government to better interact with existing contractors, as well as future contractors, in a way that provides better governance, while also allowing for partnership with the private sector in ways that goes beyond simply contracting. The IRS MeF platform encourages federal investment in a self-service platform that enable trusted and proven private sector partners to access IRS resources in predefined ways -- all of which support the IRS mission, but provide enough incentive that 3rd party companies will invest their own money and time into building software solutions that can be fairly sold to US citizens.

When an agency builds an SOA platform, it is planting the seeds for a new type of public / private partnership whereby government and companies can work together to deliver software solutions that meet a federal agency's mission and the market needs of companies. This also delivers value and critical services to US citizens, all the while reducing the size of government operations, increasing efficiencies, and saving the government and taxpayers money.

The IRS MeF platform represents 27 years of the laying of a digital foundation, building the trust of companies and individual citizens, and properly adjusting the agency's strategy to work with private sector partners. It has done so by employing the best of breed enterprise practices from the private sector. MeF is a blueprint that cannot be ignored and deserves more study, modeling, and evangelism across the federal government. This could greatly help other agencies understand how they too can employ an SOA strategy, one that will help them better serve their constituents.

You Can View, Edit, Contribute Feedback To This Research On Github

See The Full Blog Post


API Monetization In The Internet of Things @ Nordic APIs

I have a panel this week at Nordic APIs called Business Models in an Internet of Things, with Ellen Sundh (@ellensundh) of Coda Collective, David Henricson Briggs of Playback Energy, Bradford Stephens of Ping Identity and Ronnie Mitra(@mitraman/a>) of Layer 7 Technologies. My current abstract for the panel is:

As we just begin getting a hold on monetization strategies and business models for APIs delivering data and resources for mobile development. How will we begin to understand how to apply what we have learned for the Internet of Things across our homes, vehicles, sensors and other Internet enabled objects that are being integrating with our lives.

In preparation for the event I am working through my thoughts around potential monetization strategies and business models that will emerge in this fascinating adn scary new world where everything can be connected to the Internet---creating an Internet of Things (IoT).

Where Is The Value In The IoT?
When it comes to monetizing APIs of any type, there first has to be value. When it comes IoT where is the value for end-users? Is it the device themselves, is it the ecosystem of applications built around a device or will it be about the insight derived from the data exhaust generated from these Internet connected devices?

Evolving From What We Know
After almost 10 years of operating web APIs, we are getting a handle on some of the best approaches to monetization and building business models in this new API economy. How much of this existing knowledge will transfer directly to the IoT? Freemium, tiered plans, paid API access and advertising--which of these existing models will work, and which won't.

Another existing model to borrow from when it comes to IoT is the telco space. The world of cellphone and smart phones are the seeds of IoT and one of the biggest drivers of the API economy. How will existing telco business models be applied to the world of IoT? Device subsidies, contracts, data plans, message volumes are all possible things that could be borrowed from the existing telco world, but we have to ask ourselves, what will work and what won't?

Will Developers Carry the Burden?
When it comes to API access, developers often pay for access and the privilege of building applications on top of API driven platforms. Will this be the case in the IoT? Will the monetization of IoT platforms involve charging developers for API usage, number of users and features? Is this a primary channel for IoT device makers to make money off their products? In the beginning this may not be the case, with providers needing to incentivize developers to build apps and crunch data, but it is likely that eventually developers will have to carry at least some of the burden.

Micro-Payment Opportunities
The payment industry is booming in the API Economy, but micro-payments are still getting their footing, doing better in some areas than others. Certain areas of IoT may lend itself to applying micro-payment approaches to monetization. When you pass through toll booths or parking, there are clear opportunities for micro-payments to engage with Internet connected automobiles. Beyond the obvious, think of the opportunities for traffic prioritization--do you want intelligence on where you should drive to avoid traffic or possibly pay per mile to be in a preferred lane? Another area is in entertainment, in generating revenue from delivering music, audiobooks and other entertainment to drivers or passengers in IoT vehicles and public transportation.

Will IoT Be All About The Data
As we sit at the beginning of the era of big data, driven from mobile, social and the cloud, what will big data look like in the IoT era. Will the money be all about the data exhaust that comes from a world of Internet connected device, not just at the individual device and the insight delivered to users, but at the aggregate level and understand parking patterns for entire cities or the electricity consumption for a region.

Security Will Be Of High Value In IoT
We are already beginning to see the importance of security in the IoT world, with missteps by Tesla and camera maker TRENDnet. Will security around IoT be a monetization opportunity in itself? Device manufacturers will be focused on doing what they do best, and often times will overlook security, leaving open huge opportunities for companies to step up and deliver b2b and b2c security options and layers for IoT. How much will we value security? Will we pay extra to ensure the devices in our lives are truly secure?

I Will Pay For My Privacy In An IoT World
When all devices in my life are connected to the Internet, but also the world around me is filled with cameras, sensors and tracking mechanisms, how will privacy change? Will we have the opportunity to buy privacy in an IoT world? Will the wealthy be able to pay for the privilege of being lost in a sea of devices, not showing up on cameras, passed by when sensors are logging data? Privacy may not be a right in an IoT world it by be purely something you get if you can afford it. Will companies establish IoT business models and drive monetization through privacy layers and opportunities?

A IoT Las Vegas for Venture Capital
With IoT centered around costly physical devices, and potentially large platforms and networks, will anything in the IoT space be able to be bootstrapped like the web 2.0 and mobile space was? Or will all IoT companies require venture capital? At first glance IoT looks like a huge opportunity for VC firms, allowing them to specialize for the win, or gamble on the space like they would in Las Vegas.

Will We Plan For Monetization Early On In IoT?
When it comes to IoT, it is easy to focus on the monetization the physical device, either leaving money on the table with new an innovative ways of generating revenue, or possibly having monetization strategies that are behind the scenes and not obvious to users--something that could be damaging to security, privacy and overall trust in the IoT space.

We learned a lot from mistakes made in early social, cloud and mobile API monetization. We need to make sure and have open conversation around healthy IoT business models and monetization strategies. Generating revenue from IoT needs to be a 3-legged endeavor that includes not just IoT platform providers, but sensibly includes ecosystem developers as well as end-users.

The world of IoT is just getting going, but is picking up momentum very quickly. We are seeing IoT devices enter our homes, cars, clothing, bodies and will become ubiquitous in the world around us, embedded in signs, doorways, roadways, products in rural and metropolitan areas. It is clear there is huge opportunities to make money in this new Internet connected world, but let's make sure and have open conversations about how this can be done in sensible ways to make sure the IoT space grows in a healthy and vibrat way.

See The Full Blog Post