Thinking Outside the Portal

Plug and play

Photo courtesy of Flickr user Jonathan Khoo.

The centerpiece of any government open data effort is usually a data portal.

Data portals host open data or provide listings for datasets, and typically include things like license information, data schemas, developer documentation and a host of other details aimed at making it easier for end consumers to find and use data. There is a great deal of variation in the degree to which government data portals succeed in making it easier for people to find and use data – some do it very well, others do not.

In some ways, an open data portal can simultaneously represent all that is right with a government open data program – all the potential it has for changing the way government works and how people interact with their government – as well as all of its limitations.

I am not unbiased in this opinion.

Read More

Civic Onboarding

Photo courtesy of Flickr user Chuck Simmins

Photo courtesy of Flickr user Chuck Simmins

Tomorrow, the President will speak at SXSW and issue a call to action for people inside and outside government to collaborate and solve the hard problems facing our country. This is a call to action that governors and mayors should echo – our communities are filled with people that want to help.

Reading about the Community Diaper Project and other efforts to get affordable disposable diapers to low income families got me so excited about recruiting those from outside government to work on hard problems that I forgot I have written about this before. I missed a lot in my first post.

Read More

Participation and the Cult of Catalogs

“Anonymous access to the data must be allowed for public data, including access through anonymous proxies. Data should not be hidden behind ‘walled gardens.’”
8 Principles of Open Government Data

In the world of open data, there are few things that carry more weight than the original 8 principles of open data.

Drafted by a group of influential leaders on open data that came together in Sebastopol, CA in 2007, this set of guidelines is the defacto standard for evaluating the quality of data released by governments, and is used by activists regularly to prod public organizations to become more open.

With this in mind, it was intriguing to hear a well known champion of open data at the Sunlight Foundation’s recent Transparency Camp in Washington DC raise some interesting questions about one of these principles, typically considered sacrosanct in the open data community.

Read More

Why 18F’s New Approach to Procurement Reform Matters

Fast Track

Image courtesy of Flickr user lchunt.

In another recent post, I talked about how public sector technology procurement was not well suited for the digital age.

But there are some efforts underway that seek to identify new methods of procuring technology solutions for government. As these ideas start to take hold, there is hope that those in the govtech community will create a set of strategies for more successfully implementing public sector technology solutions.

Built to fail?

The design of procurement policies that are used by our federal, state, and local governments are meant to encourage broad participation by requiring things like public announcements of solicitations, open vendor meetings, and fixed deadlines for submitting responses. The hope is to make the process as predictable and transparent as possible, and to level the playing field so that any qualified vendor can participate.

Procurement policies are also designed to mitigate risk and to ensure that selected vendors are competent and capable of undertaking the work required in government IT projects. It is important that these policies be consistent with the government’s duty to be responsible stewards of public resources.

In light of this, there is a great irony in the outcomes that these policies often seem to produce.

Read More

GovTech is Not Broken

When we talk about the challenges that face governments in acquiring and implementing new technology, the conversation eventually winds around to the procurement process.

That’s when things usually get ugly. “It’s broken,” they say. “It just doesn’t work.”

What most people who care about this issue fail to recognize, however, is that while the procurement process for technology may not work well for governments or prospective vendors (particularly smaller, younger companies), it is not broken.

It works exactly as it was designed to work.

Read More

Open Data Beyond the Big City

This is an expanded version of a talk I gave last week at the Code for America Summit.

An uneven future

“The future is already here – it’s just not evenly distributed.”
William Gibson. The Economist, December 4, 2003

The last time I herd Tim O’Reilly speak was at the Accela Engage conference in San Diego earlier this year. In his remarks, Tim used the above quote from William Gibson – it struck me as a pretty accurate way to describe the current state of open data in this country.

Open data is the future – of how we govern, of how public services are delivered, of how governments engage with those that they serve. And right now, it is unevenly distributed. I think there is a strong argument to be made that data standards can provide a number of benefits to small and mid-sized municipal governments and could provide a powerful incentive for these governments to adopt open data.

One way we can use standards to drive the adoption of open data is to partner with companies like Yelp, Zillow, Google and others that can use open data to enhance their services. But how do we get companies with 10s and 100s of millions of users to take an interest in data from smaller municipal governments?

In a word – standards.

Why do we care about cities?

When we talk about open data, it’s important to keep in mind that there is a lot of good work happening at the federal, state and local levels all over the country. Plenty of states and even counties doing good things on the open data front, but for me it’s important to evaluate where we are on open data with respect to cities.

States typically occupy a different space in the service delivery ecosystem than cities, and the kinds of data that they typically make available can be vastly different from city data. State capitols are often far removed from our daily lives and we may hear about them only when a budget is adopted or when the state legislature takes up a controversial issue.

In cities, the people that represent and serve us us can be our neighbors – the guy behind you at the car wash, or the woman who’s child is in you son’s preschool class. Cities matter.

As cities go, we need to consider carefully that importance of smaller cities – there are a lot more of them than large cities and a non-trivial number of people live in them.

If we think about small to mid-sized cities, these governments are central to providing a core set of services that we all rely on. They run police forces and fire services. They collect our garbage. They’re intimately involved in how our children are educated. Some of them operate transit systems and airports. Small cities matter too.

Big cities vs. small cities on open data

So if cities are important – big and small – how are they doing on open data? It turns out that big cities have adopted open data with much more regularity than smaller cities.

If we look at data from the Census Bureau on incorporated places in the U.S. and information from a variety of sources on governments that have adopted open data policies and making open data available on a public website, we see the following:

Big Cities:

  • 9 of the 10 largest US cities have adopted open data.
  • 19 of the top 25 most populous cities have adopted open data.
  • Of cities with populations > 500k, 71% have adopted open data.

Small Cities:

  • 256 incorporated places in the U.S. with populations between 500k – 100k.
  • Only 39 have open data policy or make open data available.
  • A mere 15% of smaller cities have adopted open data.

The data behind this analysis is here. As we can see, it shows a markedly different adoption rate for open data between large cities (those with populations of 500,000 or more) and smaller cities (those with populations between 100,000 and 500,000).

Why is this important?

We could chalk up this difference to the fact that big cities simply have more data. They may have more people asking for information, which can drive the release of open data. They have larger pools of technologists, startups and civic hackers to use the data. They may have more resources to publish open data, and to manage communities of users around that data.

I don’t know that there is one definitive answer here – there’s ample room for discussion on this point.

We should care about this because – quite simply – a lot of people call smaller cities home. If we add up the populations of the 256 places noted above with populations between 100,000 and 500,000, it actually exceeds the combined population of the 34 largest cities (with populations of 500,000 or more) – 46,640,592 and 41,155,553 respectively. Right now these people are potentially missing out on the many benefits of open data.

But more than simple math, if one of the virtues of our approach to democracy in this country is that we have lots of governments below the federal level to act as “laboratories of democracy” then we’re missing an opportunity here. If we can get more small cities to embrace open data, we can encourage more experimentation, we can evaluate the kinds of data that these cities release and what people do with it. We can learn more about what works – and what doesn’t.

In addition, we now know that open data is one tool that can be used to help address historically low trust in government institutions. It’s not hard to find smaller governments in this country that could use all the help they can get in repairing relations with those they serve.

How do we fix this?

There’s at least a few things we can do to address this problem.

First, we need more options for smaller governments to release open data. We’re not going make progress in getting smaller governments to adopt open data if the cost of standing up a data portal has the same budget impact as the salary for a teacher, or a cop, or a firefighter, or a building inspector – I just don’t think that’s sustainable.

Equally important, we need to work on developing useful new data standards. This won’t always be easy, but it’s important work and we need to do it.

open data standards tweet

For smaller cities without the deep technology, journalism and research communities that can help drive open data adoption, data standards are a way to export civic technology needs to larger cities. I believe they are critical to driving adoption of open data in the many small and midsized cities in this country.

We’ve already seen what open data looks like in big cities, and they are already moving to take the next steps in the evolution of their open data programs – but smaller cities risk getting left behind.

They next frontier in open data is in small and mid-sized cities.

The Philadelphia Experiment

Three years ago next month, the City of Philadelphia’s open data portal was launched by local technology firm Azavea as part of the inaugural Philly Tech Week. Two years ago next month, Philadelphia joined the small (but growing) fraternity of cities to adopt a formal open data policy – a milestone that stands as one of the first in something I think of as an “experiment” in municipal transparency.

Since this experiment first began, our open data efforts have come a very long way, and Philadelphia is now looked at as a national leader on open data and civic technology. In many ways it is now time for a new chapter in the Philadelphia experiment – our city is ready to take open data to the next level.

For me, though, it is time to step back and take a different role in the Philadelphia technology community and the broader open data effort – both here and in other places.

How Far We’ve Come

In the last several years, Philadelphia has grown into a national leader on open data and civic technology. We enthusiastically share our experience and our ideas with other municipalities that want to kick off their own open data experiment.

Our open data portal contains dozens of new data sets added since the adoption of our open data policy, some released for the very first time by city government. The OpenDataPhilly platform has inspired efforts in other cities that look to us as a model for how to implement their open data efforts.

We have embraced not only open data, but also open source software and we are actively sharing both our code and our data on platforms like GitHub as a way to engage with the broader data community and tap into communities of innovators. The City of Philadelphia now has more public GitHub repos than any other city in the country, and more are on the way.

We’re leading the discussion on the development of new municipal data standards that will help speed the adoption of other open data releases, and enhance the value of civic technology.

We’re finding innovative new ways to engage with local technology vendors by making opportunities for smaller technology projects more visible. We’re using GitHub and other platforms to make responding to technology procurement opportunities easier, more engaging and more valuable.

We have a growing suite of developer-ready APIs that can be used to quickly and easily create powerful new civic apps. And, most importantly, we’re collaborating with one of the most active civic hacking communities in the country and helping to enable the building of all sorts of new awesome apps.

The Road Ahead

Philadelphia is ready for the next chapter in its open data experiment.

The milestones that lay ahead will see the release of important new data sets that will enable more meaningful and more sustainable civic engagement in our city. These new data releases and APIs will also help create the foundation for unprecedented collaboration between city departments and other governments. No force has more potential to enable significant improvements in government efficiency and operations than open data.

I’ll be watching all of this unfold from a new vantage point – as a member (once again) of the Philly technology and civic hacking communities. In early April, I’ll step down from my post as Chief Data Officer and return to being a professional technologist.

I’ll be talking more in the weeks ahead about exactly what I’ll be doing, but I’ll still be working with governments and open data. I’ll also still be in Philly – my wife and I will continue to live and work here, and our kids will still go to school here.

It has truly been an honor and a privilege to serve this great city, and I owe a debt of gratitude to Managing Director Rich Negrin and CIO Adel Abeid for the opportunity. I look forward to continuing to do so going forward, but from outside city government.

Philly’s open data effort has always been bigger than any one person – it’s always been about the community. It’s now time for me to rejoin that community.

See you at the next hackathon.

On Sustainable Civic Technology

Sustaining civic technology will mean that both government’s IT infrastructure and the civic technology sector that builds on it will need to change.

A pair of recent blog posts caught my eye and highlighted this theme in my head, and motivated me to capture a few thoughts on this topic.

The first post was by Dan O’Neil, Executive Director of the Smart Chicago Collaborative. Dan’s post on the the things that need to happen to drive the “…maturation of the civic innovation sector of the technology industry” are worth the read. It’s a great post that helps highlight the connection between legacy IT system in government and open data, which is used to build civic technology.

“[without] existing legacy gov IT systems, there would be no civic tech. The data we use for our civic tech projects doesn’t get collected, managed, and exported by itself.”

The second post is from the Bay Area design firm Stamen, and highlights the need for data visualizations (and other bits of civic technology that rely on open data) to be maintained over time. The post uses the analogy of gardening to underscore the need to constantly update and maintain the things we build with data.

“If you plant a flower in a garden and then never give it water or light, it will in fact die. Unless, of course, it happens to placed in just the perfect spot, in which case it will need to be pruned. Either way, some kind of tending is always required.

We don’t always think of digital works in the same way, perhaps because their metaphor of creation more closely resembles that of a built object, like a bookcase or building. But even buildings need maintenance, and after so many years, the shelves on the bookcase may falter and need new ones. We need to be more conscious about this aspect of dynamic data visualization, at the outset.”

These two posts call out the need for both the government technology systems that lie at the foundation of open data, and the civic technology that uses this data to get better. To mature. To advance to the next level.

For me, the takeaways from these two posts are pretty clear.

Technology and civic apps are like gardening – except when they’re not.

I like the analogy of gardening applied to investment in technology solutions. It underscores the need to continually invest in the maintenance and upkeep of the solutions that are used to consume government data. However, I think there is some danger in using this analogy in that it obscures the the fundamental nature of change in the technology world.

I can tend my garden today in pretty much the same way as someone did 100 years ago – soil, water and sunshine. Boom. However, when building a technology solution (particularly web-based solutions) its not feasible to use the same approach or components that might have been acceptable as recently as 10 years ago. Budgeting for maintenance and enhancements for technology projects isn’t a nice to have – its absolutely essential to their success. The pace of change in the world of technology is just too fast for anything less.

We have to fix the budget process as it relates to investment in technology.

I’ve harped on this before, but we most definitely need to get better at helping non-technology people in government (particularly budget officials) understand the need for continuous support and maintenance of IT projects. Correction – we need to help them understand technology better in general. Not since the first season of The Bachelor has there been a worse match than the current public budgeting process and the pace of change in the world of technology.

To understand how acute this problem is, ask some you know that works in state or local government if they (or someone they work with) interacts as part of their job with a technology system that is 10 years old or older. I think the responses would be surprising to many people, and rather alarming to most professional technologists.

Government needs to embrace its role as a data steward

Governments need to understand their role in the civic technology production chain. The current public budgeting and procurement processes make us lousy at investing in the kinds of technology that chance rapidly – like those that power web-based solutions.

When I read the post by Stamen about their Crimespotting project with the City of Oakland, I see a city that hasn’t accepted its role as a mature data steward:

“…over the years, the project started wavering. Oakland’s API has sputtered to the point of being nonfunctional, rendering Oakland Crimespotting totally spotless.”

If governments are serious about open data, they need to invest in systems that will make their data easy to consume and readily available – without this, the civic technology sector won’t go far.

I think part of this is accepting the role of data steward, and engineering our IT infrastructure accordingly. I don’t know if Stamen’s was the only project consuming the Oakland crime data API, but building a robust community of users around open data can help public officials see the importance of investing to keep these systems stable and available.

Don’t underestimate the power of open source

When it comes to helping ensure that civic technology solutions continue to thrive after launch, there is almost nothing better than leveraging the power of open source. There’s nothing wrong with closed source solutions built on top of open data, but if governments are paying for custom solutions from vendors to display open data we should insist that the underlying code be open sourced. This can go a long way toward helping ensure that it continues to evolve over time, as civic technologists contribute fixes and enhances and (potentially) as other governments fork these solutions for their own use.

It’s really cool to see the conversation around civic technology and open data focused on how we can take the great work that has been done to the next level. I think it means we’re ready for the next steps.

Onboarding Civic Hackers

Earlier this week, I had the pleasure of attending a civic hacking event jointly organized by Code for Philly and Girl Develop It Philly. The event had a tremendously good turnout – over 50 people by my count – making it one of the larger events Code for Philly has organized in recent months.


The mission of Girl Develop It is to empower women to learn software development, and as a result there were a good number of people at the event being introduced to civic hacking for the first time. This got me thinking about ways to onboard people new to civic hacking (and people new to coding) into civic technology projects.

None of these is new, but here are five ideas I came up with after the event:

Data Liberation – the foundation of civic hacking project is open data, and far too much of the data civic hackers need is locked up in broken websites and unusable formats. Helping to break some of this data free can be a tremendous benefit to open data users and civic hacking projects.

Documentation – far too many open source and civic hacking projects go without proper documentation to help other developers contribute and to support end users. Helping to create or expand documentation for a project can be critical to helping it succeed.

User Testing – Organizing and conducting end user testing for civic technology projects is sadly rare. There are some efforts underway to change this but in order for civic hacking projects to improve and succeed we need real feedback from mainstream users.

Outreach – One legitimate criticism of civic apps is that too few people know they exist. There are efforts working to change this, like Apps for Philly (still in its infancy) – a site that lists a host of different civic technology apps that are available for users. Adding new projects to this listing (and others like it) will both help these projects succeed and give the person doing it a much clearer sense of the civic technology landscape.

Helper Libraries – a great way to get comfortable writing code and to help out a civic technology project is to write helper libraries for projects with APIs. At the Apps for Philly Transit Hackathon, one project utilized recently released data from the City of Philadelphia on bicycle thefts. The lead developer created a new API for this data to enable other projects to use it. Building new client libraries in a range of different languages would be a great way to support other developers that want to incorporate bike theft data into their projects, and to get some hands on experience writing code.

There are so many ways to contribute to open source projects and to help support civic hacking efforts – these are just a few.

We need more great events like the one organized by Code for Philly and Girl Develop it Philly to bring together all of the talented people we have in our city to work on these important projects.

Five Ways to Make Government Procurement Better

Nothing in recent memory has focused attention on the need for wholesale reform of the government IT procurement system more than the troubled launch of


There has been a myriad of blog posts, stories and articles written in the last few weeks detailing all of the problems that led to the ignominious launch of the website meant to allow people to sign up for health care coverage.

Though the details of this high profile flop are in the latest headlines, the underlying cause has been talked about many times before – the process by which governments contract with outside parties to obtain IT services is broken.

The process clearly does not work well for governments – and many would argue that it does not work well for vendors either. Typical government procurement rules shut out many small firms that may be well positioned to execute on a government IT project but lack the organizational endurance to make it through the selection process.

What are the problems with the government procurement process and how can we fix them? How can we foster more participation in the procurement process by more firms that are qualified to do work? How can we lower the prices that governments pay for IT products and services and help ensure better quality?

Here’s my take on the major problems with the current system, and five recommendations for how we can start to fix them.

The procurement process is too costly and complex

The most under-appreciated characteristic of the government procurement process as it exists today is that it’s current design is largely intentional. Much like the federal and state income tax systems, we imbue a number of values deemed important into our procurement processes in the hopes of fostering favorable outcomes.

Requirements for women and minority-owned business participation; requirements that governments utilize local vendors; requirements that governments favor vendors that have any number of different traits or qualities – no one would argue that these requirements are aimed at producing positive outcomes. But the price of using the procurement system as the vehicle for achieving these ends means that the process is more complex for all participants. This is true, even if the positive outcomes desired are not realized, or realized to a lesser extent than hoped for.

The added complexity borne by all participants doesn’t guarantee the intended results.

Probably the most overarching value we imbue in the procurement system is risk aversion – in fact, much of the complexity and cost of the current system (for both governments and vendors) can be attributed to the desire to reduce the risk assumed by governments when partnering with outside firms.

Requirements like proposal and performance bonds, professional liability insurance coverage and the submission of audited financial statements (sometimes several years worth) – to name just a few – are all meant to offset the risk governments assume when they outsource work to a vendor. In some respects, these requirements can be deemed beneficial to governments and indicate prudent stewardship of public resources.

However, the widespread use of these and similar requirements by governments at all levels doesn’t seem to have lessened the incidence of IT project failures, and they can drive up the cost of IT projects by pricing smaller firms out of the bidding process. For example, many small firms argue that the bond industry doesn’t have a good understanding of how to underwrite IT projects, so it can be difficult to get a performance bond as part of a bid response. The result – many smaller firms that may be well qualified to do work simply walk away from the process. This is true even of these provisions don’t lower the actual risk assumed by governments as part of large IT projects.

Again, the added complexity and cost borne by all participants doesn’t guarantee the intended results.

The use of these risk mitigation provisions at all levels of government helps highlight a more fundamental problem that is at the heart of the broken procurement process in government – a shocking lack of IT knowledge within government.

A better much way for governments to hedge against the risks inherent in any large project – particularly IT projects – is to develop the internal capacity to manage and implement them successfully.

The procurement process moves too slowly

This is actually a fair criticism of government in general, but it is a particularly acute during the procurement process. To illustrate the magnitude of the issue, consider that in the City of Philadelphia the period between a contract award (a vendor gets selected to work on a project) and the final execution of that contract (the beginning of actual work) can take an average of four months for some projects.

We can make the same observation about the pace at which governments operate that we have about the procurement process – much of this is by design. But long procurement and budget cycles can drive away smaller firms that can not afford to invest significant resources in projects that don’t provide a return for months on end.

There are usually requirements for government RFPs to be posted publicly and widely advertised, and that all interested parties have an equal opportunity to participate through public bid notices and open vendor meetings. All of these requirements are meant to enhance the transparency of the bidding process but they can also add to the time it takes to select a vendor for a project, and for the vendor to begin work.

An emphasis on transparency in the procurement process is undoubtedly a good thing. But there are other steps that governments can might to dramatically enhance the transparency of the public procurement process without adding steps that slow the process down and discourage many prospective vendors from participating.

Some things we can do to make procurement better

With all of this in mind, here are – in no particular order – five suggested changes that can be adopted to improve the government procurement process.

Raise the threshold on simplified / streamlined procurement

Many governments use a separate, more streamlined process for smaller projects that do not require a full RFP (in the City of Philadelphia, professional services projects that do not exceed $32,000 annually go through this more streamlined bidding process). In Philadelphia, we’ve had great success in using these smaller projects to test new ideas and strategies for partnering with IT vendors. There is much we can learn from these experiments, and a modest increase to enable more experimentation would allow governments to gain valuable new insights.

Narrowing the focus of any enhanced thresholds for streamlined budding to web-based projects would help mitigate risk and foster a quicker process for testing new ideas.

Identify clear standards for projects

Having a clear set of vendor-agnostic IT standards to use when developing RFPs and in performing work can make a huge difference in how a project turns out. Clearly articulating standards for:

  • The various components that a system will use.
  • The environment in which it will be housed.
  • The testing it must undergo prior to final acceptance.

…can go a long way to reduce the risk an uncertainly inherent in IT projects.

It’s worth noting that most governments probably already have a set of IT standards that are usually made part of any IT solicitation. But these standards documents can quickly become out of date – they must undergo constant review and refinement. In addition, many of the people writing these standards may confuse a specific vendor product or platform with a true standard.

Require open source

Requiring that IT projects be open source during development or after completion can be an effective way to reduce risk on an IT project and enhance transparency. This is particularly true of web-based projects.

In addition, government RFPs should encourage the use of existing open source tools – leveraging existing software components that are in use in similar projects and maintained by an active community – to foster external participation by vendors and volunteers alike. When governments make the code behind their project open source, they enable anyone that understands software development to help make them better.

Develop a more robust internal capacity for IT project management and implementation

Governments must find ways to develop the internal capacity for developing, implementing and managing technology projects.

Part of the reason that governments make use of a variety of different risk mitigation provisions in public bidding is that there is a lack of people in government with hands on experience building or maintaining technology. There is a dearth of makers in government, and there is a direct relationship between the perceived risk that governments take on with new technology projects and the lack of experienced technologists working in government.

Governments need to find ways to develop a maker culture within their workforces and should prioritize recruitment from the local technology and civic hacking communities.

Make contracting, lobbying and campaign contribution data public as open data

One of the more disheartening revelations to come out of the analysis of implementation is that some of the firms that were awarded work as part of the project also spent non-trivial amounts of money on lobbying. It’s a good bet that this kind of thing also happens at the state and local level as well.

This can seriously undermine confidence in the bidding process, and may cause many smaller firms – who lack funds or interest in lobbying elected officials – to simply throw up their hands and walk away.

In the absence of statutory or regulatory changes to prevent this from happening, governments can enhance the transparency around the bidding process by working to ensure that all contracting data as well as data listing publicly registered lobbyists and contributions to political campaigns is open.

Ensuring that all prospective participants in the public bidding process have confidence that the process will be fair and transparent is essential to getting as many firms to participate as possible – including small firms more adept at agile software development methodologies. More bids typically equates to higher quality proposals and lower prices.

None of the changes list above will be easy, and governments are positioned differently in how well they may achieve any one of them. Nor do they represent the entire universe of things we can do to improve the system in the near term – these are items that I personally think are important and very achievable.

One thing that could help speed the adoption of these and other changes is the development of robust communication framework between government contracting and IT professionals in different cities and different states. I think a “Municipal Procurement Academy” could go a long way toward achieving this.

Look for more details on a Municipal Procurement Academy – to train state and local officials on best practices for IT procurement – in a future post.

[Note – picture courtesy of Flickr user Curtis Perry]