Amplifying Administrative Burden

How Poor Technology Choices Can Magnify the Challenges Faced by Those Seeing Government Services

For every 10 people who said they successfully filed for unemployment benefits during the previous four weeks three to four additional people tried to apply but could not get through the system to make a claim. Two additional people did not try to apply because it was too difficult to do so. When we extrapolate our survey findings to the full five weeks of UI claims since March 15, we estimate that an additional 8.9–13.9 million people could have filed for benefits had the process been easier. [Emphasis added]

Unemployment filing failures: New survey confirms that millions of jobless were unable to file an unemployment insurance claim. Economic Policy Institute

The impact on jobs and our economy from the ongoing COVID-19 pandemic, and the attempts by our government to provide relief for those impacted through the CARES Act has brought into sharp focus the issue of administrative burden. Administrative burden can be succinctly defined as “an individual’s experience of a policy’s implementation as onerous.”

In the book, Administrative Burden: Policy Making by Other Means, Pamela Herd and Donald Moynihan define administrative burden as the aggregate learning cost, compliance cost, and psychological cost of meeting the requirements to access a government service or benefit. As we watch almost in real time as state benefit systems fail again and again from the increased demand for benefits spurred by the crisis, we see that the cost of this burden is steep indeed.

Turning Burdens up to 11

A key takeaway for me from the writing of Herd and Moynihan is that efforts to impact administrative burden can have a reinforcing effect. Efforts running in parallel to ease burdens can sometimes amplify each other:

…program simplification around the brand of BadgerCare [the name for the State of Wisconsin’s Medicaid program] lessoned burdens not only by reducing the number of forms an applicant had to complete, but also by facilitating marketing and outreach efforts around a single easily identifiable non stigmatizing program. Any approach to reducing burdens should therefore consider how different techniques can work together. [Emphasis added]

It seems obvious now that this amplifying effect can work the other way as well, and that many of the issues we are seeing with state benefit systems are a result of this form of “burden reinforcement.”

Poor technology choices — many of them easily avoidable and relatively easy to correct — are reinforcing the existing administrative burden faced by applicants. The impact of these technology choices is often inadvertent, and distinct from technology implementations that may be made specifically to enhance burdens, and reduce access to programs or benefits.

What are some examples? This is far from an exhaustive list, but some that easily come to mind are:

  • Dynamically generation of web pages for information about the ongoing crisis, rather than static pages, or the use of optimized caching.
  • Leveraging browser-specific functionality, or failing to optimize for mobile, or low-bandwidth situations.
  • Sites not optimized for disability access.
  • Sites not optimized to scale out to meet demand in times of heavy use.
  • Poorly designed or confusing IVR trees
  • Poorly advertised contact, office location, or mailing address information.
  • Poorly designed, confusing or opaque content on program details or requirements.

None of these observations are new or original, and there are some amazing resources available that provide insights into the technology choices that support public benefit systems across all fifty states that are worth exploring in depth.

What seems overlooked in the literature about administrative burden that the current crisis helps underscore is that these poor technology choices can be “situationally burdensome,” amplifying burden more acutely when circumstances change. The environment in which burdens play out, and have an impact on those who apply for services, is dynamic. As circumstances change, design choices that may not have presented a significant burden to users in the past, can present new and unforeseen challenges.

For example, state unemployment benefits might be conditioned on an applicant certifying during the application process that they would return to work if offered a job. In most circumstances, including such a requirement in the application process might seem like a modest burden to impose (and one easily rationalized by policy makers). However, during a global pandemic, where returning to a shared workplace might present real health risks, it creates a dilemma for applicants. This policy choice carries a situational burden and dramatically impacts the learning and compliance costs of applying for benefits during a health crisis.

Similarly, a state that neglects to design a benefit system to scale adequately to meet demand runs the risk of imposing a situational burden. During times when demand for services are steady or relatively modest, these choices may not impose an undue burden on applicants. But when demand for a service spikes, applicants are often confronted with significant burdens in accessing services.

De-amplifying Burdens

But why should we care so much about these kinds of technology choices, and the reinforcing impact they can have on the burden eligible recipients face in accessing services?

These choices serve to amplify the existing burden associated with accessing information or services from government, specifically learning and compliance burdens, by making a service more difficult to apply for using a technology device (a web browser or telephone). This amplified burden falls disproportionately on people with restricted access to technology, or who use less modern technology. So, correcting these issues will disproportionately benefit these populations as well.

These fixes should be non-controversial, and in some cases relatively easy. Correcting these mistakes won’t impact eligibility for a service, or requirements for receiving benefits — just the level of effort required to find out about a service, and comply with requirements.

Correcting these issues can also be a way to help underscore the importance of sound technology choices as a way to buffer against administrative burden. Helping policy makers and others in government understand why these choices amplify burden can act as a pathway to understanding the connection between properly implemented technology and sound policy execution.

One final reason why helping governments fix these common mistakes, that is often less talked about, is accountability. In her book Automating Inequity, Virginia Eubanks provides a vivid and detailed description into the design of public benefit systems specifically engineered to reduce take-up by eligible recipients. She uses the example of the State of Indiana’s welfare automation effort to illustrate:

The Indiana automated eligibility system enhanced the state’s already well-developed diversion apparatus…[b]y narrowing the gate for public benefits and raising the penalties for noncompliance, it achieved stunning welfare role reductions.

Poor technology choices that reinforce burdens and make it more difficult for applicants to receive benefits can create confusion about why people don’t apply for or receive benefits. Is it because systems are designed to be punitive, and are engineered to remove eligible people from the benefit rolls more quickly? Or is it because the burden they present (or are perceived to present) to eligible applicants, through long wait times and confusing requirements, seems insurmountable and not worth the effort?

Our ability to hold our officials accountable for the policy choices they make is obscured when they can blame a lack of take-up for benefits on a website issue or technical glitch. Helping to ensure that sound technology choices are made is the best way to ensure that those eligible for public services receive them as quickly and as easily as possible.

It’s also the best way to ensure that our leaders are held accountable for the policy choices they make.

Process Eats Culture for Breakfast


Famed management consultant Peter Drucker is often credited with the phrase “culture eats process (or strategy) for breakfast.”

You can’t change organizations by implementing new processes alone, so the thinking goes, you have to foster a new culture in order to drive real change. To understand the degree to which this idea is accepted as management philosophy gospel, we have but to count the number of times it is repeated at conferences, in meetings, or on social media by various thought leaders.

But when we think about changing the way public sector organizations work, particularly in how they acquire and manage new technology, this idea gets flipped. In the world of government technology, process eats culture for breakfast.

 

Read More

Towards Ethical Algorithms

Old tools & new challenges for governments

There is a common misconception that data-driven decision making and the use of complex algorithms are a relatively recent phenomenon in the public sector. In fact, making use of (relatively) large data sets and complex algorithms has been fairly common in government for at least the past few decades.

As we begin constructing ethical frameworks for how data and algorithms are used, it is important that we understand how governments have traditionally employed these tools. By doing so, we can more fully understand the challenges governments face when using larger data sets and more sophisticated algorithms and design ethical and governance frameworks accordingly.

Read More

Driving Innovation Beyond The Big City

In late 2014, I had a chance to present on the main stage at the annual Code for America Summit in San Francisco. To the surprise of very few people, I was there to talk about cities and data.

Earlier that year, I had finished up my term as the first Chief Data Officer for the City of Philadelphia, one of the largest cities in the country. But my focus that day was not on big cities like Philadelphia—but rather on smaller cities that had not yet started down the road of leveraging data to spur innovation and inform better policy decisions.

In 2014, the delta between what large cities were doing with data and what small and mid-sized cities were doing was pretty stark.

Read More

Operation Data Liberation

jenga

Image courtesy of Flickr user antonymayfield. View license here.

I’ve had the opportunity recently to talk to people in several different city governments that are facing a common challenge — how to liberate operational data from a legacy system.

This is a challenge that lots of city governments face, and it strikes me that there are some common lessons that can be derived from cities that have gone down this road already for those that are still trying to figure out the right approach.

The following suggestions are crafted from my own experience as a municipal government official charged with making data more widely available, and those of people in similar positions that I’ve had a chance to speak with.

Read More

Read My Book

Late last year, I wrote a book devoted to civic hacking based on my experience working in state and local government, and inside civic tech communities.

book-picture

It’s a book meant for public servants and people working inside government who want to connect with innovators and technologists outside of the bureaucracy. The premise is simple – governments need to find more effective ways of collaborating with members of their local civic tech communities:

Governments must develop strategies for engagement that can help direct the efforts of outside technology experts to issues or challenges that will have the broadest impact and the largest potential payoff. They will need to learn how to rally people with special talents to a particular cause or challenge, and then to turn those outside efforts into tangible outcomes for government agencies. They must learn to view the technology community as a potential talent pool from which they can draw a new generation of public servants who possess unique expertise in digital service creation.

It’s open source and available on Github. If you have a suggestion for how I can make it better, send a pull request or open a new issue.

I hope you enjoy it.

Building the Government Data Toolkit

tools

Flickr image courtesy of Flickr user bitterbuick

We live in a time when people outside of government have better tools to build things with and extract insights from government data than governments themselves.

These tools are more plentiful, more powerful, more flexible, and less expensive than pretty much everything government employees currently have at their disposal. Governments may have exiting relationships with huge tech companies like Microsoft, IBM, Esri and others that have an array of different data tools — it doesn’t really matter.

In the race for better data tools, the general public isn‘t just beating out the public sector, its already won the race and is taking a Jenner-esque victory lap.

This isn’t a new trend.

Read More

The Changing Role of the Government CDO

5662338797_7ede7d8461_o

Photo courtesy of Flickr user Richard Cahan

The title of “Chief Data Officer” – once an uncommon one in state and municipal governments – is becoming less uncommon. And that’s a very good thing for public sector innovation.

As recently as a few years ago, Chief Data Officers were found almost exclusively in big city governments like Chicago, New York and Philadelphia. Municipal governments provide services that touch citizens’ lives in more intimate ways than states or the federal government, and big cities have a critical mass of data that is attractive to the growing community of users with powerful tools for mapping and analyzing data. So it’s no surprise that cities have led the way in creating new, data-focused positions like CDOs, and in releasing open data to the public.

But increasingly, state governments and small to midsized cities are appointing Chief Data Officers, and creating new positions that focus almost exclusively on data. For example, earlier this year the City of Syracuse (a city of approximately 145,000 in Central New York) appointed it’s very first Chief Data Officer. It’s worth noting that this is not a stand alone position as in some other cities. The CDO position in Syracuse was deliberately made part of the city’s internal innovation team (which is funded through the Bloomberg Philanthropies iTeam program) and plays an integral part in the city’s efforts to use data internally to provide services more efficiently.

Read More

Coming Back from the Brink

Photo courtesy of Flickr user Oliver Hine.

Photo courtesy of Flickr user Oliver Hine.

Last August, a study from the Century Foundation identified cities in Upstate New York as places with some of the highest concentrations of poverty for African American and Hispanic populations anywhere in the nation. The problem is particularly acute in the City of Syracuse which holds the  distinction of having the highest level of poverty concentration among African American and Hispanic populations of the one hundred largest metropolitan areas in the U.S.

This problem isn’t Syracuse’s alone – the study shows that Rochester and Buffalo also have serious problems with concentrated poverty. But the Salt City is an unfortunate standout in this report. In addition to have the highest concentrations of poverty among African Americans and Hispanics, when looking at concentrated poverty among non-Hispanic whites “…Detroit, Fresno, and Syracuse are the only metropolitan areas on all three lists.”

The Century Foundation’s findings echo those of an earlier study with a similar scope conducted by CNY Fair Housing, Inc. which found that the Syracuse area is “one of the worst scoring cities in the country when looking at equality of opportunity based on race and ethnicity.” Given what we know about how concentrated poverty affects the life outcomes for people who live in it, it’s hard to imagine a more serious drag on the growth and well being of our region than deliberately forcing people to live in places where they are surrounded by poverty and given them few options of getting out.

But that’s exactly what we do.

Read More

Building the Engine of Change

Engine of Change

Photo courtesy of Flickr user Bart Heird.

The term “civic tech” gets used a lot, and it often means different things to different people. To me, this has always meant that the work being done in this area is dynamic, growing, and evolving rapidly – all good things that suggest the impact of civic technology will ultimately be broad and durable. I’ve never been prone to excessive handwringing.

I believe very firmly that the most important thing about civic technology has nothing to do with technology at all. Real people – with empathy and a desire to make their community better – are the most important kind of civic technology. A recently released report on civic technology by Omidyar Network entitled Engines of Change underscores this idea, and helps emphasize that the connections between people – both inside the civic tech community and outside it – are what’s most important to its future growth.

Read More