Whiteness is not an abstraction; its claim to dominance is fortified through daily acts which may not seem racist at all precisely because they are considered “normal.” But just as certain kinds of violence and inequality get established as “normal” through the proceedings that exonerate police of the lethal use of force against unarmed black people, so whiteness, or rather its claim to privilege, can be disestablished over time. This is why there must be a collective reflection on, and opposition to, the way whiteness takes hold of our ideas about whose lives matter. The norm of whiteness that supports both violence and inequality insinuates itself into the normal and the obvious. Understood as the sometimes tacit and sometimes explicit power to define the boundaries of kinship, community and nation, whiteness inflects all those frameworks within which certain lives are made to matter less than others.

Advertisements

Data rich, analysis poor

Oakland has a new School Superintendent, I like him, partly because of the following statement he dropped at a meeting of the Youth Ventures Joint Powers Authority recently- all the city and county heavy-hitters were there, discussing the possibility of hiring an out-of-state firm to do a data report on Oakland. There was much debate about the need to do this, the need for non-local data folks, the quality of local data, but Anton Wilson wonderfully cut through that: “Since I’ve come to Oakland I’ve seen a huge stream of data come across my desk. We don’t need more data, we’re not data poor- we’re analysis poor.”

I could have high five him for that statement. But that would have been awkward from across the room.

His point is one that I’ve harped on about for some time in Oaktown. We have troves of data, but barely a person doing thoughtful analysis of it to inform decision making, policy, evaluation (with the exception of some bigger programs that do get evaluated heavily).  A similar incident highlights this even more starkly. In front of the County board of supervisors, a department chief was utterly stumped when asked a seemingly simple, core metric about their department, after an injection of $75M dollars in the past couple of years for a new program.  I know that agency has tons of data, but I’m also aware of failed efforts to replace huge parts of their technology base and a stagnating effort to build a data team, so while I share some of the pain, ultimately it’s up to all senior leaders to take seriously and invest in people and systems to help make modern government agencies data smart if not fully data driven.

Part of our current problem is that the understanding of technology and data is very poor at the executive level and this often results in unwise mashing of technology and data folks with little thought to those being the right people with the right skills to actually understand your operations. I’ve talked often of the need to integrate data analysts and researchers into regular agency strategy and planning to they can respond as needs arise, but this is also a higher level problem- started when those responsible for departments do not themselves have enough data savvy or technology awareness to make good initial decisions.

If you’re one of the data geeks or tech folks in government, a good way for you to both increase your value and to help grow your organization is to add a layer of analysis or context when asked for simple data products. Instead of just giving the numbers of what you’re asked about, give some context to how that has changed, ways that measuring that thing have changed, gaps in your data that make that data fuzzy, or even better, ask those annoying questions like “What is this being used for? What decisions are you trying to make? Can I help you when it comes to digesting this information at a planning meeting?” You’ll be stunned at the number of exec level meetings with people saying ‘I don’t really know what these data mean” or “I wish we knew some context around these data”, but never bother to pass those issues down to you. Suggest you can both produce better products and also help with analysis if you are part of the process.

For leaders, humility and awareness of how much data and tech really drives the world is a powerful starting point. Look at what other progressive agencies are doing with performance management, accountability and data driven initiatives. Copy them. And perhaps most important, find a local ally who does know data driven strategies and technology management in their sleep and have them help you make better decisions. One last clue- buying business analytics software won’t help you, training your staff properly and building your capacity by hiring data and tech savvy staff will!

Striving for better: Diversity in Civic Tech

There are many parts of my life where I’m really comfortable. I love talking about justice and social struggles, love talking about race, the reality of inequality and what it does to our society, human trafficking/sex slavery and the push back I get from pro-sex workers that this even matters. I’m also comfortable talking about diversity, the lack of it and how the tech sector and others need to ditch the status quo and it’s unjust implications. What I haven’t loved, haven’t been comfortable with, is people being critical of and even attacking an organization I‘ve led and helped built over the past two years. I’m uncomfortable because, despite some unloving offenses, those complaining have been largely right.

Most civic hack nights in Oakland’s city hall sees a wonderful balance of males and females all working on tech, engagement and design challenges to make our city a better place. Some weeks the balance shifts to more men, other weeks it’s female dominated. And I feel like this is something worth celebrating, being glad about. We’ve made real efforts to make sure men and women are included, encouraged to lead projects (not just do design- an early trend we identified and tackled) and to be part of our formative leadership team in strong numbers. But despite this one good thing, this rare gender balance in a tech sector full of macho bullshit, we’re still not doing enough, but we’re about to change that.

We’re way too white.

I’d love to deny it, but it’s real. Despite our co-founders being white and latino, and guys, our leadership team and our general membership is very much mismatched with the demographics of the city we serve. We’ve spent much of 2014 talking, listening, growing and building as an organization, and despite the intentions, despite the genuine desire for a fully inclusive organization, it hasn’t just happened. So we’re stepping up on this area. We say we’re lean, we’re adaptive, well that has to apply to all facets of our organization.

We declare a value of building with, not for (the people we seek to serve), and to us that also means that “us” must be all of us, not just those who’ve chosen to walk through the doors and get involved. So what are we doing? For starters, we’re making an intentional push for diversity in our leadership recruitment (about to launch). And we’re putting our money where our mouth is. We don’t have much funding yet, but in our first serious investment from Code for America, our main expense is a fantastic consulting firm who we’ve hired to help us develop strategies to ensure that our leadership, our advisory board and our membership becomes as diverse as our city.

We’ve asked our new partner to take on a layer of screening that will result in a more diverse candidate pool for us to pick from, and to work with us to do targeted outreach to local leaders who could play a role in our organization- people from a broader pool than our current reach generates. We’ve seen this as necessary- if the same group of people ask their friends to participate, we don’t stand a good chance of succeeding, of building a diverse leadership team. If our foundation isn’t solid, it won’t matter how good our apps are, we’ll never be “of the people, for the people” to get all patriotic and shit like that. While this partnership is our first big step, it won’t be our last, we know there’s a lot more hard work to do on this front.

As we roll into this brave new world of awkward moments and honest conversations about how we will get to who we want to be, I’m very proud of our current team and their efforts to move in this direction, to accept we’re not as diverse as we want nor as pro-active as we need to be. But we’re all prepared to do this, to learn, to be humbled and to grow, with the added strength, insights and trust that a really Oaklandish team will give us.

My invitation to others is twofold — join us, especially if you want to be part of something great, and also encourage us and give us constructive criticism along the way, but also forgive us if we’re not perfect, if we make mistakes. We give a shit. We are not cool with the status quo. We need you to help make this better.

Data: It’s all about people, not the data

I’m a data geek. I’ll own that. I love what data can do, what it can inform, what it can tell me.  I constantly find myself mentally connecting conversations I’m in and meetings I’m part of to the data that could best inform the discussion or the decisions. It’s a bit of a problem.

As our society and our government becomes slowly absorbed by the data deluge we’re now enabling, there is a righteous backlash from many that data isn’t what it’s all about, data are not more important than say, people. And this is a fair suggestion. Sometimes this is a valid and constructive statement – the point of analysis is not the data, the results or the visualization of those results, it’s what those data can do to inform decisions that will have a human impact that matters.  Where I get frustrated is with people loving to push back on the idea of using data pro-actively is when people argue that “this problem isn’t about data, it’s not something we need data for, we already know what’s happening”. I hate those statements.  They relay a level of arrogance that is not intentional but real.  Anytime someone already fully knows the nuance and scale of a problem, they better also have insights as to the solutions, otherwise what good has their knowledge and insight been to the people they care about helping?

This is another case of two sides acting as if only one side is important. And that is not something productive or effective for most social issues. It’s next to impossible to get executive buy in to change something with just experience and intuition, we don’t often see policy or investment decisions based on insight alone.  Likewise, we should not ever be making serious decisions or assumptions just based on data alone. That leads to decisions made lacking critical context and nuance and to simplistic technocratic solutions. Better to be pairing the data with the insights and experience of those living out those data.

Just as policies are often more successful when developed with the decision makers and implementers involved, so too should data driven decisions be constructed.  A great local example of this in action appeared in the release of our latest report focused on attendance problems in Oakland Unified Schools. Despite serious problems of chronic absenteeism across the district, Garfield Elementary is one of six schools in Oakland that have cut chronic absences by half or more. The Principal, Nima Tahai said “First, it’s data driven. You have to have the numbers in front of you, student names and down to the reasons for each absence… Then, school staff must engage in one-on-one work with families, reaching out to them to find out what is going on and talking to them about the importance of getting their kids to school. He went on to say that Garfield administrators even pick up kids to drive them to school if a family is stuck without transportation or a parent is ill.

This problem would never have been raised to the community’s attention without thoughtful analysis of very detailed data on every student in the district. Data revealed the scale of the problem, and then, in the hands of a facile administrator, were used to identify individual points of influence or action- each student in need of help.  The data alone mean just a nice report or a compliance document. When delivered in a form that can support action, these data become powerful elements of change. Data, people, action. That’s how government should be driving change, data driven, not data obsessed.

*First posted on Govloop.com

Calling all Innovators in Oakland (Government)

Looking for innovators and change-makers in Oakland’s City Hall

At OpenOakland we get to work with people inside city hall and across this great City. We see hard work, creativity and good things happening all over- what we don’t see is that creativity and innovation being recognized publicly, so we’re out to recognize good government. We’re announcing the first ever Oakland Civic Innovators Awards!

We’re looking for nominations from anyone who lives, works, plays or worships in Oakland- tell us about the great things you’ve seen in City government this year- have you worked with an awesomely innovative staff member, experienced thoughtful, genuine engagement on the behalf of a local agency or seen game changing creativity from people you used to just think of as bureaucrats? If so, tell us about it and we’ll be glad to recognize the good efforts that reflect not just the heart of public service and the untold stories of good government in Oakland.

We’ll be announcing the awardees at the 2015 CityCamp Oakland on January 10th, which means this is also an announcement that CityCampOak is back!

Are your data too slow?

Not everything can be Big Data. Not everything should be either. But some data do need a kick in the pants, so to speak. Are the data you produce or use real time, coming down the pipe as a feed everyday, or are you stuck with years old data for your planning and analysis purposes? If you are in the latter, don’t feel bad — you’re not alone.

For those tracking Ebola outbreaks in West Africa, the stream of data is steady but not real time, yet decisions that impact people’s lives are being made every day about resourcing and responding to this crisis.  In the USA there are similarly important data needed — many infections or diseases are notifiable — requiring direct notification of the Centers for Disease Control and Prevention. However regular hospital visits, treatments and surgeries go through a very big, very slow pipeline from local clinics and hospitals up to the state agency level and after processing, refining and some magical treatment, these data flow back to local public health and research agencies some years later. Traditionally this timeline was “all we could do” because of technology limitations and other reasons, but as we rely more and more on access to near real-time data for so many decisions, health data often stands out as a slouch in the race for data driven decisions.

In a different vein, campaign finance data for political donations is sometimes surprisingly fast. In California all donations to campaigns require the filing of a Form 460 declaring who gave the funds, their employer and zipcode. Campaigns are supposed to file these promptly, but this does not always happen until certain filing deadlines. Nevertheless, these data contain valuable insights for voters and for campaigns alike. These data get submitted as a flow, but they then end up in a complex format not accessible to average people — until someone changes that. A volunteer team at OpenOakland created a very powerful automation process that takes these data and reformats them in a way that makes them accessible and understandable to everyone at http://opendisclosure.io. Yet even this system of automated data processing and visualization suffers from a lack of perfectly updated data on a daily basis- the numbers shown each day only reflect the data filed to date, so big donations or changes in patterns do not show up until those are filed — often at a somewhat arbitrary deadline.

Unfortunately not all data are filed frequently and do not come with an easy to use API connection to allow developers and researchers to connect to them directly. Take crime data. Very important information with a high demand for all sorts of decisions at local levels. Your police force may publish good crime data each day or maybe just each month which is useful for real estate people and maybe good for analysts and crime investigations, but how do we know if our local efforts have successfully impacted crime? We go to national data. The Federal Bureau of Investigations (FBI) collects data from most law enforcement agencies in the country and publishes it at as the Uniform Crime Reports (UCR). Unfortunately, these data are published years after the fact. There is a convoluted process for local agencies to format and filter their reports, but then these data take years to get published.

We recently created a violent crime fact sheet using the latest (and recently published) available UCR data — for 2012. This lag in data means that county supervisors and other officials are trying to evaluate the impact of crime prevention efforts but can’t even compare their outcomes with other cities due to the lag in this data – we have to wait for two more years to see if these data indicators  changed in other comparable cities, or if our interventions did have a measurable impact.  This sort of time lag means that no local officials have good comparable data in a reasonable time frame- a poor system for modern policy makers to rely on. The FBI is working to slowly implement a newer system, but it is not clear that the lag will improve.

Every agency responsible for collecting data for operational purposes MUST start thinking about how it can make these data safely available to decision makers and to the public on an expedited process.  The technology is now very accessible to support this, and if necessary we should be considering bifurcated approaches — the old, slow feed to state and federal agencies and a new, agile feed for local use. Privacy standards and quality are simply things that guide how we can do this, they are not actual barriers unless we choose to let them be.

Government is a business, albeit one with a monopoly on services it provides — and it’s not cool for government to be making decisions using years old data when the private sector is increasingly data driven and real time. We can do this!

* First published over at Govloop

Beyond compliance, beyond reports: Data for Action

First posted here.

A week ago the famous Napa region was shaken by a 6.0 scale earthquake resulting in serious damage to buildings, injuries and disruptions in services to a large area. This is something residents in the Bay Area have come to expect and we are all waiting for the next “big one”, overdue in most experts opinion.

The same week, our team launched a new app in response to the disaster.

Oakland is a city with a severe housing shortage, building anger towards gentrification and the unmeasured but very real displacement of low income residents who have called this city home for decades.  It is also home to 1,378 large apartment buildings that are at varying risks of collapse in a quake centered closer to Oakland. The City of Oakland and the Association of Bay Area Governments (ABAG) have studied this issue and over half these buildings have been screened – but over 550 remain to be screened for risk.  Many homes have been found to be safe, while 609 buildings (home to thousands of residents in apartments) have been found to be as serious risk – called potential Soft Story buildings – they have a large open ground level such as storage or garages that will potentially collapse in a quake- rendering those homes uninhabitable – an instant loss of thousands of affordable housing units protected under rent control – any housing units built to replace them will surely not be affordable, resulting in very rapid push out of poorer residents.

So why do we civic hackers care about this? It’s a matter of equity and a matter of many residents without good access to information relevant to their living situation- without information, no-one can act. Unfortunately, the common practice in government is to collect information and store it neatly in a report that floats around city hall as a PDF. The data live on a server somewhere, called on only when needed. We greatly respect the proactive work the City and ABAG have done in the screening efforts, however there remains a large number of homes unscreened and there are still thousands of renters with no idea of their risk- either through damage and injury or through displacement after the quake- as a result of rent increases applied by landlords passing on retrofitting costs – Oakland’s rent control policy sadly does not clarify whether seismic retrofitting costs are borne by the landlord or tenant or both.

Some months ago we convinced ABAG and the City of Oakland to publish the data from these surveys – a complicated inventory because of the changing status of buildings as they are screened and retrofitted.  We had been planning to build a new app that would raise awareness of this issue to spur action – both for tenant rights groups and for the city to determine a policy for handling these costs and for ensuring homes in Oakland are safe for residents. After the quake we realised it was an important moment to raise this issue – so we sprinted to release a new app that helps renters and homeowners see the status of their building: http://softstory.openoakland.org.  

Our approach is to build tools that puts information in the hands of the public in a way they can act on it. In this case, the formal report is a good document, but it serves policy makers only, it does not inform nor empower those living in these homes.  This simple app lets any resident see how their building is rated – as exempt and not a soft story building, as retrofitted and safe or as potentially soft-story and at risk in a big quake.  

We’ve advocated for open data with local governments for this very reason (and others) – data can be used to fill up reports with snippets and summaries that help decision makers, but there should be a default to open with all data that has no legal reason to be protected – this information, in the hands of those actually affected by it can do radically more than if it were still sitting on a government hard drive somewhere in city hall!

Getting Quake Safe in Oakland

Our new app helps Oaklanders get Earthquake Safe!

Oakland has almost 610 homes at risk of collapse or serious damage in the next earthquake we will experience! These homes, called “soft-story” buildings are multi-unit, wood-frame, residential buildings with a first story that lacks adequate strength or stiffness to prevent leaning or collapse in an earthquake. These buildings pose a safety risk to tenants and occupants, a financial risk to owners and risk to the recovery of the City and region.

Today we are launching a new OpenOakland web and mobile app that will help inform and prepare Oakland renters and homeowners living in these buildings at risk of earthquake damage. The new app: SoftStory.OpenOakland.org provides Oaklanders with a fast, easy way to see if their home has been evaluated as a potential Soft Story building and is at increased risk in an earthquake.

The stats:

609: multi-unit buildings assessed to be at risk in Oakland

238: buildings have been assessed and found to have been retrofitted or to be exempt.

531: buildings are soft story types but have not had complete screenings yet.

1378: total soft story buildings in Oakland


 

This new app relies on data from a screening of potential soft story buildings undertaken by the City of Oakland and data analyzed by the Association of Bay Area Governments (ABAG). Owners and renters can see if their home is is considered at risk or has already been retrofitted, and learn about the risks to soft-story buildings in a serious earthquake, an event that is once again on people’s minds after last week’s magnitude 6.0 earthquake in Napa.


 

We’re launching this app as a prototype with short notice as we believe this information is critical for Oaklanders at this time.  The app was built with the support of ABAG and the City of Oakland and had technical support provided by Code for America.  Once again, where local government is increasingly transparent, where data is open and in open formats, our community can build new tools to help inform and empower residents.

To see if your building is at risk visit:SoftStory.OpenOakland.org