Calling all Innovators in Oakland (Government)

Looking for innovators and change-makers in Oakland’s City Hall

At OpenOakland we get to work with people inside city hall and across this great City. We see hard work, creativity and good things happening all over- what we don’t see is that creativity and innovation being recognized publicly, so we’re out to recognize good government. We’re announcing the first ever Oakland Civic Innovators Awards!

We’re looking for nominations from anyone who lives, works, plays or worships in Oakland- tell us about the great things you’ve seen in City government this year- have you worked with an awesomely innovative staff member, experienced thoughtful, genuine engagement on the behalf of a local agency or seen game changing creativity from people you used to just think of as bureaucrats? If so, tell us about it and we’ll be glad to recognize the good efforts that reflect not just the heart of public service and the untold stories of good government in Oakland.

We’ll be announcing the awardees at the 2015 CityCamp Oakland on January 10th, which means this is also an announcement that CityCampOak is back!

Are your data too slow?

Not everything can be Big Data. Not everything should be either. But some data do need a kick in the pants, so to speak. Are the data you produce or use real time, coming down the pipe as a feed everyday, or are you stuck with years old data for your planning and analysis purposes? If you are in the latter, don’t feel bad — you’re not alone.

For those tracking Ebola outbreaks in West Africa, the stream of data is steady but not real time, yet decisions that impact people’s lives are being made every day about resourcing and responding to this crisis.  In the USA there are similarly important data needed — many infections or diseases are notifiable — requiring direct notification of the Centers for Disease Control and Prevention. However regular hospital visits, treatments and surgeries go through a very big, very slow pipeline from local clinics and hospitals up to the state agency level and after processing, refining and some magical treatment, these data flow back to local public health and research agencies some years later. Traditionally this timeline was “all we could do” because of technology limitations and other reasons, but as we rely more and more on access to near real-time data for so many decisions, health data often stands out as a slouch in the race for data driven decisions.

In a different vein, campaign finance data for political donations is sometimes surprisingly fast. In California all donations to campaigns require the filing of a Form 460 declaring who gave the funds, their employer and zipcode. Campaigns are supposed to file these promptly, but this does not always happen until certain filing deadlines. Nevertheless, these data contain valuable insights for voters and for campaigns alike. These data get submitted as a flow, but they then end up in a complex format not accessible to average people — until someone changes that. A volunteer team at OpenOakland created a very powerful automation process that takes these data and reformats them in a way that makes them accessible and understandable to everyone at http://opendisclosure.io. Yet even this system of automated data processing and visualization suffers from a lack of perfectly updated data on a daily basis- the numbers shown each day only reflect the data filed to date, so big donations or changes in patterns do not show up until those are filed — often at a somewhat arbitrary deadline.

Unfortunately not all data are filed frequently and do not come with an easy to use API connection to allow developers and researchers to connect to them directly. Take crime data. Very important information with a high demand for all sorts of decisions at local levels. Your police force may publish good crime data each day or maybe just each month which is useful for real estate people and maybe good for analysts and crime investigations, but how do we know if our local efforts have successfully impacted crime? We go to national data. The Federal Bureau of Investigations (FBI) collects data from most law enforcement agencies in the country and publishes it at as the Uniform Crime Reports (UCR). Unfortunately, these data are published years after the fact. There is a convoluted process for local agencies to format and filter their reports, but then these data take years to get published.

We recently created a violent crime fact sheet using the latest (and recently published) available UCR data — for 2012. This lag in data means that county supervisors and other officials are trying to evaluate the impact of crime prevention efforts but can’t even compare their outcomes with other cities due to the lag in this data – we have to wait for two more years to see if these data indicators  changed in other comparable cities, or if our interventions did have a measurable impact.  This sort of time lag means that no local officials have good comparable data in a reasonable time frame- a poor system for modern policy makers to rely on. The FBI is working to slowly implement a newer system, but it is not clear that the lag will improve.

Every agency responsible for collecting data for operational purposes MUST start thinking about how it can make these data safely available to decision makers and to the public on an expedited process.  The technology is now very accessible to support this, and if necessary we should be considering bifurcated approaches — the old, slow feed to state and federal agencies and a new, agile feed for local use. Privacy standards and quality are simply things that guide how we can do this, they are not actual barriers unless we choose to let them be.

Government is a business, albeit one with a monopoly on services it provides — and it’s not cool for government to be making decisions using years old data when the private sector is increasingly data driven and real time. We can do this!

* First published over at Govloop

Beyond compliance, beyond reports: Data for Action

First posted here.

A week ago the famous Napa region was shaken by a 6.0 scale earthquake resulting in serious damage to buildings, injuries and disruptions in services to a large area. This is something residents in the Bay Area have come to expect and we are all waiting for the next “big one”, overdue in most experts opinion.

The same week, our team launched a new app in response to the disaster.

Oakland is a city with a severe housing shortage, building anger towards gentrification and the unmeasured but very real displacement of low income residents who have called this city home for decades.  It is also home to 1,378 large apartment buildings that are at varying risks of collapse in a quake centered closer to Oakland. The City of Oakland and the Association of Bay Area Governments (ABAG) have studied this issue and over half these buildings have been screened – but over 550 remain to be screened for risk.  Many homes have been found to be safe, while 609 buildings (home to thousands of residents in apartments) have been found to be as serious risk – called potential Soft Story buildings – they have a large open ground level such as storage or garages that will potentially collapse in a quake- rendering those homes uninhabitable – an instant loss of thousands of affordable housing units protected under rent control – any housing units built to replace them will surely not be affordable, resulting in very rapid push out of poorer residents.

So why do we civic hackers care about this? It’s a matter of equity and a matter of many residents without good access to information relevant to their living situation- without information, no-one can act. Unfortunately, the common practice in government is to collect information and store it neatly in a report that floats around city hall as a PDF. The data live on a server somewhere, called on only when needed. We greatly respect the proactive work the City and ABAG have done in the screening efforts, however there remains a large number of homes unscreened and there are still thousands of renters with no idea of their risk- either through damage and injury or through displacement after the quake- as a result of rent increases applied by landlords passing on retrofitting costs – Oakland’s rent control policy sadly does not clarify whether seismic retrofitting costs are borne by the landlord or tenant or both.

Some months ago we convinced ABAG and the City of Oakland to publish the data from these surveys – a complicated inventory because of the changing status of buildings as they are screened and retrofitted.  We had been planning to build a new app that would raise awareness of this issue to spur action – both for tenant rights groups and for the city to determine a policy for handling these costs and for ensuring homes in Oakland are safe for residents. After the quake we realised it was an important moment to raise this issue – so we sprinted to release a new app that helps renters and homeowners see the status of their building: http://softstory.openoakland.org.  

Our approach is to build tools that puts information in the hands of the public in a way they can act on it. In this case, the formal report is a good document, but it serves policy makers only, it does not inform nor empower those living in these homes.  This simple app lets any resident see how their building is rated – as exempt and not a soft story building, as retrofitted and safe or as potentially soft-story and at risk in a big quake.  

We’ve advocated for open data with local governments for this very reason (and others) – data can be used to fill up reports with snippets and summaries that help decision makers, but there should be a default to open with all data that has no legal reason to be protected – this information, in the hands of those actually affected by it can do radically more than if it were still sitting on a government hard drive somewhere in city hall!

Getting Quake Safe in Oakland

Our new app helps Oaklanders get Earthquake Safe!

Oakland has almost 610 homes at risk of collapse or serious damage in the next earthquake we will experience! These homes, called “soft-story” buildings are multi-unit, wood-frame, residential buildings with a first story that lacks adequate strength or stiffness to prevent leaning or collapse in an earthquake. These buildings pose a safety risk to tenants and occupants, a financial risk to owners and risk to the recovery of the City and region.

Today we are launching a new OpenOakland web and mobile app that will help inform and prepare Oakland renters and homeowners living in these buildings at risk of earthquake damage. The new app: SoftStory.OpenOakland.org provides Oaklanders with a fast, easy way to see if their home has been evaluated as a potential Soft Story building and is at increased risk in an earthquake.

The stats:

609: multi-unit buildings assessed to be at risk in Oakland

238: buildings have been assessed and found to have been retrofitted or to be exempt.

531: buildings are soft story types but have not had complete screenings yet.

1378: total soft story buildings in Oakland


 

This new app relies on data from a screening of potential soft story buildings undertaken by the City of Oakland and data analyzed by the Association of Bay Area Governments (ABAG). Owners and renters can see if their home is is considered at risk or has already been retrofitted, and learn about the risks to soft-story buildings in a serious earthquake, an event that is once again on people’s minds after last week’s magnitude 6.0 earthquake in Napa.


 

We’re launching this app as a prototype with short notice as we believe this information is critical for Oaklanders at this time.  The app was built with the support of ABAG and the City of Oakland and had technical support provided by Code for America.  Once again, where local government is increasingly transparent, where data is open and in open formats, our community can build new tools to help inform and empower residents.

To see if your building is at risk visit:SoftStory.OpenOakland.org

Concerned Internet Citizens of California

Net Neutrality is a big deal. My opinion and as of today the opinion of our President. The FCC is considering a rule to allow internet providers to charge premo rates to big companies to give them better speed to deliver their content to you the consumer, sounds like a reasonable idea at first? The problem is the internet was create as an open, even, fair system and was engineered to always allow for fair treatment of anyone’s content- the problem is that when Verizon, Comcast etc charge Netflix big dollars for faster pipes, they can also refuse to do so and then favor their own network content- no longer a level playing field. For small businesses this means they will no longer be able to compete in the same way- startups like, say, Facebook several years back could not afford to pay for this premium delivery, so they get wiped out- bad for innovation, bad for consumers.

That’s a short a bad summary, anyway, there’s a great engagement and democracy side to this- the FCC opened up for comments and in new open government fashion then published all the >1 million comments in raw open data for free download, yay! The nice folks at Smarter Chicago beat me to processing the data, and so you also shouldn’t mess with that, just grab the data in a nice easy format. I wanted to see how active and how vocal different communities in California were about this issue- were big cities the source of the complaints? Were small, isolated towns aware of this issue and vocal?

//open-oakland.cartodb.com/viz/73a0b130-1dac-11e4-b79c-0edbca4b5057/embed_map?title=true&description=true&search=true&shareable=true&cartodb_logo=true&layer_selector=false&legends=true&scrollwheel=true&fullscreen=true&sublayer_options=1&sql=&sw_lat=37.21064411993447&sw_lon=-125.30731201171875&ne_lat=40.641051496100395&ne_lon=-115.90301513671875

I grabbed the processed data, aggregated by City names, cleared out some junk data, combined it with Census populations and locations (I forgot how painful it is to get basic Census data these days) and calculated a simple rate- for every 100,000 people in a city/town, how many comments got submitted- neither for or against, but just how active and engaged are people in California? There are a bunch of small towns left off as their rates are not reliable. Take a look at your region, are you surprised how high or low your rate is?

I was somewhat surprised to see a few rural towns topping the commenter lists- Nevada City (oops, maybe this is my family complaining?), has the highest rate followed by Sebastapol – NorCal represent… San Francisco, the tech darling is down at 44 with almost 7,000 comments but a rate of only 782. Oaktown is less activist full than normal at #80 with a rate of 536 and dearest Silicon Valley/Palo Alto is a shameful 63rd at a rate of 665- tech city needs some more concerned residents?

The data with city by city stats are below.

Do Oakland’s Civic Apps work for Oaklanders?

If that’s the type of question that gets you thinking, we want you!

OpenOakland creates digital tools to increase access to public information, to help Oaklanders engage more effectively with local government and with each other. We rely on the contributions and insights of Oaklanders with a variety of skills and occupations, not just tech geeks.  One of our values is to design and build with rather than for people: collaborating with the communities we aim to serve.

We are forming a Civic User Testing program, to build better tools with the feedback, perspectives and new ideas of their intended users. If you have a desire to build new tools with Oaklanders, if you’re a UX professional or if you want to help build the first user testing program in Oakland’s civic space we invite you to join us on Tuesday July 15th as we plan out this new project.

We’ll be using the work of Smarter Chicago as a template to hack for use in our own city- the goals for tomorrow night will be to:

  • Develop a plan of action;
  • Select projects for user testing;
  • Create a framework of priorities, criteria for participants.

So please join us in City Hall at 6:30pm on the 15th (yes tomorrow). Pizza and hacker fun guaranteed. Please RSVP for catering purposes.

Oakland’s City Council Tech to upgrade from 1997 software

To get an idea of how badly Oakland needs to upgrade it’s digital infrastructure read this one line from the staff report today:

“Legistar 4.8 has not been upgraded since purchase in 1997 & has reached the limits”

Limits in this case being the massive limitations of the current technology to support better civic engagement and discussion and no ability for our community to access the critical data held in the legislative system in Oakland.
There are many big changes desperately needed in our city’s tech stack and this is one long overdue. Our ancient legislation software was the reason Miguel and his crew struggled so hard to complete the build-out of our Councilmatic system, however with this big upgrade, we’ll be using a similar system to other major cities which means both improved user facing functionality as well as a much easier deployment of a more robust Councilmatic system that has been tailored for this version by folks in Phily & Chicago.

This upgrade hit the city Finance Committee today, we’ve been waiting for over two years so it’s exciting that this finally gets approved. While the software upgrade itself is an important step for our city, more important was witnessing the ways our staff and elected officials have adapted their thinking about technology, data, code and procurement.  Two years ago there was nothing to brag about, not much to be proud of in our cities use of technology and our law making. Today saw what I think was a pivotal moment for our city. Curious? This gets geeky fast, sorry…

It turns out that there is something in addition to the basic software the vendor, Granicus, can offer- an API – if you’re not a tech geek, this essentially means a robot (code, not real) that takes in requests form various people, programs, companies and dishes out the information requested in digital form.  In this case, the API is something Granicus has built but has not made available to cities that have not required access to it- almost noone to date (NYC is just now struggling to get this sorted out and seems to be on the right track).  Councilmember Schaaf halted before approving the purchase and asked the committee to require that Granicus provide us with this API as part of the contract requirements. Noone in Oakland has ever unbundled the contracted software from the date before (aside form the unintentional effort with SeeClickFix that came with an API we didn’t need to request).
This means that we get a new legislative publishing and video streaming system, but we also get direct access to all the data in this system- machine readable data that allows local hackers and engineers to build alert systems on specific issues and neighborhoods, custom tools to help people stay informed about what our government is doing and, well, anything you may want to do with full access to the data about our decision making and public meeting track records- voting decisions, law sponsoring and more. Stuff civic geeks dream of.
After the meeting I emailed LaTonda Simmons, our City Clerk who is the manager of this whole system to thank her for moving this and making it possible to unlock this data.  I was concerned the lack of specificity about the API being public would somehow bite us in the ass, I was wrong.  Her response was encouraging- folks in city hall are listening and it turns out that geeks can make a difference.

Hi Spike – I spoke to Granicus immediately after to Finance.  They reconfirmed they will turn on API.   And yes, your feedback and that of many others will be important in this process.  More to come and thank you for your support also.  I must add that this wouldn’t have been possible without Bryan making it move.  Looking forward to the next CityCamp event.  Chat soon.

-= LaTonda S.

People in the city are really starting to get this stuff and it’s gonna be awesome as it becomes the norm- less bundling of contracted software with the data etc. And thanks to our new CIO Bryan Sastokas for starting to make things happen!
###
Info on the staff report here.
Oakland’s current system for council info is here.
Side note:
Also on this committee’s agenda was an awesome proposal to increase and make permanent a number of deeper engagement efforts around the city budget that the Budget Advisory Committee proposed.

Cops Beat Their Wives & Girlfriends At Nearly Double The National Rate

speakingtreasonfluently:

I suppose there are two possible explanations for cops having such higher rates of violence against their partners, relative to the general population.

First, it could be that people with a tendency to violent and abusive behavior disproportionately gravitate towards law enforcement work, because of something specific to their personality types.

Secondly, it could be that the conditions of the job put cops on a violent hair-trigger, which they can’t easily turn off after going home. 

But either way, the results are the same, and the results are the problem. Something has to be done, beginning with figuring out ways to break down the bullshit blue brotherhood (and it is mostly, still, a “brotherhood”) that protects abusive officers, whether their abusiveness transpires at home or on the job and in the streets.

Cops Beat Their Wives & Girlfriends At Nearly Double The National Rate

Broadband Access in Alameda County

The digital divide is a very real and very stable reality in communities like Oakland, California.  Knowing which neighborhoods have solid access to high speed internet is a critical aspect of planning for government and nonprofit provided online services- if we want low income folks from Oakland’s flatlands to use a new digital application, we’d damn sure better know how many households in the target areas likely have decent speed internet hookups at home!  Luckily for us the FCC collects reliable data on this and they publish it freely at a local level

http://open-oakland.cartodb.com/viz/1d8f9768-d7b9-11e3-b620-0e73339ffa50/embed_map?title=true&description=true&search=true&shareable=true&cartodb_logo=true&layer_selector=false&legends=true&scrollwheel=true&fullscreen=true&sublayer_options=1&sql=&sw_lat=37.655014078010716&sw_lon=-122.55455017089844&ne_lat=37.91224232115994&ne_lon=-121.67564392089844

Do yourself  a favor and view the fullscreen version: http://cdb.io/1jq8wDq

I took the raw tract level data and joined it to census tracts in QGIS, calculated a new string field called “res_fhsc_per_1000hhs” and calculated the real rate values to display in the map legend and popup- the raw data contains coded values that correspond to real numbers- so 5 means a rate of 800-1,000 per 1,000 households. The GeoJSON file was then loaded into a CartoDB mapping system. 

As with most social phenomena, Oakland’s east and western flatlands stand out as parts of the county with quite low home broadband. Those communities may have internet via very slow services that many modern web sites won’t run well over (these data include all services providing over 200kbps – try using the web on a 256k plan in 2014!).  The data are for 2010 Census tracts and were last collected and published for December 2012. Many households will have improved access since then and we also know from Pew research that many minority communities use mobile devices as a primary means of internet access.