Oakland’s City Council Tech to upgrade from 1997 software

To get an idea of how badly Oakland needs to upgrade it’s digital infrastructure read this one line from the staff report today:

“Legistar 4.8 has not been upgraded since purchase in 1997 & has reached the limits”

Limits in this case being the massive limitations of the current technology to support better civic engagement and discussion and no ability for our community to access the critical data held in the legislative system in Oakland.
There are many big changes desperately needed in our city’s tech stack and this is one long overdue. Our ancient legislation software was the reason Miguel and his crew struggled so hard to complete the build-out of our Councilmatic system, however with this big upgrade, we’ll be using a similar system to other major cities which means both improved user facing functionality as well as a much easier deployment of a more robust Councilmatic system that has been tailored for this version by folks in Phily & Chicago.

This upgrade hit the city Finance Committee today, we’ve been waiting for over two years so it’s exciting that this finally gets approved. While the software upgrade itself is an important step for our city, more important was witnessing the ways our staff and elected officials have adapted their thinking about technology, data, code and procurement.  Two years ago there was nothing to brag about, not much to be proud of in our cities use of technology and our law making. Today saw what I think was a pivotal moment for our city. Curious? This gets geeky fast, sorry…

It turns out that there is something in addition to the basic software the vendor, Granicus, can offer- an API – if you’re not a tech geek, this essentially means a robot (code, not real) that takes in requests form various people, programs, companies and dishes out the information requested in digital form.  In this case, the API is something Granicus has built but has not made available to cities that have not required access to it- almost noone to date (NYC is just now struggling to get this sorted out and seems to be on the right track).  Councilmember Schaaf halted before approving the purchase and asked the committee to require that Granicus provide us with this API as part of the contract requirements. Noone in Oakland has ever unbundled the contracted software from the date before (aside form the unintentional effort with SeeClickFix that came with an API we didn’t need to request).
This means that we get a new legislative publishing and video streaming system, but we also get direct access to all the data in this system- machine readable data that allows local hackers and engineers to build alert systems on specific issues and neighborhoods, custom tools to help people stay informed about what our government is doing and, well, anything you may want to do with full access to the data about our decision making and public meeting track records- voting decisions, law sponsoring and more. Stuff civic geeks dream of.
After the meeting I emailed LaTonda Simmons, our City Clerk who is the manager of this whole system to thank her for moving this and making it possible to unlock this data.  I was concerned the lack of specificity about the API being public would somehow bite us in the ass, I was wrong.  Her response was encouraging- folks in city hall are listening and it turns out that geeks can make a difference.

Hi Spike – I spoke to Granicus immediately after to Finance.  They reconfirmed they will turn on API.   And yes, your feedback and that of many others will be important in this process.  More to come and thank you for your support also.  I must add that this wouldn’t have been possible without Bryan making it move.  Looking forward to the next CityCamp event.  Chat soon.

-= LaTonda S.

People in the city are really starting to get this stuff and it’s gonna be awesome as it becomes the norm- less bundling of contracted software with the data etc. And thanks to our new CIO Bryan Sastokas for starting to make things happen!
###
Info on the staff report here.
Oakland’s current system for council info is here.
Side note:
Also on this committee’s agenda was an awesome proposal to increase and make permanent a number of deeper engagement efforts around the city budget that the Budget Advisory Committee proposed.

Cops Beat Their Wives & Girlfriends At Nearly Double The National Rate

speakingtreasonfluently:

I suppose there are two possible explanations for cops having such higher rates of violence against their partners, relative to the general population.

First, it could be that people with a tendency to violent and abusive behavior disproportionately gravitate towards law enforcement work, because of something specific to their personality types.

Secondly, it could be that the conditions of the job put cops on a violent hair-trigger, which they can’t easily turn off after going home. 

But either way, the results are the same, and the results are the problem. Something has to be done, beginning with figuring out ways to break down the bullshit blue brotherhood (and it is mostly, still, a “brotherhood”) that protects abusive officers, whether their abusiveness transpires at home or on the job and in the streets.

Cops Beat Their Wives & Girlfriends At Nearly Double The National Rate

Broadband Access in Alameda County

The digital divide is a very real and very stable reality in communities like Oakland, California.  Knowing which neighborhoods have solid access to high speed internet is a critical aspect of planning for government and nonprofit provided online services- if we want low income folks from Oakland’s flatlands to use a new digital application, we’d damn sure better know how many households in the target areas likely have decent speed internet hookups at home!  Luckily for us the FCC collects reliable data on this and they publish it freely at a local level

http://open-oakland.cartodb.com/viz/1d8f9768-d7b9-11e3-b620-0e73339ffa50/embed_map?title=true&description=true&search=true&shareable=true&cartodb_logo=true&layer_selector=false&legends=true&scrollwheel=true&fullscreen=true&sublayer_options=1&sql=&sw_lat=37.655014078010716&sw_lon=-122.55455017089844&ne_lat=37.91224232115994&ne_lon=-121.67564392089844

Do yourself  a favor and view the fullscreen version: http://cdb.io/1jq8wDq

I took the raw tract level data and joined it to census tracts in QGIS, calculated a new string field called “res_fhsc_per_1000hhs” and calculated the real rate values to display in the map legend and popup- the raw data contains coded values that correspond to real numbers- so 5 means a rate of 800-1,000 per 1,000 households. The GeoJSON file was then loaded into a CartoDB mapping system. 

As with most social phenomena, Oakland’s east and western flatlands stand out as parts of the county with quite low home broadband. Those communities may have internet via very slow services that many modern web sites won’t run well over (these data include all services providing over 200kbps – try using the web on a 256k plan in 2014!).  The data are for 2010 Census tracts and were last collected and published for December 2012. Many households will have improved access since then and we also know from Pew research that many minority communities use mobile devices as a primary means of internet access.

Power struggles

If you’d prefer to never see this kind of mess happening Oakland (thanks Berkeley for the great non-example), you should join Oakland Votes and many residents of the city to work on the creation of an independent Redistricting Commission for our city! Details on the flyer below- this will be a ballot measure this November assuming that city council passes it.  The meeting is to get community input into the model Oakland chooses to adopt- both California and Austin have done this and we can learn form their efforts! We’ll have good food and you’ll get to play a valuable part in shaping the future of our city, and the future shape of our council maps too!

props to @mollyampersand for the original, distant design elements.

Oakland gets its crime data feed on

You may not have heard about this yet, which is a shame.  It’s a shame because it’s a rare good thing in local government tech, because it’s a serious milestone for our city hall and because our local government isn’t yet facile with telling our community about the awesome things that do happen in Oakland’s city hall. But I’m excited, and I’m impressed that we’ve gotten here- Oakland’s crime report data is now being published daily, automatically, to the web, freely available for all.

This is quite a leap forward to where we were several years ago and in fact just year ago to be honest: spreadsheet hell. Often photocopied spreadsheet hell. Things happen slowly, but some things we’ve pushed for because we recognize their potential to change the game forever. First we pushed for opendata as a policy in the city, and we got there quick enough, but we’re now waiting in expectation for our new CIO Bryan Sastokas to publish the city’s very crucial open data implementation plan.  Then we started pushing public records into the open with the excellent RecordTrac app that makes all public information requests public unless related to sensitive crime info. And now with local developers soaking up the public data available we have the first ever Oakland crime feed and an API to boot!

The API isn’t actually something the city built, it’s a significant side benefit of their Socrata data platform- just publish data in close to real time and their system will give you a tidy API to make it discoverable and connectable.You can access their API here:

http://data.oaklandnet.com/resource/ym6k-rx7a.json

If you’re more of an analyst or a journo or a citizen scientist you may want the data in bulk, which you can grab here.  That link will get you to a file that is updated on a daily basis- pretty rad huh. Given how crime reports tend to trickle in- some get reported days after, some months, some get adjusted, the data will change over time- the only way to build a complete, perfect dataset is to constantly review for changes, update, replace etc- a very complicated task.  If you want a bulk chunk of data covering multiple years, with many richer fields and much higher quality geocoding you can grab what we’ve published at http://newdata.openoakland.org/dataset/crime-reports (covers 2003 to 2013) and for the more recent historic version you can use this file: http://newdata.openoakland.org/dataset/oakland-crime-data-2007-2014

Now that we’ve figured out how to pump crime report data out of the city firewall, we can get to work connecting it to oakland.crimespotting.org and building dashboards to support community groups, city staff and more!

So thank you to the city staff who worked to get this done- now let’s get hacking!

Side bar- Oakland has finally gotten hold of it’s new Twitter handle: @Oakland is now online! More progress…

Numbers and nonsense in Oakland’s Search for Public Safety

Oakland is once again talking about data and facts concerning crime, causes and policing practices, except we’re not really. We’re talking about an incredibly thin slice of a big reality, a thin slice that’s not particularly helpful, revealing nor empowering. And this is how we always do it.

Chip Johnson is raising the flag on our lack of a broad discussion about the complexity of policing practices and the involvement of African-Americans in the majority of serious crimes in our city, and on that I say he’s dead right, these are hard conversations and we’ve not really had them openly. The problem is, the data we’re given as the public (and our decision makers have about the same) is not sufficient to plan with, make decisions from nor understand much at all.  Once again we’re given a limited set of summary tables that present just tiny nuances of reality and that do not allow for any actual analyses by the public nor by policy makers. And if you believe that internal staff get richer analysis and research to work with you’re largely wrong.

When we assume that a few tables of selectively chosen metrics suffice for public information and justification for decisions or statements, we’re all getting ripped off.  And the truth is our city departments (OPD esp.) do not have the capacity for thoughtful analytics and research into complex data problems like these.  And this is a real problem.  Our city desperately needs applied data capacity, not from outside firms on consultancy (disclosure: my current role does this sometimes for the city) but with built up internal capacity.  There is a strong argument for external, independent access to data for reliable analysis in many cases, but our city spends hundreds of millions per year and we don’t have a data SWAT team to work on these issues for internal planning.  Take a look at what New York City does for simple yet powerful data analytics that saves lives, saves money and makes the city safer.  This is what smart businesses do to drive better decision making. 

Data, in context, with local knowledge and experience, evidence based practices (those showing success elsewhere) and a good process will yield smarter decisions for our city.

Data tables do not tell us about any nuances in police stops, we don’t know how these data vary across different neighborhoods nor anything about the actual situations around each stop- the lack of real data that shows incident level activity makes any real understanding impossible.

For example, the data report shows that White stops yield a slightly higher proportion of seizures/recoveries, so logic says why don’t the OPD pull over more White folks if they lead to solid hits at least as often?

Back in 2012 the OPD gave Urban Strategies Council all their stop report data to analyze, but there was no context nor any clear path of analysis suggested making it near impossible to produce thoughtful results, nor was it part of our actual contract.  But the data exist and should be used by the city to really understand how our police operate, the context of their work and the patterns that lead to meaningful impacts rather than habits that are not reflected upon and never questioned or changed.

it is not our cities job to just do the work, process the paperwork and never objectively review meta level issues.  According to our Mayor “Moving forward, police will be issuing similar reports twice a year”. We need data geeks in city hall to support our police and all departments and in 2014 we need to be better than data reports that consist of a set of summary tables alone.  Pivot tables are not enough for public policy.

If you’re still reading- the same problem arises with relation to the Shot Spotter situation- the Chief doesn’t think it’s worth the money, but our Mayor and CMs want to keep it- we now have the data available for the public but we’ve not really had any objective evaluation of the systems utility for OPD use- and we’ve certainly not had a conversation in public about the potential benefits of public access to this data in more like real time! Just looking at the horrendous reality of shootings in East Oakland over the past five years makes one pause very somberly when considering how much the OPD must deal with and how much they need more analytical guidance to do their jobs better and more efficiently.

image

For a crazy look at shootings by month for these five years take a look at this animation– with the caveat that not all the city had sensors installed the whole time and that on holidays a lot of incidents in the data are likely fireworks!  Makes me want to know why there is a small two block section of East Oakland with no gunshots in five years- the data have been fuzzed to be accurate to no more than 100 feet but this still looks like an oasis- who knows why?

Oakland’s Shot Spotter action in 2013

Given OPD’s recent suggestion that they want to ditch the Shot Spotter system and given the data are available, it seems worthwhile to start digging into the data to see what use they may have, starting with public benefits.  This map is a really simple visualization of the shots from January  to October of 2013.  At city level it becomes a mess, but at neighborhood level it is far more revealing.  Data in web friendly formats are available here also.

You can view it fullscreen here.

http://spjika.cartodb.com/viz/bd291444-b43b-11e3-83bf-0e10bcd91c2b/embed_map?title=true&description=true&search=false&shareable=true&cartodb_logo=true&layer_selector=false&legends=false&scrollwheel=true&fullscreen=true&sublayer_options=1&sql=SELECT%20*%20FROM%20oakshots%20where%20date_tim%20between%20

To see the areas of the city formally covered by this system use these [ugly] maps.

Truants vs Absence

Sometimes good public policy just comes down to the money.

The Attorney General Kamala Harris has published a nicely designed report to raise the profile of chronic absenteeism (kids missing more than 10% of school per year) and truancy (being late- hmm, not much of a comparison).  It does a good job of laying out the complex realities that are allowing this problem to continue and to grow in some parts of the state and in many communities.  The underlying root causes are complex, but the system components are as simple (sorta) as poor data management systems, lack of awareness and lack of effective interventions.  While many people will not empathize with the impact that students really do suffer long term from chronic absence and the fact that is does predict graduation incredibly well at 3rd grade levels, others need to just see the dollar figure.  The AG’s estimates put the cost to our state as a result of dropouts as $46 Billion a year. Given the well established links between chronic absence and graduation (I helped build that case with some work in 2007 with Hedy Chang), we can’t pretend that this is something not worth investing in- the cost benefits are enormous.

One of the more incredible findings in the AG’s work was this:

Student record systems need repair and upgrade to accurately measure, monitor and respond to truancy.

California is one of only four states in the country that does not collect individualized student attendance records at the state level.10 Even at the local level, only half of the school districts that responded to our California School District Leadership Survey (“District Leadership Survey”)11 were able to confirm that they track student absence records longitudinally—that is, they track individual students’ attendance year after year. The failure to collect, report and monitor real-time information about student attendance renders our most at-risk children – including English learners, foster children and low-income free- and reduced-price lunch students – invisible.

There are few people who would expect that our modern society is incapable of tracking student attendance in a meaningful way, yet many of California’s school districts are struggling to do just this.  And the lowdown is that lack of data means lack of scalable understanding- a single teacher may have an awareness of a student’s absenteeism, but when a struggling kid moves up a grade the knowledge of this problem is lost, and the administrators sure aren’t aware that the same child is on the same bad trajectory year after year.  We often see fancy video screen laden operations centers on TV, but the reality described in this report is that many districts aren’t doing basic analysis of chronic absenteeism and as such are surely not working on and aware of successful interventions to assist these students.

Give it a read, it’s well written and illustrates a real crisis in our cities. My 2c is that we largely ignore the truancy data presented- it’s a red herring. Follow the money and you’ll see the outcomes of chronic absenteeism as a huge money pit – unless we really address this across our state equitably.

http://oag.ca.gov/truancy