Over the past few weeks, the future of the open internet has come into sharp focus, as the FCC’s 2010 open internet rules were struck down in court, and then plans for new rules from the FCC came into public view. Amidst fears that the internet is f**ked, debate has raged about what…
If you’d prefer to never see this kind of mess happening Oakland (thanks Berkeley for the great non-example), you should join Oakland Votes and many residents of the city to work on the creation of an independent Redistricting Commission for our city! Details on the flyer below- this will be a ballot measure this November assuming that city council passes it. The meeting is to get community input into the model Oakland chooses to adopt- both California and Austin have done this and we can learn form their efforts! We’ll have good food and you’ll get to play a valuable part in shaping the future of our city, and the future shape of our council maps too!
props to @mollyampersand for the original, distant design elements.
You may not have heard about this yet, which is a shame. It’s a shame because it’s a rare good thing in local government tech, because it’s a serious milestone for our city hall and because our local government isn’t yet facile with telling our community about the awesome things that do happen in Oakland’s city hall. But I’m excited, and I’m impressed that we’ve gotten here- Oakland’s crime report data is now being published daily, automatically, to the web, freely available for all.
This is quite a leap forward to where we were several years ago and in fact just year ago to be honest: spreadsheet hell. Often photocopied spreadsheet hell. Things happen slowly, but some things we’ve pushed for because we recognize their potential to change the game forever. First we pushed for opendata as a policy in the city, and we got there quick enough, but we’re now waiting in expectation for our new CIO Bryan Sastokas to publish the city’s very crucial open data implementation plan. Then we started pushing public records into the open with the excellent RecordTrac app that makes all public information requests public unless related to sensitive crime info. And now with local developers soaking up the public data available we have the first ever Oakland crime feed and an API to boot!
The API isn’t actually something the city built, it’s a significant side benefit of their Socrata data platform- just publish data in close to real time and their system will give you a tidy API to make it discoverable and connectable.You can access their API here:
If you’re more of an analyst or a journo or a citizen scientist you may want the data in bulk, which you can grab here. That link will get you to a file that is updated on a daily basis- pretty rad huh. Given how crime reports tend to trickle in- some get reported days after, some months, some get adjusted, the data will change over time- the only way to build a complete, perfect dataset is to constantly review for changes, update, replace etc- a very complicated task. If you want a bulk chunk of data covering multiple years, with many richer fields and much higher quality geocoding you can grab what we’ve published at http://newdata.openoakland.org/dataset/crime-reports (covers 2003 to 2013) and for the more recent historic version you can use this file: http://newdata.openoakland.org/dataset/oakland-crime-data-2007-2014
Now that we’ve figured out how to pump crime report data out of the city firewall, we can get to work connecting it to oakland.crimespotting.org and building dashboards to support community groups, city staff and more!
So thank you to the city staff who worked to get this done- now let’s get hacking!
Side bar- Oakland has finally gotten hold of it’s new Twitter handle: @Oakland is now online! More progress…
Oakland is once again talking about data and facts concerning crime, causes and policing practices, except we’re not really. We’re talking about an incredibly thin slice of a big reality, a thin slice that’s not particularly helpful, revealing nor empowering. And this is how we always do it.
Chip Johnson is raising the flag on our lack of a broad discussion about the complexity of policing practices and the involvement of African-Americans in the majority of serious crimes in our city, and on that I say he’s dead right, these are hard conversations and we’ve not really had them openly. The problem is, the data we’re given as the public (and our decision makers have about the same) is not sufficient to plan with, make decisions from nor understand much at all. Once again we’re given a limited set of summary tables that present just tiny nuances of reality and that do not allow for any actual analyses by the public nor by policy makers. And if you believe that internal staff get richer analysis and research to work with you’re largely wrong.
When we assume that a few tables of selectively chosen metrics suffice for public information and justification for decisions or statements, we’re all getting ripped off. And the truth is our city departments (OPD esp.) do not have the capacity for thoughtful analytics and research into complex data problems like these. And this is a real problem. Our city desperately needs applied data capacity, not from outside firms on consultancy (disclosure: my current role does this sometimes for the city) but with built up internal capacity. There is a strong argument for external, independent access to data for reliable analysis in many cases, but our city spends hundreds of millions per year and we don’t have a data SWAT team to work on these issues for internal planning. Take a look at what New York City does for simple yet powerful data analytics that saves lives, saves money and makes the city safer. This is what smart businesses do to drive better decision making.
Data, in context, with local knowledge and experience, evidence based practices (those showing success elsewhere) and a good process will yield smarter decisions for our city.
Data tables do not tell us about any nuances in police stops, we don’t know how these data vary across different neighborhoods nor anything about the actual situations around each stop- the lack of real data that shows incident level activity makes any real understanding impossible.
For example, the data report shows that White stops yield a slightly higher proportion of seizures/recoveries, so logic says why don’t the OPD pull over more White folks if they lead to solid hits at least as often?
Back in 2012 the OPD gave Urban Strategies Council all their stop report data to analyze, but there was no context nor any clear path of analysis suggested making it near impossible to produce thoughtful results, nor was it part of our actual contract. But the data exist and should be used by the city to really understand how our police operate, the context of their work and the patterns that lead to meaningful impacts rather than habits that are not reflected upon and never questioned or changed.
it is not our cities job to just do the work, process the paperwork and never objectively review meta level issues. According to our Mayor “Moving forward, police will be issuing similar reports twice a year”. We need data geeks in city hall to support our police and all departments and in 2014 we need to be better than data reports that consist of a set of summary tables alone. Pivot tables are not enough for public policy.
If you’re still reading- the same problem arises with relation to the Shot Spotter situation- the Chief doesn’t think it’s worth the money, but our Mayor and CMs want to keep it- we now have the data available for the public but we’ve not really had any objective evaluation of the systems utility for OPD use- and we’ve certainly not had a conversation in public about the potential benefits of public access to this data in more like real time! Just looking at the horrendous reality of shootings in East Oakland over the past five years makes one pause very somberly when considering how much the OPD must deal with and how much they need more analytical guidance to do their jobs better and more efficiently.
For a crazy look at shootings by month for these five years take a look at this animation– with the caveat that not all the city had sensors installed the whole time and that on holidays a lot of incidents in the data are likely fireworks! Makes me want to know why there is a small two block section of East Oakland with no gunshots in five years- the data have been fuzzed to be accurate to no more than 100 feet but this still looks like an oasis- who knows why?
Given OPD’s recent suggestion that they want to ditch the Shot Spotter system and given the data are available, it seems worthwhile to start digging into the data to see what use they may have, starting with public benefits. This map is a really simple visualization of the shots from January to October of 2013. At city level it becomes a mess, but at neighborhood level it is far more revealing. Data in web friendly formats are available here also.
You can view it fullscreen here.
To see the areas of the city formally covered by this system use these [ugly] maps.
Sometimes good public policy just comes down to the money.
The Attorney General Kamala Harris has published a nicely designed report to raise the profile of chronic absenteeism (kids missing more than 10% of school per year) and truancy (being late- hmm, not much of a comparison). It does a good job of laying out the complex realities that are allowing this problem to continue and to grow in some parts of the state and in many communities. The underlying root causes are complex, but the system components are as simple (sorta) as poor data management systems, lack of awareness and lack of effective interventions. While many people will not empathize with the impact that students really do suffer long term from chronic absence and the fact that is does predict graduation incredibly well at 3rd grade levels, others need to just see the dollar figure. The AG’s estimates put the cost to our state as a result of dropouts as $46 Billion a year. Given the well established links between chronic absence and graduation (I helped build that case with some work in 2007 with Hedy Chang), we can’t pretend that this is something not worth investing in- the cost benefits are enormous.
One of the more incredible findings in the AG’s work was this:
Student record systems need repair and upgrade to accurately measure, monitor and respond to truancy.
California is one of only four states in the country that does not collect individualized student attendance records at the state level.10 Even at the local level, only half of the school districts that responded to our California School District Leadership Survey (“District Leadership Survey”)11 were able to confirm that they track student absence records longitudinally—that is, they track individual students’ attendance year after year. The failure to collect, report and monitor real-time information about student attendance renders our most at-risk children – including English learners, foster children and low-income free- and reduced-price lunch students – invisible.
There are few people who would expect that our modern society is incapable of tracking student attendance in a meaningful way, yet many of California’s school districts are struggling to do just this. And the lowdown is that lack of data means lack of scalable understanding- a single teacher may have an awareness of a student’s absenteeism, but when a struggling kid moves up a grade the knowledge of this problem is lost, and the administrators sure aren’t aware that the same child is on the same bad trajectory year after year. We often see fancy video screen laden operations centers on TV, but the reality described in this report is that many districts aren’t doing basic analysis of chronic absenteeism and as such are surely not working on and aware of successful interventions to assist these students.
Give it a read, it’s well written and illustrates a real crisis in our cities. My 2c is that we largely ignore the truancy data presented- it’s a red herring. Follow the money and you’ll see the outcomes of chronic absenteeism as a huge money pit – unless we really address this across our state equitably.
I was inspired by a recent piece by the wonderful @jedsundwall on his gov3.0 blog about the need to be going beyond data portals (much like a recent book I contributed too focuses on BeyondTransparency, shameless plug yes).
Jed totally hits it with this assessment of a growing attitude in local government towards just getting the data out:
It’s time to acknowledge that data is not made useful simply by making it available online. As we work to make data open and available, we also need to train people who can help make it accessible and useful.
In cities locally and globally the concept of open data is being pitched by vendors as a simple, turnkey thing they purchase and simply check it off their list of good government tasks. Not enough cities have realized that this huge data resource is an amazingly underutilized and under-leveraged resource for them. In Oakland, so much of the data being published leaves much to be desired and leads to dozens of new questions about the source, quality, meaning and completeness of these data, but the city isn’t really embracing this as a way to engage the community and to see these data reach more of their potential.
Jed goes on to suggest an alternative reality where data support exists side by side with the data portals:
You’re doing your research, but you’ve heard of the San Diego Regional Data Library. You go to its website and see that you can email, call, or chat online with a data librarian who can help you find the information you need. You call the library and speak with a librarian who tells you that the data you need is provided by the county rather than the city. You also learn about datasets available from California’s Department of Transportation, a non-profit called BikeSD, Data.gov and some other data from the city that hasn’t been opened up yet.
This is where my two worlds collide. The #opendata & #opengov world is leading and pushing from a certain position, mostly not connected to the existing community research, indicator and data world and the community indicators world has been slow in embracing this brave new world of easy access to data. We need to get along, to understand each others positions and intentions and we can really make this #datadriven world matter for our communities.
The concept of a data library is very similar to what groups like Urban Strategies Council have been doing for 15 years with our InfoAlamedaCounty.org project. For a long time we’ve seen the need to provide communities with reliable research and data to drive action and we’ve struggled to get access to data for this entire time.
We formed the National Neighborhood Indicators Partnership in 1992 with the Urban Institute to help build the field nationally to help empower more communities in just this way- we have a mandate to publish data and make usable, actionable information to communities equally. Our partner organizations in 37 cities have local knowledge and regional knowledge, expertise in community development, reentry, public safety, economic development, education and health, so we’re able to not just provide raw and improved data, we’re able to be an active, responsive voice in or communities to make more data actionable.
Many NNIP partners are starting to embrace the open data world and this is a powerful recipe for a data driven future that is focused on equity in our cities- most NNIP partners have a social mission as opposed to just doing data in a cold, calculated way. But the unfortunate truth is that as our cities are becoming more data rich, many NNIP partners are facing declining funding to help support community uses of data. It would be a mistake for funders to largely lose interest in community data intermediaries (not a sexy concept) in the excitement over open data, because none of these data become actionable and meaningful without serious support, engagement and use.
The data library is a great concept, and our experience in Oakland and many other cities says there’s huge need and value for such an entity. Our cities can themselves play some part by being more engaged through their open data resources, but that’s never going to be enough, just like Chicago has fantastic staff who engage, there’s still a role for the Smart Chicago Collaborative effort to bring that power out to communities across the city.
More data, more engagement, more power to the people?
I’m a data portal skeptic. I have been for years, but I’ve gotten tripped up when trying to explain why. I’m certainly not anti open data. I’m not even anti data portal. But I worry that organizations think that setting up an open data portal is a way to make data useful, when it’s really just a…
Traditional public information seeks to meet minimal legal requirements, maintain power inside institutions, discourage dissent, and deliver a pre-determined result. Community engagement seeks a higher reward by respecting the power, wisdom and experience of residents, and engaging them in decision-making – the consent of the governed.
/* Style Definitions */
mso-padding-alt:0in 5.4pt 0in 5.4pt;
mso-fareast-font-family:”Times New Roman”;
A sneak preview of my favorite line from our forthcoming report on the efforts of the Oakland Votes Coalition in 2013! Prose from Sharon Cornu.