The California Report Card Version 1.0
The CITRIS Connected Communities Initiative at UC Berkeley
Mobile technology offers new opportunities for the public to express views and insights, consider the views of others, and directly engage with political leaders (e.g., Hemphill & Roback 2014, Himmelroos & Christensen 2014, Graham et al. 2013, Landemore 2013).
However, the volume and diversity of ideas can be difficult to manage or may not result in actionable suggestions. A specific and timely response from government leaders is important to close the loop to sustain engagement (NDI 2014; Newsom 2013).
The California Report Card (CRC) v1.0 is an experimental platform that streamlines public input by openly encouraging suggestions from a broad range of participants, and combining peer-to-peer review with statistical models to identify and highlight the most insightful ideas.
The CITRIS Data and Democracy Initiative developed the CRC in collaboration with the Office of California Lt. Governor Gavin Newsom to explore how technology can improve communication among voters and public officials. The CRC aims to increase public engagement with government and to tap California’s collective intelligence. In CRC v1.0, participants graded the State of California on six timely issues and suggested topics that deserve increased priority at the state level.
Between January and June 2014, more than 9,000 California residents from all 58 counties assigned over 23,000 grades to the state and suggested over 300 issues for increased priority. Among other topics, the CRC revealed strong interest in statewide disaster preparedness, prompting a specific response from Lt. Governor Newsom on 20 March 2014, when he announced that this issue would become a top priority for him and his staff. The CRC v1.0 and associated data can be accessed here.
The California Report Card v1.0
The CRC integrates elements of two projects: the Opinion Space project developed at UC Berkeley between 2009–2013 (Faridani et al 2010); and the Citizen Report Card concept developed by the World Bank to assess government performance in developing economies (World Bank 2014). To increase accessibility, the CRC was built using HTML5 to enable access across mobile and desktop devices.
The CRC v1.0 is available to anyone online. It asked participants to grade the state on six timely issues:
- Implementation of the Affordable Care Act (“Obamacare”)
- Quality of K-12 public education
- Affordability of state colleges and universities
- Access to state services for undocumented immigrants
- Laws and regulations regarding recreational marijuana
- Marriage rights for same-sex partners
In contrast to traditional polls and surveys, the CRC provides instant feedback by revealing the median grade from all participants each time a grade is entered (see Fig. 2a). The system measures changes in grades to evaluate the effect of social influence bias (Krishnan 2014).
Participants were then invited to evaluate and suggest issues to be considered on the next CRC (Fig. 2b-c). In the “CAFÉ” ideation space, participants can suggest an issue for priority at the state level. This phase is illustrated as a cafe table, where participants can discuss issues over a cup of coffee. Suggestions are displayed as mugs (see Fig. 2b) and are positioned using Principal Component Analysis (using grades assigned on the initial six issues)–mugs in closer proximity represent participants who graded the six issues similarly. Participants evaluate others’ suggestions by assigning grades on two axes: “How important is this issue for the next report card?” and “How would you grade the State of California on this issue today?” (Fig. 2c). An earlier study showed that participants explore more divergent ideas using such a graphical display instead of list-based approaches (Faridani et al. 2010).
Responses from the 9,000 participants confirm approval of California’s rollout of Obamacare, but convey mixed sentiment on California’s education system. To evaluate the representativeness of these results, a reference-randomized survey was conducted with 611 participants. A comparison of the mean grades on the CRC and the reference survey showed remarkable similarity: over the six issues, the average difference between the mean CRC grades and the mean reference survey grades was 3%. In the ideation space, suggestions from a broad cross section of Californians revealed new insights on statewide concerns, including widespread interest in greater statewide disaster preparedness. All research was conducted with approval from the Institutional Review Board at UC Berkeley.
Two participant comments:
“This is the first system that lets us directly express our feelings to government leaders. I also really enjoy reading and grading the suggestions from other participants.”
“This platform allows us to have our voices heard. The ability to review and grade what others suggest is important. It enables us and elected officials to hear directly how Californians feel.”
The CRC v1.0 demonstrates how mobile technology can improve public communication and bring the government closer to the people it represents. The CRC v1.0 facilitated identification of statewide concern for disaster preparedness and prompted a direct response from the Office of Lt. Governor Gavin Newsom to identify methods to improve statewide disaster preparedness.
While elections, opinion polls, and surveys produce valuable information; they tend to be infrequent, costly, and are often conducted at the convenience of government or special interests. This essay believes that new technology has the potential to increase public engagement by tapping the collective intelligence of California residents every day, not just on Election Day.
Data is available here.
Faridani, Siamak, Ephrat Bitton, Kimiko Ryokai, and Ken Goldberg. 2010. “Opinion Space: A Scalable Tool for Browsing Online Comments.” In Proceedings of the ACM International Conference on Computer Human Interaction (CHI). New York, New York, USA: ACM Press.
Graham, Todd, Marcel Broersma, Karin Hazelhoff, and Guido van ’t Haar. 2013. “Between Broadcasting Political Messages and Interacting With Voters. The Use of Twitter during the 2010 UK General Election Campaign.” Information, Communication & Society 16 (5) (June): 692–716.
Hemphill, Libby, and Andrew J. Roback. 2014. “Tweet Acts: How Constituents Lobby Congress via Twitter.” In Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing — CSCW ’14, 1200–1210. New York, New York, USA: ACM Press.
Himmelroos, Staffan, and Henrik Serup Christensen. 2014. “Deliberation and Opinion Change: Evidence from a Deliberative Mini-Public in Finland.” Scandinavian Political Studies 37 (1) (March 11): 41–60.
Krishnan, S., Jay Patel, Michael J. Franklin, and Ken Goldberg. 2014. Social Influence Bias in Recommender Systems: A Methodology for Learning, Analyzing, and Mitigating Bias in Ratings (Working Paper). Retrieved June 27, 2014, fromhttp://goldberg.berkeley.edu/pubs/sanjay-recsys-v10.pdf.
Landemore, Helene. Democratic Reason: Politics, Collective Intelligence, and the Rule of the Many. Princeton, NJ: Princeton University Press, 2013.
National Democratic Institute [NDI]. Citizen Participation and Technology: An NDI Study. 2014. Retrieved from http://bit.ly/1hePZuM.
Newsom, Gavin and Ken Goldberg. “Let’s Amplify California’s Collective Intelligence.” San Francisco Chronicle (San Francisco, CA), June 10, 2014.
Newsom, Gavin with Lisa Dickey. 2013. Citizenville: How to Take the Towne Square Digital and Reinvent Government. New York, NY: The Penguin Press.
World Bank. Citizen Report Card and Community Score Card. Washington, DC. 2014. Retrieved from http://bit.ly/1hwn0kt.
If you like what you just read, please click the green ‘Recommend’ button below to spread the word! More case studies and calls for submissions are on the Civic Media Project. To learn more about civic media, check out the book Civic Media: Technology, Design, Practice.