Nuclear Crisis As The Catalyst For Innovation

Table of Contents

Open and User Innovation In The Wake Of The Fukushima Nuclear Crisis

The world has seen many natural disasters and human generated crises. The 1986 Chernobyl disaster, 2001 Twin Towers attack, 2008 Cyclone Nargis, and 2015 Nepal earthquake just to name a few. There are typically robust systems in place to help governments and official bodies react appropriately to disaster situations. Unfortunately, these systems are not always entirely foolproof. They can fail. Information is not always readily available to people when and where they need it. In some cases, the information coming from official sources can be inaccurate, or significant facts may even be intentionally withheld. Sometimes official bodies just cannot cope with the situation by themselves.

I lived in Japan during the earthquake, tsunami, and subsequent nuclear disaster; the first and only time this dangerous combination of destructive events had ever occurred. I witnessed and was part of the confusion that the people of Japan experienced; that horrible feeling of helplessness. Luckily, the nuclear disaster was successfully contained and the plant remains stable as the long and arduous cleanup operation is carried out. This story focuses on the open and user innovations that occurred in the wake of the 2011 Fukushima nuclear crisis, and how we might be able to facilitate innovation better in future times of crisis.

The Event

At 14:46 JST on 11th March 2011 a 9.0 Mw earthquake occurred off the coast of the Tōhoku region of Japan causing a tsunami, this led to mass destruction across the coastline. This resulted in the release of radioactive isotopes from the Fukushima-Daiichi nuclear power plant(福島第一原子力発電 所) [1]. Pumps that circulated coolant throughout the nuclear reactor failed, and as a consequence, the reactors began to overheat. Efforts to counter this reaction were too little too late and some reactors went through a full meltdown. Several hydrogen explosions occurred within the structures that house the reactors, causing releases of radioactive fallout. The radioactive materials caesium-134, caesium-137, and iodine-131 were released [2]. Soon after the disaster, trace amounts of these materials could be detected around the globe.

Government Reaction

The government had numerous crisis response systems in anticipation of such a disaster. Some, such as the Earthquake Early Warning (EEW) [4] system, executed as anticipated, and the proper warnings were raised. However, the radioactivity reporting systems functioned less effectively than they should have. The System for Prediction of Environmental Emergency Dose Information (SPEEDI) [5] performed badly, as it failed to predict the diffusion of radioactivity due to power shortages to sensors and other factors.

The government reaction to the disaster came under scrutiny when they neglected to inform the public about the severity of the accident [6], and that vital information was not readily available. Official radiation data was available from MEXT, the Ministry of Education, Culture, Sports, Science, & Technology in Japan (文部科学省) [7], through their website. The data came from sensor stations around nuclear facilities dotted around Japan. Documentation of historical values were not available to the public from official sources at the time of the disaster time. This lack of information, combined with distrust of information from the government led to open and user innovations from both independent groups and individuals.

The Community Reaction

If we look at disasters throughout history, we can see examples of how people are determined to assist others, give support, and attempt to better the situation in extraordinary and inventive ways [9]. We can see benevolent goings-on during all stages of disaster recovery within local communities [10], the events that unfolded after the Tōhoku were no different in this respect. The very same kinds of altruistic acts can be found within online communities, too [11]. Information technology and the internet afford new modes of communication and collaboration during crises; unfortunately, the efficacy responses are still not fully realized. New tools are allowing the public to not only consume, but also to produce and share their innovations, using our cognitive surplus for the better [12]. The online community was shocked and surprised by the events of the tsunami and nuclear disaster that followed.

The public wanted to get a better idea of what the consequences of the radiation escaping from the Fukushima nuclear power plant meant. In crisis situations getting the right information is vital. Unfortunately, the information coming from official sources tended to be hard to understand or hard to reach. The public came up with innovations during the Fukushima nuclear crisis, both within Japan and the international community. The open and user innovations occurred on different timescales, some happened mere days after the disaster, while some only really took off many months after the earthquake. Independently functioning people and groups had a very powerful effect with the counter disaster systems they developed. Looking into how and why each and every system came into existence would be a colossal task, but by looking at the types of responses that occurred, we can try to understand what happened. The following sections highlight the actions that were taken by the public in an attempt to satisfy their needs.

Social media

Members of the public tend to circulate official responses among themselves through peer communication. They can also feed information directly from affected areas. This was done primarily through the medium of social media. According to some reports, people were very engaged in their use of social media after the earthquake [13, 14]. Telephone networks were disrupted and suffered greatly due to excess traffic; thus people resorted to social media.

People tended to use social media services that they were already accustomed to using. Facebook [15], Twitter [16], and Mixi [17] were all popular platforms during the crisis. In fact, the number of re-tweets shot up by twenty times the average level directly after the earthquake [18]. Twitter is popular as an ad-hoc crisis communications platform because it has fast information delivery with a selectively transparent user base [19]. By using hashtags users and developers could communicate effectively. The result was a platform that allowed developers to form active projects and let users know about them. Communication is essential in times of crisis, and general social media served that purpose better than any other mode of communication.

Making Sense of Official Radiation Data

Official sources had made data available to the public. However, there were some issues with getting the data in a usable form. The following case study highlights one situation where an individual took matters into his hands and made usable data for all to access.

CASE STUDY A: Japan Radiation Open Data
A User Experience Designer and Information Visualization enthusiast from Germany called Marian Steinbach was shocked by the tsunami and nuclear catastrophe. He wanted to know what the radiation readings that were coming out of Fukushima meant. He has the initiative to take action almost as soon at the accident occurred. When looking for data sources, he came across the website for the SPEEDI sensor network [5]. Data was available to the public, but there were two main problems. The site was attracting massive traffic, so the heavy load stopped the page from loading. Also, no documentation of historical values was available so that nobody could compare values across time. Marian reached out to other developers to discuss ways they could let people know whether they were safe or not.
In response, Marian set out to develop a method to get machine readable data on the incident. He created a Google Docs spreadsheet [20] and manually updated the radiation values every 20 minutes. This was, of course, quite cumbersome, so he asked people around the globe to assist in this effort to share the workload. People were willing to help, and the radiation values were updated around the clock. There were a few problems with malicious users that would add false results and destroy data; hence some data was lost. A decent version control system would have been necessary to manage edits correctly.
In the meantime, Marian wrote a web scraper (web data extraction tool) to automate the process of copying the values from the official SPEEDI network. He then sourced values into a database that he then published as an open data download. The database can still be found at his website [21], which is still updated with the most recent data values as they are released. What started out as a crowdsourcing exercise, evolved into a data extraction and storage one. Even after this shift, people still tried editing the Google spreadsheet long after the data was being recorded automatically. This was due to poor communication channels.
Even though Marian released all available historical values to the public, many sensor stations (specifically the ones closest to Fukushima-Daiichi nuclear power plant) did not report any values due to technical issues. They did not start to report values until around Q1 2012. More recently the SPEEDI network website has added a ‘monitoring data download’ link to allow the public to access historical values. However, the site only allows people to download statistics from one sensor at a time with a maximum data range of six months.
Many derivations were born from ‘Japan Radiation Open Data’. Several people were able to make visualizations and, some were even able to confirm that the progression of radiation levels was concurrent with the half-life of the types of radioactive isotopes that were believed to have been released. Similar web scraping efforts of official sources were also made by the ad-hoc group ‘radmonitor311’ [22], who deserve a notable mention.

Data Visualisation

Representing raw data in a meaningful way is essential; otherwise, we cannot make any sense of it. This is where data visualization plays a vital part. Media reports of radiation readings were notoriously difficult for a layperson to understand. There was an enormous amount of confusion when reporting levels of radiation. Different media sources reported radiation in millisieverts (1 mSv = 0.001 Sievert), others in microsieverts (1 μSv = 0.000001 Sievert). Some also reported per hour units while others reported per year units. It is important to normalize comparisons, so they are all based on the same scale. The media did a bad job of this, and much confusion ensued.

The inadequate mass media news broadcasts drove individuals to create their content to help aid understanding of the situation. The primary objective of many innovators of data visualizations during the nuclear crisis was proper public education. Novel data visualizations were developed to represent radiation information, such as ‘micro sievert’, a simple visualization of environmental radiation levels in the Kanto area [23], and ‘Global Pulse’, visualizations by Miguel Rios, of the global flow of tweets in the wake of the disaster [24].

It should be noted that crisis visualization has a significant impact on societal reactions; thus it comes with an enormous amount of responsibility to make sure data is interpreted correctly. There is a risk that the social structure of a country could change in response to the information people have access to, whether that information is credible or not.

CASE STUDY B: Rama C. Hoetzlein
Rama C. Hoetzlein is a computer scientist and knowledge engineer working in the areas of artificial intelligence and graphics. Troubled by the immensity of the casualties involved in the tsunami, Rama was unable to focus on his work. He wanted to see what he could do to help. His inspiration came from initial rough sketches of radiation levels over time created by other user innovators, though these lacked proper radiation unit information. This, coupled with the data available from Marian Steinbach’s ‘Japan Radiation Open Data’, led him to create content for the public.
To begin with, Rama was taking data from the Tokyo Electric Power Company (TEPCO) website [25] and translating it manually. This occurred for the first three days after the disaster. He then came across Marian’s work and used the more user-friendly data source. He hoped to offer a visual way to show the risks related to radiation dosage by correlating events that unfolded with actual radiation levels. A contaminant map was then posted to Wikipedia on March 17th; an updated map was also created on March 30th when further data had been generated. Both maps and further insight into their creation can be found at Rama’s website [26]. He also created an animated information graphic of regional effects of Fukushima radiation for the dates March 8th to March 31st [27].
By representing the data in an understandable format, it was discovered that Tokyo, the capital of Japan that lies around 200km from the power plant, was not getting significantly more radiation than any other big city around the world. However, he was also able to show that millions of people within a 20km (the initial evacuation zone recommended by the government was 20km) to 100km radius of the plant were receiving levels of radiation that are deemed unsafe for nuclear workers by international standards.
Rama feels that his visualizations have had more of an effect in countries outside Japan. By giving real data out to the public, a lot of the scaremongering from other sensationalist sources was quelled. He was contacted by many individuals, a department of health, and even a worker on a submarine. The most rewarding feedback was from people in Japan who thanked Rama for helping them, their families, and friends.
The biggest challenge for future disaster situations is getting the science correct. Typically open and user innovations are done by engineers that come up with interesting ideas and want to try them out. Unless professionals are working within the field, innovating first-hand, many mistakes can easily be made.

Crisis maps

A crisis map is an open and intuitive way of letting people know what is going on during a time of crisis. The age of the wireless internet has allowed this phenomenon to kick off over the last five years. During the Fukushima nuclear crisis, these maps were used to give readings of radiation levels around Japan. Many crisis mapping efforts rely on crowdsourced information. However, at the beginning of the disaster, many of the crisis maps developed by users were in fact aggregated from government sources and international organizations. This was because the public was not armed with the correct equipment to take readings. It took some time after the earthquake before crowdsourced readings made a meaningful impact in the radiation crisis mapping effort.

Xively (previously named Pachube and then Cosm) supported hundreds of radiation associated feeds that helped to monitor conditions in real-time [28]. This enabled crisis mappers to access data for their maps [29, 30]. Later in the disaster, many people joined in on this effort to create more data points with their Geiger counters. One issue with crowdsourced data is that it relies entirely on the honor system, where people are expected to supply reliable and valid results. This was not always the case, sometimes malicious results were submitted to try and disrupt the system, other times people accidentally sent false values since they did not know how to operate their equipment properly. Nevertheless, false reports are usually easily filtered out through the sheer volume of decent results compared to fake ones. This is the beauty of the crowd.

Open Source Hardware

There were numerous open source hardware developments after the earthquake; since it is a somewhat novel model, the following describes the phenomenon before going on to describe its applications in Japan. Open source hardware is a relatively new concept that is still in its infancy when compared to open source software. The open source hardware community is around seven years old; it is spearheaded by the Open Source Hardware Association (OSHWA) [31], the first organization created to defend open source hardware and promote best practices. The definition of open source hardware is itself still a work in progress; it is important to note that defining what the term means is vital since licenses can have differing levels of openness. Similarly to open source software, it might take some time and test cases for legal clarity to materialize in open source hardware [32].

Open source hardware is hardware whose design is made publicly available so that anyone can study, modify, distribute, make, and sell the design or hardware based on that design. The hardware’s source, the design from which it is made, is available in the preferred format for making modifications to it. Ideally, open source hardware uses readily-available components and materials, standard processes, open infrastructure, open content, and open-source design tools to maximize the ability of individuals to make and use hardware. Open source hardware gives people the freedom to control their technology while sharing knowledge and encouraging commerce through the open exchange of designs. — Open Source Hardware (OSHW) Statement of Principles 1.0

There are a few factors that have led to the boom in open source hardware. Firstly, Moore’s law [33] has allowed for sufficient technological advances to accommodate for the types of systems that we need for people to design and develop products by themselves. Secondly, tools such as 3D printers, laser cutters, CNC mills, soldering irons, etc. have become more affordable. Being able to access plans without the tools to materialize them would mean that innovators could not make anything. Thirdly, the internet has opened up a whole world of collaborative practice. Given the opportunity, people will create and share. Lastly, collaborative locations such as hackerspaces, tech shop (small factories that anyone can become a member of and use), and fabrication laboratories (fab labs) have created a place for shared space and tools. These places also add a knowledge layer, where people can come together and teach each other about different specializations. Before there were only a few dozen hackerspaces, but after the boom in 2009, the total number has increased to over seven hundred hackerspaces worldwide. Public factories such as Shapeways [34] and Ponoko [35] have also opened up the opportunity to make custom products by uploading designs and having them made and mailed to the designer of the product.

Open source hardware is very attractive to user innovators because there is no need to start from scratch. This applies not only to the plans for hardware but also for open source hardware tools. Arduino [36], a microcontroller that powers most DIY hardware projects, is a prime example of an open source tool that lowers the barrier to entry and cuts out a lot of the time that would otherwise be spent by the user building a similar device from scratch.

In essence, by making plans available to the public a series of self-sustaining opportunities for innovation can be created. If the hardware that is derived from the original open hardware is kept open, there is an immense potential for further improvement through distributed innovation. By allowing users to adapt devices we can open a well of untapped potential in citizen research and development, this can save vast amounts of money when compared to centralized research. By opening their products, companies invite a considerable amount of public feedback, allowing for improvements in future versions of their hardware. Conversely, when design and construction are separated in this manner, in the case of an issue, it is not always obvious whether the fault is in the design or the construction [37].

CASE STUDY C: Safecast
After the beginning of the nuclear crisis, there was a shortage of Geiger counters in Japan. Demand within Japan was incredibly high, as were the prices for these detectors. Many people wanted to take readings to check if the levels of radiation in their area were within safe limits. In response, there were many open source radiation monitor designs released to the public [38, 39, 40, 41]. Even a large open source hardware company based in China called Seeed Studio Depot [42] launched a collaborative effort to design an open source Geiger counter. However, one group called Safecast [43] (previously known as RDTN.org) stood out among the rest.
The story started when ‘Akiba’ Chris of the Tokyo hackerspace was able to acquire two radiation monitors through the hackerspace network. He then hacked them and connected ten them to an open source hardware Arduino platform [44], from which they then broadcast the data for everyone to access. The Tokyo hackerspace continued to collaborate closely with Safecast throughout their start-up phase.
Meanwhile, Safecast worked in parallel on two projects. Firstly, Safecast (at that time RDTN.org) used the crowdfunding site Kickstarter to get 606 backers, raising a total of $36,900 to purchase their first batch of Geiger counters. These were used with Xively (previously Pachube and Cosm) to open source the data produced. Secondly, they partnered with Bunnie Huang, who began work on designing a cheap radiation monitor suitable for civilian use. The resulting design was a functional open source prototype [45] that is programmable on a laptop by connecting the device to a USB port. The methods used by Safecast have been used to collect over three and a half million open data points since launch. It should be noted that the crowdsourced data sets were not meant to replace official data, but instead, provide additional context for the public to access.
The second round of Kickstarter funding raised $104,268 for a limited edition of the Safecast Geiger counter. They have also released mobile applications to visualize collected data on crisis maps. The group is now creates real-time data for various applications including air pollution monitoring.

Conclusions

The examples above are by no means a comprehensive list of innovations. This report only highlighted a small segment of all the user innovation that occurred in response to the nuclear disaster at Fukushima-Daiichi nuclear power plant. It seems that nearly all open and user innovations that arose from the crisis were concerned with creating social value through educating people within and outside of Japan, empowering others to help, and ensuring the safety of those near the power plant. Many solutions that people came up with were effective and efficient, but there are obviously many obstacles that need to be overcome if we want to enable people to respond in innovative ways to future disaster and crisis situations.

Crisis innovations happen on a different timescale to what we usually see in regular open and user innovation. Therefore, it is sometimes difficult to categorize this type of innovation within frameworks like the phases of consumer innovation mentioned by Hippel et al. [47]. While the variety of challenges we face increases due to the changing landscape of our society (natural disasters, terrorism, and manmade accidents), so do the chances to join forces with others through novel collaboration and communication technologies. We must find new ways of conceptualizing and evaluating potential uses for these technologies in crisis management and response [48].

By being more open with data, Governments would be able to harness the power of the crowd to alleviate some of the pressure to do everything centrally. If official bodies can recognize citizens as an influential, self-organizing, and intelligent force, technology and innovation can play a transformational role in crises [49]. Many people working on data from Japan were manually extracting figures from government PDF reports and websites; this is not efficient, especially when time for response is of the essence. There needs to be a better format for the release of information for decent technological innovation to take place.

Twitter has became a crisis platform by accident. The powers that be have had a hard time trying to work out how to make use of or control this fact [50]. Crisis media is an untapped fountain of information for both users and governments, and a possible target for innovation in future crisis situations. As mentioned before, educating the public with proper facts is paramount in disaster and crisis situations, but it is also a big challenge. Media outlets used high budgets to produce incredible visualizations of things like reactor cores but they failed to portray any substantive data through them. This encouraged people to innovate with the data was openly available. One of Rama Hoetzlein’s main complaints was that media sources did not use any of the high-quality informative visualizations produced by data visualization enthusiasts.

Independent sources are important when representing data from crisis situations. By decentralizing the flow of information, a broader picture of the situation can be painted. This is why crisis maps can be so important. Platforms like Ushahidi [51], initially a platform created to map reports of violence in Kenya, seem like a conceivable type of solution for the future of crisis mapping. Since 2008, it has evolved into a place where anyone can crowdsource information specifically in areas where information is difficult to obtain. Information can come from SMS, email, Twitter and other web sources can be used to gather data. A service like this, with proper integration across all popular social networks, is needed. It should be noted that the mode of reporting should be tailored to cultural trends and available technology in that area. For example, SMS would most likely be the reporting method of choice in Nigeria. Whereas people in North America would probably turn to Twitter. However, there is a need for design and social mechanisms to inspect the legitimacy of data sourced from the crowd [52]. Some form of automated mediation or double validation of crowdsourced results could be possible solutions for this issue.

The crisis in Japan sparked much research into radiation detection devices. An off the shelf HD webcam that was transformed into a Geiger counter, with possible applications in consumer hardware as an open source modification kit [53], is a prime example of the impressive innovations to emerge from this disaster. We are indeed stepping into the era of low-cost and connected devices.

Current open hardware efforts are very much dispersed; we need to create a better structure to have a well-organized response to emergencies. Regarding open source hardware applications for future crisis situations, we can identify factors that need to be in place for successful innovation. The types of devices for each type of possible crisis needs to be defined. Plans and schematics for emergency devices for emergency conditions must be available and accessible through understandable and usable sources. The toolkits for each geographical area should be identified based upon the available resources in that particular area. Hackerspaces should be used as hubs for innovation during crises. They have played a imperative role in disaster relief over the last few years due to their huge array of skills and contacts within the hackerspace network. Since designs for new products are usually encoded in computer-aided design (CAD) files [54], people from all around the world can contribute to customizing a design to fit the needs of a particular crisis. Lastly, distribution channels for getting these devices to areas in need must be accessible.

It is important to recognize that all enabling factors need to be in place for open and user innovation to be useful in crises. Data collection needs to be scientifically accurate; users must have knowledge in how to use the hardware correctly. The device needs to be accessible and functioning. Crowdsourced data should be centralized, and efforts should be made to ensure that people can reach the data. The results need to be displayed in ways that can be easily understood. If any components are missing, we end up with a bottleneck in the problem that we are trying to solve. Getting this perfect mix of factors to fall into place in a crisis, a situation with so many uncontrollable variables is an enormous task. Far more research is required to learn how to harness the power of the crowd and citizen innovators. By opening up, governments and citizens can complement each other’s efforts for a more timely response to disasters. By putting the correct tools and knowledge into the hands of the general population, we can encourage a self-sustaining propensity for innovation in times of crisis.

About the Author

Nicholas Tenhue is currently UX and Product Strategy lead at Genospace, an alumnus of Microsoft Ventures, Founder and former President of EIT Digital Alumni, and holds a dual MS in ICT Innovation from UCL and KTH. Learn More about Nicholas or follow him on Twitter:

References

[1] Tokyo Electric Power Company. Current situation of Fukushima Daiichi and Daini nuclear power station [Online]. Available: http://www.tepco.co.jp/en/nu/fukushima-np/indexe.html

[2] Debora MacKenzie. 2011. Fukushima radioactive fallout nears Chernobyl levels [Online]. New Scientist. Available: http://www.newscientist.com/article/dn20285-fukushimaradioactive-fallout-nears-chernobyl-levels.html

[3] Hiroko Tabuchi. 2012. Fish Off Japan’s Coast Said to Contain Elevated Levels of Cesium [Online]. The New York Times. Available: http://www.nytimes.com/2012/10/26/world/asia/fish-off-fukushima-japan-show-elevatedlevels-of-cesium.html?_r=0

[4] Japan Meteorological Agency. Earthquake Early Warning. Available: http://www.seisvol.kishou.go.jp/eq/EEW/kaisetsu/.

[5] Nuclear Safety Technology Center. The System for Prediction of Environment Emergency Dose Information(SPEEDI). Available: http://www.bousai.ne.jp/eng/.

[6] Martin Fackler. 2012. Japan Weighed Evacuating Tokyo in Nuclear Crisis [Online]. The New York Times. Available: http://www.nytimes.com/2012/02/28/world/asia/japanconsidered-tokyo-evacuation-during-the-nuclear-crisis-report-says.html

[7] Ministry of Education,Culture,Sports,Science & Technology in Japan. [Online]. Available: http://www.mext.go.jp/english/

[8] BBC News Asia. 2012. Japan did not keep records of nuclear disaster meetings [Online]. Available: http://www.bbc.co.uk/news/world-asia-16754891

[9] Leysia Palen and Sarah Vieweg. 2008. The emergence of online widescale interaction in unexpected events: assistance, alliance & retreat. CSCW ’08. ACM, New York, NY, USA, 117–126.

[10] DYNES, R. 1970. Organized Behavior in Disaster. Heath Lexington, Lexington, MA.

[11] Leysia Palen and Sophia B. Liu. 2007. Citizen communications in crisis: anticipating a future of ICT-supported public participation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ‘07). ACM, New York, NY, USA, 727–736.

[12] Clay Shirky. 2011. Cognitive Surplus: Creativity and Generosity in a Connected Age. Penguin. ISBN: 0141041609.

[13] F. N. Shigyo. 2011. The Great East Japan Earthquake: How Net Users Utilized Social Media? The NHK Monthly Report on Broadcast Research, 61(8):2_13. 14

[14] Y. N. Yoshitsugu. 2011. Roles of social media at the time of major disasters observed in The Great East Japan Earthquake: twitter as an example. The NHK Monthly Report on Broadcast Research, 61(7):16_23.

[15] Facebook. Available: http://www.facebook.com/.

[16] Twitter. Available: https://twitter.com/.

[17] Mixi. Available: http://mixi.jp/.

[18] Mai Miyabe, Asako Miura, and Eiji Aramaki. 2012. Use trend analysis of twitter after the great east japan earthquake. CSCW ’12. ACM, New York, NY, USA, 175–178.

[19] QUALMAN, E. 2009. Socialnomics: How Social Media Transforms the Way We Live and Do Business. Wiley.

[20] Marian Steinbach. 2011. A Crowdsourced Japan Radiation Spreadsheet [Online]. Available: http://www.sendung.de/2011-03-15/a-crowdsourced-japan-radiation-spreadsheet/.

[21] Marian Steinbach. 2011. Japan Radiation Open Data, Measurement Data [Online]. Available: http://www.sendung.de/japan-radiation-open-data/.

[22] radmonitor311. Available: https://sites.google.com/site/radmonitor311/top_english.

[23] micro Sievert. Available: http://microsievert.net/.

[24] Miguel Rios. Global Pulse [Online]. http://blog.twitter.com/2011/06/global-pulse.html.

[25] Tokyo Electric Power Company. Available: http://www.tepco.co.jp/index-j.html.

[26] Rama Hoetzlein. 2011. Fukushima Nuclear Accident- Radiation Comparison Map [Online]. Available: http://www.rchoetzlein.com/theory/2011/fukushima-radiationcomparison-map/.

[27] Rama Hoetzlein. 2011. Fukushima Radiation — Regional Effects Animation [Online]. Available: http://www.rchoetzlein.com/theory/2011/fukushima-radiation-regional-effectsanimation/.

[28] COSM. Geiger [Online]. Available: http://community.cosm.com/taxonomy/term/221.

[29] Japan Radiation Map. Available: /.

[30] Haiyan Zhang. 2011. Japan Geigermap [Online]. Available: http://japan.failedrobot.com/.

[31] Open Source Hardware Association. Available: http://www.oshwa.org/.

[32] Aaron Weiss. 2008. Open source hardware: freedom you can hold?. netWorker 12, 3 (September 2008), 26–33.

[33] Gordon E. Moore. 1965. Cramming more components onto integrated circuits. Electronics, Volume 38, Number 8, April 19, 1965. 15

[34] Shapeways. Available: http://www.shapeways.com/.

[35] Ponoko. Available: http://www.ponoko.com/.

[36] Arduino. Available: http://www.arduino.cc/.

[37] David A. Mellis and Leah Buechley. 2011. Scaffolding creativity with open-source hardware. In Proceedings of the 8th ACM conference on Creativity and cognition (C&C ‘11). ACM, New York, NY, USA, 373–374.

[38] DIYGeigerCounter. Available: https://sites.google.com/site/diygeigercounter/home.

[39] Yapan.org. 2011. レシピ 39:自分の生活環境の放射線量を計測したい [Online blog]. Available: http://www.yapan.org/main/2011/03/measure_radiation_dose.html.

[40] Open Geiger Project. Available: http://opengeiger.com/.

[41] The Libelium Team. 2011. Geiger Counter — Radiation Sensor Board for Arduino. Available: http://www.cooking-hacks.com/index.php/documentation/tutorials/geiger-counterarduino-radiation-sensor-board.

[42] Seed Studio Depot. Seeed Open Hardware Facilitator. Available: http://www.seeedstudio.com/depot/.

[43] Safecast. Available: http://blog.safecast.org/.

[44] ‘Akiba’ Chris. 2011. Tokyo Hackerspace/RDTN Geiger Shield — Dev History [Online]. Available: http://www.tokyohackerspace.org/ja/blog/tokyo-hackerspacerdtn-geiger-shielddev-history.

[45] Bunnie Huang. 2011. Safecast Geiger Counter Reference Design [Online]. Bunnie: studios. Available: http://www.bunniestudios.com/blog/?p=2218.

[46] Amar Toor. 2011. Safecast to create real-time maps of air quality in Los Angeles [Online]. Available: http://www.theverge.com/2012/9/21/3367078/safecast-to-create-realtime-maps-of-air-quality-in-los-angeles.

[47] Hippel et al. 2011. The Age of the Consumer-Innovator [Online]. MIT Sloan Management Review. September 21, 2011. Available: http://www.theverge.com/2012/9/21/3367078/safecast-to-create-real-time-maps-of-airquality-in-los-angeles.

[48] Volkmar Pipek, Leysia Palen, and Jonas Landgren. 2012. Workshop summary: collaboration & crisis informatics (CCI’2012). CSCW ’12. ACM, New York, NY, USA, 13- 14.

[49] Leysia Palen, Kenneth M. Anderson, Gloria Mark, James Martin, Douglas Sicker, Martha Palmer, and Dirk Grunwald. 2010. A vision for technology-mediated support for 16 public participation & assistance in mass emergencies & disasters. British Computer Society, Swinton, UK.

[50] Rebecca Goolsby. 2010. Social media as crisis platform: The future of community maps/crisis maps. ACM Trans. Intell. Syst. Technol. 1, 1, Article 7.

[51] Ushahidi. Available: http://ushahidi.com/.

[52] Leysia Palen, Starr Roxanne Hiltz, and Sophia B. Liu. 2007. Online forums supporting grassroots participation in emergency preparedness and response. Commun. ACM 50, 3 (March 2007), 54–58.

[53] Thomas Auzinger, Ralf Habel, Andreas Musilek, Dieter Hainz, and Michael Wimmer. 2012. GeigerCam: measuring radioactivity with webcams. SIGGRAPH ’12. New York, NY, USA, , Article 40.

[54] Von Hippel, E. 2009. Democratizing innovation: the evolving phenomenon of user innovation. International Journal of Innovation Science, 1(1), 29–40.