Select Page

White Paper Series

Since 2017 the CODATA Task Group on FAIR Data for Disaster Risk Research has produced a series of White Papers addressing key issues relating to data in disaster risk reduction research.

White Paper 1: Gap analysis on open data interconnectivity for global disaster risk research, 2017

Disasters are sudden, calamitous events that bring great damage, loss or destruction to large populations and regions. Natural hazards pose a great threat and challenge to all human societies. The diversity in hazards requires a multi-disciplinary approach, integrating scientific research and application of the findings with pre-disaster prediction, decision making and assessment based on complete, scientific and reliable data. This white paper aims to identify the gaps in technology and relevant policies that prevent effective interconnection of disaster-related data and information for use in research, education and public engagement. A set of recommendations summarized for a better understanding of disasters and more effective ways of mitigating and reducing the impact of disasters on lives and properties.

White paper 2: Next-generation disaster data infrastructure, 2019

This white paper discusses the next generation disaster data infrastructure from four different aspects: Disaster data collection, disaster data processing, disaster data quality control and disaster data visualization. In disaster data collection, sensor data and crowd-sourced data, as well results of near real time loss simulation, should be considered together to deliver a comprehensive view of threatened areas. Fundamental requirements of disaster data infrastructure include- effective multi-source big disaster data collection,  efficient big disaster data fusion, exchange, and query, strict big disaster data quality control and standard construction,  real-time big data analysis and decision making and user-friendly big data visualization.  A set of policy recommendations  are provided at the end of the paper.

White paper 3: Disaster loss data: raising the standard, 2017

Disaster archives and loss data collection is fundamental for a comprehensive assessment of socially, temporal and spatially disaggregated impact data. Risk interpretation, with standardized loss data, can be used in loss forecasting and historical loss modelling. These would provide valuable opportunities to acquire better information about the economic, ecological and social cost of disasters, and to more rigorously collect data that can inform future policy, practice, and investment. It requires a multi-agency, multi-sectoral approach, in order to capture prior experience and the full range of relevant data. This paper aims to develop a process towards a standardised framework and with protocols for loss data collection systems that support enhanced, accurate risk assessments.  The key objectives are to identify indicators for disaster loss estimation, outline standards, and provide an initial framework for the design of data collection and assessment procedures.  Even though establishing a comprehensive national standardized loss data collection and management system is complex, due to its multi-sectoral, multi-layered requirements across the public and private sectors, it can facilitate improving disaster resilience.