Geoffrey Boulton writes the first in a series of articles from Jisc on research in the age of open science
Information and knowledge have always been essential drivers of social progress, and
the technologies through which knowledge is acquired, stored and communicated have been determinants of the nature and scale of their impact.
A technological milestone was passed at the turn of the millennium when the global volume of data and information that was stored digitally overtook that stored in analogue systems on paper, tape and disk. A digital explosion ensued that has immensely increased the annual rate of data acquisition and storage (40 times greater than 10 years ago), and dramatically reduced its cost.
In 2003, the human genome was sequenced for the first time. It had taken 10 years and cost $4 billion. It now takes three days and costs $1,000 (£770).
Like all revolutions that have not yet run their course, it is often difficult to distinguish reality and potential from hype. So what lies behind the “big data” phrase that has become the rallying cry of this revolution, and with which all levels of business and government, and increasingly universities and researchers, are struggling to come to terms?