Friday, December 23, 2011

The Future of Science

The author argues that a new paradigm for science is emerging in the early 21st century, driven by the larger process of Meta-Evolution and triggered by a number of global social developments including- Citizen Science, Open Science, Networked Science, AI-based and Computational Science.

The Knowledge Discovery process is exploding and combined with new levels of science communication and education is bringing the potential social ramifications and benefits into sharper focus for the general population.
There appears to be no end in sight to the process of knowledge discovery, with each new insight interweaving with and catalysing others.

The Evolution of Science
‘Science’ is not an object able to be easily defined, but a multilayered process and framework encompassing the causal understanding and quantification of relationships that have enabled life and human civilisation to develop.
But always underpinning it, is a fiercely guarded reputation for veracity and rigor. The methodology that has built this reputation over the last five centuries is ‘the scientific method’.
The scientific method provides a rigorous and logical means of establishing causal relationships between phenomena, based on inductive as well as deductive logic and experimental evidence.
Today we literally live or die by its outcomes; whether in the shape of an aircraft wing that enables us to travel safely through air, the reliability of the electric motor controlling a skyscraper lift, the sterilisation process used to keep food fresh and vaccines that have saved millions of lives.
This explosion of knowledge over the past millennia has created a rich melting pot from which society extracts a continuous stream of invention and benefits related to all aspects of technology - transport, energy, communications, agriculture, medicine etc. Each strand is ultimately co-dependent on all others and in fact it can be argued that the evolution of civilisation is contingent on the accretion and blending of all strands of prior knowledge.

Science Mark 1.0 began a long time ago with the basic adaptive behavioural reward system of all life, eventually evolving in hominids into a more formalised trial and error process, underpinning attempts to discover survival strategies. Examples include testing different plants to discover remedies for illnesses or processes for improving edibility. Also testing forms of social interaction to achieve more effective decision making to maximise group rewards and mediate resource sharing and refining the tools and techniques needed for hunting, food preparation and construction of dwellings.

A greater understanding of the cause and effect of natural phenomena based on this substrate of trial and error emerged from sifting data for useful patterns and inferring relationships through the use of classification and prediction techniques, eventually evolving into today’s sophisticated scientific tools.

Finally with the major scientific breakthroughs of the 16th and 17th Centuries, including the formulation of the laws of mechanics linking motion, mass, force and acceleration together with the sophisticated analytic machinery of calculus, the insight was finally gained of the abstract underlying mechanism- the Scientific Method. This was enunciated, defined and applied by Newton, Bacon, Descartes and others.

The scientific method is therefore a framework or process for extracting patterns and relationships about the universe in which we exist. It is also an open system in itself, capable of adopting new modalities when required and upgrading its structure to achieve greater compatibility within its own environment- the world of knowledge. Science and technology are ultimately the outcomes of the scientific method and all branches are closely intertwined.

With the new power of the scientific method emerged the idea of Universal laws, including the realisation that this involved the formulation of quantitative as opposed to qualitative relationships. This basic quantitative process for testing and refining knowledge has remained virtually the same ever since. The changes that have occurred, relate primarily to new developments in the tools for data gathering, analysis, measurement, applying logic and the testing of hypotheses.

But the scientific method requires first and foremost that a theory be predictive; that it must mirror reality to the point where real phenomena can be calculated with reasonable accuracy, such as the motions of the planets and stars. This predictive principle was tested by experimental means, primarily by early navigators and astronomers.

In 1609 Galileo made a telescope through which he observed the four moons of Jupiter and confirmed the Copernican Heliocentric theory. The nature and cause of these motions was however still a mystery until in 1643 Isaac Newton utilised this knowledge in constructing the law of gravity, demonstrating that the equations encapsulating the mechanics of the sky were the same as the laws of mechanics on earth.

There was also a new perception of the larger relationship of the earth to the Cosmos. Ptolemy's geocentric model of the heavens, derived from Greek cosmogonies eighteen centuries earlier, of the stars revolving in a fixed sphere around the earth, had been displaced by the Copernican heliocentric model. The earth was now not at the centre of the universe, but revolved around the sun.

This was a unification of the underlying principles governing motion in the universe, demonstrating the capacity of the scientific method of knowledge discovery to deliver results of great truth and beauty.
Science Mark 2.0 emerged during the 19th and early 20th centuries, with the formulation of the great unifying laws of Electromagnetism, Thermodynamics, Relativity, Quantum Mechanics and Darwinian Evolution. By providing a new understanding of the molecular, atomic and sub-atomic view of forces and matter, Science 2.0 opened a cornucopia of modern advances in all branches of science resulting in an explosion of technological advances including- information technology, electronics, energy, genetics, medicine, economics, engineering, agriculture and construction.

Grand Unified theories such as the Standard Model, String Theory and Loop Quantum Gravity followed in quick succession in the latter part of the of the 20th century, paving the way for Science 3.0.
Now Science Mark 3.0 is emerging in a way which will dwarf what came before. This new scientific paradigm comes with a greater understanding of the sciences at the information and sub-atomic level with the additional understanding of the reintegration of the physical and social disciplines.
Major EU ICT driven flagship projects such as FuturICT and Paradiso are also predicated on this ‘interweaving’ principle, cross correlating multidisciplinary projects and combining information from both the social and physical sciences.

But in addition to this powerful methodology, humans have gained an ally in the scientific endeavour via the enormous computational and social power of the Web. Combined with the mind power of its soon to be 3 billion human acolytes, it has the potential to offer far more than a passive repository of human knowledge. It is also capable of discovering patterns in vast quantities of data, developing new theorems and algorithms and even unleashing the power of human creative thought and is likely to emerge as a senior science partner with humans as early as 2020.

As the author has noted in previous blogs, this has the potential to create a super-organism with unparalleled problem-solving capacity; a global mind more powerful than any previous group of scientists.
And just in time, as global warming and extreme weather events, combined with the spread of new diseases and critical shortages of food and fresh water, threaten the very fabric of society and the survival of our planet.

The major social drivers powering the birth of Science 3.0 include Citizen, Open, Networked, AI based and Computational Science.

Citizen Science-
The emergence of the Citizen Scientist is just beginning and it might be useful to start by distinguishing between those involved as science insiders and outsiders or specialists and non-specialists, rather than professionals and amateurs. The line between professional and amateur science will become increasingly blurred, likely disappearing within the next twenty years, as the value that an individual contributes to knowledge discovery rather than a one-size-fits-all qualification, becomes the key criteria for recognition. With the heavy computational lifting largely done automatically by the Web, science will take on a more egalitarian flavour.

The contribution of the latter day non-specialist to science is well documented, including a lead role in the following disciplines - Paleontology- contributions to fossil discovery; Astronomy- discovery of new supernovas, comets, meteors and even planets; Mathematics- such as the immensely important statistical Bayes Theorem; Biology¬- the discovery of new species of both plants and animals; Archaeology - locating new sites with the help of Google Maps and the mapping and classification of rock art motifs; Ecology- Indigenous populations contributing to the understanding of the deep ecology of their environment.

Now this symbiosis has been taken to a new level through- Gaming, Crowdsourcing and Volunteering , expanding the support and creativity of relative outsiders in a variety of disciplines. This involves enlisting both professionals and non-professionals to help solve complex scientific problems, not just through extra computing power such as the SETI project, but through the power of many minds working in tandem with computers and the Web.

The application of games to problem-solving is a technique in which many non-professionals excel and harnessing this mind power is a relatively new variation of citizen science. For example-
Phylo is a game that allows users to contribute to the science of genetics by aligning sequences of dna, rna and proteins to find functional similarities and to learn how they have evolved over time. Humans are better at solving such visual puzzles than computers and Phylo represents such molecular groups by the alignment of vertical coloured pieces on a screen. There are currently 16,000 registered users working to solve such puzzles as well as a Facebook group to suggest Phylo improvements.

Foldit is a protein folding game, capable of solving puzzles that have challenged professional scientists for years such as the optimum folding patterns of chains of amino acid that make up the building blocks of enzymes and proteins and cracking the code of how an enzyme of an AIDS-like virus is created. It took the gamers only three weeks to create an accurate model of the solution.

Also 40,000 registered users of the game Planet Hunters, have identified 69 potential new planets from data retrieved from NASA’s Kepler Space Telescope, to find habitable planets outside earth’s solar system.
A mix of professional and crowd sourced volunteer astronomers also helped with survey observations such as those coordinated by ESA’s space hazards team, and have recently found an asteroid that could pose an impact threat to earth-2011 SF108. The project requires visual image evaluation by humans beyond automated computer analysis. Currently a survey of 300,000 spiral galaxies is also underway using superior human image discrimination to determine those with bars.

This cornucopia of knowledge provided by science outsiders is in most cases no less rigorous or ‘scientific’ than that ‘discovered’ by insiders or professionals in a multimillion dollar laboratory or observatory. Today the technological gap between citizen and traditional science is also closing in several major ways, particularly in relation to the instrumentation and techniques now available to the outsider. These include- powerful computers, publicly available data sets and algorithms, Google maps, telemetry of all types, virtual reality simulations, virtual telescopes and observatories and inexpensive desktop laboratories- all harnessing the power of the Web.

The analogy is the ease with which computing amateurs can now use professional tools and templates to translate creative ideas into professional quality end-user applications.

Networked Science-
Using specialist online tools, massive research cooperation through the power of the Web is becoming a reality. Network science projects can amplify collective intelligence, linking relevant disciplines and skills to best solve problems across institutions and nations; activating latent expertise and dramatically speeding up the rate of discovery across all sciences.

Remote interdisciplinary collaborations- or virtual research teams, are also proving invaluable in the knowledge generation process. STEM - Science Technology Engineering and Maths is one such integrated field. Specialists meet at a number of university centres to tackle problems in areas of national interest such as sustainability, neuroscience and systems biology, piecing together a narrative between disconnected information databases.

In ecology, the Long Term Ecological Research network composed of 26 research sites has been operating for 30 years. Now a group of PhD students have set up a network of small scale experiments- The Nutrient Network, to understand influences on the structure of grasslands, with scientists volunteering their services at 68 sites in 12 countries, without the need for major grants. This is an example of big science being carried out on a shoestring budget through global networks volunteer scientists.
As we as a species begin to take on the characteristics of a super-organism through integration with the Web, the level of such networked problem-solving power will increase exponentially. As underlying laws and patterns are discovered the knowledge is also increasingly converted into algorithms- that can automatically fly a jet aircraft, analyse genome sequences in minutes, diagnose illnesses or run a chemical factory.

Open Science-
Science is increasingly seen as a public enterprise, not a separate world mediated by remote experts. Institutions such as the British Royal Society are playing a leading role in publicly advocating the disclosure and sharing of scientific information.
In addition science communication and education is booming, with knowledge readily available not only through a host of popular science publications such as New Scientist and Scientific American, but through Google and thousands of free sites such as Wikipedia and ScienceDaily.
Publication of professional science research is also flexing up, allowing more direct access channels to the public most notably arXiv, outside the peer reviewed journals. But the top-down attitude of many scientists still prevails- that the scientific process must only involve professional scientists and that societal implications should be communicated to the public and policy makers as forgone expert conclusions open to minimal public debate. But the critical scientific decisions now faced by humans are also part of the fabric of a democratic society and therefore must be based on the free and open flow of information.

Open Science also manifests in more ethical science. There has been enormous pressure on researchers for positive results and a steady decline in published studies where findings have contradicted current scientific hypotheses. Negative findings attract fewer readers so scientific journals tend to reject these more often.

Government Grant Agencies such as the NSF and NIH also need to become more flexible and work with scientists to develop more open sharing modalities of knowledge discovered with public support, by encouraging scientists to submit findings in alternate ways beyond published papers. These include computational visualisations and popular science books. Also to actively participate with the public in solving endemic social problems such as climate change, conflict and food capacity.

Artificial Intelligence-based Science-
There are many AI techniques in use today including- fuzzy logic, neural networks, decision networks and swarm intelligence.
But the most powerful of all is based on the generic version of the evolutionary process itself- the Evolutionary or Genetic Algorithm- EA.

Although this technique has been widely used for over ten years to optimise and discover design solutions, it has reached a new level recently in the form of easy to use software tools such as Eureqa.
Eureqa is being applied to search not just for new patterns, but for new laws of science. This is achieved by repeatedly combining and testing simple mathematical expressions to create equations, selecting those that best reflect reality. But to achieve theorem discovery it must also apply the property of invariance to the laws of nature by describing how the system evolves over time.

Eureqa is an ideal tool for researching systems that are too complicated to follow set rules and has already been used to create time tables, aircraft wing design, network topology, financial forecasting and particle physics simulations.
But a recent landmark example of theorem proving was the rediscovery of the Law of Conservation of Energy from scratch, which originally consumed the intellectual energy of many scientists hundreds of years to discover. It took Eureqa, using a two arm pendulum, only one day.

But its significance goes well beyond this. It is being applied to discover new theorems increasingly beyond the cognitive capability of its human counterparts- beyond the limits of human knowledge. And it also has the potential to alter the balance between insiders and outsiders, because it empowers both groups. Eureqa can discover new patterns and relationships that the professional can then build on to prove new theories. The citizen scientists can do the same thing without the need to work in a traditional science framework- to move from analogical conception to solution, just like the gamers of the Foldit project.
It was recently applied to determine the existence of a deep law that controls cell differentiation. It produced a biological law of invariance equivalent to the conservation law in physics. It is also being applied to understand the decision network of a bacterium as it changes into a spore to determine which factors switch on genes and which genes control others, involving a huge network of probabilistic interactions between biomolecules.

Equally, Eureqa could and will be applied in the future to all complex scientific disciplines- economics, biology, social sciences and climate science and even perhaps to solving the universal Theory of Everything. The combination of descendants of the Web and Eureqa could achieve this within the next decade.

AI based science is an extension of the scientific method- but not essentially different. Evidence-output from an EA generated hypothesis is tested against a reality based goal to check how accurately it reflects this reality. Each new generation is therefore a variation of the original hypothesis.

Computational Science

The process of scientific research is expected to change more fundamentally over the next three years than the previous three hundred years. The epoch during which individual humans are able to conceptualise or understand increasingly complex phenomena could be coming to an end, for example as applied to the human genome, even though the power of computational visualisation is extending this capability. There are just too many interacting variables for scientists to get an intuitive understanding of the potential interactions of cells with millions of chemical outcomes.

With Big Data now a fact of life in all disciplines, combined with a discovery program such as Eureqa, 90% of the traditional work can now be done by the Web.

Mathematics- the science of patterns in number systems, abstract shapes, and transformations between mathematical objects, requires additional capabilities beyond the human brain to complement our human intuition, for example to allow exploration of the properties of trillions of prime numbers.

Meta-Evolution and the Rise of Science 3.0-

Great scientists throughout history have nurtured the ‘creative gene’ which allowed them to soar to new heights and great new conceptual horizons, which they then backed with rigorous mathematics and experimentation. Emulating the ‘creative gene’ of humans could be the next big step for the Web. But regardless, the discovery process is due to accelerate, as the global mind expands and each new insight catalyses countless others.

As previously defined, the scientific method is recognised as the primary knowledge generator of modern civilisation. Through the rigorous process of verification, applying both inductive and deductive logic, causal relationships governing physical phenomena are defined and tested in the form of hypotheses and models.

In fact it bears a striking similarity to the larger process of evolution itself and the author maintains is a subset of that process. The difference is one of complexity and scope. Evolution is a universal generic process of adaptation and optimisation, which selects and amplifies the most appropriate response of a system to its environment. The scientific method is a meta-method or methodology aimed at selecting the most appropriate theory or causal model within an experimental environment.

The key to better understanding the scientific method, is that it is not a single process but a number of sub-processes, which are continually evolving. These processes include- initial conceptual insights, general hypothesis formulation, experimental framework development, data access and capture, pattern analysis of information and evidence extraction, hypothesis modelling based on inference logic, algorithmic formulation, experimentation, testing and validation of procedures, hypothesis optimisation and refinement.

Each of these sub-processes within the overall framework of the method is therefore evolving as scientific advances continue. The scientific method was therefore not invented or discovered in its present form, but evolved and continues to evolve, using processes common to all adaptive learning. Therefore it is logical that deeper understanding of evolution will also provide a deeper understanding the future of science.
The author believes the two are intimately connected. A greater understanding of the evolutionary nature of the scientific method will lead to an acceleration of the process of new theory discovery, which in turn will accelerate the broader process of evolutionary knowledge discovery.

There is no end to this process. All current major theories, including Relativity and the Standard Model of Quantum Physics, are constantly being optimised, refined and sometimes radically reinterpreted.
In his book- The Future of Life: A Unified theory of Evolution, the author has predicted the emergence of an imminent phase change in the evolutionary process- Meta-evolution or Evolution Mark 2.0.

As human society begins to achieve a deeper and more intimate understanding of the primary evolutionary imperative shaping its destiny, he predicts this new form of evolution will emerge, accelerating the already exponential pace of change to an extreme level.
The process will manifest in response to a deeper understanding by society of the implications of the evolutionary process itself. The resulting amplifying feedback will generate a glimpse of life’s true potential, accelerating the already massive momentum of evolution and its scientific counterpart, resulting in a rate of change that is forecast by a number of scientists and forecasters to reach hyper-exponential levels in the near future.

In addition Meta-Evolution will be reflected in the evolutionary future of Science itself. Science is already on the path to escaping its current institutional shackles, through the revolution in Citizen Science, Open, AI and Networked Science; harnessing the creative realignment of the physical and social sciences and the future Web 4.0’s massive computational intelligence. Combined with Meta-evolution this phase will attain astounding outcomes and benefits for society by mid-century and provide possible redemption for the gross exploitation of our planet. The evidence is already visible in the genesis of a starburst of major projects such as FuturICT and Paradiso ICT.

This will be none too soon as mentioned, as the problems of global warming, ecological disasters, sustainability, economic collapse and endemic conflict ravage the planet. Meeting these challenges will involve active adaptation through problem solving – the heart of the evolutionary process.

Already an awareness of the pervasive and rapidly accelerating power of evolution is beginning to be felt through the enormous scientific, technological and social advances in our civilisation. This insight creates the evolutionary feedback loop- to actively engage evolution in helping meet today’s complex survival challenges. This further accelerates the knowledge discovery process, which in turn would generate further evolutionary insight and application.

A significant additional impetus would therefore be gained from a deeper understanding of the driving role of the evolutionary paradigm- a global awareness of the engine underlying life's progress. This would eventually create an explosive realisation of life’s future potential,
As this rate of knowledge acquisition increases to the point of incompatibility with the human capacity to absorb it, new social structures and modes of cognitive processing based on artificial intelligence techniques will emerge to help humans cope.

According to the author, this is already occurring. Even as the amount of information expands beyond human horizons, we are developing techniques to bring it under control. Like a fractal image, cybernetic life forms and intelligent machines are evolving in the same way as biological life- mutating to become increasingly intelligent. These act as proxies for humans, managing complex processes and roaming cyber-space- searching, filtering and processing data from an already overwhelming pool.

Hyper/Meta-evolution can be expected to become a part of a new global paradigm within the next twenty years- 2030, based on current rates of knowledge growth and coinciding with the evolution of the super-intelligent Web 4.0. This will rapidly transform all aspects of our culture and civilisation, including accelerating the rise of Science 3.0.

Saturday, December 17, 2011

The Future of Civilisation 3.0

The author argues that the combination of the recent triple major disaster in Japan, the GFC and looming world recession, together with the Arab Spring and global Occupy movement, have provided the final triggers for the rapid evolution of Civilization 3.0 in the fight for human survival.

These world-shaking events send a timely message to the rest of the world, that the form of civilisation and the social norms we have become accustomed to and lived by over the last few centuries, is over.

Civilisation 1.0 began over 15,000 years ago with the founding of the earliest settlements and villages around the world, as hunter gatherers settled down to take advantage of the rich sources of edible grasses and natural foods growing mainly around the fertile delta areas of the great rivers and coastal areas of the world. These early habitats evolved into the first towns, cities and eventually nation states. At the same time, the first writing and number systems evolved to keep track of the products and services that developed and were traded by modern humans. Also with the growth of towns and trade across the world and the use of wood for building and smelting, the clearing of the forests across Europe began.

Civilisation 2.0 then emerged, with more sophisticated means of production using wind and water; rapidly accelerating following the industrial revolution in the 18th century with the harnessing of steam and later electricity and combustion engine power. These innovations were dependent on the burning of fossil fuels- coal and oil on a massive scale, allowing the West to steal a march on the rest of the world; colonising its populations and exploiting its wealth.
The manufacture of goods and services then increased on a massive scale; everything from food, textiles, furniture, automobiles, skyscrapers, guns and railway tracks- anything requiring the use of steel, cement or timber for its production.

The major cities expanded on the same original settlement sites as Civilization 1.0 - coastal ports, river delta fertile flood plains, regardless of the risk from subsidence and earthquakes.
The energy revolution was rapidly followed by the communications and information revolution- grid power, telephone, wireless, radio, television and eventually computers. Later, nuclear power was added to the mix.

Then in the second half of the 20th century the realization finally dawned that the planet’s resources really were finite, following the many dire predictions for decades previously.
Now at the current rate of consumption by a global population of 7 billion, projected to grow to 9 billion by mid-century, combined with the Armageddon of global warming, the planet is rapidly running out of fresh water, food, and oil. At the same time the grim effects of the escalating levels of carbon in the oceans and atmosphere has triggered more frequent and severe weather events- major droughts, floods and storms, adding to the impact of earthquakes in highly populated urban areas.

These problems are now bigger than any one nation can handle and can only be effectively addressed on a global basis. This will inevitably need to be coupled to a higher level of social awareness and democratic governance, in which everyone, not just politicians, are involved in the key decision processes affecting the planet.

Welcome to Civilisation 3.0.

Civilization 3.0 is just beginning, but is already being tested. From now through the rest of this century comes the hard part. Tinkering around the edges won’t do it for the planet and its life- including humans, any longer.
The planet’s climate is already in the throes of runaway warming, regardless of what forces caused it, because of the built-in feedback processes from the melting of the ice sheets in Greenland and Antarctica, to the release of huge methane reserves in the northern tundra and ocean floor.

But this is just the start of our problems.
Some areas may get a short term reprieve with local cooling, but overall the heating process appears to be unstoppable. The Faustian bargain that humans struck to establish Civilisation 1.0 and 2.0, when the planet was teeming with natural resources, is about to be redeemed. Humans are being called to account.

By 2020 the cost of solar, wind and biofuels is likely to be at a baseline level comparable to that of fossils fuels, due to major technological advances currently underway, such as artificial photosynthesis. But because of the flawed democratic process, major businesses and corrupt governments can still undermine the critical mindset needed for radical change, with calls for short term profits drowning out the desperate call by future generations for long term survival. It is therefore highly likely that we will still be emitting copious amount of carbon by 2020 and starting to exceed the safe limits of temperature rise.
In addition, supplies of fresh food and water, particularly in developing countries are already dwindling, with the potential to create further malnutrition and conflict.

So Civilisation 3.0 has to get serious.

One of the major recent initiatives at the heart of the fight-back revolution is the concept of a smarter planet. The Japanese experience has now reinforced that concept. Every built object and operational process will eventually need to be embedded with sensors and its performance and integrity continuously monitored and assessed in relation to natural disasters and sustainability.

Everything from roads, transport, bridges, railways, buildings, dams, power plants, grids and information systems, as well as human knowledge and skill capacities, will need to be urgently upgraded. Even towns and cities will have to be redesigned to avoid future worst case natural and manmade disasters and provide a more sustainable living space for future generations.
In addition, the loss of critical ecosystems and species will compound the infrastructure problems of the planet, requiring re-prioritization of the value of the natural environment and fair re-allocation of its resources on a global scale.

The current level of risk and waste in the built environment is now seen as both unacceptable and avoidable. By applying new technologies already available such as smarter materials, safer engineering methods, improved communications and sophisticated computer modeling, risk can be dramatically reduced.

The new sustainability standards will need to be set much higher; at a much smarter level than previously accepted, in order to reduce carbon emissions, optimize performance and enable more responsive adaptation within a fast deteriorating physical and social environment. This will be mandatory as the escalating scale of the risk becomes apparent.

At the heart of this revolution will be the powerful mathematical algorithms and intelligence capable of making optimum decisions at a far greater speed and with less human intervention. In turn this will require instant access to the Intelligent Web’s global resources of specialized knowledge, artificial intelligence and massive grid computing power.

By 2030 however, panic will be building across the globe. The safe levels of temperature rise of 2%, expected to hold until the end of the century, will likely be breached and physical and social problems will escalate.

Any realistic solution for human survival will require living and working together cooperatively and peacefully as one species on one planet, finally eliminating the enormous destruction and loss of life that wars and conflict inevitably bring.
Although cooperation on a global scale will be vital, individual nations will be tempted to free ride, as populations react with violence and anarchy to shortages of basic necessities through rising prices and inadequate infrastructure, particularly in hard-hit developing economies.

A massive mind shift will be required across the planet to achieve this level of cooperation; a more collaborative and creative process will need to evolve and quickly, harnessing all human knowledge and technological resources. To achieve this level of cooperation non-democratic states will need to democratize or be excluded from the resulting benefits and the old forms of democracy will have to be upgraded to a more inclusive and participatory level if human civilization is to avoid slow annihilation.
The stress of the human fight for survival will also present myriad ripple-on challenges relating to maintaining a cohesive social fabric. Democracy and justice are basic options, but also providing adequate levels of health, work and education will get a lot harder. This will require adaptation on a vast scale.

By 2040 the trendlines will be set and through the social media, the risks will need to be openly and clearly relayed to all populations. This will be similar to the collective discipline and mindset required many times in the past by nations threatened by the fear of war and decimation.
It will now need to be replicated on a global scale

Beyond increasing renewable energy and reducing waste, the fight for survival will require the implementation of other more radical innovations, including the eventual geo-engineering of the weather and climate. The science and technology needed to achieve such a complex outcome is unlikely to be achievable before 2050 and in the meantime our civilization may be in free fall. However it will probably be the only solution capable of reversing rather than just slowing the headlong rush to chaos.

Other radical solutions will involve the need to accelerate our level of knowledge generation. This is already taking place through advanced methods of automatic pattern analysis and algorithm discovery, applying artificial intelligence methods and the immense computational intelligence of the Web.

It will be a bootstrapping process. The faster the increase in knowledge acquisition, the more powerful the potential intelligence of the Web will become, which will then further accelerate the increase in life-saving expertise. This exponential process may be further accelerated by promoting higher levels of networked ‘swarm’ behavior, combining human intelligence on a grand scale across the planet. The benefits of collective intelligence acting like an advanced insect hive are already being realized, with research teams combining in larger and larger groups to solve more and more difficult problems. It has been demonstrated that an increase in synergy resulting from collective intelligence in complex self-organising systems allows ‘smarter’ problem solving as well as greater decision agility.
For example 50 European Universities have recently combined in the FuturICT project; an EU billion dollar flagship project to model, predict and solve future planetary and social problems. And this is only one collective project out of thousands, with increasing collaboration between US, European and Asian science and technology groups.

With all these initiatives, will Civilization 3.0 survive ?

It will likely be a very close call, dependent largely on whether our increase in beneficial knowledge can outstrip the planet’s rapid descent into environmental and social oblivion- a potential runaway pre-Venusian scenario with no end in sight.

It is similar to the Red Queen scenario in Lewis Carroll’s- Alice through the Looking Glass, in which the red chess queen has to run faster and faster just to maintain her position. Humans will also have to become smarter and smarter just to stay ahead of the approaching Armageddon.

The odds in fact will be very similar to the climate bottleneck that almost eliminated our early Homo sapien ancestors 20,000 years ago as they struggled to survive the last ice age. Only a small band of perhaps several hundred survived thousands of years of frozen hardship, finally regrouping and reaping the rewards that evolved following the great melt.

Modern humans can also can reap a future cornucopia if they have the courage and skill to survive the looming crisis in our evolution.

Many other civilisations across our universe may well have faced a similar bottleneck. Those that survived will have gone on to reap the untold riches of Civilisation 4.0 with its mastery over the physical laws governing our world and galaxy. Along the way Civilisation 5.0 will emerge, possessing not only the immense scientific capability needed to solve any physical problem, but enough wisdom to avoid future social catastrophes.

The stakes couldn’t be higher. The Japanese catastrophe and many others, including the Indian Ocean earthquake and tsunami in 2004 leaving 300,000 dead, should have given us all a clarion call.
This is not a bad dream, from which we’ll all awake tomorrow with business as usual. The future of Civilisation 3.0 and our unique intelligent life-form really is in the balance. Let us hope ours will be one of the few or perhaps the only advanced civilisation to have survived such a test, so that our children and our children’s children can live to experience the untold wonders of our planet and universe.

But the Red Queen will have to run very fast indeed.