Friday, December 23, 2011

The Future of Science

The author argues that a new paradigm for science is emerging in the early 21st century, driven by the larger process of Meta-Evolution and triggered by a number of global social developments including- Citizen Science, Open Science, Networked Science, AI-based and Computational Science.

The Knowledge Discovery process is exploding and combined with new levels of science communication and education is bringing the potential social ramifications and benefits into sharper focus for the general population.
There appears to be no end in sight to the process of knowledge discovery, with each new insight interweaving with and catalysing others.

The Evolution of Science
‘Science’ is not an object able to be easily defined, but a multilayered process and framework encompassing the causal understanding and quantification of relationships that have enabled life and human civilisation to develop.
But always underpinning it, is a fiercely guarded reputation for veracity and rigor. The methodology that has built this reputation over the last five centuries is ‘the scientific method’.
The scientific method provides a rigorous and logical means of establishing causal relationships between phenomena, based on inductive as well as deductive logic and experimental evidence.
Today we literally live or die by its outcomes; whether in the shape of an aircraft wing that enables us to travel safely through air, the reliability of the electric motor controlling a skyscraper lift, the sterilisation process used to keep food fresh and vaccines that have saved millions of lives.
This explosion of knowledge over the past millennia has created a rich melting pot from which society extracts a continuous stream of invention and benefits related to all aspects of technology - transport, energy, communications, agriculture, medicine etc. Each strand is ultimately co-dependent on all others and in fact it can be argued that the evolution of civilisation is contingent on the accretion and blending of all strands of prior knowledge.

Science Mark 1.0 began a long time ago with the basic adaptive behavioural reward system of all life, eventually evolving in hominids into a more formalised trial and error process, underpinning attempts to discover survival strategies. Examples include testing different plants to discover remedies for illnesses or processes for improving edibility. Also testing forms of social interaction to achieve more effective decision making to maximise group rewards and mediate resource sharing and refining the tools and techniques needed for hunting, food preparation and construction of dwellings.

A greater understanding of the cause and effect of natural phenomena based on this substrate of trial and error emerged from sifting data for useful patterns and inferring relationships through the use of classification and prediction techniques, eventually evolving into today’s sophisticated scientific tools.

Finally with the major scientific breakthroughs of the 16th and 17th Centuries, including the formulation of the laws of mechanics linking motion, mass, force and acceleration together with the sophisticated analytic machinery of calculus, the insight was finally gained of the abstract underlying mechanism- the Scientific Method. This was enunciated, defined and applied by Newton, Bacon, Descartes and others.

The scientific method is therefore a framework or process for extracting patterns and relationships about the universe in which we exist. It is also an open system in itself, capable of adopting new modalities when required and upgrading its structure to achieve greater compatibility within its own environment- the world of knowledge. Science and technology are ultimately the outcomes of the scientific method and all branches are closely intertwined.

With the new power of the scientific method emerged the idea of Universal laws, including the realisation that this involved the formulation of quantitative as opposed to qualitative relationships. This basic quantitative process for testing and refining knowledge has remained virtually the same ever since. The changes that have occurred, relate primarily to new developments in the tools for data gathering, analysis, measurement, applying logic and the testing of hypotheses.

But the scientific method requires first and foremost that a theory be predictive; that it must mirror reality to the point where real phenomena can be calculated with reasonable accuracy, such as the motions of the planets and stars. This predictive principle was tested by experimental means, primarily by early navigators and astronomers.

In 1609 Galileo made a telescope through which he observed the four moons of Jupiter and confirmed the Copernican Heliocentric theory. The nature and cause of these motions was however still a mystery until in 1643 Isaac Newton utilised this knowledge in constructing the law of gravity, demonstrating that the equations encapsulating the mechanics of the sky were the same as the laws of mechanics on earth.

There was also a new perception of the larger relationship of the earth to the Cosmos. Ptolemy's geocentric model of the heavens, derived from Greek cosmogonies eighteen centuries earlier, of the stars revolving in a fixed sphere around the earth, had been displaced by the Copernican heliocentric model. The earth was now not at the centre of the universe, but revolved around the sun.

This was a unification of the underlying principles governing motion in the universe, demonstrating the capacity of the scientific method of knowledge discovery to deliver results of great truth and beauty.
Science Mark 2.0 emerged during the 19th and early 20th centuries, with the formulation of the great unifying laws of Electromagnetism, Thermodynamics, Relativity, Quantum Mechanics and Darwinian Evolution. By providing a new understanding of the molecular, atomic and sub-atomic view of forces and matter, Science 2.0 opened a cornucopia of modern advances in all branches of science resulting in an explosion of technological advances including- information technology, electronics, energy, genetics, medicine, economics, engineering, agriculture and construction.

Grand Unified theories such as the Standard Model, String Theory and Loop Quantum Gravity followed in quick succession in the latter part of the of the 20th century, paving the way for Science 3.0.
Now Science Mark 3.0 is emerging in a way which will dwarf what came before. This new scientific paradigm comes with a greater understanding of the sciences at the information and sub-atomic level with the additional understanding of the reintegration of the physical and social disciplines.
Major EU ICT driven flagship projects such as FuturICT and Paradiso are also predicated on this ‘interweaving’ principle, cross correlating multidisciplinary projects and combining information from both the social and physical sciences.

But in addition to this powerful methodology, humans have gained an ally in the scientific endeavour via the enormous computational and social power of the Web. Combined with the mind power of its soon to be 3 billion human acolytes, it has the potential to offer far more than a passive repository of human knowledge. It is also capable of discovering patterns in vast quantities of data, developing new theorems and algorithms and even unleashing the power of human creative thought and is likely to emerge as a senior science partner with humans as early as 2020.

As the author has noted in previous blogs, this has the potential to create a super-organism with unparalleled problem-solving capacity; a global mind more powerful than any previous group of scientists.
And just in time, as global warming and extreme weather events, combined with the spread of new diseases and critical shortages of food and fresh water, threaten the very fabric of society and the survival of our planet.

The major social drivers powering the birth of Science 3.0 include Citizen, Open, Networked, AI based and Computational Science.

Citizen Science-
The emergence of the Citizen Scientist is just beginning and it might be useful to start by distinguishing between those involved as science insiders and outsiders or specialists and non-specialists, rather than professionals and amateurs. The line between professional and amateur science will become increasingly blurred, likely disappearing within the next twenty years, as the value that an individual contributes to knowledge discovery rather than a one-size-fits-all qualification, becomes the key criteria for recognition. With the heavy computational lifting largely done automatically by the Web, science will take on a more egalitarian flavour.

The contribution of the latter day non-specialist to science is well documented, including a lead role in the following disciplines - Paleontology- contributions to fossil discovery; Astronomy- discovery of new supernovas, comets, meteors and even planets; Mathematics- such as the immensely important statistical Bayes Theorem; Biology¬- the discovery of new species of both plants and animals; Archaeology - locating new sites with the help of Google Maps and the mapping and classification of rock art motifs; Ecology- Indigenous populations contributing to the understanding of the deep ecology of their environment.

Now this symbiosis has been taken to a new level through- Gaming, Crowdsourcing and Volunteering , expanding the support and creativity of relative outsiders in a variety of disciplines. This involves enlisting both professionals and non-professionals to help solve complex scientific problems, not just through extra computing power such as the SETI project, but through the power of many minds working in tandem with computers and the Web.

The application of games to problem-solving is a technique in which many non-professionals excel and harnessing this mind power is a relatively new variation of citizen science. For example-
Phylo is a game that allows users to contribute to the science of genetics by aligning sequences of dna, rna and proteins to find functional similarities and to learn how they have evolved over time. Humans are better at solving such visual puzzles than computers and Phylo represents such molecular groups by the alignment of vertical coloured pieces on a screen. There are currently 16,000 registered users working to solve such puzzles as well as a Facebook group to suggest Phylo improvements.

Foldit is a protein folding game, capable of solving puzzles that have challenged professional scientists for years such as the optimum folding patterns of chains of amino acid that make up the building blocks of enzymes and proteins and cracking the code of how an enzyme of an AIDS-like virus is created. It took the gamers only three weeks to create an accurate model of the solution.

Also 40,000 registered users of the game Planet Hunters, have identified 69 potential new planets from data retrieved from NASA’s Kepler Space Telescope, to find habitable planets outside earth’s solar system.
A mix of professional and crowd sourced volunteer astronomers also helped with survey observations such as those coordinated by ESA’s space hazards team, and have recently found an asteroid that could pose an impact threat to earth-2011 SF108. The project requires visual image evaluation by humans beyond automated computer analysis. Currently a survey of 300,000 spiral galaxies is also underway using superior human image discrimination to determine those with bars.

This cornucopia of knowledge provided by science outsiders is in most cases no less rigorous or ‘scientific’ than that ‘discovered’ by insiders or professionals in a multimillion dollar laboratory or observatory. Today the technological gap between citizen and traditional science is also closing in several major ways, particularly in relation to the instrumentation and techniques now available to the outsider. These include- powerful computers, publicly available data sets and algorithms, Google maps, telemetry of all types, virtual reality simulations, virtual telescopes and observatories and inexpensive desktop laboratories- all harnessing the power of the Web.

The analogy is the ease with which computing amateurs can now use professional tools and templates to translate creative ideas into professional quality end-user applications.

Networked Science-
Using specialist online tools, massive research cooperation through the power of the Web is becoming a reality. Network science projects can amplify collective intelligence, linking relevant disciplines and skills to best solve problems across institutions and nations; activating latent expertise and dramatically speeding up the rate of discovery across all sciences.

Remote interdisciplinary collaborations- or virtual research teams, are also proving invaluable in the knowledge generation process. STEM - Science Technology Engineering and Maths is one such integrated field. Specialists meet at a number of university centres to tackle problems in areas of national interest such as sustainability, neuroscience and systems biology, piecing together a narrative between disconnected information databases.

In ecology, the Long Term Ecological Research network composed of 26 research sites has been operating for 30 years. Now a group of PhD students have set up a network of small scale experiments- The Nutrient Network, to understand influences on the structure of grasslands, with scientists volunteering their services at 68 sites in 12 countries, without the need for major grants. This is an example of big science being carried out on a shoestring budget through global networks volunteer scientists.
As we as a species begin to take on the characteristics of a super-organism through integration with the Web, the level of such networked problem-solving power will increase exponentially. As underlying laws and patterns are discovered the knowledge is also increasingly converted into algorithms- that can automatically fly a jet aircraft, analyse genome sequences in minutes, diagnose illnesses or run a chemical factory.

Open Science-
Science is increasingly seen as a public enterprise, not a separate world mediated by remote experts. Institutions such as the British Royal Society are playing a leading role in publicly advocating the disclosure and sharing of scientific information.
In addition science communication and education is booming, with knowledge readily available not only through a host of popular science publications such as New Scientist and Scientific American, but through Google and thousands of free sites such as Wikipedia and ScienceDaily.
Publication of professional science research is also flexing up, allowing more direct access channels to the public most notably arXiv, outside the peer reviewed journals. But the top-down attitude of many scientists still prevails- that the scientific process must only involve professional scientists and that societal implications should be communicated to the public and policy makers as forgone expert conclusions open to minimal public debate. But the critical scientific decisions now faced by humans are also part of the fabric of a democratic society and therefore must be based on the free and open flow of information.

Open Science also manifests in more ethical science. There has been enormous pressure on researchers for positive results and a steady decline in published studies where findings have contradicted current scientific hypotheses. Negative findings attract fewer readers so scientific journals tend to reject these more often.

Government Grant Agencies such as the NSF and NIH also need to become more flexible and work with scientists to develop more open sharing modalities of knowledge discovered with public support, by encouraging scientists to submit findings in alternate ways beyond published papers. These include computational visualisations and popular science books. Also to actively participate with the public in solving endemic social problems such as climate change, conflict and food capacity.

Artificial Intelligence-based Science-
There are many AI techniques in use today including- fuzzy logic, neural networks, decision networks and swarm intelligence.
But the most powerful of all is based on the generic version of the evolutionary process itself- the Evolutionary or Genetic Algorithm- EA.

Although this technique has been widely used for over ten years to optimise and discover design solutions, it has reached a new level recently in the form of easy to use software tools such as Eureqa.
Eureqa is being applied to search not just for new patterns, but for new laws of science. This is achieved by repeatedly combining and testing simple mathematical expressions to create equations, selecting those that best reflect reality. But to achieve theorem discovery it must also apply the property of invariance to the laws of nature by describing how the system evolves over time.

Eureqa is an ideal tool for researching systems that are too complicated to follow set rules and has already been used to create time tables, aircraft wing design, network topology, financial forecasting and particle physics simulations.
But a recent landmark example of theorem proving was the rediscovery of the Law of Conservation of Energy from scratch, which originally consumed the intellectual energy of many scientists hundreds of years to discover. It took Eureqa, using a two arm pendulum, only one day.

But its significance goes well beyond this. It is being applied to discover new theorems increasingly beyond the cognitive capability of its human counterparts- beyond the limits of human knowledge. And it also has the potential to alter the balance between insiders and outsiders, because it empowers both groups. Eureqa can discover new patterns and relationships that the professional can then build on to prove new theories. The citizen scientists can do the same thing without the need to work in a traditional science framework- to move from analogical conception to solution, just like the gamers of the Foldit project.
It was recently applied to determine the existence of a deep law that controls cell differentiation. It produced a biological law of invariance equivalent to the conservation law in physics. It is also being applied to understand the decision network of a bacterium as it changes into a spore to determine which factors switch on genes and which genes control others, involving a huge network of probabilistic interactions between biomolecules.

Equally, Eureqa could and will be applied in the future to all complex scientific disciplines- economics, biology, social sciences and climate science and even perhaps to solving the universal Theory of Everything. The combination of descendants of the Web and Eureqa could achieve this within the next decade.

AI based science is an extension of the scientific method- but not essentially different. Evidence-output from an EA generated hypothesis is tested against a reality based goal to check how accurately it reflects this reality. Each new generation is therefore a variation of the original hypothesis.

Computational Science

The process of scientific research is expected to change more fundamentally over the next three years than the previous three hundred years. The epoch during which individual humans are able to conceptualise or understand increasingly complex phenomena could be coming to an end, for example as applied to the human genome, even though the power of computational visualisation is extending this capability. There are just too many interacting variables for scientists to get an intuitive understanding of the potential interactions of cells with millions of chemical outcomes.

With Big Data now a fact of life in all disciplines, combined with a discovery program such as Eureqa, 90% of the traditional work can now be done by the Web.

Mathematics- the science of patterns in number systems, abstract shapes, and transformations between mathematical objects, requires additional capabilities beyond the human brain to complement our human intuition, for example to allow exploration of the properties of trillions of prime numbers.

Meta-Evolution and the Rise of Science 3.0-

Great scientists throughout history have nurtured the ‘creative gene’ which allowed them to soar to new heights and great new conceptual horizons, which they then backed with rigorous mathematics and experimentation. Emulating the ‘creative gene’ of humans could be the next big step for the Web. But regardless, the discovery process is due to accelerate, as the global mind expands and each new insight catalyses countless others.

As previously defined, the scientific method is recognised as the primary knowledge generator of modern civilisation. Through the rigorous process of verification, applying both inductive and deductive logic, causal relationships governing physical phenomena are defined and tested in the form of hypotheses and models.

In fact it bears a striking similarity to the larger process of evolution itself and the author maintains is a subset of that process. The difference is one of complexity and scope. Evolution is a universal generic process of adaptation and optimisation, which selects and amplifies the most appropriate response of a system to its environment. The scientific method is a meta-method or methodology aimed at selecting the most appropriate theory or causal model within an experimental environment.

The key to better understanding the scientific method, is that it is not a single process but a number of sub-processes, which are continually evolving. These processes include- initial conceptual insights, general hypothesis formulation, experimental framework development, data access and capture, pattern analysis of information and evidence extraction, hypothesis modelling based on inference logic, algorithmic formulation, experimentation, testing and validation of procedures, hypothesis optimisation and refinement.

Each of these sub-processes within the overall framework of the method is therefore evolving as scientific advances continue. The scientific method was therefore not invented or discovered in its present form, but evolved and continues to evolve, using processes common to all adaptive learning. Therefore it is logical that deeper understanding of evolution will also provide a deeper understanding the future of science.
The author believes the two are intimately connected. A greater understanding of the evolutionary nature of the scientific method will lead to an acceleration of the process of new theory discovery, which in turn will accelerate the broader process of evolutionary knowledge discovery.

There is no end to this process. All current major theories, including Relativity and the Standard Model of Quantum Physics, are constantly being optimised, refined and sometimes radically reinterpreted.
In his book- The Future of Life: A Unified theory of Evolution, the author has predicted the emergence of an imminent phase change in the evolutionary process- Meta-evolution or Evolution Mark 2.0.

As human society begins to achieve a deeper and more intimate understanding of the primary evolutionary imperative shaping its destiny, he predicts this new form of evolution will emerge, accelerating the already exponential pace of change to an extreme level.
The process will manifest in response to a deeper understanding by society of the implications of the evolutionary process itself. The resulting amplifying feedback will generate a glimpse of life’s true potential, accelerating the already massive momentum of evolution and its scientific counterpart, resulting in a rate of change that is forecast by a number of scientists and forecasters to reach hyper-exponential levels in the near future.

In addition Meta-Evolution will be reflected in the evolutionary future of Science itself. Science is already on the path to escaping its current institutional shackles, through the revolution in Citizen Science, Open, AI and Networked Science; harnessing the creative realignment of the physical and social sciences and the future Web 4.0’s massive computational intelligence. Combined with Meta-evolution this phase will attain astounding outcomes and benefits for society by mid-century and provide possible redemption for the gross exploitation of our planet. The evidence is already visible in the genesis of a starburst of major projects such as FuturICT and Paradiso ICT.

This will be none too soon as mentioned, as the problems of global warming, ecological disasters, sustainability, economic collapse and endemic conflict ravage the planet. Meeting these challenges will involve active adaptation through problem solving – the heart of the evolutionary process.

Already an awareness of the pervasive and rapidly accelerating power of evolution is beginning to be felt through the enormous scientific, technological and social advances in our civilisation. This insight creates the evolutionary feedback loop- to actively engage evolution in helping meet today’s complex survival challenges. This further accelerates the knowledge discovery process, which in turn would generate further evolutionary insight and application.

A significant additional impetus would therefore be gained from a deeper understanding of the driving role of the evolutionary paradigm- a global awareness of the engine underlying life's progress. This would eventually create an explosive realisation of life’s future potential,
As this rate of knowledge acquisition increases to the point of incompatibility with the human capacity to absorb it, new social structures and modes of cognitive processing based on artificial intelligence techniques will emerge to help humans cope.

According to the author, this is already occurring. Even as the amount of information expands beyond human horizons, we are developing techniques to bring it under control. Like a fractal image, cybernetic life forms and intelligent machines are evolving in the same way as biological life- mutating to become increasingly intelligent. These act as proxies for humans, managing complex processes and roaming cyber-space- searching, filtering and processing data from an already overwhelming pool.

Hyper/Meta-evolution can be expected to become a part of a new global paradigm within the next twenty years- 2030, based on current rates of knowledge growth and coinciding with the evolution of the super-intelligent Web 4.0. This will rapidly transform all aspects of our culture and civilisation, including accelerating the rise of Science 3.0.

Saturday, December 17, 2011

The Future of Civilisation 3.0

The author argues that the combination of the recent triple major disaster in Japan, the GFC and looming world recession, together with the Arab Spring and global Occupy movement, have provided the final triggers for the rapid evolution of Civilization 3.0 in the fight for human survival.

These world-shaking events send a timely message to the rest of the world, that the form of civilisation and the social norms we have become accustomed to and lived by over the last few centuries, is over.

Civilisation 1.0 began over 15,000 years ago with the founding of the earliest settlements and villages around the world, as hunter gatherers settled down to take advantage of the rich sources of edible grasses and natural foods growing mainly around the fertile delta areas of the great rivers and coastal areas of the world. These early habitats evolved into the first towns, cities and eventually nation states. At the same time, the first writing and number systems evolved to keep track of the products and services that developed and were traded by modern humans. Also with the growth of towns and trade across the world and the use of wood for building and smelting, the clearing of the forests across Europe began.

Civilisation 2.0 then emerged, with more sophisticated means of production using wind and water; rapidly accelerating following the industrial revolution in the 18th century with the harnessing of steam and later electricity and combustion engine power. These innovations were dependent on the burning of fossil fuels- coal and oil on a massive scale, allowing the West to steal a march on the rest of the world; colonising its populations and exploiting its wealth.
The manufacture of goods and services then increased on a massive scale; everything from food, textiles, furniture, automobiles, skyscrapers, guns and railway tracks- anything requiring the use of steel, cement or timber for its production.

The major cities expanded on the same original settlement sites as Civilization 1.0 - coastal ports, river delta fertile flood plains, regardless of the risk from subsidence and earthquakes.
The energy revolution was rapidly followed by the communications and information revolution- grid power, telephone, wireless, radio, television and eventually computers. Later, nuclear power was added to the mix.

Then in the second half of the 20th century the realization finally dawned that the planet’s resources really were finite, following the many dire predictions for decades previously.
Now at the current rate of consumption by a global population of 7 billion, projected to grow to 9 billion by mid-century, combined with the Armageddon of global warming, the planet is rapidly running out of fresh water, food, and oil. At the same time the grim effects of the escalating levels of carbon in the oceans and atmosphere has triggered more frequent and severe weather events- major droughts, floods and storms, adding to the impact of earthquakes in highly populated urban areas.

These problems are now bigger than any one nation can handle and can only be effectively addressed on a global basis. This will inevitably need to be coupled to a higher level of social awareness and democratic governance, in which everyone, not just politicians, are involved in the key decision processes affecting the planet.

Welcome to Civilisation 3.0.

Civilization 3.0 is just beginning, but is already being tested. From now through the rest of this century comes the hard part. Tinkering around the edges won’t do it for the planet and its life- including humans, any longer.
The planet’s climate is already in the throes of runaway warming, regardless of what forces caused it, because of the built-in feedback processes from the melting of the ice sheets in Greenland and Antarctica, to the release of huge methane reserves in the northern tundra and ocean floor.

But this is just the start of our problems.
Some areas may get a short term reprieve with local cooling, but overall the heating process appears to be unstoppable. The Faustian bargain that humans struck to establish Civilisation 1.0 and 2.0, when the planet was teeming with natural resources, is about to be redeemed. Humans are being called to account.

By 2020 the cost of solar, wind and biofuels is likely to be at a baseline level comparable to that of fossils fuels, due to major technological advances currently underway, such as artificial photosynthesis. But because of the flawed democratic process, major businesses and corrupt governments can still undermine the critical mindset needed for radical change, with calls for short term profits drowning out the desperate call by future generations for long term survival. It is therefore highly likely that we will still be emitting copious amount of carbon by 2020 and starting to exceed the safe limits of temperature rise.
In addition, supplies of fresh food and water, particularly in developing countries are already dwindling, with the potential to create further malnutrition and conflict.

So Civilisation 3.0 has to get serious.

One of the major recent initiatives at the heart of the fight-back revolution is the concept of a smarter planet. The Japanese experience has now reinforced that concept. Every built object and operational process will eventually need to be embedded with sensors and its performance and integrity continuously monitored and assessed in relation to natural disasters and sustainability.

Everything from roads, transport, bridges, railways, buildings, dams, power plants, grids and information systems, as well as human knowledge and skill capacities, will need to be urgently upgraded. Even towns and cities will have to be redesigned to avoid future worst case natural and manmade disasters and provide a more sustainable living space for future generations.
In addition, the loss of critical ecosystems and species will compound the infrastructure problems of the planet, requiring re-prioritization of the value of the natural environment and fair re-allocation of its resources on a global scale.

The current level of risk and waste in the built environment is now seen as both unacceptable and avoidable. By applying new technologies already available such as smarter materials, safer engineering methods, improved communications and sophisticated computer modeling, risk can be dramatically reduced.

The new sustainability standards will need to be set much higher; at a much smarter level than previously accepted, in order to reduce carbon emissions, optimize performance and enable more responsive adaptation within a fast deteriorating physical and social environment. This will be mandatory as the escalating scale of the risk becomes apparent.

At the heart of this revolution will be the powerful mathematical algorithms and intelligence capable of making optimum decisions at a far greater speed and with less human intervention. In turn this will require instant access to the Intelligent Web’s global resources of specialized knowledge, artificial intelligence and massive grid computing power.

By 2030 however, panic will be building across the globe. The safe levels of temperature rise of 2%, expected to hold until the end of the century, will likely be breached and physical and social problems will escalate.

Any realistic solution for human survival will require living and working together cooperatively and peacefully as one species on one planet, finally eliminating the enormous destruction and loss of life that wars and conflict inevitably bring.
Although cooperation on a global scale will be vital, individual nations will be tempted to free ride, as populations react with violence and anarchy to shortages of basic necessities through rising prices and inadequate infrastructure, particularly in hard-hit developing economies.

A massive mind shift will be required across the planet to achieve this level of cooperation; a more collaborative and creative process will need to evolve and quickly, harnessing all human knowledge and technological resources. To achieve this level of cooperation non-democratic states will need to democratize or be excluded from the resulting benefits and the old forms of democracy will have to be upgraded to a more inclusive and participatory level if human civilization is to avoid slow annihilation.
The stress of the human fight for survival will also present myriad ripple-on challenges relating to maintaining a cohesive social fabric. Democracy and justice are basic options, but also providing adequate levels of health, work and education will get a lot harder. This will require adaptation on a vast scale.

By 2040 the trendlines will be set and through the social media, the risks will need to be openly and clearly relayed to all populations. This will be similar to the collective discipline and mindset required many times in the past by nations threatened by the fear of war and decimation.
It will now need to be replicated on a global scale

Beyond increasing renewable energy and reducing waste, the fight for survival will require the implementation of other more radical innovations, including the eventual geo-engineering of the weather and climate. The science and technology needed to achieve such a complex outcome is unlikely to be achievable before 2050 and in the meantime our civilization may be in free fall. However it will probably be the only solution capable of reversing rather than just slowing the headlong rush to chaos.

Other radical solutions will involve the need to accelerate our level of knowledge generation. This is already taking place through advanced methods of automatic pattern analysis and algorithm discovery, applying artificial intelligence methods and the immense computational intelligence of the Web.

It will be a bootstrapping process. The faster the increase in knowledge acquisition, the more powerful the potential intelligence of the Web will become, which will then further accelerate the increase in life-saving expertise. This exponential process may be further accelerated by promoting higher levels of networked ‘swarm’ behavior, combining human intelligence on a grand scale across the planet. The benefits of collective intelligence acting like an advanced insect hive are already being realized, with research teams combining in larger and larger groups to solve more and more difficult problems. It has been demonstrated that an increase in synergy resulting from collective intelligence in complex self-organising systems allows ‘smarter’ problem solving as well as greater decision agility.
For example 50 European Universities have recently combined in the FuturICT project; an EU billion dollar flagship project to model, predict and solve future planetary and social problems. And this is only one collective project out of thousands, with increasing collaboration between US, European and Asian science and technology groups.

With all these initiatives, will Civilization 3.0 survive ?

It will likely be a very close call, dependent largely on whether our increase in beneficial knowledge can outstrip the planet’s rapid descent into environmental and social oblivion- a potential runaway pre-Venusian scenario with no end in sight.

It is similar to the Red Queen scenario in Lewis Carroll’s- Alice through the Looking Glass, in which the red chess queen has to run faster and faster just to maintain her position. Humans will also have to become smarter and smarter just to stay ahead of the approaching Armageddon.

The odds in fact will be very similar to the climate bottleneck that almost eliminated our early Homo sapien ancestors 20,000 years ago as they struggled to survive the last ice age. Only a small band of perhaps several hundred survived thousands of years of frozen hardship, finally regrouping and reaping the rewards that evolved following the great melt.

Modern humans can also can reap a future cornucopia if they have the courage and skill to survive the looming crisis in our evolution.

Many other civilisations across our universe may well have faced a similar bottleneck. Those that survived will have gone on to reap the untold riches of Civilisation 4.0 with its mastery over the physical laws governing our world and galaxy. Along the way Civilisation 5.0 will emerge, possessing not only the immense scientific capability needed to solve any physical problem, but enough wisdom to avoid future social catastrophes.

The stakes couldn’t be higher. The Japanese catastrophe and many others, including the Indian Ocean earthquake and tsunami in 2004 leaving 300,000 dead, should have given us all a clarion call.
This is not a bad dream, from which we’ll all awake tomorrow with business as usual. The future of Civilisation 3.0 and our unique intelligent life-form really is in the balance. Let us hope ours will be one of the few or perhaps the only advanced civilisation to have survived such a test, so that our children and our children’s children can live to experience the untold wonders of our planet and universe.

But the Red Queen will have to run very fast indeed.

Friday, October 14, 2011

The Future of Diplomacy

The author argues that the current model of Diplomacy is out of synch with the new participatory model of democracy in the 21st century and needs to be replaced with a more inclusive vision.

The age-old art of Diplomacy was never going to cut it in the 21st cyber century.

Let’s take a closer look at the heart and soul of the traditional model of Diplomacy.

According to Wikipedia, Diplomacy is based on the art and practice of conducting negotiations between representatives of groups or states, including the conduct of international relations through the intercession of professional diplomats, usually relating to matters of peace-making, trade, war, economics, culture, the environment and human rights; with treaties usually negotiated by diplomats prior to endorsement by national politicians.

It also employs a number of techniques to gain a strategic advantage or find mutually acceptable solutions to a common challenge. This is basically a process of vetting, exchanging and assessing information with the overall aim of extracting an advantage by the major players involved. One might sum this up in the modern day context as the informal application of Game Theory as applied to the social sciences.

Diplomacy therefore acts as more of a back channel to the day to day negotiations carried out between politicians, expert advisers and bureaucrats and still operates largely opaquely within the public domain.
Until its European beginnings in Greece, such behind the scenes negotiations were considered beyond the remit of the general population and were conducted largely in secret. The criteria for deciding this cloaking of information interplay were inevitably fuzzy, depending on the perceived political sensitivities of the issues involved. This fuzziness continues today, ranging from outright censorship and obfuscation, to qualified and partial disclosure on a need-to-know basis, to the open and transparent release of all relevant information, albeit governed by a minimum time limit between an event and its disclosure; the rationale being that past decisions with hindsight may prove either embarrassing or adversely impact ongoing resolution.

Of course the semantics of diplomacy have always been fuzzy - as indeterminate in fact as a government finds expedient. And it is argued that this is an inescapable fact of life because human behavior and politics are not precise sciences, but based to a large degree on the shifting sands of perception, uncertainty and personal bias.
Gleaning reliable information at a dinner party on the state of trade, build- up of arms or a county’s UN voting intentions is increasingly unlikely to yield any information of significant value, as the contents of CableGate has demonstrated. Such information, until recently, was supposedly kept safe from public or competitive gaze in encrypted government cyber vaults. Now however the credibility of this scenario is in tatters as ‘state secrets’ are increasingly being disinterred for public consumption by whistle blower organisations such as WikiLeaks. But the sky hasn’t fallen in – primarily because of the limited relevance of this ‘top secret’ data menagerie.

But the primary reason for the rapidly approaching use-by date of diplomacy in its current form is its confliction with the increasing transparency of modern democracy.

At the end of the 20th century 119 of the world’s 192 nations were declared electoral democracies. In the current 2st century, democracy continues to spread throughout Africa and Asia and significantly also the Middle East, with over 130 states in various stages of democratic evolution.
Democracy is supposedly based on the principle that populations should have access to the reasoning behind national policy so as to better participate in the decisions that affect their lives. This principle should therefore extend to the practice and protocols of diplomacy as it provides part of the input to the decision-making process.

In other words, democracy although imperfect, should offer each individual a stake in the nation’s collective decision outcomes and on that basis, behind the scenes diplomatic maneuvering can act to subvert that process. For example 80% of the UK population was against involvement in the Iraq war. At the same time the UK Prime Minister Tony Blair was actively working behind the scenes with the US government to impose his own view and counter the groundswell of public opinion. The same counter-democratic process occurs over and over again on critical issues affecting government policy throughout the world, legitimising the covert and often regressive nature of diplomacy.

But democracy, as with all other processes engineered by human civilisation, is still evolving. A number of indicators are pointing to a major leap forward, encompassing a more public participatory form of model, which harnesses the expert computational intelligence of the Web.
By the middle of the 21st century, such a global version of the democratic process will be largely in place.

However the cloak and dagger practices still embedded in standard model of diplomacy is putting it at odds with this more open participatory model. The business of diplomacy is therefore increasingly out of synch with the modernisation of democracy: and increasingly this gap is widening and the rumblings of disquiet are becoming louder.
The evolution of democracy can also be seen in terms of improved human rights. The United Nations Universal Declaration of Human Rights and several ensuing legal treaties, define political, cultural and economic rights as well as the rights of women, children, ethnic groups and religions. This declaration is intended to create a global safety net of rights applicable to all peoples everywhere, with no exceptions. It also recognises the principle of the subordination of national sovereignty to the universality of human rights; the dignity and worth of human life beyond the jurisdiction of any State.

Diplomacy has traditionally soft pedaled on such human rights issues in private forums, but it is increasingly seen as the wrong approach and the wrong forum. Issues relating to democratic rights are best delivered in the open forums of the UN or other public institutions open to the media. Even if diplomacy is restricted to pre-vote lobbying, the modus operandi should be made transparent to avoid the potential for a conflict of interest and corruption that has often surfaced for example in Olympic and world sporting venue lobbying.

The spread of democracy is now also irreversibly linked to the new cooperative globalisation model. The EU, despite its growing pains, provides a compelling template; complementing national decisions in the supra-national interest at the commercial, financial, legal, health and research sharing level, openly within the European parliament. While lobbying pressure from voting blocs is still commonly applied, the political and philosophical bias of the lobbying groups is in most cases transparent.

The raison d’etre of diplomacy therefore is to oil the decision wheels of democracy and as such the basis of its operation should be equally available to all members of a democratic state. Diplomats as well as politicians have party allegiances and obligations that can and do create serious conflicts of interest and skew the best of democratic intentions
The global spread of new technology and knowledge not only provides the opportunity for developing countries to gain a quantum leap in material wellbeing, but is an essential prerequisite for a stable democracy, limiting the value of traditional back channel diplomacy. Such cyber-based advances therefore presage a much more interactive and open public form of democracy and mark the next phase in its ongoing evolution.

Web 2.0’s social networking, blogging, messaging and video services have already significantly changed the way people discuss political issues and exchange ideas beyond national boundaries or political controls. In addition a number of popular sites exist as forums to actively harness individual opinions and encourage debate about contentious topics, funneling them to the political process. These are often coupled with online petitions, allowing the public to deliver requests to Government and receive a committed response. As this back channel explodes it leaves little space for former closed room hearsay.

In an age of Google and satellites, no information is sacred- not even the site of a nuclear facility in North Korea, a logging fire in the Amazon or an aircraft carrier in the South China Sea. There are also a plethora of specialized smart search engines and analytical tools aimed at locating and interpreting information about divisive and complex topics such as global warming and stem cell advances. These are increasingly linked to Argumentation frameworks and Game theory, aimed at supporting the logical basis of arguments, negotiation and other structured forms of group decision-making. New logic and statistical tools can also provide inference and evaluation mechanisms to better assess the evidence for a particular hypothesis.

By 2030 it is likely that such ‘intelligence-based’ algorithms will be capable of automating the analysis and advice provided to politicians, at a similar level of quality and expertise to that offered by the best human advisers, including diplomats.

It might be argued that there is still a need for the role for foreign affairs apparatchiks and diplomatic staff in lobbying, interpreting, nuancing and promoting the national interest on behalf of their political masters. But this is also rapidly becoming redundant in the modern networked world where every policy decision, utterance, and body language shift among political leaders is recorded, analysed and spread instantly around the world via the social media.

Decisions are being made increasingly in real time and the reasons behind them are also exposed and discussed around the world in real time. The days of closed clandestine meetings are over. News feeds and aggregation, blog and video sites ravenously ingest and recycle every piece of political and economic information captured by smart phones and citizen journalists for popular consumption on an endless 24/7 cycle.

The main argument for maintaining the diplomatic status quo is that making such social substrate information generally available might act against the ‘national Interest’ by providing a non-complying country with a significant negotiating advantage. In other words, not all sovereign states might be equally willing to share their inside knowledge with others.
This of course this is self-serving nonsense.

The National Interest is almost always best served by open discussion and debate within its population and peers, with all information cards on the table, leading to better decision-making whether relating to welfare or war as well as making the policy makers more accountable. The record shows that the opposite almost invariably leads to corruption as well as bad decision-making skewed by political conflicts of interest. Those that don’t comply with future global ethical standards in political or financial dealings, as already with corruption, will pay a heavy price in lost investment and prestige in regional and global forums and accelerate the scourge of global cyber-war.

We are also accelerating towards a globalised society in which all major decisions by any one state will affect all others, whether relating to global warming, financial governance or technology advances. In such a world there can be no free riders. The decisions of every nation will be scrutinized continuously to ensure conformance with global governance norms. The stakes will be too high and the interests too interrelated for it to be any other way.

Information is increasingly spread equally around the world by the ubiquitous power of the web invading traditional walled gardens, using a variety of tools such as video phones, Google Maps and citizen journalism; with the results spread globally within seconds by social media sites. Information captured in this way is far more relevant than recycling snippets of ‘who said what’ at a late night bar.

Of course this shift to the general public’s involvement in the diplomatic process is strongly objected to by the thousands of bureaucrats and technocrats in the diplomacy industry, threatening the mystique of their raison d’etre. But inevitably their future role is more likely to be restricted to implementing policy – not creating or influencing it.
As the expert knowledge and expertise from the 7 billion minds connected to the world’s continuously updated storehouse of knowledge on Web 4.0 begins to permeate all levels of society, the pervasive influence of yesterday’s diplomatic overtures will finally dwindle away.

The recent exposure of gigabytes of leaked diplomatic cables and emails, like the Pentagon Papers before it, is an expression of the fault lines between the old and new worlds of diplomacy. WikiLeaks however is also an expression of the power of real democracy at work, where the connivances and shallowness of the substratum of diplomatic knowledge has been excavated and the Emperor has been found to be stark naked. What it exposed was a mountain of largely banal and uncomplimentary cross-talk relating to foreign governments and their machinations, with little real substance. The exception was video footage of a number of US atrocities including the cold blooded gunning down of Reuters correspondents and many other obscenities that had previously been covered up in the name of maintaining diplomatic relations.

Certainly information of substance such as the strength of a nation’s economy or trade surplus could have been more accurately and cheaply obtained by trawling the Web and in the future will be infinitely more reliable, extracted from a variety of expert sources.
The same could be said of information relating to current or future military conflicts, using Google maps to pinpoint the size and strength of nuclear installations or clandestine buildup of forces. Information relating to sensitive military planning and deployments will need to be kept securely encrypted, much more rigorously than previously, but this is hardly the stuff of diplomacy. It should not be confused with the need in a true democracy for full public transparency and acceptance of the arguments for and against involvement in a conflict in the first place.

As demonstrated in the Vietnam, Iraq and Afghanistan wars, governments are not above applying spurious non-sequitor arguments for home consumption to justify their own agendas. Human rights information about political activists is another casualty of the skewing of diplomatic double-speak, distributed still on many news channels. But the web is rapidly reaching the point of knowledge and inference maturity that can expose most diplomatic obfuscation. It is also equipped as mentioned, with algorithms to weigh evidence and the probabilities of truth in future decision processes.

Diplomacy’s future role therefore, if it is to survive at all, is as a complementary subset of the new democratic model, for an open cyber society. As such it must keep in lock-step with democracy’s fast evolving shift towards a more inclusive model, avoiding the artificial disjunction between overt and covert social knowledge management with its divisive overtones and elitist origins.









.

Friday, July 29, 2011

The Future of Migration

The author contends that the future of Global Migration is governed by two major drivers- Global Warming and the laws of physics that control the flow of information and knowledge across borders, which. will inevitably be followed by a flow of education and human skills on a global scale.

This latter scenario is based on the physics of the Least Action Principle, which postulates that any dynamical process, whether the trajectory of a ray of light or orbit of a planet, follows a path of least resistance or one which minimises the 'action' or overall energy expended.

Physicist Richard Feynman showed that quantum theory also incorporates a version of the Action Principle and underlies a vast range of processes from physics to linguistics, communication and biology. The evidence suggests a deep connection between this principle based on energy minimisation and self-organising systems including light waves, information flows and natural system topographies, such as the flow of a river.

Information is now flowing seamlessly to every corner of the planet and its populations, mediated by the Internet and Web; reaching even the poorest communities in developing countries via cheap PCs, wireless phones and an increasing variety of other mobile devices.

Half the population of the developing world in Asia and Africa now have access to the Web via inexpensive mobile phones. Individual local farmers and small businesses increasingly use them to transfer money, track commodity prices and supplier deliveries and keep in touch with relatives and their community. They are also the ideal medium for transferring knowledge as the basis of the education process.

In sync with the flow of information and knowledge there is now a global flow of educational material online including open access courseware resources. Courseware is a critical resource already offered by a number of prestigious tertiary institutions including- The Massachusetts Institute of Technology, Yale and Harvard, in addition to free knowledge reference sites such as Wikipedia.

The trend-lines in this open learning revolution are already evident and will become pervasive in the near future. They include online 24 hour access to the Web, open content via free courseware, and real-time wireless web delivery; making it much cheaper and easier for the flow of knowledge to reach previously illiterate societies and communities, particularly as a generational shift takes place.

At the same time the human learning process is being driven by the need to adapt to a fast changing work and social environment, to provide ongoing support for society’s needs in the new cyber-age. This shift in turn is being driven by the increasing rate of knowledge generation providing new opportunities.

By 2030 the full power of the Web will be deployed towards this new paradigm. At the same time work practices will become increasingly fluid, with individuals moving freely between projects, career paths and virtual organisations on a contract or part-time basis; adding value to each enterprise and in turn continuously acquiring new skills, linked to ongoing advanced learning programs.

And so by 2040, the flow of information followed by the continuous flow of educational courseware, together with improvements in standards of living, will have largely eliminated the inequalities of skills and training that currently exist between developed and developing nations.

The Action Principle will finally allow the developing world to achieve equal status with the developed world in terms of access to knowledge, training and the realisation of human potential and facilitate the free movement of human workers and their families between workplaces globally.

Already there is a large transfer of skills between countries like India, with a vast pool of engineering and computer science graduates, and the West’s need for such skills. This may be in the form of virtual outsourcing or physical transfers of a skilled labour force on short term contracts. The same process currently operates between EU countries to fill capacity shortages on a regular and continuing basis.

At the same time as the information/education/workflow convergence is occurring at a worldwide level, two other major drivers of global migration are accelerating - global warming and global conflict.

Planet earth is now reaching a catastrophic tipping point, where it is realised that humans have probably left their run too late to limit global temperature rise to the maximum safe 2 degrees centigrade and atmospheric carbon levels to less than 450 ppm.

The evidence is starting to become apparent from a number of sources. The melting of the Arctic and Antarctic ice sheets and mountain snows feeding the major river systems in Asia and Africa, the disintegration of the northern tundra threatening the release of vast amounts of methane, the catastrophic loss of biodiversity, disruption of most ecosystems including the coral reefs and tropical forests, ocean warming, threatening the phytoplankton base of the food chain, and increases in extreme climate related events- droughts, floods, rising ocean surges in coastal areas, tornados etc. These are already threatening to overwhelm even the wealthier nations’ capacity to rebuild damaged and obsolete infrastructure.

Rampant global warming will inevitably lead to major disruption of the world’s food and fresh water supply chains, seriously affecting at least half the world’s population. This will result in vast migration movements as the rivers and food bowls of China, India and Africa dry up and deadly tropical diseases such as the malaria and dengue fever, spread.

In turn these factors will result in increasing social chaos and conflict unless managed on a global basis.

To stabilise the situation, the 1951 UN convention on refugees will need to be strengthened and expanded to establish a world humanitarian body with the powers to override national sovereignty and mandate the number of climate and conflict refugees that each region will be required to accept, according to capacity and demand.

Migration has always been a routine way of coping with floods and droughts going back to the earliest civilisations, when there were few borders and the numbers affected were trivial in comparison with today’s 7 billion population and its vast infrastructure.

The magnitude and frequency of environmental hazards is now beginning to place enormous pressure on the capacity of many communities to survive. The recent IPCC / Stern Review of the economics of climate change estimates that climate refugees will reach 200 million by 2050.

An idea of the coming wave of human migration can be glimpsed from a sample of recent natural disaster statistics, which do not include earthquake, volcanic or tsunami events.

Mexico was a source of 1 million environmental refugees a year during the 1990s with increased hurricanes and floods also the root cause of its economic crisis.

Large-scale government enforced relocation programs in Vietnam and Mozambique moved hundreds of thousands of people to cope with worsening floods and storms in 2000.

Six million environmental refugees in China have been created by the expanding Gobi desert. Migration in China and India has also been greatly amplified by development of projects such as China’s Three Gorges, which displaced 2 million people.

The 1998 monsoon floods in Bangladesh covered two thirds of the country and left 21 million homeless.

In 2008, floods following the Burma cyclone forced hundreds of thousands to flee, with little assistance from the Burmese junta.

In 2010, record monsoon rains in Pakistan caused the Indus River to burst its banks, causing millions to relocate.

Although most of these events created internal rather than external migration, it is unlikely that this will continue to be the case, with rising temperatures forecast to force tens of millions to move from tropical to more temperate regions, due to ongoing droughts over the next twenty years.

There are also an increasing number of conflict refugees from autocratic and despotic regimes and failed states. Tribalism and fear and suspicion of the ‘other’ is still strongly embedded in the DNA of human evolution, leading to scapegoating of migrant groups in tough economic times. Examples include Muslim harassment in Christian countries, Neo-Nazism in Europe targeting African refugees and inter-religious conflict in Asia and the Middle East.

The refugee diaspora has greatly expanded in conflict zones across the globe over the past two years, driven by upheavals in Afghanistan, Pakistan, Iraq, Somalia, Sudan and the Democratic Republic of Congo and the Ivory Coast, as well as persecution of ethnic minorities in China, Burma and Bhutan. Criminal violence, as now endemic in Mexico, is likely to add to this misery.

It is estimated that almost a million people are smuggled and trafficked across international borders each year, using increasingly sophisticated methods by criminal organisations linked to a range of other crimes- identity theft, corruption, money laundering, and violence ranging from debt bondage to murder- earning of the order of $10billion.

By 2030 mounting humanitarian crises are likely to make assistance to all climate and conflict refugees mandatory as it is realised that a piecemeal national approach will result in far worse disruptions to society in terms of the uncontrolled spread of violence in a very unstable time.

Any country that avoids its international obligations and attempts to free ride the system will be ostracised and severely sanctioned.

Europe already contends with a growing number of refugees from North Africa, which include economic, climate, disaster and conflict refugees, but with the upturn in Middle East violence and difficult economic times is battling xenophobia in its member states.

By 2040/50 most of the new migration infrastructure will be in place and communities will have to adjust accordingly. In an already largely globalised multicultural world where most nations have already accepted other cultures for several generations, even if begrudgingly, this will not be as revolutionary a development as many might expect.

It is therefore likely that the paradigm of controlled but flexible migration worldwide will cease to be controversial, endorsed and managed under the auspices of the UN, as a globalised One Planet philosophy gains traction.

It will be the only solution capable of managing cross border refugee flows in a time of looming climate disruption, but also the most economic means of allocating valuable human resources in a globalised educated world to areas of greatest need, as humans fight to save their planet.

Friday, April 29, 2011

The Future of Tourism

By 2015- the nature of traditional Tourism will have radically altered in many ways.
There has already been a reduction in overseas travel and this trend will accelerate, as more travellers become aware that air travel contributes 3%-4% of global carbon emissions. This will increase the popularity of local destinations in all countries, including exploring local wildernesses and heritage sites, as well as exotic city theme parks. Communities in city and country areas with common interests will also take advantage of local resources to a much greater degree, creating their own local travel themes independently of the larger operators.

Travel will also need to become more eco-friendly and socially responsible, with travel operators offering a choice of carbon offsets such as tree planting. And as tourists also contribute to the risk of damage to fragile archealogical sites and pristine wildernesses, they will be encouraged to volunteer their skills to remediate the environments they visit, as part of a holiday package.

By 2030- many ecosystems will have disappeared or be at risk- reefs, coastal areas, forests and glaciers etc, while thirty percent of animal and plant species will have disappeared or be endangered. Tourists will be banned from most national parks and will rush to visit the last great cultural sites and wildernesses on earth before they disappear or are closed to humans. Following today's trend, most wild animal species will be viewed solely in zoos and theme parks.

Major cities and surrounding areas will become the main tourist hubs, offering not only traditional entertainment and cultural experiences, but previously outdoor physical activities such as surfing, skiing, fishing and golfing; but now in controlled managed environments.

By 2050 tourism will have fragmented into myriad exotic experiences often transacted in virtual and augmented realites- simulating extraordinarily realistic and immersive environments,involving all the senses. Gradually such travel experiences will be indistinguishable from previous realities- allowing unlimited options- trips into space and under the oceans, back in time to historic events and forward into future civilisations.

The Future of Cars


By 2015 most cars will be powered by electricity with advanced lightweight lithium batteries capable of being charged rapidly at power outlets as well as by hydrogen cells. The new electric vehicle infrastructure enabling simple recharging and replacement of batteries and liquid hydrogen storage will be well advanced in major cities.

Computer systems will increasingly control all vehicle functions as standard- including those already in use for navigation, entertainment, collision avoidance, adaptive cruise control, anti-collision radar, safety crash protection, stability and automatic parking.

By 2020 in most larger cities, small efficient electric cars including single and dual passenger variations will be available for flexible and inexpensive hire for local transport needs via smart phone managed pickup pools, servicing urban neighbourhoods (Ref Future of Cities).

The major advance however will be in the form of fully automated cars capable of navigating autonomously, guided by sensor/ processor embedded smart roads and transit corridors; obeying traffic laws and avoiding collisions with other objects and vehicles. They will also be capable of interpreting traffic forecasts and communicating via local networks with other vehicles to reduce road congestion. In addition they will be responsive to passenger requirements, linked via the wireless Web to their activity profiles- appointment schedules, regular destinations such as schools, child minding centres and leisure centres etc.

The car of 2020 will also be capable of providing and monitoring in-vehicle entertainment and communication, emergency assistance, scheduling and payment services for power charging, parking, security etc. Automated transit control will facilitate traffic streaming and congestion management, with specialised car, bus and cycle transit lanes in operation throughout most urban areas.

By 2030 individual cars will have transformed into autonomous transport pods or capsules for individual passenger urban use. Pods will link seamlessly to other minimum carbon-emission forms of transport for local neighbourhood and inter-urban movement- light metro rail, electric cycles, scooters and bicycles. Pod streaming infrastructure will link to smart transport hubs, providing automated fast electric urban and intercity pod/light rail/bus services.

By 2040 the car as we know it today will cease to exist in the developed world’s urban areas. In its place will be multipurpose intelligent transit pods- systems seamlessly linked and customised to individual and community needs. Most ground-based vehicles except for bicycles will be totally autonomous and humans will become passengers only. All instructions managing human and urban infrastructure interaction such as pick-up/destination location and schedule requirements will be relayed by mobile links and automatically accessed by the pod system via the Intelligent Web 4.0.(Ref Future Web)

By 2050 the first vehicles to take advantage of 3D transport will emerge. Multilevel transit systems will be suspended above the transport routes of cities with lower levels restricted to bicycles, scooters and walking. All levels will link with major transport hubs and metro trains for super-fast autonomous intercity and new low energy system air travel. All service and logistical decisions will be managed by adaptive algorithms via dedicated secure virtual networks of the Intelligent Web.

Humans and their transport infrastructure will be seamlessly and permanently networked.

Sunday, April 24, 2011

Future of Education

David Hunter Tow, Director of the Future Planet Research Centre forecasts an education revolution over the next 30 years that will include whole of life learning and personalised instruction in any topic, anywhere and anytime; driven by the knowledge explosion, the new social media and Intelligent Web.

Over the coming decades the process of learning and education will undergo a profound shift, from the traditional classroom/face to face method of knowledge transfer to a much more abstract model, where teaching will be largely separated from its current physical infrastructure, such as classrooms and campuses, in much the same way as the content of a printed book is becoming abstracted from its physical medium in digitised eBook form.

The human learning process will be driven by the need to adapt to its social environment; primarily the increasing rate of change in the knowledge and skills required to support future civilisation’s technology, culture and services. This will occur in tandem with society’s immersion in an increasingly cyber environment.

The Education Revolution is therefore inextricably linked to two major drivers -

The Knowledge Revolution- the hyper-fast generation of information and knowledge processes;

The Cyber Revolution- the transformation of the world’s knowledge base, including all processes and services to digital form, distributed via the Web.

The trends in this revolution are already evident and will become pervasive in the near future on a global basis. They include online teaching access, open content, real-time wireless web delivery, independent courseware provision, virtual reality teaching environments and lifelong education at a personalised level.

By 2020- online education will dominate university and school learning. This will allow resources and investment to be more effectively applied to the quality and delivery of courseware- anywhere, anytime.

Already a growing acceptance of online learning in major universities and schools, has generated the development of hybrid curricula- a combination of online options and traditional face to face classroom teaching and tutoring. However some institutions such as the global Phoenix University already operate solely as virtual campuses, offering global online courseware.

This shift will have many beneficial flow-on effects for both individuals and communities, including reduced travel time and more flexible delivery of courseware, making learning more affordable and accessible, particularly for working and part time students.

Studies have already suggested that students in online learning environments perform as well as or better than those receiving face-to-face classroom instruction. Traditional teaching institutions therefore will find it increasingly difficult to compete with online cyber innovations on a cost, convenience and quality basis.

The Open Content movement is also generating momentum in the new educational universe. Open content allows all sectors of society, including poorer populations in developing nations, to benefit from the education revolution, using the Web to deliver the collective courseware of major teaching institutions across the globe, free of charge.

This valuable resource is already offered by a number of prestigious tertiary institutions including- The Massachusetts Institute of Technology, Yale, and Harvard. In addition the Open Courseware Consortium now has 200 members, 4000 courses and 44 sources in 7 languages

At the same time the development and provision of formal educational curricula will no longer be solely the preserve of schools and universities. Most larger business enterprises already provide in-house training and even accredited degree programs in specialised areas related to their own services, such as computer communications and hospitality. There is also a trend towards companies offering supplementary training on a modular basis, credited towards traditional graduate and post-graduate qualifications.

In addition, knowledge reference sites such as Wikipedia are providing free semi-structured online courses and books by aggregating existing reference material. This trend will continue, with independent courseware developers eventually dominating the growing market for expanding on-line content.

Developing countries such as India and China are also funding massive expansion programs in their schools and universities, for example graduating over 250,000 engineering and computer science students each year; rapidly catching up with the West in the quality and innovation of teaching methods.

To compete with the rise of the new breed of online training institutions, traditional institutions have linked with global partners, as well as fostering local community and business relationships. This trend reflects the shift in urban environments towards lifestyle autonomy, incorporating the full range of essential support services for local communities, including- education, health, leisure and knowledge access, primarily utilising the very high bandwidth of the Web. Such centres will typically combine online training with practical vocational and workplace experience.

Learning technology will be dominated largely by the new social media- social networks, augmented reality, virtual worlds, video gaming etc, delivered via smart mobile multi-purpose devices connected to the web.

By 2030 the full power of the web will be deployed towards this new learning paradigm, including powerful simulation training environments based on immersive virtual reality.

Augmented and virtual realities will allow procedures and knowledge to be absorbed within 3D virtual worlds and games, capable of simulating most services and applications; supporting the full range of training needs from trade apprenticeships to strategic management skills.

The benefits of teaching in such VR environments include the capacity to explore real life situations without risk, as is currently practiced by pilots using flight simulators and the more objective and automatic monitoring and assessment of performance criteria.

The application of virtual worlds to education will become a standard function of school and university teaching and research in the near future. Users are already using such technology to socialise and connect through personalised avatars. These worlds can therefore be quickly adapted to provide learning support and feedback between students, with the added potential to create teaching avatar support. Gaming environments can also be adapted to offer student support for the solution of complex real world problems in areas such as conflict, climate change and economic analysis.

Traditional human teaching supervision with still be vital for young children, but games already play a large part in their development and this role will accelerate in the cyber age.

Social media sites are also creating environments where scientists can experiment with new research techniques, by applying intelligent agents to simulate interacting populations. These methods are increasingly being applied to studies of educational modalities between students and teachers. Worlds such as Second life are already providing virtual campuses for some of the world's most prestigious universities such as Harvard and Stanford; offering a research environment in which to come to terms with the transition from the traditional to the new networked cyber campuses.

2030 will also see the acceptance of two additional trends- whole of life learning and personalised instruction. Both will be driven by the exponential rate of change in the social,work and cyber environments. Whole of life vocational learning will become both essential and natural as it is realised that secondary and tertiary learning is just the beginning of human cognitive challenges. This trend is already underway, facilitated by the web’s pervasive access to the world’s digitised knowledge storehouse.

Enterprise organisational boundaries and work practices will become increasingly fluid and porous, with individuals moving freely between projects, career paths and virtual organisations; adding value to each enterprise and in turn continuously allowing workers to acquire new skills, linked to ongoing advanced learning programs.

Work and education patterns will therefore gradually adapt to a cycle of seamless knowledge generation and acquisition which in turn will trigger the need for more personalized education. This will be facilitated by the Web’s pervasive social reach, providing flexibility of learning options- mixing and matching with an individual’s lifestyle and experience.

By 2040 it is anticipated there will be a scramble within public teaching institutions to fill emerging educational and training gaps in courseware at all levels- primary, high school, undergraduate and post graduate studies. But the knowledge explosion will now be too fast for traditional course development and management methods, demanding support for constantly emerging cross-discipline training, combined with an emphasis on general problem-solving and creative skills.

Most institutions will be under enormous pressure to keep up with the demands of the medical, legal, science and engineering professions, incorporating the latest technologies; with capacity seriously lagging the accelerating knowledge advances.

It is realised that catch-up can only be achieved by facilitating the development of Web-generated courseware, combined with continuing global open content accessibility; largely driven by automated methods and delivered on demand.

This period will mark the beginning of the end for the traditional one-trak qualification pathway. Instead, performance assessment will be based on the quality, relevance and value of an individual’s skills and capabilities in relation to business, professional and social requirements; whether applied to medicine, astrophysics or cabinet-making.

Formal education will have reached a critical threshold as it is understood that ongoing personal learning is an essential driver for human survival and progress- and always has been. Education will be seen not as a sub-set of techniques that is applied only within a formal qualification factory. It is an essential component of each individual’s development regardless of age, position or ability within society.

Such an expanded view will also generate new modes of alternate education, nurturing creativity and encouraging learning for learning’s sake. For example the Free/Slow University of Warsaw- FSUW-offers an informal, non-profit centre for interdisciplinary studies; providing for student participation on the basis of the excitement of knowledge discovery and cultural experience for its own intrinsic sake, rather than for the pursuit of profits alone.

By 2050 the intelligent Web 4.0 will be capable of autonomously generating knowledge in synch with new work patterns and individual requirements, delivered directly, eventually through neural as well as sensory interfaces.

The Web will generate, host and deliver best practice training material autonomously, as it is beginning to do for information resources in a service economy. Over time this will be develop consistent quality outcomes competitive with most traditional options and encompass the complete spectrum of learning, distilling the best techniques from both human and artificial intelligence. This trend towards automatic algorithmic knowledge and pattern discovery is already evident in the sciences- biology, physics and astronomy.

By 2050 the abstraction of learning from its traditional infrastructure of classrooms and campuses will be complete in most advanced nations, with the major teaching institutions morphing into community support centres for research and services integration. The role of the Intelligent Web within the educational universe will have been transformed into that of a senior partner with humans, continuously forecasting new skill requirements and generating the required support programs; creating new courseware as a public resource, belonging to the global commons.

This will finally allow the developing world to achieve equal status with the developed world in terms of access to knowledge, training and the realisation of human potential.