Monday, November 25, 2013

The Future of Startups


The Director of the Future Planet Research Centre, David Hunter Tow, predicts that the Startup culture will provide the key to a more productive, peaceful and sustainable planet by offering a new era of creative work and training with the potential to reduce poverty and conflict, while focussing on solutions to the looming crisis facing the planet from unstoppable climate change.

The Startup phenomenon can be likened to the first industrial revolution in the 18th and 19th centuries that radically transformed all aspects of world society by improving living standards and providing new ways to release the creative potential of new generations.

 A whole new culture is emerging based on providing support for a new breed of Startup entrepreneur through business structures and processes that include-incubators, accelerators,  mentoring and training and equity partners. Startup enterprises now number in the tens of thousands and embrace virtually every significant social, business and industrial process. This nascent culture is rapidly evolving into a global force - gaining structure by coalescing around a number of Hubs, Networks, Ecosystems and Industry and Application Sectors.

Major Hubs are currently based in most of the world's larger cities including- London, Berlin, Istanbul, Helsinki, Tel Aviv, Stockholm, Auckland,  Singapore, Beijing, Bangalore, Sydney, Paris, Sao Paulo, Moscow , Reykjavik, Tallinn, Chicago, Manilla, Milan, New York and Barcelona, as well as the major iconic San Francisco/Silicon Valley nexus. At the same time a number of emergent hubs are gathering momentum in most urban regions of Africa,  Middle East, Asia and South and Central America.

No process or application, no matter how entrenched or fiercely guarded by its traditional custodians will be exempt from the impact of the Startup’s disruptive agenda, with every activity embedded in the operation of  modern civilisation likely to be transformed into a more streamlined and productive form- available primarily via inexpensive digital mobile platforms.

This re-engineering is occurring not just in the traditional online service sectors of retail, marketing, advertising, entertainment, travel and media, but increasingly in the professional service areas of knowledge management- education, healthcare, law, insurance, design and finance. Industrial sectors are also flexing up with smarter solutions, supporting engineering, mining, manufacturing, agriculture, construction, energy, transport, distribution, supply and communications, developing as well as developed countries.

But this shift to a smarter planet isn’t solely a big enterprise or big city initiative. In the near future every small town and regional community will also spawn its own Start up ecosystem. It will become a way of life offering a new form of work and play, with creativity the main currency.

It is gradually dawning on Government and the commercial and industrial establishment  that this is going to be the future way to create a new generation of viable businesses and economy. In the process it will likely disrupt and displace the existing 20th century paradigm for building a civilisation, taking no prisoners. And if today's enterprises want to survive they will very quickly need to adapt.to this new world order.

But for the larger enterprises in particular this will be an almost impossible task. Social and technology commentators, big business and Governments until now have largely underestimated the significance of this revolution , seeing it as an add-on phenomenon, complementary but not essential to the functions of the traditional economy.

Big mistake.

Misreading the significance of past economic disruptions such as the explosion of small desktop computers and the Internet has led to the demise of many seemingly invulnerable organisations. Just ask IBM about its near-death experiences in these areas. The economy and our social fabric is undergoing the next wave in a series of rapid and radical global changes that will dwarf the original industrial and digital revolutions.

The Startup phenomenon is just the latest in a rolling wave of technology driven changes reshaping our relationship with the planet and triggering a whole new way of survival. And most significantly it is achieving this by releasing the full global potential of human creativity.

And the reason for this burgeoning hyper-growth cycle at the start of the 21st century is because it is meshing simultaneously with a number of other revolutions including those of the sciences, arts, knowledge and artificial intelligence, education and work, as well as advanced digital technology and social development.

The primary mover and shaker- the heart and soul of all these revolutions is the Internet/Web, with its payload of exponentially increasing information, now available to all via commoditised mobile portals. This represents the next phase in the democratisation of the world’s storehouse of precious knowledge, driven by the imperative to fulfil the potential of the vast under-educated populations of Africa, Asia and the Middle East that have previously missed out on our planet’s bounty. It creates a level data playing field by allowing any citizen with a mobile phone or tablet regardless of location or income, to access a common knowledge universe.

Piggy backing on this pinnacle of this human intellectual achievement is the education sector which is now well on the way to providing the means of delivering this vast treasure trove in easy to absorb bite size chunks via virtually free MOOCs – Massive Online Open Courses, providing equal access to quality education and training at all levels within across the planet.

And right on its heels, leveraging the benefits of this educational bounty is the revolution in work practice- now catalysed by the Startup industry.  

The nature of work is now undergoing a dramatic transformation, flexing up to allow the transfer of skills from cheaper as well as high quality expat off shore sources of labour. But Startups have the potential to take this transfer to another level; to redress the global employment problem, eventually providing opportunities for skilled employment at the local community level.

A major Startup Hub- the Founder Institute , with chapters in 55 cities across 30 countries has just declared that over 1000 companies with a total portfolio value of $5billion have  graduated from its program in the last four years.  And the Institute and other countless incubators around the world are not just attracting the typical demographic of  twenty to thirty year olds with computer science and software engineering backgrounds, but entrepreneurs of all age groups - middle aged executives, trade and factory workers and housewives- even retirees still chasing their lifetime dreams; all with the vision and wisdom of hindsight that only serious life experience can provide- ready to grasp the opportunities that a younger generation cannot yet conceive of.

Following a hobby or passion has always been an intrinsic part of human nature.  It is no different in the digital age. The over fifties, sixties, seventies and even eighties now utilise the web as much or more than the under thirties and in a more active way than passively downloading music or videos. Surveys have shown they are also more astute at utilising social media for real benefit. The software skills required to transform that hobby or creative idea into a digital app is the simplest part of the equation- capable of being easily and inexpensively outsourced to an expert or automated app generator. After all, the technical skills required to design a blog or website used to be challenging for the average citizen. Not anymore. Now anyone can use a free template from Google and be up and running within ten minutes. The same is happening with app technology.

This is the new world where age is not a barrier but an advantage and where creative content and innovation is king.

The infrastructure required to support this new work/play revolution is also dirt cheap; an old warehouse with some discarded tables and chairs and cheap commodity smart phones and laptops or servers- sufficient even for graphics and game developers. For brain storming or practical sessions with an engineering or financial expert with forty years heavy duty industrial experience - a comfortable coffee bar or a friend’s garage is sufficient.

Cities or precincts that were once derelict and dying such as exist in Detroit, Denver, East Berlin or devastated New Orleans are finding a new lease of life by Startup communities; at the same time solving another endemic problem in society- unemployment and crime. Street kids, high school dropouts and jobless university graduates can be rapidly absorbed into this culture with some initial mentoring and training, offering creative opportunities and refuges no different from the arts and crafts sectors that have adopted similar supportive practices for decades. In fact there’s now a significant overlap and synergy between technology and arts communities, sharing creative spaces, ideas and marketing strategies.     

No wonder established enterprises of all hues- from the technology giants such as Google, Microsoft, Apple, Sony, Cisco, Verizon, Samsung, Yahoo, Amazon and IBM as well as Government agencies and big business in manufacturing, energy and banking- from NASA to Goldman Sachs, GE, Cisco, Shell, Phillips, Siemens, Panasonic, Ford and Toyota  are cashing in on this potential bonanza, supporting and mentoring Startup communities- not so much to make an immediate profit but just to gain a footing in this ultra-competitive new survival game.  

Most have either spun off their own internal Startup divisions like IBM or like Google are having a bet each way, aggressively offering to support other promising hubs such as the recently expanded Sydney Incubator tapping into the network of Australian University students.

For those enterprises that don’t or can’t adapt to this new universe, the gig will be up. Just as the empires of ancient times - the Romans, Greeks, Persians and Chinese dynasties or later British, Portuguese, Dutch, French and Spanish colonisers- all thought they were masters of the universe with their new technologies of guns and ships; but eventually overreached and lost the plot, misreading the pro-nationalist signals and  new awareness of a changing world.

Now the new technologies keep exploding relentlessly, with the Cloud, mobile technology, virtual reality, the Internet of intelligent objects, big data, artificial intelligence, robotics,  massive bandwidth, software defined networks, more flexible database structures and open source software, setting the pace.

But just over the horizon lurks the next generation of technology powered by – the intelligent Web with human like intelligence, quantum computing and teleportation, direct thought transfer via sensory headbands, the Precog society where prediction is the norm, insect sized drones and giant social observatories such as the original billion dollar EU FuturICT blueprint. Also the emergence of the global human superorganism- the response to increasing globalisation in the face of intractable global problems requiring urgent solutions such as climate change and conflict.

And each time the technology explodes it exposes more opportunities as well as existential risks to humanity. The current generation of dominant tech providers- Google, Microsoft, Apple, Amazon and Facebook are already looking vulnerable; with Google overreaching just like the ancient empires; and Facebook’s invasion of user privacy- likely to go the same way as Myspace; and Apple- passed its innovative peak, likely to become another producer of commodity devices such as Nokia. Even Microsoft is on the ropes unable to make the paradigm shift needed to survive the new world order with Bill Gates’ job as chairman on the line.

Big enterprises have a habit of believing their own rhetoric of infinite growth with a delusional mantra of taking over the world in their market niche. Unfortunately they never studied physics and the limits of computation, information and energy, as the power of entropy inevitably dissembles their structures.  

So the traditional notion of an individual's job and work-related role is already outdated. Work value in the future will be measured in terms of contributions to personal and organisational goals, together with social utility, whether for a two person startup or two thousand employee company.

By 2025 most tasks in heavy industry such as mining, construction, manufacturing and transport will be largely automated and robot-assisted. But such projects will also be increasingly managed and resourced on a real-time basis within the Web's global knowledge network- driven by innovative algorithms generated by next-gen apps.

By 2030 organisational boundaries and work practices will be fluid and porous, with individuals moving freely between projects, career paths and virtual organisational structures; adding value and in turn continuously acquiring new skills, linked to ongoing vocational programs.

And Startups will play the leading role in generating this new innovative world of work and play as a hothouse for generating new ideas and skills. Opportunities for Startups will therefore abound. Why? Because  every current major provider of products and services whether- big pharma, big banks, big media, big agriculture, big construction, big government or big cities etc  will be desperately in need of a makeover with their clunky and inefficient 20th century legacy systems not cutting it in the 21st roller coaster super competitive world.

Likewise professional services in marketing, healthcare, travel, law, media and finance will be dominated by apps and algorithms generated by small agile second and third generation Startup companies.

A revolution in social development is also changing the way populations are coping with massively expanding populations and dwindling resource options, by returning to smaller self- sufficient and cooperative urban communities linked by high bandwidth communication and transport networks which will facilitate work, food  and water security and learning opportunities in a Startup age.

Although big factories using automated robotic processes for producing industrial components - steel, concrete, glass, cars, turbines, trains and solar panels will still be essential using a mix of advanced technologies such as 3D printing, the streamlined and flexible information services needed to manage, market and optimise such products are more likely to be created by the host of future creative Startups- not the few software goliaths still lingering from of the 20th century.   

This future downsizing of the enterprise aligned with local community structures augers well for the nascent Startup industry with its naturally flatter decentralised architecture, allowing a more flexible capacity to adapt to market signals rather than through rigid centralised control communications. Startups also have the capacity to upscale more flexibly using cloud-based frameworks and by forming cooperative networks rather than expanding centralised silos.

And Startups are not only leveraging new information technologies but also the new sciences of materials, biology, chemistry, physics and energy including- graphene- the next electronics replacement for silicon; artificial photosynthesis- the future hope for solar energy; optical physics- for invisibility cloaking and super lasers, quantum computing and information teleportation ; synthetic biology- for growing organs and creating organisms to clean up pollution. Even gene sequencing machines, atomic microscopes and analytic laboratory processes are being downsized to desktop level, closing the comparative cost differential between rich and poor countries and large and small enterprises.

And governments are loving it- because Startups are offering a silver bullet to generate prosperity- a low cost simple way to foster new industries and jobs without the burden of expensive infrastructure, offering the next generation entry to a better life.

The fight against big enterprise corruption, bribery, price gouging and market cartels by big enterprise also benefits in a down sized decentralised app society. There have been numerous recent exposures of the underlying level of corruption, bribery, conflict of interest and contempt for customers within the finance and banking industries, as well as major sectors of the mining and construction industries. But if government regulators have failed to prevent the misuse of shareholder and public funds then agile Startup competitors offering cheaper, safer and more convenient services, may do the job for them.

An example is the payments sector. Many smaller agile groups from technology and infrastructure poor African countries such as Kenya have taken the lead in these services of convenience and already provide perfectly viable mobile phone money transfer and business transaction services via text and a pin number, bypassing expensive western banking services.

Both banks and private equity funds are now scrambling to join the Startup race. But the banks are slow to shed their conservative no-risk attitude to lending and the large venture capital funds are being outflanked because of their elitist attitude, refusing to get involved until they are sure the Startup is well on its way to stardom. But in a future high risk roller coaster world there is no such thing as certainty and the professional funds are now at risk of  being outflanked by the more nimble networks of crowdfunders and syndicates of wealthy Angel investors, happy to take a gamble, offering both seed capital for visionary ideas and serious followup investment for likely winners; gaining the advantage of an inside rails run to grab the major payoff  prize.

But the Startup has a much more important role to play in today’s world.

The latest climate report predicts our climate will be irrevocably changed within thirty years if we don’t change direction – despite all the current advances in renewable energy technology and efficiency savings.

By focussing on innovations in sustainable energy and poverty reduction- rather than trying to emulate another superficial social media or marketing billionaire, today’s Startups can play an essential role in saving the planet and its human cargo, including themselves.

This is an indicator of the potential power of the maturing Startup industry, as a global phenomenon which also might just save the planet through the unleashing of an explosion of  innovation and idealism; designing more resilient and sustainable systems, reducing the pressure on the planet’s ecosystems and supporting more cohesive communities; at the same time generating new pathways to peace through cooperative globalisation- offering hope for future generations in a time of existential crisis.

Today's Startup is therefore not only a powerful force for change but also for survival. They are also beginning to gain the upper hand in the marketplace of ideas. A tipping point is already emerging. There is now more investment capital available than viable projects. No more the demeaning cap in hand pleas by desperate entrepreneurs for funding - prostrating themselves in ridiculous speed pitching marathons- often losing control over their IP in the process of a desperate race for assistance.

Now there are many more alternative funding options to tap such as crowdfunding and Angel syndicates- more financial supply than startup demand; Universities, such as Stanford, MIT and Sydney as well as tech companies and government agencies are also competing with established VC firms, with many lower-tier VC firms caught in the squeeze, at risk of going to the wall.

So now it’s the VC firms turn to do the pitching and make concessions for a limited supply of viable Startups. How things change.

For the entrepreneurs and founders it means more control, more funding choices, and shorter lead times.

The centre of gravity of the talented app developers and entrepreneurs is also shifting away from the US back to their country of origin. Until recently at all levels of science and technology the US has been living on borrowed overseas intellectual capacity. For the last fifty years it succeeded beyond its wildest expectations in seducing the most talented of the world's minds to assist achieve its scientific and technological dominance, with offers of scholarships, state of the art research facilities, career paths, permanent residency and financial packages an order higher than their own countries could offer. And during the last fifty years hardly a research paper of any significance was published without input from a researcher of European or Asian origin. And the American economy prospered beyond all expectations.

But now the game is over, with governments across the world able to offer their talented graduates and entrepreneurs the necessary home grown incentives and facilities to pursue their careers in their own countries; at the same time contributing to their own national development.

So the Startups of tomorrow will be much more evenly distributed with a more level playing field and the world can look forward to an explosion in creative and innovative potential across all nation states. In tomorrow’s world there will be no alpha nation. Each Startup ecosystem will develop its own expertise in its own way, which it will then share with the world.

By the mid-forties the earth’s climate will have irredeemably changed to something much more violent and unpredictable if we stay on our current trajectory, even accounting for the growing use of renewable energy sources and greater efficiencies The best we can now hope for is to slow Armageddon down, but we may not be able to reverse it.

Climate change triggered by global warming will dominate every business and social decision within the next decade. Every country, community and company has to make it front and centre in their planning processes- what to produce, how to produce it, where to produce, in order to minimise energy consumption and slow the release of carbon.

The Startup culture will play a pivotal role in this process- the key to the planet’s redemption. But only if its focus shifts to developing sustainable processes and products rather than infantile notions of  becoming the next billion dollar enterprise.

Let’s hope that the current and future generation of  founders don’t lose sight of the real priority facing planet Earth and have the wisdom to avoid being dazzled by ephemeral dollar signs.

Otherwise they too will be swept away by its inevitable apocalyptic endgame.

 

 

 

.

 

Tuesday, April 9, 2013

The Future of Surveillance


Future Society – Future of Surveillance

The author David Hunter Tow, predicts that by 2030 the equivalent of a global PreCognition machine will be in operation with everyone  a Person of Interest as portrayed in The Minority Report film.

The state of surveillance and reconnaissance technology and its multiple applications  is now evolving at warp speed creating unprecedented Future Shock to civilisation’s social fabric.
Surveillance is already big business- very big business and is likely to continue to expand exponentially into the foreseeable future, attracting the good, the bad and the ugliest elements of society.

The problem is that without careful controls,  the runaway consequences of such a pervasive and intrusive phenomenom is likely to be catastrophic for humanity.

The main technological and social components of the  global surveillance trendline are already emerging; woven together into a dense matrix from which there will be no easy escape.

They include-

 The Knowledge Web
The most important component is the Web/Internet itself- the core asset and artefact of our civilisation, leveraging the knowledge of our society.  

This massive information network is already evolving into something beyond society’s capacity to control- the means of generating and accessing all civilisation’s knowledge content and application. It now connects over 3 billion humans and in the near future trillions of computing devices, machines and sensors. It already allows  a dense interchange of information, expertise and ideas relating to the sciences, arts and social experience that support all aspects of human existence on planet earth.

All knowledge advances, including not just basic data, but the algorithms, processes and techniques used to processs information, are being funnelled at hyperspeed into its heart, like a giant black hole swallowing the energy of billions of suns.

And emerging from the other side just like a white hole is a whole new universe- the promise of a cornucopia of untold intellectual riches and wisdom. Giant science and social observatories are now being constructed- models containing trillions of variables to assist in forecasting the future; reducing the risks that could wipe out our world in the blink of an eye- catastrophic economic, environmental or existential collapse.
The Web itself is rapidly moving to the next level- becoming more intelligent and self-determining; adapting and learning with the computational intelligence of billions of human and cyberagent minds; rapidly taking on the characteristics of a living superorganism.

 Once encapsulated, content can be mixed and matched, processed and recycled ad infinitum just like matter, until it finally emerges in a form that in  the best scenario will benefit humanity and allow it survive and achieve its potential in the future.
But there is an alter ego- a dark side to the Internet/Web. In order to achieve this magical transformation, this perpetual knowledge generator at the heart and soul of our civilisation,  it must also become a superb surveillance machine, with intelligent sensors to act as its eyes and ears- everywhere.

The following categories of sensors are now commonly used  to support the Internet/Web
Embedded Sensors-

Sensors are incredibly important, because without them to monitor the processes and systems of our planet, including our own bodies, our wonderful chocolate factory would quickly die. It  can only operate as a supersystem if it is fed a continuous diet of up to date, relevant and reliable information.

By linking to a variety of intelligent sensors, some incorporating the distributed  ability to process signals using artificial intelligence, the Web can capture the raw material it requires to weave our social matrix and is already doing so in increasing volumes, as its appetite for problem solving expands.
Sensors therefore must therefore also evolve  to become smarter- becoming more like multi-component systems, which can now be constructed in a vast variety of forms. For example- as force and field detectors embedded in the limbs of autonomous robots, capable of working on complex tasks with humans; as clouds of tiny artificial insects or smart dust that can automatically cooperate to monitor deadly environments without risking human lives; as nano-biosensors small enough to enter and navigate human cells to keep us alive; as the instrumentation of unmanned drones capable of locking on to a target and activating a kill switch against human beings; and as road location catseyes, continuously  communicating with driverless cars to avoid accidents and gridlock.

But rapidly changing climate and social change triggered by global warming will be the main driver for this technology in the future,  requiring intelligent sensors embedded in every form of natural and man made ecosystem; allowing for constant adaptation and maintenance, utilising closed feedback loops linked to the Intelligent Web for its solutions.

Such smart sensor networks are already operating in every sphere of work and social activity including-
Maintaining engineered Infrastructure- embedded in roads, bridges, dams, pipelines, grids and power stations.

Monitoring ecosystems-  natural systems  such as-forests, rivers, water, soil,  air and energy resources providing feedback to regulatory authorities to protect their integrity and survival.
Coordinating manufacturing and logistical facilities-  factories, plants,  container centres, warehouses, ports. airports, railways, traffic systems etc to efficiently manage the manufacture and delivery of products and services.

Personalising Health - advances in smart phones  and mobile technology equipped with biosensors have opened up unlimited  opportunities to monitor and support an individual’s health needs on an unprecedented personal basis- delivering just in time interventions linked to the latest diagnostic and treatment algorithms on the Web. Also using nanosensors to track disease pathways at the cellular and molecular level.
Managing Disasters and Conflict -  protecting the security  of those living in war and conflict zones – including law enforcement precincts in cities and urban areas; using a  range of sensors to protect and monitor the security of communities and public assets. These are increasingly delivered by smartphones as well as pervasive CCTV cameras, mobile robots and in the future small agile drones.

 Satellites / Probes – Eyes in  the Sky-
Ssensor systems, involving high resolution cameras and global positioning devices attached to space based telescopes, aircraft, balloons, unmanned drones, explorers  and probes of all types are now widely used to detect the electromagnetic spectrum of the planet’s resources in most wavelengths- optical, infrared, ultra violet, radio etc. The results are used to feed data to web based or smartphone apps for analysis covering-  weather forecasts, disaster interventions, animal distribution, ecosystem health, 24 hour communications and video news footage..

.Military / Spy networks - satellites track the world’s most secret military and government installations and test sites using software that enables surveillance of the remotest areas on the planet. This information is also used for research, using images from Google Earth satellite maps to replace traditional archaeological methods; by Governments to monitor border integrity and NGOs to safeguard wildlife against poaching in protected areas. Powerful probes and remote autonomous vehicle landers are increasingly used in space exploration  to obtain fly by views of planets, moons and asteroids and in the future mining options.
Drones / UAVs – these are likely to become common in the future, sharing airspace with piloted aircraft. They are currently used for surveillance spying and kill missions, but in the future will be used for reconaissance by most governments, NGOs and private corporations.

They can monitor a range of information sources, vastly reducing the operational risk in conflict areas; allowing surveillance by sensors that can record full motion video, infrared patterns. radio and mobile phone signals. They can also refuel on remote short airstrips, extending effective air range by thousands of kilometres.
Nextgen drones will be autonomous and smaller, able to navigate and eventually make target decisions, controlled by  complex algorithms and Web feeds; eliminating human operators from the decision loop entirely.  They will be used by every type of organisation - criminal networks, private security businesses, NGOs and social activist groups, providing a variety of logistical, security, news gathering and research services.

But many legal, ethical and regulatory issues remain to be resolved before UAVs will be able to operate in lockstep with human controlled vehicles. There is now fierce pushback by the community against another method of individual privacy invasion.

Intelligent Devices
With the imminent arrival of the Internet of Everything the focus will be on every object in relation to surveillance - machines, electronic devices and systems that can communicate with other machines as well as human users will be the first objects of interest to  be caught in the net. These will include complex systems such as supersmart phones and robots as well as everyday home and office devices such as  cameras, TVs, printers, video recorders, toys,  game consoles, microwave ovens, toasters, fridges etc, all equipped with forms of embedded sensors and actuators including chipped product and ID codes. Eventually  trillions of such active objects including life forms – plants,  animals and humans- will be linked to the Internet through a variety of communication protocols including including DNA sequences and brain interfaces.

Robots of all types will be pervasive in the home,  workplace and industrial areas including- humanoids, capable of intereacting and cooperating with humans in work areas such as retail stores and factories or performing home support services- initially cleaning, food delivery, health and companion support.  They will eventually be capable of more sophisticated decision-making and autonomous operation equal to humans in every activity and finally acting in surveillance / supervisory mode.

 Social Networks /Media

Humans are also expanding their remit in the surveillance game in the form of citizen reporters, scientists and observers, using smartphones to gather information from their local environment, then feeding it through social network media.  Social networks such as Facebook and Twitter already provide feedback on the latest breaking news across the globe, particularly in entertainment, crime and disaster areas, often creating ad hoc networks to provide alternative coverage when standard communication fails as in Haiti-  offering critical on the ground suppport and impact assessment as first responders.

Phone cameras have already proved the single most important surveillance tool available to communities in times of crisis;  also a tool for democracy that has already proved crucial in capturing proof of abuse during the Arab Spring. Citizen reporters, and community activists  equipped with such devices constantly feed the Web with realtime events, capturing evidence of illegal activities and promoting events of public interest through crowdsourcing. The social media therefore provides a significant back channel in disseminating realtime information around the globe like a Mexican wave, as well as signalling emerging trends such as disease epidemics and political developments.

In addition, activist NGOs, whistleblowers and mass movements- Greenpeace, Wikileaks and Occupy all contribute to this channel, providing background monitoring and surveillance of big business and  Government corruption; a form of ethical surveillance crucial to a democracy.

Cyber Espionage

Cyber espionage is now rife around the world. Serious cyber attacks are a daily occurrence particularly between nations such as  China, US, Russia, Britain, Iran and Israel, with the intent of covert acquisition of national secrets, Intellectual property, financial assets and personal information.
But cyber espionage is also a form of intrusive surveillance.

Current cyber malware such as Stuxnet, Flame, Duqu and Miniduke are all primarily surveillance and  reconnaissance weapons capable of performing spy missions as well as crippling vital target infrastructure. This routinely involves  copying critical screen images, websites, emails, documentation and network traffic in general.- performing extensive data mining, copying, transmitting and deleting files for espionage purposes.
The Pentagon’s Plan x is a good example of the exploding surveillance syndrome now overtaking society. It aims to create a  new surveillance and operations system to map the digital battlefield of cyberspace and  define a playbook for deploying cyberweapons. It will provide a realtime graphical rendering of this cyberworld showing ongoing operations and realtime flows of networked data around the world like a large scale computer game. This visualisation or surveillance model of cyberspace requires intensive reconaissance of both friend and foe. But it is already out of date- a model more appropriate for the sci-fi films of the nineties. It will soon be superseded by a much bigger prescence – a multi-dimensional cognitive model in which  players are linked directly to the Intelligent Web.

The US is also assembling a vast intelligence surveillance apparatus to collect information about its own citizens as well as those overseas actors perceived as terrorist risks, integrating the resources of the Department of Homeland Security, military , local police departments and FBI. In the near future this will be expanded to encompass the whole range of US and overseas allied security agencies. This machine will collate information about thousands of US citizens and residents many of whom have not been accused of any wrongdoing, to assist the FBI initially in its ongoing eternal and surreal war against home grown terrorism.
According to news reports there are now almost 4000 federal, state and local organisations working on domestic counterterrorism projects, following the 2001 attacks. Obviously this is  getting out of hand, making it virtually impossible to achieve any realistic goal for achieving a coordinated system.

There are also a number of legislative bills relating to Internet surveillance awaiting ratification including – SOPA, PIPA and CISPA. The  first two speak to copyright protection of content on the web threatening to close down any remotely implicated site,  which opponents say infringes on the right to privacy and freedom of access to the Web; while the third  relates to the monitoring of private citizen information or spying on the general public, in the name of investigating hypothetical cyber threats and ensuring the security of networks against cyber attack.

All three have met with fierce opposition from advocacy groups such as the American Civil Liberties Union  and the Electronic Frontier Foundation, as ignoring the legal rights of supposed infringers and excessively intrusive and draconian.

 Future Shock

While the benefits of the future Interne/Web are enormous in terms of greater knowledge leading to a higher quality of life and a safer existence for humanity, there are concurrent significant downsides which will quickly escalate, potentially  leading to a loss of control of humanity over its destiny. 
The existential risk is that transition to such an always on and pervasive entity as a global  surveillance machine, monitoring  a large proportion of the planet’s  natural, engineered and cultural environment, could  lead to a big brother society in which everyone is a person of interest.

The major disruptions noted as already emerging, relate to the inevitable erosion of citizen privacy  and equitable access to the the Internet in the name of security, with new US laws such as SOPA  and CISPA due to be enacted. These purportably aim to provide greater protection for intellectual property and personal rights but at the same time have the potential to erode democratic rights. 
In other words the beneficial potential of the Internet/Web is at risk of being subverted, emerging instead as a vast spying or surveillance machine.

But this is just the beginning of a slippery slope in human rights attrition.
The surveillance mechanisms outlined will inevitably lead to much greater personal freedoms restriction, which in turn will increase pressure for some form of predictive capacity to choke off dissent. This is likely to escalate no matter what legal safeguards are adopted.

In the paranoid world of the spy/surveillance agencies, networks will become impossibly entangled – much more so than in the current geopolitical/security maze. If there are 4000 domestic agencies in the US currently involved in covert surveillance, how many more are there internationally and how many will there be involved in the surveillance game when the cyberespionage paranoia really explodes?
Who is friend or foe when every nation and major organisation is spying on every other?

As mentioned, prediction/forecasting models are already in widespread use – and so they should be in a world threatened by global warming and economic collapse.  Projects such as the FuturICT Social Observatory, although not gaining EU funding in the immediate future will continue, monitoring vast amounts of information, searching for trends and elusive signals to save the planet.
It is good science when forecasting is applied to reduce risks to our civiliisation. But when such mechanisms abuse power by tightening control over populations it is the beginning of an unravelling of democratic standards.

Autocratic and fascist states throughout history have applied such techniques to their people, punishing political enemies and dissidents in the process. The current surveillance technologies amplify this potential for misuse a thousandfold, exploiting the Web as civilisation’s greatest asset for potential benefit, turning it instead into a quasi Surveillance/Precog machine with the capacity to predict an individual’s movements and actions.
Governments have lost the ability to solve this problem.

Even if there is the will it has become too complex.

The Future is at a tipping point- and the outcome does not look promising.

Wednesday, February 29, 2012

Future of FuturICT

David Hunter Tow, Director of The Future Planet Research Centre, proposes that the major value of the European flagship FuturICT project lies not just in its capacity as a global Simulator and Forecasting tool, but as a stepping stone to a new class of prediction models capable of linking with the future Intelligent Web.
FuturICT is the project the world had to have and its timing is impeccable.
The planet and its population is in dire straits, with the latest IPCC draft report predicting ongoing temperature rises, triggering increasing frequency of more extreme climate events- droughts, floods hurricanes etc, into the foreseeable future.
Such physical events will have numerous disruptive impacts on all ecosystems and societies and in order to survive we will have to get much smarter at forecasting the effects of such outcomes and implementing adaptive survival strategies.
Planning for our future has always been an intrinsic part of human nature at the personal, community, state and more recently global level. And we’ve had some outstanding successes, such as forecasting the weather, technological progress, energy futures and even sometimes stock markets- although these have tended to be more problematic.
But despite a range of mathematical improvements in our foresight and modelling methods developed in tandem with a broader understanding of scientific and social principles, our capacity to forecast has been sadly lacking when outcomes don’t follow obvious trends and predictable scenarios or when the signals of emerging change are faint.
These results are usually called Black Swan events- that seem to come out of nowhere. But they can also be immensely disruptive and we need to get a much better fix on them to ensure our survival. In recent times these elusive events have included the GFC, Occupy Wall Street movement, Arab Spring, Fukushima meltdown and Eurozone collapse – all predictable in hindsight but not so easy with foresight.
Now with the survival of our society on a knife edge, civilisation needs tools that are a lot more robust and models that can deliver much more dependable outcomes.
Enter FuturICT.
It represents the next phase in the evolution of models powerful enough to not just deliver probable predictions but accurately prioritise the resources needed to help us survive the onslaught of massive change.
FuturICT is a prototype of the next generation forecasting tool. It is massive – on a different scale to previous models and this is part of the problem.
But the main difference is that it is a project that has to deliver. No more luxury of extended research timelines; no more egos on parade at glamorous six star conference locations; no more addressing only a narrow elite of academic peers, while throwing a few crumbs to the social media. This is crunch time for FuturICT and for society.
Whatever the world’s current knowledge base – and it’s massive, covering advanced mathematical and AI techniques; new societal, physics, materials, engineering and computational sciences; a better understanding of emerging and emergent sciences such as network, complexity, evolutionary and social theory - all based on exploding data sets and increasingly complex algorithms- it now has to be marshalled, corralled and delivered, based on its initial promise.
And FuturICT is the vehicle chosen to do it- a major advance in the science of modelling the future at a cost of a billion euros over ten years; harnessing the best scientific minds from hundreds of research institutions across Europe; and that’s just the beginning.
Society is now asking for the payoff from its massive investment- billions and even trillions of euros - not just in the FuturICT project, but from research funding of multiple arcane disciplines over the last twenty years.
And make no mistake – it’s not the universities and research councils that have provided this largesse– it’s the humble taxpayer. And now the average citizen wants to know how their money can be applied to save their children’s future. There’s no ifs and buts here – they expect a pay-off now, in their time of desperate need.
Numerous PR releases have flooded both the old and new media already about this futuristic endeavour. In essence it will involve developing The FutureICT Knowledge Accelerator and Crisis-Relief System including-
Crisis Observatories- scanning for emerging problems;
Planetary nervous System- aggregating data streams from sensor systems- monitoring the pulse of civilisation around the globe;
The Living Earth Simulator- the heart and soul of the system; modelling the planet’s social, physical, biological and environmental phenomena- searching for insights into its future.
These components will apply state of the art techniques to mine Big Data using numerous mathematical, statistical, AI and logical inference techniques to discover patterns of significant interest.
But a lot of this is uncharted territory. This a giant leap from models aimed at solving specific problems with hundreds or thousands of variables, such as next week’s weather, transport flows or even complex ecosystem interactions, to one that can be applied to a huge range of environmental and societal problems, encompassing tens of thousands if not millions of interweaving information channels, parameters and variables.
But modelling at this unprecedented level of real world complexity is just the beginning. Managing the thousands of data streams and the research outputs from hundreds of institutions is the really difficult part. It won’t be a neat jigsaw - but a constantly dynamic multi-dimensional network of knowledge links, feedback loops, algorithmic contortions and exponentially exploding potentials.
For a start, the constraints on the model’s variables will have to be severe if it is to be managed at all. At least 95% will have to be pruned or drastically culled and the techniques to do this at the correct prioritisation levels have barely begun to be explored let alone formalised for a model of this scale.
The Centre’s own research suggests that evolutionary, network, complexity and decision theory will play key roles, but the challenges are myriad including-
The need to refine and integrate into the model a rigorous theory of social psychology. This vital field is still in its infancy.
The challenge of developing a radical new economic model after the disaster of the GFC. This appears to have been largely swept under the carpet by most economists as too hard, substituting another set of regulatory controls instead.
The challenge of smoothly combining the multiple disparate models of the system; managing the interdependent interfaces of feedback loops and input-outputs of an extremely complex and non-deterministic nature.
The problem of updating data inputs and algorithms in real or future time- extracting and extrapolating good models from past data alone is not enough and in fact could be disastrously misleading.
The core problem of using old mathematics in today’s 21st century models. Entirely new approaches may be required, such as replacing partial differential equations with a cellular automata approach as Steven Wolfram has argued. Also using coarse-grain rather than fine grain forecasting to escape the problem of infinite regress.
And then there’s the human element. Each group of researchers will be lobbying to maximise the application of their special insights and expertise to gain maximum kudos for themselves and their institutions. This is human nature, but if allowed to proliferate outside a disciplined framework could rapidly spiral out of control.
The data management and reporting system will also have to use a variety of standard tools to rigorously link results from multiple sources, particularly as funding will rely on the quality and transparency of the overall program- not individual or institutional progress.
Then there’s the next phase- providing advice to policy makers based on the project’s outcomes. As a critical EU Flagship project there will be a lot riding on the verification of results in the public domain- particularly in the Eurozone’s cash strapped times.
At the same time integrating FuturICT with the other myriad models which will be working full throttle over the next ten years from competing and cooperating projects, particularly in the US and Asia. No-one will expect FuturICT to be operating as an island isolated in a sea of scientific progress.
And finally, integrating it with the full power of the Intelligent Web-Mark 4.0- because this will be the inevitable outcome.
Evolution towards a full Web-Human partnership will be the major paradigm shift of the 21st century, as policy makers and scientists alike find the causal relationships far too complex to comprehend.
The process of scientific research is expected to change more fundamentally over the next thirty years than in the previous three hundred years, towards an alternate global commons approach- a decentralised open marketplace of ideas, driven by a combination of the Web’s and human computational intelligence.
The epoch during which individual humans are able to conceptualise or understand increasingly complex phenomena is coming to an end. There will be just too many interacting variables for the human brain to get an intuitive understanding of the possible permutations and interactions.
With Big Data now a fact of life in all disciplines, combined with evolutionary discovery programs such as Eureqa, 95% of the traditional science will be handled autonomously by the Web.
Eureqa is already being applied as an ideal tool for disentangling and optimising systems that are too complicated or time consuming to analyse by traditional methods; for example aircraft wing design, network topology, financial forecasting and particle physics simulations.
But its significance goes well beyond this. It is being applied to discover new theorems increasingly beyond the cognitive capability of its human counterparts- beyond the limits of human knowledge.
Eureqa type software therefore could and will be applied in the future within all complex scientific disciplines- economics, biology, social sciences and climate science and even perhaps to solving the universal Theory of Everything. The combination of descendants of the Web and Eureqa could perhaps achieve this within the next several decades.
So if the challenges as defined may be almost impossible to overcome, why is FuturICT so vital?
Because it’s the next step in an essential learning curve that humanity has to experience in order to improve and refine its predictive capability. Such a process will be essential if we are to survive the approaching Armageddon of climate change and many other critical challenges.
This is the next major step – a proof of concept of a new era of mega-modelling. That is why it is so important – not whether it can produce perfect results, but whether we can learn enough to continue to make progress in this vital area for our future wellbeing.
There are no shortcuts in this endeavour – it will be and always has been a step by step evolutionary journey.
FuturICT will play a key role in eventually linking with the full computational intelligence of the Web to create a new societal decision framework never before contemplated by human society.
Accepting the decision capability of the Web as an equal and in the future- senior decision partner, integrating up to 10 billion human minds, will be one of the defining paradigm shifts of our times. It will involve a very radical mind-shift. Large cooperative projects such as FutureICT are essential stepping stones towards this goal.
This is what FuturICT has to teach us.

Friday, December 23, 2011

The Future of Science

The author argues that a new paradigm for science is emerging in the early 21st century, driven by the larger process of Meta-Evolution and triggered by a number of global social developments including- Citizen Science, Open Science, Networked Science, AI-based and Computational Science.

The Knowledge Discovery process is exploding and combined with new levels of science communication and education is bringing the potential social ramifications and benefits into sharper focus for the general population.
There appears to be no end in sight to the process of knowledge discovery, with each new insight interweaving with and catalysing others.

The Evolution of Science
‘Science’ is not an object able to be easily defined, but a multilayered process and framework encompassing the causal understanding and quantification of relationships that have enabled life and human civilisation to develop.
But always underpinning it, is a fiercely guarded reputation for veracity and rigor. The methodology that has built this reputation over the last five centuries is ‘the scientific method’.
The scientific method provides a rigorous and logical means of establishing causal relationships between phenomena, based on inductive as well as deductive logic and experimental evidence.
Today we literally live or die by its outcomes; whether in the shape of an aircraft wing that enables us to travel safely through air, the reliability of the electric motor controlling a skyscraper lift, the sterilisation process used to keep food fresh and vaccines that have saved millions of lives.
This explosion of knowledge over the past millennia has created a rich melting pot from which society extracts a continuous stream of invention and benefits related to all aspects of technology - transport, energy, communications, agriculture, medicine etc. Each strand is ultimately co-dependent on all others and in fact it can be argued that the evolution of civilisation is contingent on the accretion and blending of all strands of prior knowledge.

Science Mark 1.0 began a long time ago with the basic adaptive behavioural reward system of all life, eventually evolving in hominids into a more formalised trial and error process, underpinning attempts to discover survival strategies. Examples include testing different plants to discover remedies for illnesses or processes for improving edibility. Also testing forms of social interaction to achieve more effective decision making to maximise group rewards and mediate resource sharing and refining the tools and techniques needed for hunting, food preparation and construction of dwellings.

A greater understanding of the cause and effect of natural phenomena based on this substrate of trial and error emerged from sifting data for useful patterns and inferring relationships through the use of classification and prediction techniques, eventually evolving into today’s sophisticated scientific tools.

Finally with the major scientific breakthroughs of the 16th and 17th Centuries, including the formulation of the laws of mechanics linking motion, mass, force and acceleration together with the sophisticated analytic machinery of calculus, the insight was finally gained of the abstract underlying mechanism- the Scientific Method. This was enunciated, defined and applied by Newton, Bacon, Descartes and others.

The scientific method is therefore a framework or process for extracting patterns and relationships about the universe in which we exist. It is also an open system in itself, capable of adopting new modalities when required and upgrading its structure to achieve greater compatibility within its own environment- the world of knowledge. Science and technology are ultimately the outcomes of the scientific method and all branches are closely intertwined.

With the new power of the scientific method emerged the idea of Universal laws, including the realisation that this involved the formulation of quantitative as opposed to qualitative relationships. This basic quantitative process for testing and refining knowledge has remained virtually the same ever since. The changes that have occurred, relate primarily to new developments in the tools for data gathering, analysis, measurement, applying logic and the testing of hypotheses.

But the scientific method requires first and foremost that a theory be predictive; that it must mirror reality to the point where real phenomena can be calculated with reasonable accuracy, such as the motions of the planets and stars. This predictive principle was tested by experimental means, primarily by early navigators and astronomers.

In 1609 Galileo made a telescope through which he observed the four moons of Jupiter and confirmed the Copernican Heliocentric theory. The nature and cause of these motions was however still a mystery until in 1643 Isaac Newton utilised this knowledge in constructing the law of gravity, demonstrating that the equations encapsulating the mechanics of the sky were the same as the laws of mechanics on earth.

There was also a new perception of the larger relationship of the earth to the Cosmos. Ptolemy's geocentric model of the heavens, derived from Greek cosmogonies eighteen centuries earlier, of the stars revolving in a fixed sphere around the earth, had been displaced by the Copernican heliocentric model. The earth was now not at the centre of the universe, but revolved around the sun.

This was a unification of the underlying principles governing motion in the universe, demonstrating the capacity of the scientific method of knowledge discovery to deliver results of great truth and beauty.
Science Mark 2.0 emerged during the 19th and early 20th centuries, with the formulation of the great unifying laws of Electromagnetism, Thermodynamics, Relativity, Quantum Mechanics and Darwinian Evolution. By providing a new understanding of the molecular, atomic and sub-atomic view of forces and matter, Science 2.0 opened a cornucopia of modern advances in all branches of science resulting in an explosion of technological advances including- information technology, electronics, energy, genetics, medicine, economics, engineering, agriculture and construction.

Grand Unified theories such as the Standard Model, String Theory and Loop Quantum Gravity followed in quick succession in the latter part of the of the 20th century, paving the way for Science 3.0.
Now Science Mark 3.0 is emerging in a way which will dwarf what came before. This new scientific paradigm comes with a greater understanding of the sciences at the information and sub-atomic level with the additional understanding of the reintegration of the physical and social disciplines.
Major EU ICT driven flagship projects such as FuturICT and Paradiso are also predicated on this ‘interweaving’ principle, cross correlating multidisciplinary projects and combining information from both the social and physical sciences.

But in addition to this powerful methodology, humans have gained an ally in the scientific endeavour via the enormous computational and social power of the Web. Combined with the mind power of its soon to be 3 billion human acolytes, it has the potential to offer far more than a passive repository of human knowledge. It is also capable of discovering patterns in vast quantities of data, developing new theorems and algorithms and even unleashing the power of human creative thought and is likely to emerge as a senior science partner with humans as early as 2020.

As the author has noted in previous blogs, this has the potential to create a super-organism with unparalleled problem-solving capacity; a global mind more powerful than any previous group of scientists.
And just in time, as global warming and extreme weather events, combined with the spread of new diseases and critical shortages of food and fresh water, threaten the very fabric of society and the survival of our planet.

The major social drivers powering the birth of Science 3.0 include Citizen, Open, Networked, AI based and Computational Science.

Citizen Science-
The emergence of the Citizen Scientist is just beginning and it might be useful to start by distinguishing between those involved as science insiders and outsiders or specialists and non-specialists, rather than professionals and amateurs. The line between professional and amateur science will become increasingly blurred, likely disappearing within the next twenty years, as the value that an individual contributes to knowledge discovery rather than a one-size-fits-all qualification, becomes the key criteria for recognition. With the heavy computational lifting largely done automatically by the Web, science will take on a more egalitarian flavour.

The contribution of the latter day non-specialist to science is well documented, including a lead role in the following disciplines - Paleontology- contributions to fossil discovery; Astronomy- discovery of new supernovas, comets, meteors and even planets; Mathematics- such as the immensely important statistical Bayes Theorem; Biology¬- the discovery of new species of both plants and animals; Archaeology - locating new sites with the help of Google Maps and the mapping and classification of rock art motifs; Ecology- Indigenous populations contributing to the understanding of the deep ecology of their environment.

Now this symbiosis has been taken to a new level through- Gaming, Crowdsourcing and Volunteering , expanding the support and creativity of relative outsiders in a variety of disciplines. This involves enlisting both professionals and non-professionals to help solve complex scientific problems, not just through extra computing power such as the SETI project, but through the power of many minds working in tandem with computers and the Web.

The application of games to problem-solving is a technique in which many non-professionals excel and harnessing this mind power is a relatively new variation of citizen science. For example-
Phylo is a game that allows users to contribute to the science of genetics by aligning sequences of dna, rna and proteins to find functional similarities and to learn how they have evolved over time. Humans are better at solving such visual puzzles than computers and Phylo represents such molecular groups by the alignment of vertical coloured pieces on a screen. There are currently 16,000 registered users working to solve such puzzles as well as a Facebook group to suggest Phylo improvements.

Foldit is a protein folding game, capable of solving puzzles that have challenged professional scientists for years such as the optimum folding patterns of chains of amino acid that make up the building blocks of enzymes and proteins and cracking the code of how an enzyme of an AIDS-like virus is created. It took the gamers only three weeks to create an accurate model of the solution.

Also 40,000 registered users of the game Planet Hunters, have identified 69 potential new planets from data retrieved from NASA’s Kepler Space Telescope, to find habitable planets outside earth’s solar system.
A mix of professional and crowd sourced volunteer astronomers also helped with survey observations such as those coordinated by ESA’s space hazards team, and have recently found an asteroid that could pose an impact threat to earth-2011 SF108. The project requires visual image evaluation by humans beyond automated computer analysis. Currently a survey of 300,000 spiral galaxies is also underway using superior human image discrimination to determine those with bars.

This cornucopia of knowledge provided by science outsiders is in most cases no less rigorous or ‘scientific’ than that ‘discovered’ by insiders or professionals in a multimillion dollar laboratory or observatory. Today the technological gap between citizen and traditional science is also closing in several major ways, particularly in relation to the instrumentation and techniques now available to the outsider. These include- powerful computers, publicly available data sets and algorithms, Google maps, telemetry of all types, virtual reality simulations, virtual telescopes and observatories and inexpensive desktop laboratories- all harnessing the power of the Web.

The analogy is the ease with which computing amateurs can now use professional tools and templates to translate creative ideas into professional quality end-user applications.

Networked Science-
Using specialist online tools, massive research cooperation through the power of the Web is becoming a reality. Network science projects can amplify collective intelligence, linking relevant disciplines and skills to best solve problems across institutions and nations; activating latent expertise and dramatically speeding up the rate of discovery across all sciences.

Remote interdisciplinary collaborations- or virtual research teams, are also proving invaluable in the knowledge generation process. STEM - Science Technology Engineering and Maths is one such integrated field. Specialists meet at a number of university centres to tackle problems in areas of national interest such as sustainability, neuroscience and systems biology, piecing together a narrative between disconnected information databases.

In ecology, the Long Term Ecological Research network composed of 26 research sites has been operating for 30 years. Now a group of PhD students have set up a network of small scale experiments- The Nutrient Network, to understand influences on the structure of grasslands, with scientists volunteering their services at 68 sites in 12 countries, without the need for major grants. This is an example of big science being carried out on a shoestring budget through global networks volunteer scientists.
As we as a species begin to take on the characteristics of a super-organism through integration with the Web, the level of such networked problem-solving power will increase exponentially. As underlying laws and patterns are discovered the knowledge is also increasingly converted into algorithms- that can automatically fly a jet aircraft, analyse genome sequences in minutes, diagnose illnesses or run a chemical factory.

Open Science-
Science is increasingly seen as a public enterprise, not a separate world mediated by remote experts. Institutions such as the British Royal Society are playing a leading role in publicly advocating the disclosure and sharing of scientific information.
In addition science communication and education is booming, with knowledge readily available not only through a host of popular science publications such as New Scientist and Scientific American, but through Google and thousands of free sites such as Wikipedia and ScienceDaily.
Publication of professional science research is also flexing up, allowing more direct access channels to the public most notably arXiv, outside the peer reviewed journals. But the top-down attitude of many scientists still prevails- that the scientific process must only involve professional scientists and that societal implications should be communicated to the public and policy makers as forgone expert conclusions open to minimal public debate. But the critical scientific decisions now faced by humans are also part of the fabric of a democratic society and therefore must be based on the free and open flow of information.

Open Science also manifests in more ethical science. There has been enormous pressure on researchers for positive results and a steady decline in published studies where findings have contradicted current scientific hypotheses. Negative findings attract fewer readers so scientific journals tend to reject these more often.

Government Grant Agencies such as the NSF and NIH also need to become more flexible and work with scientists to develop more open sharing modalities of knowledge discovered with public support, by encouraging scientists to submit findings in alternate ways beyond published papers. These include computational visualisations and popular science books. Also to actively participate with the public in solving endemic social problems such as climate change, conflict and food capacity.

Artificial Intelligence-based Science-
There are many AI techniques in use today including- fuzzy logic, neural networks, decision networks and swarm intelligence.
But the most powerful of all is based on the generic version of the evolutionary process itself- the Evolutionary or Genetic Algorithm- EA.

Although this technique has been widely used for over ten years to optimise and discover design solutions, it has reached a new level recently in the form of easy to use software tools such as Eureqa.
Eureqa is being applied to search not just for new patterns, but for new laws of science. This is achieved by repeatedly combining and testing simple mathematical expressions to create equations, selecting those that best reflect reality. But to achieve theorem discovery it must also apply the property of invariance to the laws of nature by describing how the system evolves over time.

Eureqa is an ideal tool for researching systems that are too complicated to follow set rules and has already been used to create time tables, aircraft wing design, network topology, financial forecasting and particle physics simulations.
But a recent landmark example of theorem proving was the rediscovery of the Law of Conservation of Energy from scratch, which originally consumed the intellectual energy of many scientists hundreds of years to discover. It took Eureqa, using a two arm pendulum, only one day.

But its significance goes well beyond this. It is being applied to discover new theorems increasingly beyond the cognitive capability of its human counterparts- beyond the limits of human knowledge. And it also has the potential to alter the balance between insiders and outsiders, because it empowers both groups. Eureqa can discover new patterns and relationships that the professional can then build on to prove new theories. The citizen scientists can do the same thing without the need to work in a traditional science framework- to move from analogical conception to solution, just like the gamers of the Foldit project.
It was recently applied to determine the existence of a deep law that controls cell differentiation. It produced a biological law of invariance equivalent to the conservation law in physics. It is also being applied to understand the decision network of a bacterium as it changes into a spore to determine which factors switch on genes and which genes control others, involving a huge network of probabilistic interactions between biomolecules.

Equally, Eureqa could and will be applied in the future to all complex scientific disciplines- economics, biology, social sciences and climate science and even perhaps to solving the universal Theory of Everything. The combination of descendants of the Web and Eureqa could achieve this within the next decade.

AI based science is an extension of the scientific method- but not essentially different. Evidence-output from an EA generated hypothesis is tested against a reality based goal to check how accurately it reflects this reality. Each new generation is therefore a variation of the original hypothesis.

Computational Science

The process of scientific research is expected to change more fundamentally over the next three years than the previous three hundred years. The epoch during which individual humans are able to conceptualise or understand increasingly complex phenomena could be coming to an end, for example as applied to the human genome, even though the power of computational visualisation is extending this capability. There are just too many interacting variables for scientists to get an intuitive understanding of the potential interactions of cells with millions of chemical outcomes.

With Big Data now a fact of life in all disciplines, combined with a discovery program such as Eureqa, 90% of the traditional work can now be done by the Web.

Mathematics- the science of patterns in number systems, abstract shapes, and transformations between mathematical objects, requires additional capabilities beyond the human brain to complement our human intuition, for example to allow exploration of the properties of trillions of prime numbers.

Meta-Evolution and the Rise of Science 3.0-

Great scientists throughout history have nurtured the ‘creative gene’ which allowed them to soar to new heights and great new conceptual horizons, which they then backed with rigorous mathematics and experimentation. Emulating the ‘creative gene’ of humans could be the next big step for the Web. But regardless, the discovery process is due to accelerate, as the global mind expands and each new insight catalyses countless others.

As previously defined, the scientific method is recognised as the primary knowledge generator of modern civilisation. Through the rigorous process of verification, applying both inductive and deductive logic, causal relationships governing physical phenomena are defined and tested in the form of hypotheses and models.

In fact it bears a striking similarity to the larger process of evolution itself and the author maintains is a subset of that process. The difference is one of complexity and scope. Evolution is a universal generic process of adaptation and optimisation, which selects and amplifies the most appropriate response of a system to its environment. The scientific method is a meta-method or methodology aimed at selecting the most appropriate theory or causal model within an experimental environment.

The key to better understanding the scientific method, is that it is not a single process but a number of sub-processes, which are continually evolving. These processes include- initial conceptual insights, general hypothesis formulation, experimental framework development, data access and capture, pattern analysis of information and evidence extraction, hypothesis modelling based on inference logic, algorithmic formulation, experimentation, testing and validation of procedures, hypothesis optimisation and refinement.

Each of these sub-processes within the overall framework of the method is therefore evolving as scientific advances continue. The scientific method was therefore not invented or discovered in its present form, but evolved and continues to evolve, using processes common to all adaptive learning. Therefore it is logical that deeper understanding of evolution will also provide a deeper understanding the future of science.
The author believes the two are intimately connected. A greater understanding of the evolutionary nature of the scientific method will lead to an acceleration of the process of new theory discovery, which in turn will accelerate the broader process of evolutionary knowledge discovery.

There is no end to this process. All current major theories, including Relativity and the Standard Model of Quantum Physics, are constantly being optimised, refined and sometimes radically reinterpreted.
In his book- The Future of Life: A Unified theory of Evolution, the author has predicted the emergence of an imminent phase change in the evolutionary process- Meta-evolution or Evolution Mark 2.0.

As human society begins to achieve a deeper and more intimate understanding of the primary evolutionary imperative shaping its destiny, he predicts this new form of evolution will emerge, accelerating the already exponential pace of change to an extreme level.
The process will manifest in response to a deeper understanding by society of the implications of the evolutionary process itself. The resulting amplifying feedback will generate a glimpse of life’s true potential, accelerating the already massive momentum of evolution and its scientific counterpart, resulting in a rate of change that is forecast by a number of scientists and forecasters to reach hyper-exponential levels in the near future.

In addition Meta-Evolution will be reflected in the evolutionary future of Science itself. Science is already on the path to escaping its current institutional shackles, through the revolution in Citizen Science, Open, AI and Networked Science; harnessing the creative realignment of the physical and social sciences and the future Web 4.0’s massive computational intelligence. Combined with Meta-evolution this phase will attain astounding outcomes and benefits for society by mid-century and provide possible redemption for the gross exploitation of our planet. The evidence is already visible in the genesis of a starburst of major projects such as FuturICT and Paradiso ICT.

This will be none too soon as mentioned, as the problems of global warming, ecological disasters, sustainability, economic collapse and endemic conflict ravage the planet. Meeting these challenges will involve active adaptation through problem solving – the heart of the evolutionary process.

Already an awareness of the pervasive and rapidly accelerating power of evolution is beginning to be felt through the enormous scientific, technological and social advances in our civilisation. This insight creates the evolutionary feedback loop- to actively engage evolution in helping meet today’s complex survival challenges. This further accelerates the knowledge discovery process, which in turn would generate further evolutionary insight and application.

A significant additional impetus would therefore be gained from a deeper understanding of the driving role of the evolutionary paradigm- a global awareness of the engine underlying life's progress. This would eventually create an explosive realisation of life’s future potential,
As this rate of knowledge acquisition increases to the point of incompatibility with the human capacity to absorb it, new social structures and modes of cognitive processing based on artificial intelligence techniques will emerge to help humans cope.

According to the author, this is already occurring. Even as the amount of information expands beyond human horizons, we are developing techniques to bring it under control. Like a fractal image, cybernetic life forms and intelligent machines are evolving in the same way as biological life- mutating to become increasingly intelligent. These act as proxies for humans, managing complex processes and roaming cyber-space- searching, filtering and processing data from an already overwhelming pool.

Hyper/Meta-evolution can be expected to become a part of a new global paradigm within the next twenty years- 2030, based on current rates of knowledge growth and coinciding with the evolution of the super-intelligent Web 4.0. This will rapidly transform all aspects of our culture and civilisation, including accelerating the rise of Science 3.0.

Saturday, December 17, 2011

The Future of Civilisation 3.0

The author argues that the combination of the recent triple major disaster in Japan, the GFC and looming world recession, together with the Arab Spring and global Occupy movement, have provided the final triggers for the rapid evolution of Civilization 3.0 in the fight for human survival.

These world-shaking events send a timely message to the rest of the world, that the form of civilisation and the social norms we have become accustomed to and lived by over the last few centuries, is over.

Civilisation 1.0 began over 15,000 years ago with the founding of the earliest settlements and villages around the world, as hunter gatherers settled down to take advantage of the rich sources of edible grasses and natural foods growing mainly around the fertile delta areas of the great rivers and coastal areas of the world. These early habitats evolved into the first towns, cities and eventually nation states. At the same time, the first writing and number systems evolved to keep track of the products and services that developed and were traded by modern humans. Also with the growth of towns and trade across the world and the use of wood for building and smelting, the clearing of the forests across Europe began.

Civilisation 2.0 then emerged, with more sophisticated means of production using wind and water; rapidly accelerating following the industrial revolution in the 18th century with the harnessing of steam and later electricity and combustion engine power. These innovations were dependent on the burning of fossil fuels- coal and oil on a massive scale, allowing the West to steal a march on the rest of the world; colonising its populations and exploiting its wealth.
The manufacture of goods and services then increased on a massive scale; everything from food, textiles, furniture, automobiles, skyscrapers, guns and railway tracks- anything requiring the use of steel, cement or timber for its production.

The major cities expanded on the same original settlement sites as Civilization 1.0 - coastal ports, river delta fertile flood plains, regardless of the risk from subsidence and earthquakes.
The energy revolution was rapidly followed by the communications and information revolution- grid power, telephone, wireless, radio, television and eventually computers. Later, nuclear power was added to the mix.

Then in the second half of the 20th century the realization finally dawned that the planet’s resources really were finite, following the many dire predictions for decades previously.
Now at the current rate of consumption by a global population of 7 billion, projected to grow to 9 billion by mid-century, combined with the Armageddon of global warming, the planet is rapidly running out of fresh water, food, and oil. At the same time the grim effects of the escalating levels of carbon in the oceans and atmosphere has triggered more frequent and severe weather events- major droughts, floods and storms, adding to the impact of earthquakes in highly populated urban areas.

These problems are now bigger than any one nation can handle and can only be effectively addressed on a global basis. This will inevitably need to be coupled to a higher level of social awareness and democratic governance, in which everyone, not just politicians, are involved in the key decision processes affecting the planet.

Welcome to Civilisation 3.0.

Civilization 3.0 is just beginning, but is already being tested. From now through the rest of this century comes the hard part. Tinkering around the edges won’t do it for the planet and its life- including humans, any longer.
The planet’s climate is already in the throes of runaway warming, regardless of what forces caused it, because of the built-in feedback processes from the melting of the ice sheets in Greenland and Antarctica, to the release of huge methane reserves in the northern tundra and ocean floor.

But this is just the start of our problems.
Some areas may get a short term reprieve with local cooling, but overall the heating process appears to be unstoppable. The Faustian bargain that humans struck to establish Civilisation 1.0 and 2.0, when the planet was teeming with natural resources, is about to be redeemed. Humans are being called to account.

By 2020 the cost of solar, wind and biofuels is likely to be at a baseline level comparable to that of fossils fuels, due to major technological advances currently underway, such as artificial photosynthesis. But because of the flawed democratic process, major businesses and corrupt governments can still undermine the critical mindset needed for radical change, with calls for short term profits drowning out the desperate call by future generations for long term survival. It is therefore highly likely that we will still be emitting copious amount of carbon by 2020 and starting to exceed the safe limits of temperature rise.
In addition, supplies of fresh food and water, particularly in developing countries are already dwindling, with the potential to create further malnutrition and conflict.

So Civilisation 3.0 has to get serious.

One of the major recent initiatives at the heart of the fight-back revolution is the concept of a smarter planet. The Japanese experience has now reinforced that concept. Every built object and operational process will eventually need to be embedded with sensors and its performance and integrity continuously monitored and assessed in relation to natural disasters and sustainability.

Everything from roads, transport, bridges, railways, buildings, dams, power plants, grids and information systems, as well as human knowledge and skill capacities, will need to be urgently upgraded. Even towns and cities will have to be redesigned to avoid future worst case natural and manmade disasters and provide a more sustainable living space for future generations.
In addition, the loss of critical ecosystems and species will compound the infrastructure problems of the planet, requiring re-prioritization of the value of the natural environment and fair re-allocation of its resources on a global scale.

The current level of risk and waste in the built environment is now seen as both unacceptable and avoidable. By applying new technologies already available such as smarter materials, safer engineering methods, improved communications and sophisticated computer modeling, risk can be dramatically reduced.

The new sustainability standards will need to be set much higher; at a much smarter level than previously accepted, in order to reduce carbon emissions, optimize performance and enable more responsive adaptation within a fast deteriorating physical and social environment. This will be mandatory as the escalating scale of the risk becomes apparent.

At the heart of this revolution will be the powerful mathematical algorithms and intelligence capable of making optimum decisions at a far greater speed and with less human intervention. In turn this will require instant access to the Intelligent Web’s global resources of specialized knowledge, artificial intelligence and massive grid computing power.

By 2030 however, panic will be building across the globe. The safe levels of temperature rise of 2%, expected to hold until the end of the century, will likely be breached and physical and social problems will escalate.

Any realistic solution for human survival will require living and working together cooperatively and peacefully as one species on one planet, finally eliminating the enormous destruction and loss of life that wars and conflict inevitably bring.
Although cooperation on a global scale will be vital, individual nations will be tempted to free ride, as populations react with violence and anarchy to shortages of basic necessities through rising prices and inadequate infrastructure, particularly in hard-hit developing economies.

A massive mind shift will be required across the planet to achieve this level of cooperation; a more collaborative and creative process will need to evolve and quickly, harnessing all human knowledge and technological resources. To achieve this level of cooperation non-democratic states will need to democratize or be excluded from the resulting benefits and the old forms of democracy will have to be upgraded to a more inclusive and participatory level if human civilization is to avoid slow annihilation.
The stress of the human fight for survival will also present myriad ripple-on challenges relating to maintaining a cohesive social fabric. Democracy and justice are basic options, but also providing adequate levels of health, work and education will get a lot harder. This will require adaptation on a vast scale.

By 2040 the trendlines will be set and through the social media, the risks will need to be openly and clearly relayed to all populations. This will be similar to the collective discipline and mindset required many times in the past by nations threatened by the fear of war and decimation.
It will now need to be replicated on a global scale

Beyond increasing renewable energy and reducing waste, the fight for survival will require the implementation of other more radical innovations, including the eventual geo-engineering of the weather and climate. The science and technology needed to achieve such a complex outcome is unlikely to be achievable before 2050 and in the meantime our civilization may be in free fall. However it will probably be the only solution capable of reversing rather than just slowing the headlong rush to chaos.

Other radical solutions will involve the need to accelerate our level of knowledge generation. This is already taking place through advanced methods of automatic pattern analysis and algorithm discovery, applying artificial intelligence methods and the immense computational intelligence of the Web.

It will be a bootstrapping process. The faster the increase in knowledge acquisition, the more powerful the potential intelligence of the Web will become, which will then further accelerate the increase in life-saving expertise. This exponential process may be further accelerated by promoting higher levels of networked ‘swarm’ behavior, combining human intelligence on a grand scale across the planet. The benefits of collective intelligence acting like an advanced insect hive are already being realized, with research teams combining in larger and larger groups to solve more and more difficult problems. It has been demonstrated that an increase in synergy resulting from collective intelligence in complex self-organising systems allows ‘smarter’ problem solving as well as greater decision agility.
For example 50 European Universities have recently combined in the FuturICT project; an EU billion dollar flagship project to model, predict and solve future planetary and social problems. And this is only one collective project out of thousands, with increasing collaboration between US, European and Asian science and technology groups.

With all these initiatives, will Civilization 3.0 survive ?

It will likely be a very close call, dependent largely on whether our increase in beneficial knowledge can outstrip the planet’s rapid descent into environmental and social oblivion- a potential runaway pre-Venusian scenario with no end in sight.

It is similar to the Red Queen scenario in Lewis Carroll’s- Alice through the Looking Glass, in which the red chess queen has to run faster and faster just to maintain her position. Humans will also have to become smarter and smarter just to stay ahead of the approaching Armageddon.

The odds in fact will be very similar to the climate bottleneck that almost eliminated our early Homo sapien ancestors 20,000 years ago as they struggled to survive the last ice age. Only a small band of perhaps several hundred survived thousands of years of frozen hardship, finally regrouping and reaping the rewards that evolved following the great melt.

Modern humans can also can reap a future cornucopia if they have the courage and skill to survive the looming crisis in our evolution.

Many other civilisations across our universe may well have faced a similar bottleneck. Those that survived will have gone on to reap the untold riches of Civilisation 4.0 with its mastery over the physical laws governing our world and galaxy. Along the way Civilisation 5.0 will emerge, possessing not only the immense scientific capability needed to solve any physical problem, but enough wisdom to avoid future social catastrophes.

The stakes couldn’t be higher. The Japanese catastrophe and many others, including the Indian Ocean earthquake and tsunami in 2004 leaving 300,000 dead, should have given us all a clarion call.
This is not a bad dream, from which we’ll all awake tomorrow with business as usual. The future of Civilisation 3.0 and our unique intelligent life-form really is in the balance. Let us hope ours will be one of the few or perhaps the only advanced civilisation to have survived such a test, so that our children and our children’s children can live to experience the untold wonders of our planet and universe.

But the Red Queen will have to run very fast indeed.