Guest Post by Leo Saraceno – The Game of Risk: How Insurance Behemoths, Disaster Recovery, and State Intelligence Laid the Foundations for the Planetary Computer

The following guest post was written by Leo Saraceno and expands on his previous work at Silicon Icarus and the series “Atomic Ecologies” he did with me and Jason Bosch last summer. Last year I started a relationship map to explore the intersecting themes of corporate environmentalism, biogeochemisty, natural capital, and energy economics as they pertain to public policy innovation and humanity’s increasingly digitized relationship to nature. The screenshot below features several players who are referenced this post, including AXA, Colin Prentice, and IBIS. I also direct your attention to Howard Odum whose “emergy” language of energetic exchange I sense has a lot to do with the shift to a decentralized electrical energy grid and digital programmable tokens framed as “community currencies.” In this way social exchanges will be remade as “choice” signals to inform risk modeling and digital governance. Leo lays out just such a scenario in this piece relating to Mitiga Solutions and their crossover work in the parametric insurance modeling and token engineering sectors. I am grateful to him for diving into the somewhat boring world of global insurance markets to bring us clarity around the planned role that natural disaster prediction will play in advancing the attempted creation of a globally-networked super-intelligence.

Interactive Map Link: https://embed.kumu.io/4c6ea2f6cbd20150420bb2f4232faf3d#untitled-map?s=bm9kZS1BSFBRZENTYw%3D%3D

 

The Game of Risk: How Insurance Behemoths, Disaster Recovery, and State Intelligence Laid the Foundations for the Planetary Computer By Leo Saraceno

My previous writings, published at siliconicarus.org, introduce blockchain and web3 as forces that will drive social transformation. Picture waves combining to generate a storm surge with the power to transform natural life across the planet. Social impact finance, touted for its supposed sustainability and transparency, compels adoption of these cybernetic, steering technologies across many facets of our lives.

Smart contract protocols, the next-generation internet, is more than a complicated machine. Freed from the constraints of display screens, the logic circuits of web3 catalyze a digitally-mediated form of guided evolution using a potent combination of complexity theory, free market economics, token engineering, and Internet of Things-enabled game mechanics. A choppy sea of loosely-connected organizations and individuals are moving us steadily, yet almost imperceptibly, forward into a future governed by real-time data streams fed into simulation modeling. The overlapping interests of global finance, industry, the military, state intelligence, and academia have flowed together to create a powerful current with a sinister undertow.

Natural disasters and environmental degradation are now being leveraged to create a sense of urgency around adopting automated technologies to manage risk in lieu of human-centered analysis and decision making. The insurance industry has long been a leader in the use of real time data to inform prediction and risk. This post will discuss a division of the insurance industry that deals with the environment and climate and expands upon a previous piece, Natural Asset Managers – How Decentralized Ledger Technology Will Drive the Ecosystem Services Sector, that I wrote in February 2023.

A key element in both articles is the concept of ecosystem services described below:

“The phrase ‘ecosystem services’ refers to all quantifiable benefits that life provides to humans, such as: water purification, natural pollination, air cleansed by forests, medicine etc. Quantification takes place at a scientific level, (measuring inputs and outputs using systems analysis, relevant chemical domains, and statistics) and is translated into monetary effects.”

The ecosystem services paradigm overlaps with the catastrophe insurance market. Both ground themselves in large scale environmental data modeling. In today’s globalized economy, it’s not difficult to imagine why massive multinational insurance companies would be keenly interested in enhancing their capacity to model economic activity and ecological outcomes at an ever more granular level of detail.

Ecological health and economic activity are being integrated into one another as ESG (Environmental, Sustainable, Governance) portfolios aligned with the United Nations Sustainable Development Goals carve out larger and larger pieces of the emerging financial markets pie. The insurance industry has stepped into this space, serving as a bridge between the interests of institutional conservation and finance, a trusted facilitator to help realize the many “green new deals” that are just now peaking above the horizon. The concept of “natural capital” with its emphasis on quantification fits comfortably under the umbrella of insurance, since this sector excels at calculating the probability of events that may lead to claims and impact their bottom line.

Source: https://www.northropgrumman.com/corporate-responsibility/technology-for-conservation/

Having a comprehensive understanding of insured assets is vital to the insurance game of risk. By agreeing to embrace the data-driven “sustainability” metrics that are central to social-impact finance, we are tacitly agreeing to transform the substance of the Earth into a new asset class. That asset class is defined by assigned values logged in digital accounting ledgers that document the relative value of natural capital within an ecosystem services context. Each unit of life, from the smallest microbe to the mightiest river, is being transformed into a number that supposedly reflects its relative value to the continued functioning of life on Earth. To “save” the planet we will simulate it, and accurate simulations are only possible through the use of pervasive sensing technologies paired with sophisticated modeling software.

AXA, a French multinational insurance company that ranks among the largest in the world, helped launch the Task Force on Nature Related Disclosures (TFND) in the summer of 2021. Some of the largest financial and corporate institutions in the world are backing this effort, which boasts a collective $20 trillion in assets. In their 2022 Climate and Biodiversity report, AXA published details from one of its hundred TFND pilots. The project involved deployment of an “innovative biodiversity-specific data measurement tool” created for them by IcebergDataLabs. The software measured the impact of AXA’s business on biodiversity.

The Corporate Biodiversity Footprint, a “data measurement tool,” involves four steps:

  1. Assess the commodities and products purchased and sold by the company throughout its value chain based on IDL’s internal physical Input/Output and allocate the company’s product flows by sector (NACE(6) sectorization);
  1. Calculate the company’s environmental pressures identified by the CBF based on its product flow based on a Life-Cycle Analysis;
  1. Translate the pressures through pressure-impact functions (GLOBIO) into one and the same biodiversity impact unit;
  1. Aggregate the different impacts into an overall absolute impact at a company level.
    (emphasis added)

The framework uses “pressure impact functions” to transform the company’s physical activities into biodiversity metrics. A Dutch organization called GLOBIO developed the mathematics behind this standard. The biodiversity metric is called Mean Species Abundance (MSA). This is calculated by:

“…dividing the abundance of each species found in relation to a given pressure level by its abundance found in an undisturbed situation within the same study, truncating the values at 1, and then calculating the arithmetic mean over all species present in the reference situation.”

GLOBIO’s biodiversity metric compares commercial sites, for example the construction of new roads or buildings, with comparable areas that have been identified as being “undisturbed.” The scope of the pilot analysis, however, was limited by the fact that the environmental assessment, led by Dutch PBL, only included warm blooded vertebrates and plants. Overlooked were the invertebrates and microorganisms that are universally recognized as sacred, essential beings that make life on earth possible. In a classic Hegelian dialectic move, however, a critique of this approach may actually serve to legitimize an expanded program of digital life capture. What Dutch PBL overlooked is being addressed by the ARISE project, which is active in another corner of the Netherlands. ARISE is cataloging ALL multicellular life in the country to construct a “bio cloud” for computational use by local supercomputers.

The creation of markets in ecosystem services could not happen without substantial infusions of catalytic capital from entities and individuals that stand to benefit from their existence. AXA maintains an investment fund, AXA Research, that underwrites scientific and policy research  needed to justify the shift to a global governance / planetary computer paradigm. Narrative management is everything, and for a price, academics can be incentivized to become willing partners, eager to provide made-to-order research for such purposes. An example of AXA’s influence is seen in their sponsorship of the endowed chair at the Biosphere and Climate department of Imperial College London.

The position is currently occupied by Colin Prentice, an influential researcher and the second author of the 1996 paper that laid out the Integrated Biosphere Simulation Model (IBIS). That effort was carried out in partnership with Oak Ridge National Laboratories and the SAGE center at the University of Wisconsin. Oak Ridge National Laboratories became the birthplace of mathematical and computational models of ecological systems in the post WWII era. For additional background, see our presentation series on Oak Ridge and “ecological cybernetics.” A follow up paper on this topic is underway.

NASA describes IBIS as follows:

“a first step toward gaining an improved understanding of global biospheric processes and studying their potential response to human activity.”

Source: https://daac.ornl.gov/MODELS/guides/IBIS_Guide.html

Professor Prentice’s current project is called LEMONTREE (Land Ecosystem Models based On New Theory, Observations and Experiments). Its funding comes from Schmidt Futures, Eric Schmidt formerly of Google/Alphabet. The principal aim appears to be developing ways to evaluate evapotranspiration (evaporation + transpiration). According to one of their citations, about 70% of all terrestrial precipitation returns to the atmosphere via evapotranspiration, representing a major factor in predicting the water cycle and its associated energetic ramifications.

LEMONTREE formalized a collaboration between researchers at several universities whose research centers development of next-generation vegetation models. The theories underpinning their current modeling work use optimality and natural selection principles. Optimality refers to tradeoffs such as plants maximizing carbon from photosynthesis while minimizing water loss. Prentice considered optimality principles to be essential for building universal land surface models.

Optimality principles are the ‘missing law’ of biology for Earth System modeling. They have the potential to generate substantially simpler and more robust models that fully exploit the power of natural selection to eliminate suboptimal trait combinations, and the richness of relevant observational data that are now available to ecosystem scientists for the development and evaluation of theory and models.

Source: https://www.imperial.ac.uk/news/217793/new-lemontree-project-aims-improve-earth/

Prentice is also the director of the Leverhulme Centre for Wildfires, Environment and Society, which claims to be the only research facility in the world devoted to the fundamental biophysical dynamics of wildfires. The Leverhulme Centre works with LEMONTREE and has also received grants from the AXA Research Fund. AXA funds another department chair in Turkey at the Koç University occupied by Özgur B. Akan. Akan’s research involves building an internet of things enabled by nanotechnology:

“The proposed research program aims to implement the very first artificial micro/nanoscale MC systems in liquid and air environments by fabricating MC-transceivers (MC-TxRxs) based on novel nanomaterials such as graphene, hydrogels, and xerogels. Given the peculiarities arising from nano-physical and biochemical processes, and the computational and energy-based limitations of nanomachines, the program will revisit conventional ICT tools and devise new ones for MC based on empirical knowledge gained through the fabricated MC system prototypes.

The project will focus on two applications: (i) a distributed molecular nano-sensor network for environmental monitoring, and (ii) an artificial gut-brain axis as a testbed for the development of novel ICT-based treatment techniques for psychological and intestinal diseases.” Source

Professor Akan received his PhD through Georgia Tech in 2004 under Ian Akyildiz. Akyildiz is one of the top researchers internationally on network technologies. He spent thirty-five years at Georgia tech leading the Broadband Wireless Networking laboratory; is cited over 150,000 times; has established research enters all over the world including the UAE, Saudi Arabia, Spain, Russia, and India; and is founder/chief editor of the International Telecommunication Union Journal of Future and Evolving Technologies.

(This is Alison – I find it very interesting that Akan has affiliated himself with Cambridge University as indicated by this video promoting his online “Internet of Everything” Course. I recently shared a guest post by Lorraine Davison, who has been exploring Teilhard de Chardin and Julian Huxley’s role in advancing noetic convergence, and she specifically references Cambridge as being the incubator for this school of thought. You will note the family name Fry, which comes up later in this post relating to Lewis Fr Richardsony, a pioneer in weather forecasting. I’m providing an excerpt below for convenient reference, but I do encourage you to check out Lorraine’s entire piece here after you finish Leo’s post.)

“Dear reader, congratulate yourself. You are a rational animal sitting quite near the pinnacle of evolutionary progress. I say “quite near,” because the actual pinnacle is occupied by a group of nineteenth-century British academics and their current intellectual heirs. These luminaries have raised evolution itself to the status of a god, and they are the self-appointed spokesmen for this new religion of progress. Refusal to march to the drum beat of this relentless and accelerating advance is now the only heresy. In fact, anyone who lacks utopian vision or exhibits distasteful attachments to past and family will likely be trodden underfoot. You must understand that this is not science or politics burnished with religious zeal; it is an entirely new global religion. We, my friends, have fallen into the hands of fanatical religious zealots.

The embryo of this new religion had already been implanted at the University of Cambridge as the turbulent seventeenth century dawned. The University had become a focus for religious and political dissent and went on to supply many of the Puritan settlers for the new England that was emerging across the Atlantic.

After the English Civil War (1642-1651) dissenters were required to either stay out of the university system or publicly subscribe to the beliefs of the Church of England. So, religious dissent outside of mainstream institutions became a fertile breeding ground for all manner of exploratory thought and a not insignificant factor in spawning the industrial revolution. In time, a shadow ruling class emerged from these dissenting thinkers who also went on to provide the impetus for the great philanthropic movements of the nineteenth century.

This powerful group was composed of three strands of the British upper-middle class: evangelical members of the Church of England (often known as the Clapham Sect), Quakers, and philosophical agnostics. These families came to form a discrete group as they tended to intermarry. Many of the Quakers and agnostics eventually entered the Church of England for reasons of pragmatism and self-interest. Here, they strengthened the evangelical faction that worked tirelessly for the abolition of slavery. 

These influential dynasties included the Stephens (including James, father of Virginia Woolf), the Darwins, the Huxleys, the Wedgwoods, the Frys, and the Gaskells. Their rise to positions of social and cultural importance was a defining moment for modern Britain, and they replaced many members of the aristocracy in influential positions in government, church, and academia. Their offspring were more likely to be educated at Cambridge than Oxford University and provided many of the fellows of the colleges. After the loosening of religious restrictions in 1871, those members of the group who had clung openly to their dissenting views were also able to enter the university system.” Lorraine Davison

It’s not much of a leap to see how the “distributed molecular nano-sensor network for environmental monitoring” imagined by Akan meshes with the data collection demands of the maturing ecosystem services paradigm. For a public with little to no comprehension of pending bio-digital convergence, the technological imperative to observe the environment and measure our impact on it seems straightforward and productive. Most people cannot imagine the nefarious secondary goals that underpin corporate environmentalism. Virtualization of life carried out in the name of saving the planet is incomprehensible. 

If, however, one considers ubiquitous computing and pervasive sensing within the context of our country’s long history of cybernetic research and consciousness studies, it is hard to discount the possibility that “green finance” could be a powerful tool to repurpose life on earth as networked nodes used to advance new forms of social computation, the nature of which is beyond our current understanding.

Insurance Risk Modeling Overview: Parametric Insurance and Global Digital Twins

While the previous section focuses on AXA, there’s a lifetime’s worth of research that could be done on similar organizations pursuing related endeavors. In this section I examine how the growth of the parametric insurance sector advances the larger program of forcing life on Earth into simulation modeling software. In parametric insurance, claims are paid automatically based on a pre-specified metric. Parametric structures end up most often in catastrophe insurance, where the pre-specified trigger can be a weather metric, such as wind speed, or a Richter scale metric in the case of earthquakes. There is always a third-party company tasked with the environmental and economic modeling of the specific event. These modelers inform the terms of the insurance contract.

Source: https://www.mitigasolutions.com/risk-modelling
Source: https://www.mitigasolutions.com/

I’ve already discussed Mitiga Solutions in, “Blockchain, Digital Twins, and Global Brain Economics Part 1,” but I wanted to expand on a few points in this post. Mitiga Solutions was spun out of the Barcelona supercomputing center. Barcelona is a hot spot of advanced digital networks and the data economy where Francesca Bria, former Chief Digital Technology and Innovation for this supposed “rebel city,” implemented city-wide sensor networks and carried out AI computation on citizen communications. Alison’s piece, Mondragon, Focolare, and Inclusive Capitalism, opens with a discussion of Bria’s deployment of smart city technologies and how they were used to support lockdown measures. 

Mitiga’s customers include multinational defense contractors, NASA, NATO, and the United Nations Development Programme (UNDP). Their services are also sought out by insurers like AXA and Willis Towers Watson. The firm appears to specialize in atmospheric sciences and has cultivated relationships with Europe’s major air traffic control systems.

Mitiga was founded by Alejandro Marti, a graduate of Rutgers University who worked for the state of New Jersey as a GIS specialist for several years before joining the Barcelona supercomputing center where he worked for over a decade. Marti heads a United Nations focus group on AI for natural disasters and serves on the board of PRACE (Partnership for Advanced Computing in Europe). Mitiga joined Microsoft Startup track in 2020, which led to an expanded partnership in 2022 for co-development of a “fully transactional” insuran-tech SAAS (software as a service) platform for risk management. Microsoft and Mitiga are in the process of evaluating Microsoft’s Planetary Computer Platform. Parametric insurance is an important use case for the planetary computer.

Source: https://news.microsoft.com/es-es/2022/04/21/mitiga-solutions-and-microsoft-expand-collaboration-to-mitigate-the-effects-of-natural-disasters-caused-by-climate-change/

The Danish Red Cross hired Mitiga Solutions to be the third-party modeler for a blockchain-based parametric insurance contract for a potential volcano catastrophe. The creation of mathematical models of the environment based on quantifiable data mirrors the modeling used in ecosystem services use cases but is applied slightly differently in an insurance context. Parametric insurance modeling doesn’t simply evaluate the risk of a covered event occurring. The potential impact on insured assets is a second part of the model, which considers physical assets AND purely financial assets, such as securitized catastrophe bonds.

An article published by PRACE about Mitiga Solutions’ approach to epidemic modeling in sub-Saharan Africa, while not directly about insurance and ecosystem services, helps lay out the type of work they do. The three phases of their work involved community-based participatory surveillance, tokenized incentives that encouraged data-sharing, and a comparison of that data with institutional perspectives:

“…digital participatory surveillance, developed almost ten years ago as part of a European project on influenza. It involves gathering information from people about how they feel and symptoms they may be experiencing, which is then cross-referenced with information from hospitals and doctors on the ground.”

Blockchain-based “community currencies” fueled data collection:

“Mitiga is working on promoting a programme in Africa with the Red Cross that aims to incentivise people to provide information about their health. “This concept, known as a community inclusion currency, is based on blockchain technology and provides people with tokens that can be exchanged for certain good such as food, transport or education when they give information about their health status…”

Tokens were awarded when people shared information about their health status. Tokens and associated health surveys became “signals” used by Mitiga to run their models. Regardless of what intentions we believe Mitiga or their partners have, these models are meant to guide targeted populations towards pre-determined outcomes established by the program’s designers. Was informed consent given by participants? Were they made to understand the true nature of the socio-technical systems into which the data of their lives was being fed? While not stated in the article, the only Red Cross-partnered, blockchain-based, community currency effort I’m aware of is Grassroots Economics. In a prior article series, I explain how the system works and describe the above mentioned organizations’ relationships and significance.

Mitiga’s mathematical models used information gathered during participatory surveillance activities to identify where the data was sourced. Responses to health questionnaires about how people felt were later cross-referenced with government data.

Source: https://prace-ri.eu/model-aggregating-for-epidemics/

This data is then integrated into agent-based models, programs that can be used to simulate the actions and interactions of individuals. Data for these computational models comes from a number of sources. Governments provide information on the economic activity of their people, which can be supplemented by information from mobile phone networks. Phone data provides valuable insights into the movement of individual people as well as their social networks. In some areas, such as sub-Saharan Africa, the data sought may be confused by shared ownership of mobile phones. In these cases, gamification can be used, asking representative selections of a population to play simple games where they provide details about what they do and where they go in a typical week. Notice how gamification on mobile devices can be used to identify the movements of individual people, agents in the simulation.

Tokenized community currencies are not given with charitable intent. Rather the tokens, programmed for use at specific businesses, are tools that indirectly get people to freely offer up information about the patterns of their economic lives. This data can then be aggregated and used for prediction modeling of the larger community, which leads to the third phase, “business intelligence.” In this phase signals are used to calculate the potential effect of an ongoing catastrophe on a given community.

“The final part of the strategy adds a layer of business intelligence that helps to reduce the financial impact of any disease outbreak, providing information about how supply and demand, transport, financial markets and other institutions might be affected. “Our mission is to stop epidemics from turning into pandemics while reducing negative social and financial impacts…” says Dr Marti.

Making such predictions requires reliable information about how these interconnected businesses operate. Payment systems, identity services, and supply chain tracking are methods by which such data can be generated, which helps explain why humanitarian NGOs are encouraging people to use a digital community currency in the first place. Every transaction from its point of origination can be tracked from person to person. Populations involved in these pilots are guinea pigs for behavioral surveillance and programming. Participants, remade as digital “agents,” have their actions shaped by the game mechanics embedded in their environments, which have been transformed into cyber-physical systems. Personal relationships and purchases situate each person within their larger socio-technical context. People, agents, computational nodes – all can be nudged based on the signals they emit in the form of health surveys and directed economic transactions. Widespread adoption of web3 technologies will make humanity machine-readable, and steerable, at a population level.

Now consider the Danish Red Cross’s volcano catastrophe bond. Substitute “epidemic” for “volcanic eruption.” Not only can Mitiga Solutions provide modeling to define the payment triggers for parametric insurance policies, they’re also positioned to assist clients determine where best to allocate post-disaster funds.

“Mitiga Solutions has developed a state-of-the-art model using numerous data inputs to predict where funds will be needed, improving both the efficiency and effectiveness of humanitarian relief,” explained Alejandro Marti, CEO and Co-founder of Mitiga Solutions, Barcelona-based experts in predicting natural hazards.”

 Ecosystem services are defined by the contributions made by natural capital to life on Earth and how such valuations interface with the rise of the global “green” economy. Mitiga has built its business on risk modeling techniques linking ecological valuation not to “human flourishing,” but to the economic consequences of disasters. Central to both use cases is the idea of “risk management,” where predictions are made using global simulation technologies that draw upon mass surveillance and advanced software. The Mitiga case study is a useful example that shows how risk prediction extends simultaneously across military, industry, insurance, and development aid sectors.

Source: https://www.rodekors.dk/sites/rodekors.dk/files/2021-08/Summary%20note%20Disaster%20Risk%20Insurance.pdf

VeRisk at the Center of Parametric Insurance’s Rise

While parametric insurance has existed in some form for over two hundred years, quantitative and probabilistic models gained popularity starting in 1987 after Karen Clark accurately predicted the amount of damage that would result from hurricane Andrew. Using the methods that underpin modern parametric insurance models, Clark’s best guess created a new field practically overnight. Her company eventually grew into Boston-based AIR Worldwide and became the dominant third-party modeler for catastrophe insurance. According to the main news source for catastrophe insurance artemis.bm, AIR Worldwide’s modeling accounts for 70% of the catastrophe insurance market and covers over $25 billion in assets.

Source: https://ar.casact.org/hurricane-andrews-message-to-insurers-2/

Curiously, AIR Worldwide collaborated with Integrated Research on Disaster Risk (IRDR), an organization co-sponsored by the International Council for Science (ISC). The ISC was a principal designer of the International Geophysical Year (IGY), an international effort intended to increase our understanding of geophysical, atmospheric, and solar interactions. The IGY essentially spawned the space age. The USSR’s Sputnik was launched as part of the program.

New Jersey-based VeRisk acquired AIR Worldwide in May 2002, rebranding it under their name in 2022. New Jersey is a hub for the insurance industry, which operates within the financial services sector. It’s worth recalling that Dr. Alejandro Marti, founder and CEO of Mitiga Solutions, attended college at Rutgers University in New Jersey and worked for the State as a lead GIS specialist for eight years. VeRisk originates from Insurance Services Office (ISO), a non-profit started in 1971 by Daniel J. McNamara.

ISOs’ quasi monopoly on insurance data and legal standards puts it in a unique position. From its inception, ISO has been intertwined with the government and likely the military. ISO consolidated over thirty distinct state rating and reporting agencies into a single organization that served nearly three hundred insurance companies. Each state differed in requirements and formats for reports and compliance from insurance companies, so ISO was founded to provide those services. Similar to VISA, it operated as a decentralized consortium controlled by the insurance industry. ISO standardized paperwork, acted as an intermediary between the state and insurance companies, and reduced costs to industry.

By operating at scale, McNamara’s organization had access to lots of data, which increased its statistical power, improved analysis, and led to the firm wielding considerable influence over the insurance sector. In October 2001, ISO was the organization that asked every state for permission to exclude for terrorism damage from insurance contracts. ISO went public in 2009 under the new name VeRisk, which ended up as the largest IPO that year. Today Vanguard at ~10% and Blackrock ~5% are the largest shareholders, with Fidelity owning about 2.29% as well.

ISO, an early example of the power of big data, is considered one of the first “insuratech” organizations. It became a private company and in 1984 developed the ISOTEL network, a computerized information system. This evolved into ISONET by 1999, which the company states was the first “web-accessible claims tool”. Around this time, after acquiring American Insurance Services Group and the National Insurance Crime Bureau, ISO reportedly held the largest claims database in the insurance industry. 

Source: https://www.verisk.com/50-years/a-timeline-of-our-growth-and-innovation/

ISO updates its information each month to reflect changes in public fire-protection capabilities that affect a specific risk, including changes in fire-station locations, district boundaries, and automatic-aid agreements and sets Public Protection Classification codes for each fire district in the country based on their evaluation of the district’s fire-suppression capabilities. Specifically they pioneered a GIS (Geographic Information System) capable of assessing: distance to ocean and other major bodies of water; windstorm exposure; California brush-fire hazard locations; home-to-work drive distances; crime; and personal and commercial auto rating territories. The system was developed in partnership with San Diego-based Vista Information Systems. At the time, Vista was considered the nation’s leading information provider for real estate, which was bought out by Fidelity in 2001.

In 2002, AIR Worldwide expanded into catastrophe markets, including terrorism modeling. The firm designed the first terrorist estimated loss product in collaboration with the FBI, CIA, Department of Defense, and the Department of Energy. Risk management in the terrorism arena relies on advanced software to analyze social networks and identify potential conspirators.

“One year after the attacks of September 11, 2001, and at the request of our clients, AIR Worldwide released the first commercial catastrophe loss estimation model for terrorism. The model estimates the likelihood and financial impact of insured property and workers’ compensation losses from future terrorist attacks in the United States. Where natural catastrophe models are constructed based on decades of (albeit limited) historical data, AIR’s terrorism model incorporates the judgment of a team of experts—a “red team”— familiar both with the available historical data and current trends. The red team is comprised of counterterrorism specialists who have decades of experience in government organizations such as the FBI, CIA, Department of Defense, and the Department of Energy. With input from the team, AIR has developed a comprehensive database of potential targets, or landmarks, across the United States (which include many of the same buildings found in the Department of Homeland Security database) and a subset of “trophy targets” that carry a higher probability of attack. Team members use a software tool developed by AIR to perform social network analysis and probabilistic plot analysis of the steps involved in a successful terrorist operation.” Source

The insurance product released by AIR Worldwide in collaboration with a “red team” consisting of agents from the FBI, CIA, Department of Defense, and Department of Energy illustrates the degree to which the company maintains close ties with defense and Homeland Security interests. Four star General Vincent Brooks joined VeRisk’s board of directors in 2020. In 2003 the Joint’s Chief of Staff tapped General Brooks as spokesperson for the violence in the Middle East. He rose through the ranks over the decades, most recently serving as commander of the US Armed Forces in Korea, before retiring in 2019. Brooks is a member of the Council of Foreign Relations.

Source: https://www.air-worldwide.com/publications/air-currents/2016/11/

Some of AIR’s major projects include: participation in the multi-billion dollar GOES-R satellite system for NOAA/NASA, various contracts through Kirtland Air Force Research Lab, and research conducted at the Office of Naval Research on aerosols. Their website features public national security work for the Army, Navy, and Air Force. They are also involved with analysis of operational climate data, spacecraft anomalies, satellite missions modeling, and other intelligence concerns. Clients in investment banking, commodities trading (including oil), and agricultural forecasting also use AER’s consulting services. 

AER was awarded a $21 million contract in 2019 for:

“comprehensive, next-generation space environment model development, verification and validation, space environment related product development support, and design/prototyping of advanced space weather sensors.”

This work is being carried out at Kirtland Air Force Research Labs in Lexington, Massachusetts. This contract came on the heels of a smaller $9 million contract in 2015, also operating out of Kirtland. The team, led by L3 Harris, is building the next generation of GOES-R satellite series.

Intriguingly, ISO also bought Atmospheric Environmental Research (AER) in 2008. AER is another Massachusetts-based climate/geophysical software and sensor technology company, a major provider of sensors, satellites, and software to the United States government and industry. Right before their acquisition, AER was lauded as one of the top 25 defense contractors in Massachusetts. Most of their management team reports experience either in the military or other governmental organizations. One of AERs’ managers, Jeff Stehr, served as senior lead scientist for private intelligence company Booz Allen Hamilton for six years.

The company was founded in 1977 by husband-and-wife duo Cecilia and Dr. Nien Dak Sze. Dr. Sze is a senior researcher at Harvard and a contributor to the Harvard-China Project. Oddly enough, Dr. Nien Dak Sze is heir to K.S. Sze & Sons, a hundred-year-old luxury jewelry company based in Hong Kong. Sze wrote his doctoral thesis on the atmospheric chemistry of Venus. Cecilia would go on to serve as president and CEO of AER for many years after Dr. Sze received an invitation to advise Hong Kong State Environmental Protection Agency.

Source: https://ks-sze.com/pages/our-story

Sze also counsels the China-United States Exchange Foundation founded by Chinese shipping billionaire Tung Chee-hwa. Chee-hwa served on the Hong Kong Executive Council, the highest regulatory office, from 1996-2015 and remains heavily involved in Chinese politics. Hong Kong is implementing blockchain and related technologies for social management through organizations called Shanzhai City and the Impact Data Consortium chain, which Alison and I wrote about previously.

Source: https://www.cyberport.hk/enewsletter/v148/1480009.html#:~:text=In%20Hong%20Kong%2C%20Shanzhai%20City,in%20time%20and%20through%20blockchain.

The China-United States Exchange Foundation (CUSEF) is a registered foreign agent that funds think tanks and journalists to influence the United States political system. The organization collaborates with well-known organizations such as the Carnegie Foundation, Brookings Institute, Atlantic Council, and Center for Strategic and International Studies. Ronnie Chan, a governing board member of the CUSEF, is a member of the Berggruen institute. Alison has written many words about the Berggruen Institute, but the short version is that they are an organization pushing tokenization, artificial intelligence, digital governance, and blockchain identity linked to Universal Basic Income.

Source: https://www.berggruen.org/people/ronnie-chan/
Source: https://web.archive.org/web/20210323174540/https://twitter.com/NBerggruen/status/1374417021744476173

Ross Hoffman, the recently retired Vice President of Research and Development for AER, published numerous papers and articles about weather manipulation in the early 2000s. Hoffman sat on “The Committee on the Status of and Future Directions in U.S Weather Modification Research and Operations,” organized by the National Academies Board of Atmospheric Sciences and Climate. The board included another AER researcher, Richard D. Rosen. In 2003, the above committee published a Critical Issues in Weather Modification Research 2003. The board included another AER researcher, Richard D. Rosen. In 2003, the above committee published a 144-page report on that exact topic. Their focus was on strengthening the fundamental atmospheric sciences and working towards “reproducible” methods for weather modification, concluding that:

“There still is no convincing scientific proof of the efficacy of intentional weather modification efforts. In some instances, there are strong indications of induced changes, but this evidence has not been subjected to tests of significance and reproducibility. This does not challenge the scientific basis of weather modification concepts. Rather it is the absence of adequate understanding of critical atmospheric processes that, in turn, lead to a failure in producing predictable, detectable, and verifiable results.”

 The issue in their minds is not the ability to affect the weather, rather the ability to create “predictable, detectable, and verifiable results.”

Obvious conflicts of interest exist when a multinational company such as VeRisk owns subsidiaries that are deeply involved in weather modification research. Considering their numerous governmental and military contracts to develop advanced sensor technology, individuals within the company must have different levels of security clearance. This means at least some high-level staff are likely to have access to classified information relating to weather modification research; and let’s not forget that VeRisk claims to have some of the largest private databases in the world.

Source: https://wrenchinthegears.com/wp-content/uploads/2023/08/Critical-Issues-in-Weather-Modification-Research-2003.pdf

Then there’s Cecilia and Nian Dak Sze who hold leadership positions in one of the most advanced satellite, sensor, and data analysis companies in the world while advising the Hong Kong government and the China-United States Exchange Foundation. Digital media feeds consumers narratives designed to guide viewers away from a more complex, nuanced understanding of the international dynamics embodied by this power couple and their relationship to the Berggruen Institute. The unstated and unknown implications of their business arrangements should be given serious consideration.

If we take a brief historical detour back to the origins of mathematical weather forecasting, we will find ourselves in World War I. Lewis Fry Richardson, a British Quaker scientist who had served in a variety of positions as a physicist, chemist and mathematician, refused to go to combat as a conscientious objector. (Note, the Fry family was reference in Lorraine Davison’s piece on noetic convergence.) Richard spent the next three years in an ambulance unit in France. In his spare time, Richardson conducted experiments and composed differential equations describing weather, which is the basis of modern weather prediction. At the time, weather prediction was considered an art and not practiced as a science. Predating computers, all calculations had to be solved by hand.

In his original 1922 publication “Weather Prediction by Numerical Process,” Richardson set forth the following fantasy:

“After so much hard reasoning, may one play with a fantasy? Imagine a large hall like a theatre, except that the circles and galleries go right round through the space usually occupied by the stage. The walls of this chamber are painted to form a map of the globe. The ceiling represents the north polar regions, England is in the gallery, the tropics in the upper circle, Australia on the dress circle and the Antarctic in the pit.

As myriad computers [people who compute] are at work upon the weather of the part of the map where each sits, but each computer attends only to one equation or part of an equation. The work of each region is coordinated by an official of higher rank. Numerous little “night signs” display the instantaneous values so that neighbouring computers can read them. Each number is thus displayed in three adjacent zones so as to maintain communication to the North and South on the map…”

In the 1980s an artist recreated Richardson’s fantasy. The world we are living in today is quite different in some respects, but there is a lot to say about how this image reflects our current circumstances and the drive towards a digitally mediated social evolution.

 

Source: https://www.irishtimes.com/news/science/lewis-fry-richardson-s-remarkable-weather-forecast-factory-1.2473954

Richardson’s work remained relatively unnoticed until around WWII and gained traction in the 1950s when high performance computing became more widely available. Strikingly, within Weather Prediction by Numerical Process, he simultaneously included mathematical descriptions of several processes that are key to modern climate modeling, including taking into account the plant-atmosphere relationship. Richardson understood that longer term predictions of weather depended on observing these types of dynamics. Current work echoes the drive towards predictability and control of the climate. His intentions might have stemmed from care, but weaponization of knowledge of this sort was inevitable. 

Richardson is considered one of the first people to apply quantitative analysis to international conflict, as well as laying out foundational work in the mathematics of fractals, which would be quoted in Beniot Mandelbrot’s early work “How Long Is the Coast of Britain? Statistical Self-Similarity and Fractional Dimension.” As an arduous pacifist he reportedly destroyed his unpublished research when he was informed that his research was beneficial to chemical weapon development. 

Source: https://en.wikipedia.org/wiki/How_Long_Is_the_Coast_of_Britain%3F_Statistical_Self-Similarity_and_Fractional_Dimension

VeRisk and Web 3

Returning full circle back to ecosystem services and natural capital, we arrive at Maplecroft, another subsidiary of Verisk, located in Bath, England. This company started as an academic team conducting risk analysis for the mining industry. Today Maplecroft directs VeRisk’s ESG (Environmental Sustainable Governance) analytics activities.

In a 2021 article, they highlighted biodiversity loss from mining operations and urged companies to join the Task Force for Nature-related Disclosures, described earlier in this post:

“Operators need to work out a way of measuring biodiversity risk across their portfolios and calculate their exposure to the threats of natural capital depletion in a way that satisfies investors. By participating in the Taskforce for Nature-related Financial Disclosures, known as the TNFDs, they can help shape what the global disclosure benchmark will look like.” 

While VeRisk appears quiet on web3, it has several articles dating back to 2016 on the promise of these technologies. Ivelin Zvezdov, Assistant Vice President of the Research and Modeling Group on Extreme Event Solutions Team, wrote about blockchain’s potential in 2019:

“I suggested that a system can be designed where quantitative triggers and indices produced by a catastrophe model and a real-time catastrophe tracking software product will someday control the acceptance, rejection, modification, re/negotiation, and binding of smart contracts.”

Taking a deeper dive into the Lemonade Crypto Coalition or the ecosystem services markets, one can see how Zvezdov’s assessment of blockchain is playing out with respect to social and ecology projects. It is likely that VeRisk is already experimenting with blockchain, considering their forays into related technologies such as Internet of Things and machine learning. As of yet, however, I have not found specific evidence. The Mitiga Solutions deal with the Danish Red Cross involves the blockchain based securitization of the parametric insurance contract, but not use of blockchain smart contracts for the actual event triggers.

Source: https://www.verisk.com/newsroom/verisk-maplecroft-named-best-specialist-esg-ratings-provider-by-esg-investing/

Big Bets on Bio-Digital Convergence

There’s a race on to monitor, model, and steer life on Earth – to contain all of it within a networked planetary computer. Biodiversity metrics are being structured to support budding ecosystem services markets as multinational insurance giants like AXA throw their financial support behind researchers like Colin Prentice. Prentice and his colleagues are busy framing out new fields of inquiry to expand potential data capture, such as linking vegetation coverage to climate modeling. AXA’s parallel investments in Internet of Bio-Nano Things research supports my hypothesis that life at every scale, from single-cell organisms to mega-fauna and towering trees, is at risk of swapping autonomy for remote digital mediation. 

VeRisk is another major player with entanglements across insurance, disaster management, and military intelligence sectors. Looking back at the origins of mathematical weather prediction, we see how the groundwork was laid. Decades later, we have a cacophony of complex parametric insurance schemes overseen by compromised players. The same organizations predicting, and betting on, natural disasters are also involved with weather modification research. It’s definitely a tangled web. The massive datasets VeRisk has consolidated means the company is likely to wield an outsize influence over how the modern game of cybernetic influence proceeds. 

Web3, financial innovation, and digital governance have combined to create a global socio-technical architecture with the potential to gamify and steer social outcomes starting with disaster response. While they may sound good to the casual observer, digital community currencies deployed by organizations such as Grassroots Economics and Mitiga Solutions are in fact tools of subtle and not-so-subtle social-engineering. Pervasive sensing technologies, from satellites to nanotechnology, capture data, including participatory surveillance from self-reported citizen surveys, to feed simulations run on supercomputers to inform where social impact finance will be directed. 

The implications of bio-digital convergence as it pertains to humanitarian aid delivery, disaster response, and insurance are not well understood by the communities swept up in these pilot programs. Social commentary on the topic is not balanced, leaning heavily on the side of affirming benefits. There has thus far been little discussion about what problems may arise when social systems are linked to game mechanics, ubiquitous computing, and financial bets, and then automated to achieve some arbitrary optimization “human flourishing”  metric. Much more informed public discussion is needed before these initial pilots, with their multi-national insurance backers and collaborators in the finance and defense sectors, are brought to scale.

3 thoughts on “Guest Post by Leo Saraceno – The Game of Risk: How Insurance Behemoths, Disaster Recovery, and State Intelligence Laid the Foundations for the Planetary Computer

  1. Amy Harlib says:

    It all boils down to TOTAL SLAVERY! Globalist technocrat predator control freak megalomaniacs disguise their evil intent with Orwellian newspeak and doublespeak.

    How to fight back against this TOTAL SLAVERY!

    RESIST! DO NOT COMPLY! DITCH THE DAMNED ‘SMART’ PHONES AND THE DAMNED QR CODES AND GO BACK TO LANDLINES OR FLIP PHONES AND USE CASH AS MUCH AS POSSIBLE! INSIST ON CASH! CBDC IS TOTAL SLAVERY!

  2. Brad says:

    One has to acknowledge that the entire working of this planet is rapidly being reduced to interconnected data and a giant simulation under the guise of saving the planet while also generating profit for fun and ultimate control of everything that can be controlled. “This Perfect Day” novel’s planetary computer “UINCOM” if not an actuality exists in separate locations, busy finalizing its reison d’etre.

  3. washington sean says:

    Excellent read Leo! Much appreciated for the time and work you put into composing this piece!

    I was particularly interested in your perspective on weather modification as it pertains to a new line of insurance products and asset classes– despite new and emerging tech (no doubt with hidden and nefarious intentions) we can be certain that fraudsters are still gonna fraud.

    Lots of new names for my to keep in mind as I trudge forward — and some familiar ones as well — such as Eric Schmidt — who seems to pop up all over the place the more I trudge down this investigative path. For example, in my work tracing the Lightstage from its inception at USC modern application via monthly subscription, I found a quote from Schmidt on OTOY’s website:

    “Six years ago, I predicted that 90% of computing would eventually reside in the web-based cloud. OTOY has created a remarkable technology which moves that last 10% – high-end graphics processing – to the cloud. This is a disruptive and important achievement. It marks the tipping point where the web replaces the PC as the dominant computing platform.”

    https://home.otoy.com/advisory-board/

    Also, Professor Akan’s work which you highlighted, focuses in part on “an artificial gut-brain axis as a testbed for the development of novel ICT-based treatment techniques for psychological and intestinal diseases.”

    This reminded me of Naveen Jain and his work with VIOME — which, in 2021, received FDA approval and validated: “Viome’s mRNA analysis technology and state of the art AI platform that powers Viome’s at-home Health Intelligence Test designed to offer consumers deeper health insights about their own gut microbiome health, cellular health, immune system health, stress response, and even biological aging. Viome’s AI platform analyzes an individual’s unique microbial and human gene expressions to provide them with personalized food recommendations and tailor-made, precision supplements designed to address the root cause of low-grade inflammation and improve these health metrics over time.”

    Great work Leo. Looking forward to your next piece!

Comments are closed.