- Nobel Prize winning expert in climate predictions (IPCC)
- Google’s former head of Satellite data systems and operations
- Co-developer of Princeton Ocean Model
- NSF head of atmospheric science and world expert on hurricanes and cyclones
- Developer of the short-term flood prediction system for the Port Authority of New York & New Jersey
- CEO of Zip2, Internet GIS pioneer
- SAP’s VP IOT and Cloud marketing
- Hydroscientist who completed NOAA-led post-Harvey Houston flooding assessment
- PI of NSF project to containerize weather prediction
Leaders from U.S. agencies, including the National Oceanic and Atmospheric Administration (NOAA), the National Center for Atmospheric Research (NCAR), the National Science Foundation (NSF), NASA, the CIA, the Department of the Treasury and the Department of State
Renowned advisors, including the former Chief Climate Envoy of the United States, Deputy Secretary of the U.S. Treasury, Chief Data Officer of Goldman Sachs, President of The Hartford’s property and casualty companies, Head of Analytics for Google Search, developer of the Palantir data pipeline, and former senior executives and domain experts at Bridgewater and Verisk Insurance Solutions
Government agencies, asset owners, planners, developers and investors are increasingly recognizing the need to incorporate climate data into risk modeling for specific assets. Catastrophic risk modeling most often employs models that project the future based on past statistics with the assumption that the climate is not changing. This approach is flawed in a dynamic environment that is continually shaped by changes to built and natural landscapes. Similarly, climate panels at the international, national, state and metropolitan levels use inconsistent methodologies, validation approaches, and metrics that make it nearly impossible for the private sector to use them without extensive custom work.
Today’s decision-makers need data that reflects ongoing change and provides accurate predictions. With the right information, they can make more informed decisions in areas such as building placement and design, insurance, zoning and building codes. The right decisions improve safety and reduce risks to critical infrastructure.
Jupiter’s founders believed that by incorporating every relevant factor in an integrated, dynamic model, they could deliver a risk-focused solution with accurate, actionable information, and that this approach could be designed to efficiently scale in the cloud.
In early 2017, Jupiter was born. Pilots are underway for flooding on the U.S. Atlantic coast, and the company will provide global coverage with multiple services by 2020. Jupiter offices are now open in San Mateo, CA, Boulder, CO and New York City metropolitan area.
Jupiter’s ongoing work is funded by venture capital firms DCVC (Data Collective) and Ignition Partners. Together with university partners, the company is also leading the containerization work in the Big Weather Web project funded by the National Science Foundation (NSF). Investments in the company currently total approximately $10 million.
Data Collective (DCVC) is a venture capital fund that backs entrepreneurs applying deep tech to transform giant industries. DCVC and its principals have supported brilliant people changing global-scale businesses for over twenty years, helping create tens of billions of dollars of wealth while also making the world a markedly better place. DCVC incubated and then led the seed for Jupiter.
Ignition Partners helps entrepreneurs build innovative and category-defining businesses of lasting value. The company focuses on AI, machine learning and cloud application services for vertical industry reinvention and digital transformation. Ignition Partners invests in early-stage business to business software companies, and led Jupiter’s Series A funding.