Yifan Cheng

and 7 more

The Arctic hydrological system is an interconnected system that is experiencing rapid change. It is comprised of permafrost, snow, glacier, frozen soils, and inland river systems. Permafrost degradation, trends towards earlier snow melt, a lengthening snow-free season, soil ice melt, and warming frozen soils all challenge hydrologic simulation under climate change in the Arctic. In this study, we provide an improved representation of the hydrologic cycle across a regional Arctic domain using a generalizable optimization methodology and workflow for the community. We applied the Community Terrestrial Systems Model (CTSM) across the US state of Alaska and the Yukon River Basin at 4-km spatial resolution. We highlight several potentially useful high-resolution CTSM configuration changes. Additionally, we performed a multi-objective optimization using snow and river flow metrics within an adaptive surrogate-based model optimization scheme. Four representative river basins across our study domain were selected for optimization based on observed streamflow and snow water equivalent observations at ten SNOTEL sites. Fourteen sensitive parameters were identified for optimization with half of them not directly related to hydrology or snow processes. Across fifteen out-of-sample river basins, thirteen had improved flow simulations after optimization and the median Kling-Gupta Efficiency of daily flow increased from 0.40 to 0.63. In addition, we adapted the Shapley Decomposition to disentangle each parameter’s contribution to streamflow performance changes, with the seven non-hydrological parameters providing a non-negligible contribution to performance gains. The snow simulation had limited improvement, likely because snow simulation is influenced more by meteorological forcing than model parameter choices.

Joseph Hamman

and 1 more

Climate data from Earth System Models are increasingly being used to study the impacts of climate change on a broad range of biogeophysical (forest fires, fisheries, etc.) and human systems (reservoir operations, urban heat waves, etc.). Before this data can be used to study many of these systems, post-processing steps commonly referred to as bias correction and statistical downscaling must be performed. “Bias correction” is used to correct persistent biases in climate model output and “statistical downscaling” is used to increase the spatiotemporal resolution of the model output (i.e. 1 deg to 1/16th deg grid boxes). For our purposes, we’ll refer to both parts as “downscaling”. In the past few decades, the applications community has developed a plethora of downscaling methods. Many of these methods are ad-hoc collections of post processing routines while others target very specific applications. The proliferation of downscaling methods has left the climate applications community with an overwhelming body of research to sort through without much in the form of synthesis guiding method selection or applicability. Motivated by the pressing socio-environmental challenges of climate change – and with the learnings from previous downscaling efforts in mind – we have begun working on a community-centered open framework for climate downscaling: scikit-downscale. We believe that the community will benefit from the presence of a well-designed open source downscaling toolbox with standard interfaces alongside a repository of benchmark data to test and evaluate new and existing downscaling methods. In this notebook, we provide an overview of the scikit-downscale project, detailing how it can be used to downscale a range of surface climate variables such as air temperature and precipitation. We also highlight how scikit-downscale framework is being used to compare existing methods and how it can be extended to support the development of new downscaling methods.