Features

What you can do with UQ[py]Lab

Bayesian inference for model calibration and inverse problems

Bayesian inference is a powerful tool for probabilistic model calibration and inverse problems. UQ[py]Lab offers a flexible and intuitive way to set-up and solve Bayesian inverse problems.

  • Intuitive definition of prior knowledge, forward model and data
  • State-of-the-art Markov Chain Monte Carlo (MCMC) algorithms
  • Customizable discrepancy between model and measurements
  • Support for user-specified custom likelihood
  • Support for multiple forward models and multiple discrepancy models (joint inversion)
  • Fully integrated with UQLab (e.g. surrogate models, complex priors, etc.)

Polynomial Chaos-Kriging

Polynomial Chaos-Kriging (PC-Kriging) associates the global approximation behavior of polynomial chaos expansions and the local accuracy of Kriging to provide a highly accurate surrogate model at low computational costs.

  • Support for sequential and optimal construction of PC-Kriging
  • Full control on both levels of approximation: polynomial chaos expansions and Kriging directly use the corresponding dedicated UQLab modules
  • Support for sparse, adaptive and arbitrary polynomial chaos expansions
  • Gradient-based, global and hybrid optimization  methods for Kriging

Kriging (Gaussian process modeling)

Gaussian process modeling is a flexible and robust technique to build fast surrogate models based on small experimental designs

  • Simple, ordinary, and universal Kriging
  • Highly customizable trend and correlation functions
  • Maximum-likelihood- and cross-validation-based hyperparameter estimation
  • Gradient-based, global, and hybrid optimization methods
  • Interpolation (noise-free response) and regression (noisy response) modes

Reliability analysis (rare event estimation)

When the performance of a system is affected by uncertainties on its characteristics and/or its environment, reliability can be assessed by computing probabilities of failure.
UQLab offers state-of-the-art reliability algorithms and a powerful modular framework
for active learning reliability.

  • FORM/SORM approximation methods
  • Sampling methods (Monte Carlo, importance sampling, subset simulation)
  • Modular framework to build custom active learning solution schemes
  • Kriging-based adaptive methods (AK-MCS, APCK-MCS)

Advanced probabilistic modelling

Probabilistic modelling lies at the core of uncertainty quantification. Full support for complex probabilistic models based on copula and marginal representation. Powerful data-driven inference module.

  • Support for many common marginal distributions
  • Support for standard copulas, as well as complex vine-copula constructions
  • Fully automatic inference of both marginal distributions and copula
  • Advanced sampling schemes, including latin hypercube sampling, quasi-random sequences

State-of-the art sensitivity analysis

Sensitivity analysis can identify the driving factors that most influence the response of a model

  • Approximation- and simulation- based methods
  • Advanced variance decomposition techniques (e.g. Sobol' indices)
  • Support for metamodel-based sensitivity analysis
  • Support for sensitivity analysis of dependent input variables

Polynomial Chaos Expansions

Polynomial chaos expansions (PCE), one of the most powerful and versatile surrogate models available today

  • Non-intrusive PCE facilities
  • State-of-the-art sparse regression solvers, including least angle regression, subspace pursuit, Bayesian compressing sensing and much more
  • Support for classical polynomials, as well as polynomials orthogonal to user-defined input distributions
  • Support also fully data-driven, arbitrary PCE