What you can do with UQ[py]Lab

Bayesian inference for model calibration and inverse problems

Bayesian inference is a powerful tool for probabilistic model calibration and inverse problems. UQ[py]Lab offers a flexible and intuitive way to set-up and solve Bayesian inverse problems.

  • Intuitive definition of prior knowledge, forward model and data
  • State-of-the-art Markov Chain Monte Carlo (MCMC) algorithms
  • Customizable discrepancy between model and measurements
  • Support for user-specified custom likelihood
  • Support for multiple forward models and multiple discrepancy models (joint inversion)
  • Fully integrated with UQLab (e.g. surrogate models, complex priors, etc.)

Reliability-based design optimization

The reliability-based design optimization (RBDO) module offers a set of state-of-the-art algorithms to solve various types of optimization problems under probabilistic constraints. They include:

  • Reliability index approach (RIA)
  • Performance measure approach (PMA)
  • Single loop approach (SLA)
  • Sequential optimization and reliability assessment (SORA)

On top of these well-known algorithms, the modular design of the RBDO module allows the user to set up customized solution schemes by combining all of the reliability, surrogate modeling, and optimization techniques available in UQLab.

Support vector machines

Support vector machines (SVM) come from machine learning and allow one to build predictive models from data. In the context of uncertainty quantification, SVM for regression (SVR) can be used as surrogate models of complex simulators using designs of computer experiments. SVM for classification (SVC) can be used in the context of reliability analysis.

  • L1-SVR and L2-SVR formulation for regression
  • Soft-margin classification
  • Anisotropic and user-defined kernels
  • Leave-one-out error and span approximations
  • Multiple optimization algorithms (grid search, BFGS, cross-entropy, CMA-ES, etc.)

Polynomial Chaos-Kriging

Polynomial Chaos-Kriging (PC-Kriging) associates the global approximation behavior of polynomial chaos expansions and the local accuracy of Kriging to provide a highly accurate surrogate model at low computational costs.

  • Support for sequential and optimal construction of PC-Kriging
  • Full control on both levels of approximation: polynomial chaos expansions and Kriging directly use the corresponding dedicated UQLab modules
  • Support for sparse, adaptive and arbitrary polynomial chaos expansions
  • Gradient-based, global and hybrid optimization  methods for Kriging

Canonical low-rank tensor polynomial approximations

Canonical low-rank approximations (LRA) are a powerful alternative to polynomial chaos expansions that are particularly effective in high dimension.

  • Low-rank basis construction based on orthonormal polynomials
  • Adaptive identification of maximum rank and polynomial degree via cross-validation
  • Alternate least-square calculation of basis elements and coefficients
  • Polynomials orthogonal to arbitrary distributions (via Stieltjes construction)

Kriging (Gaussian process modeling)

Gaussian process modeling is a flexible and robust technique to build fast surrogate models based on small experimental designs

  • Simple, ordinary, and universal Kriging
  • Highly customizable trend and correlation functions
  • Maximum-likelihood- and cross-validation-based hyperparameter estimation
  • Gradient-based, global, and hybrid optimization methods
  • Interpolation (noise-free response) and regression (noisy response) modes

Reliability analysis (rare event estimation)

When the performance of a system is affected by uncertainties on its characteristics and/or its environment, reliability can be assessed by computing probabilities of failure.
UQLab offers state-of-the-art reliability algorithms and a powerful modular framework
for active learning reliability.

  • FORM/SORM approximation methods
  • Sampling methods (Monte Carlo, importance sampling, subset simulation)
  • Modular framework to build custom active learning solution schemes
  • Kriging-based adaptive methods (AK-MCS, APCK-MCS)

Advanced probabilistic modelling

Probabilistic modelling lies at the core of uncertainty quantification. Full support for complex probabilistic models based on copula and marginal representation. Powerful data-driven inference module.

  • Support for many common marginal distributions
  • Support for standard copulas, as well as complex vine-copula constructions
  • Fully automatic inference of both marginal distributions and copula
  • Advanced sampling schemes, including latin hypercube sampling, quasi-random sequences

State-of-the art sensitivity analysis

Sensitivity analysis can identify the driving factors that most influence the response of a model

  • Approximation- and simulation- based methods
  • Advanced variance decomposition techniques (e.g. Sobol' indices)
  • Support for metamodel-based sensitivity analysis
  • Support for sensitivity analysis of dependent input variables

Polynomial Chaos Expansions

Polynomial chaos expansions (PCE), one of the most powerful and versatile surrogate models available today

  • Non-intrusive PCE facilities
  • State-of-the-art sparse regression solvers, including least angle regression, subspace pursuit, Bayesian compressing sensing and much more
  • Support for classical polynomials, as well as polynomials orthogonal to user-defined input distributions
  • Support also fully data-driven, arbitrary PCE