ATHENA: Parameter space reduction made easy in Python

How to enhance your numerical analysis pipeline

Marco Tezzele
SISSA mathLab
3 min readJan 6, 2021

--

Working in high-dimensional spaces can be undesirable: raw data are often sparse as a consequence of the curse of dimensionality, and analysing the data is usually computationally intractable. [Wikipedia]

ATHENA: Advanced Techniques for High dimensional parameter spaces to Enhance Numerical Analysis [1] is a Python package which implements several parameter space reduction techniques both linear as active subspaces [2], and nonlinear as kernel-based active subspaces [3] and nonlinear level-set learning [4]. It uses a modern API design inspired by scikit-learn.
It is open source and available on GitHub under the MIT license.

All the aforementioned methods are data-driven, so they rely only on just inputs/outputs pairs. Active subspaces (AS) for example highlights lower order structures in your target function of interest by finding linear combinations of all the input parameters, that is by properly rotating the input domain.

Linear dimensionality reduction through active subspaces.

A natural nonlinear extension is the kernel-based active subspaces (KAS) technique which exploits a mapping of the parameters to a higher dimensional space for which the existence of an active subspace is much more probable. This is particularly evident for radial symmetric model functions which do not present any privileged direction in the input domain along which they vary the most. KAS shows better performance not only for such cumbersome cases where AS fails, but also in general as proven in [3], at the cost of a greater computational load.

Another nonlinear methodology exploits RevNets [5] to learn the target functions’ level sets and build nonlinear coordinate transformations to reduce the number of active input dimensions of the model function. It is called nonlinear level-set learning (NLL).

Nonlinear level-set learning method applied to a cubic 2D functions.

Applications

The need of parameter space reduction comes from different fields of computational science and engineering: high-dimensional optimisation, regression, inverse problems, sensitivity analysis, uncertainty quantification, model order reduction, and artificial neural networks to cite a few.

It is with this idea that we developed ATHENA to enhance existing scientific numerical pipelines in a data scarcity framework, to fight the curse of dimensionality.

We are going to present in detail some recent works made possible by ATHENA in future stories. Stay tuned!

References

[1] ATHENA: Advanced Techniques for High dimensional parameter spaces to Enhance Numerical Analysis. https://github.com/mathLab/ATHENA

[2] P.G. Constantine. Active subspaces: Emerging ideas for dimension reduction in parameter studies. SIAM Spotlights, 2015.

[3] F. Romor, M. Tezzele, A. Lario, and G. Rozza. Kernel-based Active Subspaces with application to CFD parametric problems using Discontinuous Galerkin method. arXiv preprint arXiv:2008.12083, 2020.

[4] G. Zhang, J. Zhang, and J. Hinkle. Learning nonlinear level sets for dimensionality reduction in function approximation. In Advances in Neural Information Processing Systems, 2019.

[5] B. Chang, L. Meng, E. Haber, L. Ruthotto, D. Begert, and E. Holtham. Reversible architectures for arbitrarily deep residual neural networks. In AAAI Conference on Artificial Intelligence, 2018.

--

--

Marco Tezzele
SISSA mathLab

Postdoctoral researcher at the Oden Institute, University of Texas at Austin