Abstract
Artificial neural networks together with associated computational libraries provide a powerful framework for constructing both classification and regression algorithms. In this paper we use neural networks to design linear and non-linear discrete differential operators. We show that neural network based operators can be used to construct stable discretizations of initial boundary-value problems by ensuring that the operators satisfy a discrete analogue of integration-by-parts known as summation-by-parts. Our neural network approach with linear activation functions is compared and contrasted with a more traditional linear algebra approach. An application to overlapping grids is explored. The strategy developed in this work opens the door for constructing stable differential operators on general meshes.
| Original language | English |
|---|---|
| Article number | 109873 |
| Journal | Journal of Computational Physics |
| Volume | 424 |
| DOIs | |
| Publication status | Published - 1 Jan 2021 |
Keywords
- Discrete differential operators
- Neural networks
- Overlapping grids
- Stability
- Summation-by-parts
ASJC Scopus subject areas
- Numerical Analysis
- Modeling and Simulation
- Physics and Astronomy (miscellaneous)
- General Physics and Astronomy
- Computer Science Applications
- Computational Mathematics
- Applied Mathematics