Hide metadata

dc.date.accessioned2023-03-08T17:53:38Z
dc.date.available2023-03-08T17:53:38Z
dc.date.created2023-02-15T17:42:37Z
dc.date.issued2023
dc.identifier.citationGläser, Dennis Koch, Timo Peters, Sören Marcus, Sven Bernd, Flemisch . fieldcompare: A Python package for regression testing simulation results. Journal of Open Source Software (JOSS). 2023, 8(81)
dc.identifier.urihttp://hdl.handle.net/10852/101043
dc.description.abstractIn various research areas such as engineering, physics, and mathematics, numerical simulations play an important role. A number of research software simulation frameworks have been established, for instance, Dune (Bastian et al., 2008, 2021), Dumux (Flemisch et al., 2011; Koch et al., 2021), Deal.II (Arndt et al., 2022), FEniCS (A. Logg, 2012; FEniCS, 2023), and VirtualFluids (Kutscher et al., 2022). Numerical software typically has a high inherent complexity as it aims at solving complex physical model equations by using advanced mathematical methods for solving partial differential equations. Beyond this, the model equations often involve parameters that are described by means of empirical constitutive relationships. Thus, a numerical simulation usually brings together various software components: for the domain discretization, the discretization method for the equations, the physics, and a non-linear and/or linear solver to obtain a solution for the discretized equations. While each of these components can be unit tested, it is important to have system tests that verify that a particular type of simulation can be carried out successfully. By successful we mean here that the simulation produces the correct results. As sufficiently complex problems often lack analytical solutions, determining correctness of numerical simulations poses a significant challenge. In the absence of an analytical solution, a common strategy is to use a trusted reference for comparison (e.g., data measured in experiments or results from previous publications). From the perspective of software quality assurance, it suffices to define a reference result as the correct one and continuously verify that the code still reproduces it. In numerical software, such regression tests play a vital role at the level of system tests (Kempf & Koch, 2017). They make sure that developers notice when a certain change to the code affects the results produced by the simulations. Whether the new results are better or worse has to be decided by the developers, and in the case of the former, the reference results may be updated. In order to carry out regression tests, one must be able to detect significant deviations between newly-computed and reference results. What a significant deviation is has to be decided by the developers as well, and adequate tolerances have to be chosen that are big enough to avoid false negatives from machine precision issues, but small enough to ensure that physically relevant deviations in the results are detected. Some numerical software packages as, for instance, DUNE and DuMux (Flemisch et al., 2011), provide mechanisms to detect such deviations. However, the functionality is not provided independent of the frameworks themselves and is therefore only available to their users. Besides this, only those mesh file formats that are used by the frameworks are supported. Very recently, DuMux incorporated fieldcompare into its test suite in place of its in-house solutions.
dc.languageEN
dc.publisherOpen Journals
dc.rightsAttribution 4.0 International
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.titlefieldcompare: A Python package for regression testing simulation results
dc.title.alternativeENEngelskEnglishfieldcompare: A Python package for regression testing simulation results
dc.typeJournal article
dc.creator.authorGläser, Dennis
dc.creator.authorKoch, Timo
dc.creator.authorPeters, Sören
dc.creator.authorMarcus, Sven
dc.creator.authorBernd, Flemisch
cristin.unitcode185,15,13,15
cristin.unitnameMekanikk
cristin.ispublishedtrue
cristin.fulltextoriginal
cristin.qualitycode1
dc.identifier.cristin2126463
dc.identifier.bibliographiccitationinfo:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.jtitle=Journal of Open Source Software (JOSS)&rft.volume=8&rft.spage=&rft.date=2023
dc.identifier.jtitleJournal of Open Source Software (JOSS)
dc.identifier.volume8
dc.identifier.issue81
dc.identifier.doihttps://doi.org/10.21105/joss.04905
dc.type.documentTidsskriftartikkel
dc.type.peerreviewedPeer reviewed
dc.source.issn2475-9066
dc.type.versionPublishedVersion
cristin.articleid4905


Files in this item

Appears in the following Collection

Hide metadata

Attribution 4.0 International
This item's license is: Attribution 4.0 International