It would be good if we could automatically generate the distributions of several decay channels and check if they agree with a set of reference histograms whenever we need to build EvtGen after (major) code changes within a continuous integration (CI) pipeline.
Description
Revisions and Commits
rEVTGEN evtgen | |||
rEVTGEN6052b4f913d3 Implemented JSON test framework for all decay models - closes T108 | |||
rEVTGENa70475a8415f T108: Add testDecayModel which automatically generates distributions for decays… | |||
rEVTGEN7d4bc7efbce1 Add draft gitlab CI script | |||
rEVTGEN1815b83712e4 Some improvement to gitlab CI script | |||
rEVTGEN5a322ad69fff Bugfix for gitlab CI script | |||
rEVTGENc5a613effade Fix CI script build command | |||
rEVTGENdd90a034cbf6 Use external build of jsoncpp |
Related Objects
Event Timeline
- Need to decide if including the json header & source file is compatible with our license; I think it'll be OK since we include its own license fully.
- Also need to think about where we can store reference histograms for CI pipline. Can we use CERN EOS?
- Can ROOT be used in a CI pipeline? If not, we could create our own simple "histogram" class to store distributions so that they can be compared with references.
The idea is to run a test doing:
./testDecayModel jsonFiles/example.json
where the json file defines the decay model, particles, parameters and histograms, as well as a reference file for comparison. We could then test the chi-squared values and print out a warning message if there is a large difference.
This then allows us to use this single test program to compare many decay models, which could then replace evtgenlhc_test1 and do_tests, without writing additional C++ code (unless we want to add other histogram distribution types).
This sounds great!
Presumably then we would have json files that define the tests for each model and also have some other json file that maps the model to the source and header files that it depends on (or probably the other way around), then in the CI pipeline we would check what code has changed in the commit and decide which models need to be rerun based on the changes. Is that the idea?
Indeed I think this would allow us to remove a lot of the existing code in the test and validation directories.
I think this is fine from a licence perspective but the other question is whether it is necessary. jsoncpp is available in the LCG views, so we should just be able to use find_package in the test/CMakeLists.txt to locate it and compile against it from there.
- Also need to think about where we can store reference histograms for CI pipline. Can we use CERN EOS?
I think this should be feasible, yes. We could maybe even request some EOS space specifically for EvtGen and if it's publicly readable then we shouldn't need to add anything special in the CI config. The output histograms from the CI can be uploaded to gitlab as temporary artefacts, or we could also save those to EOS (but that would require some special stuff in the CI config so the job has the right credentials).
- Can ROOT be used in a CI pipeline? If not, we could create our own simple "histogram" class to store distributions so that they can be compared with references.
Yes it can - we can run any software that's on cvmfs (or that you can pull down in the docker container).
Should I start to add my CI stuff (which currently just runs a build) into this branch and we can start to expand on that and try things out?
! In T108#1685, @tlatham wrote:
Presumably then we would have json files that define the tests for each model and also have some other json file that maps the model to the source and header files that it depends on (or probably the other way around), then in the CI pipeline we would check what code has changed in the commit and decide which models need to be rerun based on the changes. Is that the idea?
Yes, this is something we can do.
jsoncpp is available in the LCG views, so we should just be able to use find_package in the test/CMakeLists.txt to locate it and compile against it from there.
Of course, we should do this instead.
Should I start to add my CI stuff (which currently just runs a build) into this branch and we can start to expand on that and try things out?
Yes, Tom, please go ahead with this, as well as any other changes you think we will need, such as using json from cvmfs. Thanks.
The CI build in the gitlab mirror of EvtGen is now working.
I've also committed the necessary changes to switch to using jsoncpp from LCG.
The build and test run successfully and you can see the details of the CI pipeline here:
https://gitlab.cern.ch/evtgen/evtgen/-/pipelines/1883808
I'm running everything in LCG_98 now because that includes the new versions of Photos and Tauola that include HepMC3 (btw it also includes EvtGen 2.0.0). There are now 4 builds:
- gcc9-opt
- gcc9-dbg
- gcc10-opt
- clang10-opt
and a test that uses the gcc9-dbg build. At present the test just runs your testDecayModel on one of the example json files you provided.
One thing I've noted is that the lack of a reference file does not result in the test exiting with a failure code, so that's something we need to look at.
The next step will be to analyse the contents of the push to see what files have changed and based on that work out what tests need to be run, etc.