ROSS-Damaris integration support
The main purpose of this branch is to add support for the ROSS-Damaris integration. This involved adding an option to the ./configure
step in the build process, which is --with-damaris
. No path needs to be specified here. If ROSS is built with Damaris enabled, it will create the necessary pkgconfig files in the same location as the regular ROSS pc file, so the CODES process will be able to find it.
The ROSS blog contains the full details on dependencies, building, and running with Damaris enabled. There are a few CODES-specific details to using Damaris here: https://xgitlab.cels.anl.gov/codes/codes/wikis/Using-ROSS-Instrumentation-with-CODES.
One thing to note is that using Damaris requires splitting MPI_COMM_WORLD
.
MPI_COMM_CODES
was set previously (which is typically just set to MPI_COMM_WORLD
), but MPI_COMM_WORLD
was still being used in many places. This merge request fixes that as well. It also added in a function codes_comm_update()
in codes/src/util/codes-comm.c
. This needs to be called after tw_init()
in the model's main()
. This just updates MPI_CODES_COMM
to be the same as MPI_COMM_ROSS
. It needs to be called after tw_init()
because that is where the communicator will be split. This only matters if you're running with Damaris actually turned on. If you're not running with Damaris turned on (whether you built ROSS/CODES with it enabled or not), there's no issues.
The other change made in this merge request is not directly related to Damaris, but provides support for all the model callback functions necessary to use all ROSS instrumentation modes with Dragonfly Plus. I also fixed the RNG reverse computation issue in DFP synthetic workloads that was recently found in the other network synthetic workloads.