Tpetra::Details::iallreduce does not work with complex, if MPI_VERSION < 3
Created by: mhoemmen
@trilinos/tpetra Story: #748 (closed)
Tpetra::Details::iallreduce currently assumes that, given a Teuchos::EReductionType and any MPI_Datatype, that MPI_SUM and MPI_MAX work for that MPI_Datatype. This is only true if the MPI_Datatype is built in; it's not true if the MPI_Datatype is custom (derived).
We currently conservatively assume that the MPI_C_FLOAT_COMPLEX and MPI_C_DOUBLE_COMPLEX built-in MPI_Datatype only exist if MPI_VERSION >= 3. Tpetra::Details::MpiTypeTraits constructs derived MPI_Datatype for Kokkos::complex otherwise. This means that Tpetra::Details::iallreduce does not work with Kokkos::complex Views, when MPI_VERSION < 3. (MPI reports "invalid MPI_Op.")
As a temporary work-around, we could add a function that constructs the required custom MPI_Op on the fly for each MPI_Allreduce all (if MPI_VERSION >= 3, Tpetra::Details::iallreduce uses MPI_Iallreduce; otherwise, it uses MPI_Allreduce), and destroys it after use. It may make sense to cache the constructed MPI_Op; we could attach it to the MPI_Comm using MPI's (key,value) mechanism. That latter would ensure correct deallocation as well.
This issue only matters if users want to try pipelined Krylov methods with complex Scalar types.