EllPETSc provides 3D elliptic solvers for the various classes of elliptic problems defined in EllBase. EllPETSc using the “Portable, Extendable Toolkit for Scientific computation” (PETSc) by Argonne National Lab. PETSc is a suite of routines and data structures that can be employed for solving partial differential equations in parallel. EllPETSc t is called by the interfaces provided in EllBase.
This thorns provides sophisticated solvers based on PETSc libraries. It supports all the interfaces defined in EllBase. At this point is not optimized for performance. Expect improvements as we develop the elliptic solver arrangement.
This thorn provides
No Pizza
No Wine
peace
This thorn supports three elliptic problem classes: LinFlat for a standard 3D cartesian Laplace operator, using the standard 7-point computational molecule. LinMetric for a Laplace operator derived from the metric, using 19-point stencil. LinConfMetric for a Laplace operator derived from the metric and a conformal factor, using a 19-point stencil. The code of the solvers differs for the classes and is explained in the following section.
PETSc needs to be installed on the machine and the environment variables PETSC_ARCH and PETSC_DIR have to be set to compile EllPETSc. PETSc can obtained for free at http://www-fp.mcs.anl.gov/petsc/. Cactus needs to be compiled with MPI. While PETSc can be compiled for single processor mode (without MPI), Cactus has only been tested and used with the parallel version of PETSc requiring MPI. For detailed information in how to install PETSc refer to the documentation.
For this class we employ the the 7-point stencil based on only. These values are constant at each gridpoint.
For this class the standard 19-point stencil is initialized, taken the underlying metric into account. The values for the stencil function differ at each gridpoints.
For this class the standard 19-point stencil is initialized, taken the underlying metric and its conformal factor into account. The values for the stencil function differ at each gridpoints.
The main task when interfacing PETSc consists of transferring data from the Cactus parallel data structures (gridfunctions) to the parallel structures provided by PETSc.
Here we explain the main steps, to be read with the code at hands.
The indices imin,imax … are calculated and describe the starting/ending points in 3D local index space: ghostzones are not included here.
A linear global index is calculated describing the starting/ending points in linear global index space. Ghostzone are not included here.
A lookup gridfunction wsp is loaded identifying the 3D local index with the linear global index. Values of zero indicate boundaries.
PETSc matrices/vectors are created specifying the linear size: global endpoint - global startpoint.
For the elliptic class LinFlat the stencil functions are initialized with the standard 7-point stencil, the class LinMetric and LinConfMetric require a more sophisticated treatment described later.
Looping over the processor local grid points (in 3D local index space) the PETSc vectors and coefficient matrix is loaded if no boundary is present (wsp[i,j,k] not equal zero.);
Starting the PETSc vector and matrix assembly, nested for performance as recommended by PETSc.
Creation of the elliptic solver context and setting of options, followed by the call to the PETSc solver.
Upon completion of the solve, the PETSc solution has to transferred to the Cactus data structures.
The sizes of the arrays Mlinear for the coefficient matrix and Nsource are passed in the solver. A storage flag is set if these variables are of a sized greater 1. In this case, the array can be accessed.
Use PETSc as normal, Use the PUGH communicator if a routine needs a communictor. On first pass, you need to make a call to PETScSetCommWorld() and PetscInitialize() to set the PETSc communicator and initialize PETSc.
This could be a seperate routine scheduled early in schedule.ccl at BASEGRID eg. PetscInitialize() requires the commandline parameters as input. It allows you to pass through the flags, etc. (I have not ried this feature.) Initialize the PETSc communicator with the Cactus communicator. You end up having code like this:
/* The pugh Extension handle */ pGH *pughGH; /* Get the link to pugh Extension */ pughGH = (pGH*)GH->extensions[CCTK_GHExtensionHandle("PUGH")]; if (first_trip==0) { int argc; char **argv; /* Get the commandline arguments */ argc = CCTK_CommandLine(&argv); /* Set the PETSc communicator to set of PUGH and initialize PETSc */ ierr = PetscSetCommWorld(pughGH->PUGH_COMM_WORLD); CHKERRA(ierr); PetscInitialize(&argc,&argv,NULL,NULL); CCTK_INFO("PETSc initialized"); }
You need to tell Cactus to look for the PETSc includes: In the file make.code.defn define the SRCS (sources) as explained in the dcoumentation and add a lien for SUS_INC_DIR which lets Cactus look for additional includes, eg.:
SYS_INC_DIRS += $(PETSC_DIR) $(PETSC_DIR)/include \ $(PETSC_DIR)/bmake/$(PETSC_ARCH)
This file is not created by the when you use Cactus to create a new thorn by ”gmake newthorn”. For a template PETSc configuration file, have a look in ./CactusElliptic/EllPETSc/src/make.configuration.defn.
The first section checks if PETSC_DIR/PETSC_LIB are set. If they are not, the configuration process will be interrupted (otherwise you have to wait to the end of the compilation to find out that your program won’t link).
Second section specifies the standard PETSc libs. eg.:
PETSC_LIB_DIR = $(PETSC_DIR)/lib/libg/$(PETSC_ARCH) PETSC_LIBS = petscts petscsnes petscsles petscdm
Third section adds platform dependent file, by checking PETSC_ARCH and assigning the appropriate libs.
In the end the variables are assigned to the variables that Cactus make process is using (note the incremental assignment ”+=”)
LIBDIRS += $(PETSC_LIB_DIR) $(X_LIB_DIR) LIBS += $(PETSC_LIBS) $(PLATFORM_LIBS) X11 EXTRAFLAGS += -I$(PETSC_DIR)/include
petsc_coeff_to_one | Scope: private | BOOLEAN |
Description: Divide each line of the matrix by the central value?
| ||
Default: no | ||
petsc_ksp_type | Scope: private | STRING |
Description: Which Krylov subspace method to use
| ||
Range | Default: KSPBCGS | |
KSPCR | pcr
| |
KSPCG | cg
| |
KSPCGS | cgs
| |
KSPBCGS | bcgs
| |
KSPLSQR | lsqr
| |
KSPGMRES | gmres
| |
KSPTCQMR | tcqmr
| |
KSPTFQMR | tfqmr
| |
KSPCHEBYCHEV | chebyshev
| |
KSPCHEBYSHEV | chebyshev
| |
KSPRICHARDSON | richardson
| |
petsc_nablaform | Scope: private | KEYWORD |
Description: PETSC nabla form
| ||
Range | Default: down | |
up | ||
down | ||
petsc_pc_type | Scope: private | KEYWORD |
Description: Which preconditioner method to use
| ||
Range | Default: PCJACOBI | |
PCNONE | none | |
PCJACOBI | jacobi
| |
PCBJACOBI | bjacobi
| |
PCICC | icc
| |
PCILU | ilu
| |
PCASM | asm
| |
PCLU | lu
| |
petsc_reuse | Scope: private | BOOLEAN |
Description: Reuse parts of the PETSc structure
| ||
Default: no | ||
petsc_verbose | Scope: private | KEYWORD |
Description: PETSc verbose output
| ||
Range | Default: yes | |
no | No output
| |
yes | Some output
| |
debug | Tons of output
| |
domain | Scope: shared from GRID | KEYWORD |
Implements:
ellpetsc
Inherits:
ellbase
driver
Group Names | Variable Names | Details | |
petscworkspace | compact | 0 | |
wsp | description | Workspace for the elliptic PETSc solver | |
dimensions | 3 | ||
distribution | DEFAULT | ||
group type | GF | ||
timelevels | 1 | ||
variable type | REAL | ||
Uses header:
EllBase.h
This section lists all the variables which are assigned storage by thorn CactusElliptic/EllPETSc. Storage can either last for the duration of the run (Always means that if this thorn is activated storage will be assigned, Conditional means that if this thorn is activated storage will be assigned for the duration of the run if some condition is met), or can be turned on for the duration of a schedule function.
Always: | |
petscworkspace | |
CCTK_BASEGRID
ellpetsc_register
register the petsc solvers
Language: | c | |
Type: | function | |