Samuel Tootle <tootle@itp.uni-frankfurt.de>

February 19, 2023


This thorn enables the use of initial data solutions computed using the Frankfurt University/KADATH (FUKA) initial data codes

1 Introduction

Accurate time evolution of systems within general relativity require accurate initial data solutions at some initial time-slice. For isolated objects in equilibrium, these solutions can be computed easily either analytically or using efficient 1-2D solvers. Conversely, initial data for binary configurations is considerably more challenging to construct and, as such, has seen considerably fewer codes dedicated to solving this problem especially for mass asymmetric and spinning configurations.

Frankfurt University/KADATH (FUKA) is a collection of initial data codes that are focused on providing a robust suite of solvers capable of computing precise initial data solutions for binary configurations including a black hole or neutron stars. This is currently limited to classical general relativity and with a strict z-symmetry so that component spins can only be (anti-)aligned with the orbital axis. Furthermore, initial data involving neutron stars can be computed using either tabulated equations of state or polytropic. Details of the initial data solvers can be found in the publication[1].

This thorn provides a minimal interface to FUKA’s builtin export routines in order to interpolate an initial data solution generated with FUKA into the EinsteinToolkit. Initial data must be generated independently by a separately compiled version of FUKA. The latest development version can be found under the fuka branch at [2].

2 Using This Thorn

This thorn requires the KadathThorn which handles the (possibly) building and linking of the FUKA library when creating the EinsteinToolkit executable. The FUKA library currently requires a compiler that supports C++17 standards and std::filesystem. However, the library can be compiled separately from the EinsteinToolkit and linked by setting


in the build configuration file. Here HOME_KADATH  is a global environment variable which the user should have already set during the installation of FUKA. When linking against a pre-compiled version of FUKA, the ETK can be compiled with a lower C++ standard since FUKA is linked via a statically compiled library.

Note: Linking a previously compiled version of FUKA when building the EinsteinToolkit has been known to provide undefined behavior with the intel MKL library. Specifically, imported solutions will results in garbage for all quantities (e.g. the lapse). FUKA works exceptionally well with MKL, but an ABI issue has been encountered on multiple occasions which has been tied to the order of linking scientific libraries. In the case of intel oneapi, ”-mkl” should appear at the end of the LDFLAGS. If you encounter and resolve such an error, please submit an issue ticket so it can be documented.

2.1 Obtaining This Thorn

This thorn can be obtained from [3]

2.2 Basic Usage

FUKA initial data solutions consist of two files: a configuration file (config) and a data file (dat). In the case of neutron stars, the solution also needs the equation of state file that describes the table or polytropic configuration.

After activating the KadathImporter thorn, one needs to set KadathImporter::filename to the config file, e.g.

    KadathImporter::filename = "<dir>/id.info"

In the ficticious dir, we must ensure the config, dat, and possible EOS files are stored in the same place. Additionally, FUKA will also check for the desired EOS in


However, this requires the HOME_KADATH  environment variable being set in the evolution environment. If FUKA is built during the ETK build procedure, the directory that should be assigned to HOME_KADATH  is listed in the output.

In addition to the filename, the initial data type must also be set such as

    KadathImporter::type = "BNS"

where possible types are

    STRING type "ID type"
    "BBH" :: "Black hole ID"
    "BH"  :: "Binary black hole ID"
    "BNS" :: "Binary neutron star ID"
    "NS"  :: "Neutron star ID"
    "BHNS" :: "Black Hole neutron star ID"
    } "BBH"

Excision filling parameters

FUKA computes black hole solutions by excising the interior of the black hole, where the excision surface is defined as an apparent horizon. However, for puncture evolution the interior needs to be filled with smooth junk data. To do so, the FUKA exporter performs 3rd to 7th order Lagrange interpolation on radially equi-spaced grid points outside of the excision region. The interpolated solution is then extrapolated to the requested grid point inside the horizon.

Although the default parameters for filling the excision region have been tuned to be optimal, these settings can be modified by the user in the parfile:

    CCTK_INT interp_order "Interpolation order for smooth junk"
      3 :: "Parabolic"
      5 :: "Quartic"
      7 :: "6th order polynomial"
      8 :: "7th order polynomial"

    CCTK_REAL interpolation_offset "Interpolation offset (in units of r_AH)"
      -1:* :: "Anything goes"

    CCTK_REAL delta_r_rel "Relative dr spacing for the interpolation polynomial"
      -1:* :: "Anything goes"

    BOOLEAN puncture_lapse "Set a puncture lapse alp = psi^-2 inside the hole"

3 In the background

This importer interfaces with the initial data exporters that are built into the FUKA library which is where all the actual “work” is done. The source code for these can be found in


These exporters provide a simple API to allow one to easily integrate FUKA initial data into a new evolution code. In this way, the KadathImporter passes the filename of the initial data along with vectors of points in which to interpolate the initial data solution on. In return the exporter passes back a vector consisting of the necessary ADMBase and HydroBase quantities, specifically

Note: FUKA initial data involving neutron stars uses cold equations of state. It has no knowledge of temperature or electron fraction. These must be set by another thorn! For an example, see the KadathPizza thorn[3] where these quantities are initialized after initial data import, but before the FreeTHC initial data converter.

3.1 Interaction With Other Thorns

The KadathImporter extends the ADMBase and HydroBase thorns to allow for setting ADMBase and HydroBase variables.

To import metric quantities from an initial data solution, the following ADMBase variables need to be set to Kadath:

It is important to note that on import

In the case of initial data involving a neutron star, HydroBase::initial_hydro must be set to Kadath too.

3.2 Examples

A short test for verifying the importer interface is working is available in the test directory. Examples for a BBH and neutron star are available in the par directory

3.3 Support and Feedback

The current active developer and maintainer is

    Samuel Tootle - tootle@itp.uni-frankfurt.de

4 History

The pre-ETK inclusion KadathThorn and KadathImporter thorns were developed by L. Jens Papenfort, Samuel Tootle, and Elias Most for binary black hole and binary neutron star initial data. Samuel Tootle later rewrote the thorns to encapsulate all initial data available from FUKA, updated to meet ETK inclusion standards, and wrote the documentation.

4.1 Acknowledgements

The authors are grateful to the continued support by Philippe Grandclement, author of the KADATH spectral library, throughout the development of the FUKA codes.


[1]   ”Papenfort, L. Jens et al”, New public code for initial data of unequal-mass, spinning compact-object binaries, ”10.1103/PhysRevD.104.024057”

[2]   Frankfurt University/KADATH (codes), ”https://bitbucket.org/fukaws/fuka”

[3]   Frankfurt University/KADATH (workspace), ”https://bitbucket.org/fukaws/”

5 Parameters

Scope: private  REAL

Description: Relative dr spacing for the interpolation polynomial

Range   Default: 0.3
Anything goes

Scope: private  STRING

Description: Input config file name

Range   Default: (none)

Scope: private  INT

Description: Interpolation order for smooth junk

Range   Default: 8
6th order polynomial
7th order polynomial

Scope: private  REAL

Description: Interpolation offset (in units of r_AH)

Range   Default: 0.
Anything goes

Scope: private  BOOLEAN

Description: Set a puncture lapse alp = psiˆ-  2 inside the hole

  Default: no

Scope: restricted  STRING

Description: ID type

Range   Default: BBH
Black hole ID
Binary black hole ID
Binary neutron star ID
Neutron star ID
Black Hole neutron star ID

Scope: shared from HYDROBASE  KEYWORD

Extends ranges:

Kadath Initial Data

6 Interfaces








7 Schedule

This section lists all the variables which are assigned storage by thorn Fuka/KadathImporter. Storage can either last for the duration of the run (Always means that if this thorn is activated storage will be assigned, Conditional means that if this thorn is activated storage will be assigned for the duration of the run if some condition is met), or can be turned on for the duration of a schedule function.



Scheduled Functions

CCTK_PARAMCHECK (conditional)


  check parameters


 Type: function

ADMBase_InitialData (conditional)


  set up binary black hole initial data


 Type: function

ADMBase_InitialData (conditional)


  set up black hole initial data


 Type: function

HydroBase_Initial (conditional)


  set up binary neutron star initial data


 Before: admbase_postinitial
 Type: function

HydroBase_Initial (conditional)


  set up neutron star initial data


 Before: admbase_postinitial
 Type: function

HydroBase_Initial (conditional)


  set up black hole neutron star initial data


 Before: admbase_postinitial
 Type: function