Custom scripts using CPL

CPL can be used to create custome CFD workflows that may fall outside the capabilities of the command-line applications. The classes used to build the command-line applications can likewise be use to create custom Python scripts, as shown with following example.

This tutorial mimics the workflow of the task file used for the VOF mutliphase solver damBreak tutorial provided with Caelus. To follow along it is recommended that the user download the custom CPL script. It is assumed the user is executing the script from within the $CAELUS_PROJECT_DIR/tutorials/multiphase/vof/vofSolver/ras/damBreak directory. To use CPL’s Python interface directly, the user needs to ensure is CPL installed, preferably in a conda or virtualenv environment (see: Installing Caelus Python Library (CPL)). As a Python script, other non-CPL functionality can be used in coordination with CPL (e.g. matplotlib).

import os
import sys
import shutil

import matplotlib.pyplot as plt

Several CPL classes and methods are required. Refer to the CPL Python API docs (caelus) for a complete listing of modules and associated functionality.

from caelus.config.cmlenv import cml_get_version
from import DictFile, DecomposeParDict
from import CaelusCmd
from import get_mpi_size
from import LogProcessor, SolverLog
from import CaelusPlot

An environment specifies the particular OpenFOAM or CML version and installation location. This examples loads the default (no argument to cml_get_version returns the default).

print("Searching for default caelus version...")
cenv_default = cml_get_version()

cenv = cenv_default
print("Using Caelus version: " + cenv.version)
print("Caelus path: " + cenv.project_dir)

Commands are run using CaelusCmd. The environment to the job manager object. The command is executed by calling the object and a boolean is returned to enable status checking. Here, the meshing application, blockMesh, is run.

status = 0
print("Executing blockMesh... ")
caelus_cmd = CaelusCmd("blockMesh", cml_env=cenv)
status = caelus_cmd()
if status != 0:
    print("ERROR generating blockMesh. Exiting!")

Use built-in Python modules for filesystem related tasks.

shutil.copy2("0/", "0/alpha1")

The solution is initialized solution using setFields with the CaelusCmd as shown previously.

status = 0
print("Executing setFields... ")
caelus_cmd = CaelusCmd("setFields", cml_env=cenv)
caelus_cmd.cml_exe_args = "-latestTime"
status = caelus_cmd()
if status != 0:
    print("ERROR running setFields. Exiting!")

An automated way to detect and set up a parallel run is to check for a system/decomposeParDict file, use the CaelusDict class to retrieve the numberOfSubdomains parameter, and set the number of MPI ranks to run applications with.

decomp_dict = DecomposeParDict.read_if_present()

parallel = True if decomp_dict['numberOfSubDomains'] > 1 else False

status = 0
solver_cmd = CaelusCmd("vofSolver", cml_env=cenv)

if parallel:
    print("Executing decomposePar... ")
    decomp_cmd = CaelusCmd("decomposePar", cml_env=cenv)
    decomp_cmd.cml_exe_args = ("-force")
    status = decomp_cmd()
    if status != 0:
        print("ERROR running decomposePar. Exiting!")
    solver_cmd.num_mpi_ranks = decomp_dict['numberOfSubdomains']
    solver_cmd.parallel = True
    print("Executing vofSolver in parallel on %d cores..."%solver_cmd.num_mpi_ranks)

    print("Executing vofSolver...")

status = solver_cmd()
if status != 0:
    print("ERROR running vofSolver. Exiting!")

Finally, the SolverLog class is invoked to parse the log file and generate a plot of the residuals.

print("Processing logs... ")
clog = SolverLog(logfile="vofSolver.log")
cplot = CaelusPlot(clog.casedir)
cplot.plot_continuity_errors = True