{"id":26009860,"url":"https://github.com/ajaust/precice-parallel-solverdummies","last_synced_at":"2025-03-05T22:08:46.966Z","repository":{"id":48346273,"uuid":"364157474","full_name":"ajaust/precice-parallel-solverdummies","owner":"ajaust","description":"Very simple solver dummies for the different bindings of preCICE. These dummies can run in parallel and one can run arbitrary combinations of these bindings.","archived":false,"fork":false,"pushed_at":"2022-11-30T19:26:41.000Z","size":50,"stargazers_count":9,"open_issues_count":0,"forks_count":2,"subscribers_count":3,"default_branch":"main","last_synced_at":"2023-03-06T13:47:22.709Z","etag":null,"topics":["cpp","mpi","parallel","precice","python-bindings","solver-dummies"],"latest_commit_sha":null,"homepage":"","language":"Fortran","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"lgpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/ajaust.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2021-05-04T06:12:30.000Z","updated_at":"2022-12-24T14:25:56.000Z","dependencies_parsed_at":"2022-08-30T00:20:32.757Z","dependency_job_id":null,"html_url":"https://github.com/ajaust/precice-parallel-solverdummies","commit_stats":null,"previous_names":[],"tags_count":null,"template":null,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ajaust%2Fprecice-parallel-solverdummies","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ajaust%2Fprecice-parallel-solverdummies/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ajaust%2Fprecice-parallel-solverdummies/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ajaust%2Fprecice-parallel-solverdummies/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/ajaust","download_url":"https://codeload.github.com/ajaust/precice-parallel-solverdummies/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":242111438,"owners_count":20073433,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["cpp","mpi","parallel","precice","python-bindings","solver-dummies"],"created_at":"2025-03-05T22:05:02.326Z","updated_at":"2025-03-05T22:08:46.959Z","avatar_url":"https://github.com/ajaust.png","language":"Fortran","readme":"# Parallel solver dummies for preCICE\n\nThis repository currently contains four solver dummies for the coupling library [preCICE](https://github.com/precice/precice) that can be executed in parallel. The solver dummies can also be mixed with each other, i.e., each solver can play any role in the coupling!\n\nCurrent solver dummies included\n- C solver dummy: `c/solverdummy-c-parallel.cpp`\n- C++ solver dummy: `cpp/solverdummy-cpp-parallel.cpp`\n- Fortran solver dummy: `fortran/solverdummy-fortran-parallel.f90`\n- Julia solver dummy: `julia/solverdummy-julia-parallel.jl`\n- Python solver dummy: `python/solverdummy-python-parallel.py`\n\n\n## Requirements\n\n**All dummies**\n- preCICE v2 (only tested with v2.2.0)\n\n**C dummy**\n- CMake\n- A C compiler\n- MPI (only tested with OpenMPI)\n\n**C++ dummy**\n- CMake\n- A C++ compiler\n- MPI (only tested with OpenMPI)\n\n**Fortran dummy**\n- CMake\n- A Fortran compiler\n- MPI (only tested with OpenMPI)\n\n**Julia dummy**\n- Julia `v1.6` or above\n- [Julia bindings for preCICE](https://github.com/precice/PreCICE.jl)\n\n**Python dummy additionally needs**\n- Python 3\n- mpi4py\n- [Python bindings for preCICE](https://github.com/precice/python-bindings)\n\nIf you have preCICE and the Python bindings installed, you should have all requirements installed.\n\n## Compilation\n\n### C\n\nCompile the C solver dummy and copy it back to the `c` directory:\n\n```\ncd c\nmkdir build\ncd build\ncmake ..\nmake\ncp solverdummy-c-parallel ..\ncd ..\n```\n\n### C++\n\nCompile the C++ solver dummy and copy it back to the `cpp` directory:\n\n```\ncd cpp\nmkdir build\ncd build\ncmake ..\nmake\ncp solverdummy-cpp-parallel ..\ncd ..\n```\n\n### Fortran\n\nCompile the Fortran solver dummy and copy it back to the `fortran` directory:\n\n```\ncd fortran\nmkdir build\ncd build\ncmake ..\nmake\ncp solverdummy-fortran-parallel ..\ncd ..\n```\n\n## Test solvers\n\n`N` and `M` refer to the number of ranks one want to use. Start with `1` in order to run the serial computation and increase `N` and `M` afterwards to have truly parallel computations.\n\nFurther information:\n- `N` and `M` may be different numbers.\n- The solver have to run in different shells or you have to send the first solver to the background.\n- You pass the participant's name and the corresponding mesh name on the terminal. The name and mesh have to be one of the following\n    - `SolverOne` and `MeshOne`\n    - `SolverTwo` and `MeshTwo`\n- Replace `mpirun` by the suitable MPI wrapper of your machine.\n- Step into the directory of the solver you want to use, i.e. `cd cpp/` for the C++ solver.\n- If you copy the solvers into other directories, please adapt the file `precice-config-parallel.xml` accordingly.\n\n**Note** You can mix the different dummies arbitrarily with each other. Some examples are given below:\n### Setup 1: C++ and C++\n\n```\nmpirun -n N ./solverdummy-cpp-parallel ../precice-config-parallel.xml SolverOne\nmpirun -n M ./solverdummy-cpp-parallel ../precice-config-parallel.xml SolverTwo\n```\n\nExample:\n```\nmpirun -n 4 ./solverdummy-cpp-parallel ../precice-config-parallel.xml SolverOne\nmpirun -n 2 ./solverdummy-cpp-parallel ../precice-config-parallel.xml SolverTwo\n```\n\n### Setup 2: Python and Python\n\n```\nmpirun -n N python3 solverdummy-python-parallel.py ../precice-config-parallel.xml SolverOne\nmpirun -n M python3 solverdummy-python-parallel.py ../precice-config-parallel.xml SolverTwo\n```\n\nExample:\n```\nmpirun -n 4 python3 solverdummy-python-parallel.py ../precice-config-parallel.xml SolverOne\nmpirun -n 2 python3 solverdummy-python-parallel.py ../precice-config-parallel.xml SolverTwo\n```\n\n### Setup 3: C++ and Python\n\n```\nmpirun -n N ./solverdummy-cpp-parallel ../precice-config-parallel.xml SolverOne\nmpirun -n M python3 solverdummy-python-parallel.py ../precice-config-parallel.xml SolverTwo\n```\n\nExample:\n```\nmpirun -n 4 ./solverdummy-cpp-parallel ../precice-config-parallel.xml SolverOne\nmpirun -n 2 python3 solverdummy-python-parallel.py ../precice-config-parallel.xml SolverTwo\n```\n\n### Setup 4: Fortran and C++\n\n```\nmpirun -n N ./solverdummy-cpp-parallel ../precice-config-parallel.xml SolverOne\nmpirun -n M ./solverdummy-fortran-parallel ../precice-config-parallel.xml SolverTwo\n```\n\nExample:\n```\nmpirun -n 4 ./solverdummy-cpp-parallel ../precice-config-parallel.xml SolverOne\nmpirun -n 2 ./solverdummy-fortran-parallel ../precice-config-parallel.xml SolverTwo\n```\n\n### Setup 4: C++ and C\n\n```\nmpirun -n N ./solverdummy-cpp-parallel ../precice-config-parallel.xml SolverOne\nmpirun -n N ./solverdummy-c-parallel ../precice-config-parallel.xml SolverOne\n```\n\nExample:\n```\nmpirun -n 4 ./solverdummy-cpp-parallel ../precice-config-parallel.xml SolverOne\nmpirun -n 2 ./solverdummy-c-parallel ../precice-config-parallel.xml  SolverTwo\n```\n\n### Setup 5: Julia and Julia\n\nThe Julia solver dummy uses Julia's native parallelization and not the MPI wrapper.\n\nYou need to add `\u003cmaster:sockets/\u003e` to the `precice-config-parallel.xml` file in both participants.\n\nExample:\n\n```xml\n\u003cparticipant name=\"SolverOne\"\u003e\n    \u003cmaster:sockets/\u003e\n    \u003cuse-mesh name=\"MeshOne\" provide=\"yes\"/\u003e\n    \u003cwrite-data name=\"dataOne\" mesh=\"MeshOne\" /\u003e\n    \u003cread-data  name=\"dataTwo\" mesh=\"MeshOne\" /\u003e\n\u003c/participant\u003e\n```\n\nFor convenience, the repository provides a `precice-config-parallel-julia.xml` configuration that can be used with the Julia solver dummy.\n\nRun the dummies with\n\n```shell\njulia solverdummy-julia-parallel.jl N ../precice-config-parallel-julia.xml SolverOne\njulia solverdummy-julia-parallel.jl M ../precice-config-parallel-julia.xml SolverTwo\n```\n\nExample:\n```shell\njulia solverdummy-julia-parallel.jl 4 ../precice-config-parallel-julia.xml SolverOne\njulia solverdummy-julia-parallel.jl 2 ../precice-config-parallel-julia.xml SolverTwo\n```\n\n## Setup 6: Julia and C++\n\nAs with [Setup 5](#setup-5-julia-and-julia) you need to add `\u003cmaster:sockets/\u003e` to the `precice-config-parallel.xml` file in both participants or you can used the provided `precice-config-parallel-julia.xml` configuration.\n\n```shell\nmpirun -n N ./solverdummy-cpp-parallel ../precice-config-parallel-julia.xml SolverOne MeshOne\njulia solverdummy-julia-parallel.jl M ../precice-config-parallel-julia.xml SolverTwo\n```\n\nExample:\n```shell\nmpirun -n 4 ./solverdummy-cpp-parallel ../precice-config-parallel-julia.xml SolverOne MeshOne\njulia solverdummy-julia-parallel.jl 2 ../precice-config-parallel-julia.xml SolverTwo\n```\n\n## Remarks\n\n- The codes are heavily based on the solver dummies of [preCICE](https://github.com/precice/precice) and the corresponding [Python bindings](https://github.com/precice/python-bindings).","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fajaust%2Fprecice-parallel-solverdummies","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fajaust%2Fprecice-parallel-solverdummies","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fajaust%2Fprecice-parallel-solverdummies/lists"}