在xshell中使用FDS做MPI并行运算问题

sh/bash/dash/ksh/zsh等Shell脚本
回复
lizijiang
帖子: 5
注册时间: 2021-03-30 21:03
系统: linux

在xshell中使用FDS做MPI并行运算问题

#1

帖子 lizijiang » 2021-03-30 21:19

在FDS用户手册中提出,运用下面的shell进行MPI并行运算。

代码: 全选

#!/bin/bash
#PBS -N job_name
#PBS -e <pwd>/job_name.err
#PBS -o <pwd>/job_name.log
#PBS -o <pwd>/job_name.log
#PBS -o <pwd>/job_name.log
export OMP_NUM_THREADS=2
export I_MPI_PIN_DOMAIN=omp
cd <pwd>
mpiexec -n 8 <full_path>/fds job_name.fds
我根据我需要的节点数写了这个shell,但运行的时候报错。

代码: 全选

#!/bin/bash
#PBS -N detailed_MPI22
#PBS -e /THL6/home/openfoam/FDS/FDSmodel/single_building/detailed_MPI22/detailed_MPI22.err
#PBS -o /THL6/home/openfoam/FDS/FDSmodel/single_building/detailed_MPI22/detailed_MPI22.log
#PBS -l nodes=11:ppn=2
#PBS -l walltime=24:0:0
export OPM_NUM_THREADS=8
export I_MPI_PIN_DOMAIN=omp
cd /openfoam/FDS/FDSmodel/single_building/detailed_MPI22
mpiexec -n 22 /THL6/home/openfoam/FDS/FDSmodel/single_building/detailed_MPI22/fds detailed_MPI22.fds
运算以后的报错
[openfoam@th-hpc1-ln0 ~]$ mpiexec -np 22 FDS_MPI.pbs
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)
[proxy:0:0@th-hpc1-ln0] HYD_spawn (../../../../../src/pm/i_hydra/libhydra/spawn/intel/hydra_spawn.c:128): execvp error on file FDS_MPI.pbs (No such file or directory)

===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= RANK 14 PID 6040 RUNNING AT th-hpc1-ln0
= KILLED BY SIGNAL: 9 (Killed)
===================================================================================

===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= RANK 20 PID 6046 RUNNING AT th-hpc1-ln0
= KILLED BY SIGNAL: 9 (Killed)
===================================================================================

===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= RANK 21 PID 6047 RUNNING AT th-hpc1-ln0
= KILLED BY SIGNAL: 9 (Killed)
===================================================================================

我是远程连接超算的,请问有没有人知道我该如何修改或者应该如何在Xshell里份MPI数,才能对FDS进行并行计算。
本人是linux小白,完全不会shell这些命令,但是无奈运行fds要用到,请大家多多指导!
头像
astolia
论坛版主
帖子: 6455
注册时间: 2008-09-18 13:11

Re: 在xshell中使用FDS做MPI并行运算问题

#2

帖子 astolia » 2021-03-31 9:30

既然你要用linux系统,要写shell脚本,就稍微学一下shell基础吧。如果有其它编程语言基础的话学起来很快的,挤个半天时间出来都够了。

你执行的命令是 mpiexec -np 22 FDS_MPI.pbs,其中FDS_MPI.pbs是一个文件路径。在linux系统中,凡是不以/开头的文件路径都是相对路径,也就是相对于当前所在目录的路径。而根据你[openfoam@th-hpc1-ln0 ~]的提示,当前路径在当前用户的家目录~,也就是说,你是让mpiexec去处理当前用户的家目录下的FDS_MPI.pbs文件。但这个文件不存在,所以报错No such file or directory

弄清楚你把FDS_MPI.pbs放到哪里了,把路径写正确。另外,由于mpiexec是去执行FDS_MPI.pbs,也别忘了给FDS_MPI.pbs加上可执行权限
lizijiang
帖子: 5
注册时间: 2021-03-30 21:03
系统: linux

Re: 在xshell中使用FDS做MPI并行运算问题

#3

帖子 lizijiang » 2021-03-31 9:38

astolia 写了: 2021-03-31 9:30 既然你要用linux系统,要写shell脚本,就稍微学一下shell基础吧。如果有其它编程语言基础的话学起来很快的,挤个半天时间出来都够了。

你执行的命令是 mpiexec -np 22 FDS_MPI.pbs,其中FDS_MPI.pbs是一个文件路径。在linux系统中,凡是不以/开头的文件路径都是相对路径,也就是相对于当前所在目录的路径。而根据你[openfoam@th-hpc1-ln0 ~]的提示,当前路径在当前用户的家目录~,也就是说,你是让mpiexec去处理当前用户的家目录下的FDS_MPI.pbs文件。但这个文件不存在,所以报错No such file or directory

弄清楚你把FDS_MPI.pbs放到哪里了,把路径写正确。另外,由于mpiexec是去执行FDS_MPI.pbs,也别忘了给FDS_MPI.pbs加上可执行权限
非常感谢,我修改看看
lizijiang
帖子: 5
注册时间: 2021-03-30 21:03
系统: linux

Re: 在xshell中使用FDS做MPI并行运算问题

#4

帖子 lizijiang » 2021-03-31 15:43

通过大家的帮助,我将shell文件修改为
#!/bin/bash
#PBS -N detailed_MPI22
#PBS -e /THL6/home/openfoam/FDS/FDSmodel/single_building/detailed_MPI22/detailed_MPI22.err
#PBS -o /THL6/home/openfoam/FDS/FDSmodel/single_building/detailed_MPI22/detailed_MPI22.log
#PBS -l nodes=6:ppn=2
#PBS -l walltime=24:0:0
export OPM_NUM_THREADS=8
export I_MPI_PIN_DOMAIN=omp
cd /THL6/home/openfoam/FDS/FDSmodel/single_building/detailed_MPI22
mpiexec -n 12 fds detailed_MPI22.fds

然后给予该shell文件权限,并且进行运行
chmod +x ./FDS_MPI.pbs
mpirun -np 12 FDS_MPI.pbs
成功运行了并行运算的FDS文件

注:pbs文件保存在了/THL6/home/openfoam,需要运行的fds保存在了/THL6/home/openfoam/FDS/FDSmodel/single_building/detailed_MPI22
希望可以帮助到后面运用fds进行mpi并行的小白,user guide上写的实在太简单
回复