orca 4.1

Post Reply
jwk
Posts: 73
Joined: Sat Jun 20, 2020 3:37 am
Full Name: John Keller
Organization: University of Alaska Fairbanks
Subdiscipline: Chemistry

orca 4.1

Post by jwk »

Installed ORCA 4.1.2 on a CentOS 7.6 WebMO remote compute server, and the test job ran just fine under the local WebMO user's account. Looking at the WebMO 19 Enterprise Interface Manager for Orca, I see a box "ORCA MPI setup script" with a default entry "/share/apps/orca3/openmpi165/bin/mpivars.sh". What should this box contain for the current version of ORCA? (Searching the old forum, I found an entry from July 20, 2017 asking the same question, but no answer was recorded.)

jwk
Posts: 73
Joined: Sat Jun 20, 2020 3:37 am
Full Name: John Keller
Organization: University of Alaska Fairbanks
Subdiscipline: Chemistry

Re: orca 4.1

Post by jwk »

More info: So I just left this box blank and submitted the changes. Now ORCA does show up on the Compute Engines for that server, but a job fails with the .out file containing the following lines near the end:

-------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code.. Per user-direction, the job has been aborted.
-------------------------------------------------------
/usr/local/orca_4_1_2_linux_x86-64_shared_openmpi313/orca_gtoint_mpi: error while loading shared libraries: libmpi.so.40: cannot open shared object file: No such file or directory
....
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

Process name: [[52237,1],0]
Exit code: 127
--------------------------------------------------------------------------

ORCA finished by error termination in GTOInt
Calling Command: mpirun -np 4 /usr/local/orca_4_1_2_linux_x86-64_shared_openmpi313/orca_gtoint_mpi input.int.tmp input
[file orca_tools/qcmsg.cpp, line 458]:
.... aborting the run
--------------------------------------------------------------------------

jwk
Posts: 73
Joined: Sat Jun 20, 2020 3:37 am
Full Name: John Keller
Organization: University of Alaska Fairbanks
Subdiscipline: Chemistry

Re: orca 4.1

Post by jwk »

Still more info on this CentOS 7.4 system:
I installed openmpi3 and openmpi3-devel using yum, and re-ran /sbin/ldconfig. Failed job with same error.

jwk
Posts: 73
Joined: Sat Jun 20, 2020 3:37 am
Full Name: John Keller
Organization: University of Alaska Fairbanks
Subdiscipline: Chemistry

Re: orca 4.1

Post by jwk »

If "!PAL2" (no quotes) is inserted as line 1 in h2o.inp, the test job fails with the same messages as above. So this is a problem with the ORCA installation for parallel processing, not WebMO. On this system, jack is the local WebMO user:

$ which mpirun
$ /usr/lib64/openmpi/bin/mpirun
$ echo $PATH
$ /usr/lib64/openmpi/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/home/jack/.local/bin:/home/jack/bin
$ echo $LD_LIBRARY_PATH
$ /usr/lib64/openmpi/lib::/usr/local/mopac2016

jwk
Posts: 73
Joined: Sat Jun 20, 2020 3:37 am
Full Name: John Keller
Organization: University of Alaska Fairbanks
Subdiscipline: Chemistry

Re: orca 4.1

Post by jwk »

If you choose *1 processor* in an ORCA job in WebMO, it runs normally, with no error messages.

schmidt
Posts: 83
Joined: Sat May 30, 2020 3:00 pm
Full Name: JR Schmidt
Organization: WebMO, LLC

Re: orca 4.1

Post by schmidt »

The "mpivars.sh" (or whatever you choose to enter here!) is a script that will be "sourced" prior to have WebMO run ORCA. In particular, this script needs to setup the environment (e.g. LD_LIBRARY_PATH, etc.) to provide access to the MPI libraries you use on your system (and more specially, those for which ORCA was complied against). If you use "modules" on your system, you could also do a "module load" in this script to get things setup correctly.

The reason ORCA is failing is that ORCA cannot find the MPI libraries. Making use of a proper "setup" script will allow this to happen.

jwk
Posts: 73
Joined: Sat Jun 20, 2020 3:37 am
Full Name: John Keller
Organization: University of Alaska Fairbanks
Subdiscipline: Chemistry

Re: orca 4.1

Post by jwk »

OK, parallel processing is now working on several machines running either CentOS 7.6 or 8.1.
Expanding on J.R.'s comment:
As su, download and expand orca_4_1_2_linux_x86-64_shared_openmpi313.tar.zst as described in the WebMO Instructions.
Orca is installed in the default directory /usr/local/orca_4_1_2_linux_x86-64_shared_openmpi313/
Install openmpi3 version 3.1.3 from the base repository ("yum install openmpi3-devel"), which CentOS places in the /usr/lib64/openmpi3/ directory.
Use vim to create a 2-line bash script setuporca.sh and save it to the orca_4_1.. directory:
export PATH=/usr/lib64/openmpi3/bin:$PATH
export LD_LIBRARY_PATH=/usr/lib64/openmpi3/lib:$LD_LIBRARY_PATH
Finally, in the WebMO Interface Manager for ORCA, enter in the top box
/usr/local/orca_4_1_2_linux_x86-64_shared_openmpi313
Enter in the next one,
/usr/local/orca_4_1_2_linux_x86-64_shared_openmpi313/setuporca.sh
Set the number of cores, submit, and quit.

Post Reply