Hi Everyone,
I have a batch script which looks like this:
-----------------------------------------------------------------------------------------------------------
#!/bin/bash
#SBATCH -c 1 # Number of CPUS per Node
#SBATCH --mem=2G # Sets memory per CPU to 4GB ##change memory
#SBATCH -J LS-DYNA # Name of Job
#SBATCH -o %N.%j.%a.out # Output file name
#SBATCH -e %N.%j.%a.err # Error file name
#SBATCH --time=01:00:00 ##change time
#SBATCH --mail-type=end
#SBATCH --mail-user=----- ##change email address
# Remove all loaded modules
module purge
# Load Ansys/2022.r1
module load ansys/2022.r1
# Divide memory between cores
memory=$SLURM_MEM_PER_NODE
# This is the run command Note i specifies the input file, memory the memory available and ncpu the number of CPUs used
lsdyna -dp i=Main_Revised.k memory=${memory}M ncpu=$SLURM_CPUS_PER_TASK
----------------------------------------------------------------------------------------------------------------------
This file works fine when I run small meshes in HPC. However, with large meshes I get error message asking for more memory allocation (in words). Could anyone please explain the process how to do that and in which lines I should make the changes?
P.S: I'm doing implicit analyses using SMP double precision solver.
A lot of times this error is due to unstable simulations where you have contacts failing or parts blowing up. Can you try to run a simulation that you know works, like an example sim from dyna's website?
https://www.dynaexamples.com/implicit/basic-examples
I'm sure you've already tried just adding more memory.