Daa-meganizer was killed during Binning reads Analyzing alignments phase

I am meganising a daa file and I’m encountering this error tools/daa-meganizer: line 44: 17077 Killed.

Version MEGAN Community Edition (version 6.25.1, built 31 Aug 2023)
Author(s) Daniel H. Huson
Copyright (C) 2023 Daniel H. Huson. This program comes with ABSOLUTELY NO WARRANTY.
Java version: 20.0.1; max memory: 117.2G
Functional classifications to use: EC, EGGNOG, GTDB, INTERPRO2GO, SEED
Loading ncbi.map: 2,396,736
Loading ncbi.tre: 2,396,740
Loading ec.map: 8,200
Loading ec.tre: 8,204
Loading eggnog.map: 30,875
Loading eggnog.tre: 30,986
Loading gtdb.map: 240,103
Loading gtdb.tre: 240,107
Loading interpro2go.map: 14,242
Loading interpro2go.tre: 28,907
Loading seed.map: 961
Loading seed.tre: 962
Meganizing: …/Mayur_Compost_R1_pair.daa
Meganizing init
Annotating DAA file using FAST mode (accession database and first accession per line)
Annotating references
10% 20% 30% 40% 50% 60% 70% 80% 90% 100% (218,968.3s)
10% 20% 30% 40% 50% 60% 70% 80% 90% 100% (8,767.2s)
Binning reads Initializing…
Initializing binning…
Using ‘Naive LCA’ algorithm for binning: Taxonomy
Using Best-Hit algorithm for binning: SEED
Using Best-Hit algorithm for binning: EGGNOG
Using ‘Naive LCA’ algorithm for binning: GTDB
Using Best-Hit algorithm for binning: EC
Using Best-Hit algorithm for binning: INTERPRO2GO
Binning reads…
Binning reads Analyzing alignments
10% 20% 30% 40% 50% 60% 70% 80% 90% /home/stud/tgangar/megan/tools/daa-meganizer: line 44: 17077 Killed $java $java_flags --module-path=$modulepath --add-modules=megan megan.tools.DAAMeganizer $options

While installing I gave it 8000 MB as memory then changed the memory as follows by editing vmoptions file. I have set it to 120 GB.

#Enter one VM parameter per line
#For example, to adjust the maximum memory usage to 512 MB, uncomment the following line:
#To include another file, uncomment the following line:
#-include-options [path to other .vmoption file]

I have used this script to run instead of the BASH script described here.

#SBATCH --job-name=meganizer
#SBATCH --account=“biotechnology”
#SBATCH --nodes=5
#SBATCH --ntasks-per-node=8
#SBATCH --time=7-00:00:00
#SBATCH --error=job.%J.err
#SBATCH --output=job.%J.out
#SBATCH --mem=60G

xvfb-run --auto-servernum --server-num=1 /home/stud/tgangar/megan/tools/daa-meganizer -mdb …/megan-map-Feb2022.db -i …/Mayur_Compost_R1_pair.daa

I can’t figure out the problem, should I redownload the mapping file and do all the steps again?

Now this time the binning has been done but it is still getting killed at certain steps. The error code is 12956. This is my first time using Megan, can anyone give me a hand in this?

At what step the files will be meganized and can be used for Megan?

MEGAN is having a hard time with this dataset…
Since the program does not appear to report an “Out of memory” error, I don’t think that giving it 120GB is necessary or a good idea. This is because the program uses additional memory beyond the 120GB (running the SQLITE database happens outside of Java) and so it may be that the program is using too much memory in total. So, please try running with


say, and see whether that runs to completion.

BTW: the number 12956 is not an error code but rather it is the process ID.

1 Like

Thank you @Daniel for the support and tips, it worked and has completed binning 100%.