Daa-meganizer was killed during writing phase

Hi all,

I’m trying to maganize one of my diamond daa file yet it was always killed during writing phase. I tried to enlarge the memory invoked however it didn’t work. Here is the verbose:

Version MEGAN Ultimate Edition (version 6.24.20, built 5 Feb 2023)
Author(s) Daniel H. Huson
Copyright (C) 2023 Daniel H. Huson. This program comes with ABSOLUTELY NO WARRANTY.
Java version: 18.0.2.1
Functional classifications to use: EC, EGGNOG, GTDB, INTERPRO2GO, KEGG, SEED
Loading ncbi.map: 2,396,736
Loading ncbi.tre: 2,396,740
Loading ec.map: 8,200
Loading ec.tre: 8,204
Loading eggnog.map: 30,875
Loading eggnog.tre: 30,986
Loading gtdb.map: 240,103
Loading gtdb.tre: 240,107
Loading interpro2go.map: 14,242
Loading interpro2go.tre: 28,907
Loading kegg.map: 25,480
Loading kegg.tre: 68,226
Loading seed.map: 961
Loading seed.tre: 962
Meganizing: ~/Documents/daa_origin/sample10.daa
Meganizing init
Annotating DAA file using FAST mode (accession database and first accession per line)
Annotating references
10% 20% 30% 40% 50% 60% 70% 80% 90% 100% (149.8s)
Writing
10% 20% 30% 40% 50% 60% 70% 80% 90% ~/megan/tools/daa-meganizer: line 38: 581947 Killed $java $java_flags --module-path=$modulepath --add-modules=megan6u megan6u.tools.DAAMeganizer $options

Anyone can guide me how to fix it? Many thanks!

This looks like a memory issue. What is the size of the DAA file? How much memory did you assign to MEGAN? How much physical memory does the machine have? If you have a very large DAA file, say 50G, then you might need give 32G to MEGAN, say, but then also additional memory must be available for parts of the program that run “outside” of that assigned amount, at least another 32G, but more is better.

Hi Daniel,

Thanks for your reply. The size of my daa file is 27GB and I assigned 500GB to MEGAN, capable for my physical memory. The strange thing is, even DAA files as large as 94GB could be successfully meganized on my workstation, this 27GB DAA file just cannot be meganized, always killed during writing phase.

I have reported a similar issue before:

which I was never able to solve (even when testing with 1 TB RAM). My idea was that I could use diamond+megan to classify contigs from a large metagenome assembly. Which would really help with this as other annotation tools are very slow. However, in the end this was not possible because daa-meganizer always gets stuck or fails at the writing step. So I had to go back to tools like PROKKA for metagenome assembly annotation.