Meganizing daa file

Dear I am trying to meganize daa (9.2GB) file created using diamond blastx. However, meganizer stops at certain point, as below. Any comments?

$MEGAN/tools/daa-meganizer -i s1merged_matches.daa -a2t prot_acc2tax-Jun2018X1.abin

Version MEGAN Community Edition (version 6.10.6, built 20 Dec 2017)
Copyright © 2017 Daniel H. Huson. This program comes with ABSOLUTELY NO WARRANTY.
Java version: 1.8.0_144
Loading 1,601,128
Loading ncbi.tre: 1,601,131
Opening file: prot_acc2tax-Jun2018X1.abin
Meganizing: s1merged_matches.daa
Exception in thread “main” java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.util.Arrays.copyOfRange(
at java.lang.String.(
at java.lang.StringBuilder.toString(
at megan.daa.DAAReferencesAnnotator.apply(
at megan.daa.Meganize.apply(

how much memory did you specify when installing MEGAN?
Edit the file MEGAN.vmoptions to make sure MEGAN gets enough memory, 8GB would be good, 16GB would be perfect.

I set 16GB memory in MEGAN.vmoptions file but it was still not working. So I used (-k 5) in DIAMOND to get smaller output file- 2.6GB (versus 8.6GB using the default value in DIAMOND). And I am meganizing it on the cluster, so everything is going perfectly. Please do let me know if you think I am doing something silly with DIAMOND parameters.

Probably need to give MEGAN even more memory?