Meganizing daa file

Dear I am trying to meganize daa (9.2GB) file created using diamond blastx. However, meganizer stops at certain point, as below. Any comments?

$MEGAN/tools/daa-meganizer -i s1merged_matches.daa -a2t prot_acc2tax-Jun2018X1.abin

Version MEGAN Community Edition (version 6.10.6, built 20 Dec 2017)
Copyright © 2017 Daniel H. Huson. This program comes with ABSOLUTELY NO WARRANTY.
Java version: 1.8.0_144
Loading ncbi.map: 1,601,128
Loading ncbi.tre: 1,601,131
Opening file: prot_acc2tax-Jun2018X1.abin
Meganizing: s1merged_matches.daa
Exception in thread “main” java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.util.Arrays.copyOfRange(Arrays.java:3664)
at java.lang.String.(String.java:207)
at java.lang.StringBuilder.toString(StringBuilder.java:407)
at megan.daa.io.InputReaderLittleEndian.readNullTerminatedBytes(InputReaderLittleEndian.java:178)
at megan.daa.io.DAAHeader.loadReferences(DAAHeader.java:206)
at megan.daa.DAAReferencesAnnotator.apply(DAAReferencesAnnotator.java:52)
at megan.daa.Meganize.apply(Meganize.java:66)
at megan.tools.DAAMeganizer.run(DAAMeganizer.java:227)
at megan.tools.DAAMeganizer.main(DAAMeganizer.java:56)

how much memory did you specify when installing MEGAN?
Edit the file MEGAN.vmoptions to make sure MEGAN gets enough memory, 8GB would be good, 16GB would be perfect.

I set 16GB memory in MEGAN.vmoptions file but it was still not working. So I used (-k 5) in DIAMOND to get smaller output file- 2.6GB (versus 8.6GB using the default value in DIAMOND). And I am meganizing it on the cluster, so everything is going perfectly. Please do let me know if you think I am doing something silly with DIAMOND parameters.

Probably need to give MEGAN even more memory?