This looks like a memory issue. What is the size of the DAA file? How much memory did you assign to MEGAN? How much physical memory does the machine have? If you have a very large DAA file, say 50G, then you might need give 32G to MEGAN, say, but then also additional memory must be available for parts of the program that run “outside” of that assigned amount, at least another 32G, but more is better.
Thanks for your reply. The size of my daa file is 27GB and I assigned 500GB to MEGAN, capable for my physical memory. The strange thing is, even DAA files as large as 94GB could be successfully meganized on my workstation, this 27GB DAA file just cannot be meganized, always killed during writing phase.
which I was never able to solve (even when testing with 1 TB RAM). My idea was that I could use diamond+megan to classify contigs from a large metagenome assembly. Which would really help with this as other annotation tools are very slow. However, in the end this was not possible because daa-meganizer always gets stuck or fails at the writing step. So I had to go back to tools like PROKKA for metagenome assembly annotation.