meGan toolkit crashes with “Segmentation fault (core dumped)” on large datasets

Hey Guys… :wave:

I am using meGan (v1.11.0) to process metagenomic datasets, and whenever I load datasets larger than ~20 GB, the tool crashes with:

megan: Segmentation fault (core dumped)

What I’ve tried:

  • Running on a machine with 64 GB RAM and 16 cores
  • Increasing Java heap size via -Xmx to 48 GB
  • Using both GUI and command-line modes
  • Testing a smaller dataset (~10 GB) which works fine

The crash always occurs shortly after the initial data indexing step. No crash log or Java error trace appears—just the segmentation fault.

I check this : How many diamond target hits are needed for MEGAN? workday training in ameerpet Has anyone encountered this? Any ideas on memory limits, configuration tweaks, or known bugs that trigger this crash?

Thanks in advance!

Hi @joved81744,

You might need more memory for this. Could you try increasing it to 60 GB?

Best regards,
Anupam