unite-train

A pipeline to build Qiime2 taxonomy classifiers for the UNITE database.

Download a pre-trained classifier here! 🎁

Issues pre-releases Downloads

What is this?

If you are interested in Fungi πŸ„πŸ„β€πŸŸ« you could use their genomic fingerprint to identify them. Affordable PCR amplification and sequencing of the ITS gene gives you these nucleic acid fingerprints, and the UNITE team provides a database to gives these sequences a name.

We can predict the taxonomy of our fungal fingerprints using an old-school machine learning method: a supervised k-mer nb-classifier. But first, we need to prepare our database in a process called β€˜training.’

This is a pipeline that trains the UNITE ITS taxonomy database for use with Qiime2. You can run this pipeline yourself, but you don’t have to! I’ve provided a ready to use pre-trained classifiers so you can simply run qiime feature-classifier classify-sklearn.

If you have questions about using Qiime2, ask on the Qiime2 forums.

If you have questions about the UNITE ITS database, contact the UNITE team.

If you have questions about this pipeline, please open a new issue!


Running Snakemake workflow

Set up:

Configure:

Run:

snakemake --cores 8 --use-conda --resources mem_mb=10000

Training one classifier takes 1-9 hours on an AMD EPYC 75F3 Milan, depending on the size and complexity of the data.

Run on a slurm cluster: More specifically, The University of Florida HiPerGator supercomputer, with access generously provided by the [Kawahara Lab](https://www.floridamuseum.ufl.edu/kawahara-lab/)! ```bash screen # We connect to a random login node, so we may not be able... screen -r # to reconnect with this later on. snakemake --jobs 24 --slurm \ --rerun-incomplete --retries 3 \ --use-envmodules --latency-wait 10 \ --default-resources slurm_account=kawahara-b slurm_partition=hpg-milan ```
Run with Docker: Say, in 'the cloud' using [FlowDeploy](https://flowdeploy.com/). ```bash snakemake --jobs 12 \ --rerun-incomplete --retries 3 \ --use-singularity \ --default-resources ```

Reports:

snakemake --report results/report.html
snakemake --forceall --dag --dryrun | dot -Tpdf > results/dag.pdf