Update link to protocols.io

Hello folks,

We just ran the ARTIC protocol on some samples, but noticed there is a V2 version of the protocol which isn’t obvious to find from the website. The V1 version yielded lots of double barcode events and a non-ambiguous consensus sequence in our non-template control, which I suspect will be resolved by the V2 modifications.

Would it be possible to update the link to point to the most recent version?

Cheers,

Martin

We should be making the V2 protocol the recommended one fairly soon.

Could you clarify a few things in your note?

What do you mean by “double barcode events” ? We expect double barcoding (i.e. multiplex adaptors at both ends) for both V1 and V2 protocols. V2 protocol should increase the rate of double barcoding.

What do you mean by a non-ambiguous consensus sequence in our non-template control? Do you mean you get reads in your negative control? This should not happen with V1 or V2 assuming you follow the SOP and have no contamination.

Double barcoding events = 2 different barcodes, e.g. BC12 && BC07, on reads that passed filtering. Specifically, BC12 was our non-template control, and quite a few reads demuxed as BC12 were evidently amplicons from different samples that, in addition to the proper barcodes, had additional BC12 sequences ligated on both ends. Some stats below.

These reads passed all the bioinformatics filtering steps and were sufficiently abundant to introduce non-NNNNN’s in the NTC consensus fasta.

I suspect these demultiplexing artifacts will be drowned out when assigned to a ‘positive’ amplicon, but it raises a need to consider more stringent barcode filtering (at least, with the V1 lab protocol).

For further clarification, the rate of double demultiplexing was assessed with the following sophisticated bioinformatics pipeline : grep barcode_A_sequence demuxed.fastq | grep barcode_B_sequence | wc -l , so it is in all likelihood much higher given the error-rate.

Ah! That explains the problem.

It is vital that during demultiplexing you ensure that reads have barcodes at both ends, and that they are the same.

This is achieved either using artic demultiplex (V1.0.0 SOP) which calls Porechop with the --require_two_barcodes option, or in the latest V1.1.0 SOP using guppy_barcoder with the --require_barcodes_both_ends option (https://artic.network/ncov-2019/ncov2019-bioinformatics-sop.html).

With these options enabled you should not see mapping reads in your negative control barcode.

Should not see them, but still do. The SOP was followed to the letter, including the both_ends flag during demux. Specifically:

/home/apps/ont-guppy/guppy-3.5.2/bin/guppy_barcoder \
-x auto \
--require_barcodes_both_ends \
-i ./fastq/guppy_hac-3.5.2/ \
-s ./fastq/demuxed/guppy_hac-3.5.2/ \
--arrangements_files "barcode_arrs_nb12.cfg barcode_arrs_nb24.cfg"

More stringent filtering required? Unless the GPU implementation doesn’t apply the both_ends flag (the -x auto parameter is the only difference). Will have a look using CPU only ASAP.

Very strange. And you filtered out reads < 400 and >700 with artic gupplyplex too?

Our local experience is that we expect none to a small handful of mapped reads coming through our negative controls with this protocol.

Yes, all recommended steps. Using V2 of the laboratory protocol generated no false positives though. Hence, I’d like to reinforce that the most recent protocol should be easier to find for new users.

1 Like