You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This was not an issue when we were re-processing all data de novo, but now that we are storing sequences and metadata in a local database, there is a risk that a sample of lineage X becomes re-labelled lineage Y. As a result, we would have duplicate records in the database. This would cause the pipeline to crash when it attempts to insert a new record with accession number as unique key, which collides with the existing record with the same key. This would be easy to fix by modifying the database insert statement.
The text was updated successfully, but these errors were encountered:
This was not an issue when we were re-processing all data de novo, but now that we are storing sequences and metadata in a local database, there is a risk that a sample of lineage X becomes re-labelled lineage Y. As a result, we would have duplicate records in the database. This would cause the pipeline to crash when it attempts to insert a new record with accession number as unique key, which collides with the existing record with the same key. This would be easy to fix by modifying the database insert statement.
The text was updated successfully, but these errors were encountered: