COVID-19 EQA Satisfaction Survey

In February 2021, CMPT set off on a new program, to provide proficiency testing (EQA) for units providing rapid testing for COVID-19.  This was something novel for CMPT and others on many accounts.

From CMPT perspective, this was the first time that we were involved in EQ for a viral product.  While around the year 2000 we experimented with a program for Herpes simplex virus, that program was never fully developed. This time we had access to a brilliant product produced by Dr. Martin Petric at the BC Centre for Disease Control.  Much of the research and development was completed before we got the material.  For our purposes we had to ensure that the product could be used appropriately for EQA; we needed to ensure that it was stable and homogeneous and could be used to simulate human nose swabs.

From the client perspective, most of the testers were not laboratory technologists, and most had never heard of proficiency testing, much less External Quality Assessment.

From a social/society perspective, this was an evaluation of new tests of a new virus in a new pandemic.  Information was being gathered and decisions were being made based on tests results obtained through a hands-on performance that had not been thoroughly examined.

With all this newness, what possibly could go wrong!

With a little trial and error on all sides, the program got started, but it was important for us to know if it was working out for the testing clients.  A satisfaction survey was created and sent to participants. A response rate of about 30 percent was achieved.

When asked about whether the testers had ever participated in quality assessment about two thirds of responders said they had.  The other third said that while this was a new experience for them, the concept of Quality Assessment made sense to them.

When CMPT sent out the samples, they were accompanied by an instruction sheet that (a) explained how to prepare and test the samples and (b) how to report the results. Of people who reported back, one (12.5%) said they did not find the instruction sheet very helpful, but 6 (75%) said they did.  One reported they were not the ones that actually did the test, and so could not comment on the value of the instruction sheet.

With respect to the samples, it is important that EQA samples look like and act like typical samples, otherwise they are not a real test of performance.  When asked about the CMPT material and swabs, and their typical appearance six (86%) of participants indicated that the CMPT samples were very typical with some subtle variation only, while  one (14%) did not think the samples were very realistic at all.

Their concern was that while CMPT was testing the testing process, we were not testing the collection of the nose swab, which was true, but outside of the scope of what we could achieve.

That being said, all the testers said that once they had a swab in their hands, be it from a nose or from CMPT, they processed it exactly the same.

Finally, CMPT had set up a special mechanism so that as soon as the tester put the results into the computer they would get back an immediate report telling them if they had achieved the correct answer.  Seventy-five percent found this very helpful, while 25 percent did not.  Part of this was explained because commonly the person who did the testing was not the same person, or even in the same place as the person who entered the results into the computer, so for them that immediate response was a lost effort.

In summary the survey showed for a new program, there were a lot of plusses and while the survey results were not perfect, there was a lot for us to feel good about and lots to think about and learn.

That is an OK place to start.

Michael Noble

 

Posted in Article, connections