Vahid Posted November 29, 2023 Posted November 29, 2023 Hi everyone Mycoplasma detection has been a major challenge for me lately. I do cell culture and use qPCR to detect mycoplasma. In addition to controls(+,-) and NTC, I use 3 different strains of mycoplasma as standards (M. orale, M. pneumoniae and M. fermentans). Some Cts relating to standards are extremely high (sometimes 45!) for a few weeks. This is while I count the samples with a flow cytometer before DNA-Extraction and at least 13x10*6 cells are counted in it. The samples and wells in which Ct indicates greater than predicted (I consider ≤35 to be an acceptable Ct and I had before) appear to be entirely random. I have tested different modes, and the results are uninterpretable. We have two PCR devices, and the problem is the same in both. All samples, controls and standards are pipetted into two wells (double determination). Does anyone have any ideas?
CharonY Posted November 29, 2023 Posted November 29, 2023 The first thing to do is to talk to your supervisor to check what kind of quality control you are using and what the expected results are. From your description it is not clear for example whether your standards are extracted DNA with known quantity, for example. A Ct of 45 is pretty much unspecific and 35 is close to what is generally the detection limit (so roughly <10 genomic copies). It is advisable to start with pure DNA standards and establish a calibration curve or at least detection limit and PCR efficiency so that you have an idea what to expect. Then work your way backward to the isolation steps.
Vahid Posted December 1, 2023 Author Posted December 1, 2023 thanks CharonY, We work under GMP protocols. In this case, we have Mycoplasma infected cell lines (we buy). We count them. If there are more than at least 13 million cells, we extract DNA. After DNA extraction, we do PCR and according to the approved matrix, the standards should show a Ct less than 35 (According to the approved protocol). If we get approved Ct (≤35), we will be allowed (set free) to use them. What is interesting is that a total CT less than 30 is rarely obtained. But the main problem now: Why do we gain Cts above 35 and they appear randomly every time (90%)?!
CharonY Posted December 1, 2023 Posted December 1, 2023 OK, you are doing purity checks, high cutoffs make sense here. I assumed you had the issue even when using pure standards with higher concentration. High CT beyond 40 are generally unspecific signals. I.e. hon-targets could be amplified, probes break down etc. First thing is to check the amplification curves for shape, do the thresholds make sense? If running SYBR check melt curves. One can also run a gel to see what has been amplified or send for sequencing.
Vahid Posted December 6, 2023 Author Posted December 6, 2023 What should I do to find out if the thresholds make sense or not? I use the Sartorius kit(FAM & ROX detection). The strange thing is that this kit used to work better with the approved method and for some time the number of unexplained errors became very high. I am looking for an idea to show why this happens. (Working with GMP principles is not easy to change, and the reason for the change must be provided)
CharonY Posted December 7, 2023 Posted December 7, 2023 Normally some standard DNA/QC for QA/QC is in place to ensure that the qPCR works as expected. However, rather than just looking at pos/neg, it is often worthwhile to have a more concentrated standard (e.g. synthetic target DNA) and run a dilution series to establish PCR efficiency. This is more important for quantitative approaches, but variation in batches of master mixes or probes are not that rare. So if you have minor changes in PCR efficiency, and you are operating near detection limits anyway it might not be that unusual that you dip in and out of the detection range (and anything >40 is almost certainly a false positive). In that context it is important to consider is that we are not looking at a normal, but rather a Poisson distribution which is limited by sample volume (roughly speaking you might have have somewhere between 1-20 copies in your reaction). Since you use FAM you won't have melt curves, but you could plot the fluorescence data to see whether you amplification or perhaps issues with noise which might justify a shift in thresholding. I.e. check the whether you can see an exponential signal and where it sits relative to the noise. Considering that you are using a fixed starting material and have established protocols, one would expect fairly consistent results, but of your Ct of your actual samples also is around 30-35 it suggests that out of 13 million host cells you get somewhere around 1-100 Mycos, if I understand you correctly. Is that expected?
Alysdexic Posted December 8, 2023 Posted December 8, 2023 (edited) Check DNA extraction efficienty and test for PCR inhibitors. Consider primer/probe degradation and pipetting errors; use fresh batches of primers/probes and regularly calibrate pipettes. Verify thermal cycler calibration and reagent consistenty. Assess quality of standards. If problems persist, consider alternative mycoplasma detection methods (ELISA). Rebuild standards, test primer/probe stock for decay, and make fresh batches. Measure DNA extracts, check template worth, and run on gel to confirm wholeness. Include stop controls and mycoplasma-rich samples as extraction control. Validate heat cycle and pipette balance, ensure reagent soundness, and ready fresh reagents. Better conditions, widen reference strain use, boost positive controls. Carry out melt curve study for exactness. Again, if trouble stays, thenk about other mycoplasma finding methods, like ELISA, to back up qPCR results. Edited December 8, 2023 by Alysdexic
Vahid Posted December 8, 2023 Author Posted December 8, 2023 Maybe by explaining the steps of procedure, the issue will be clearer: - We buy cells infected with desired mycoplasma strains. - melting - counting (≥13x10^6) - Extraction - Measurement of CT genes extracted from prepared standards (usually at this stage, CTs show between 28 and 31) → allowed to be used - Approved for use as a standard (in the measurement of mycoplasma contamination of cell cultures) - Storage in the freezer -20 - My pipetting order is also clear: (two wells) A1, A2 = PK B1,B2 = Standard 1 (M. orale) C1,C2 = Standard 2 (M. pneumoniae) D1,D2 = Standard 3 (M. fermentans) E1,E2 = NK F1,F2 = NTC Correct results are obtained from wells A1, A2, E1,E2, F1,F2. Both samples of a standard are pipetted from a same microtube. (for example: microtube A containing 25 ml M. orale extracted gene, which is pipetted 10 ml in B1 and 10 ml in B2) - Two other standards in the same way. The strange thing is that when 2 samples are similar, how do they show 2 CTs with a big difference (sometimes one is less than 35 and the other is more than 35 and even 45). It is even more strange that I do not find any similarity in the results. (results are not interpretable). For example: One of the standards is outside the acceptable range. One of the wells is outside the acceptable range. One of the rows is outside the acceptable range. .... .... It seems that every time at least one of the 6 standards is out of scope and invalidates the work. As the result of the test may be valid by chance (10%). I have 2 devices and I see this happening in both. Thermo fisher recently recalibrated our PCR devices. Attention: We work under GMP principles, and all equipment and products are constantly calibrated and checked.
CharonY Posted December 8, 2023 Posted December 8, 2023 6 hours ago, Vahid said: The strange thing is that when 2 samples are similar, how do they show 2 CTs with a big difference (sometimes one is less than 35 and the other is more than 35 and even 45). Just to clarify, what you indicate as standard in your list are the extracts from counted cells? And the difference you see is e.g. between B1 and B2, which contains template from the same sample? If the difference are all >35, you are likely looking at noise (again, check the curve to verify). However if they are e.g. 30 and 35, and nothing is suspicious with your MM, then the most likely candidate are pipetting errors of the template. Is there a trend (i.e. is the first typically higher/lower than the other) or is it random? As part of your SOP, are you using low-biding filter tips?
Vahid Posted December 11, 2023 Author Posted December 11, 2023 Here is the answers; what you indicate as standard in your list are the extracts from counted cells? Exactly the difference you see is e.g. between B1 and B2, which contains template from the same sample? Yes Is there a trend or is it random? Random As part of your SOP, are you using low-biding filter tips? I uploaded its image. I use Microsart AMP Mycoplasma (Sartorius) to detect the Mycoplasma (FAM, ROX kit) What are you meaning by MM? Attached are pictures of one of the invalid tests.
CharonY Posted December 11, 2023 Posted December 11, 2023 I noticed that your wells say 10CFU MO. Is your template dilute to about 10 CFU? If so the Ct range makes much more sense to me. But as you can see in your multicomponent plot, you have got technical issues. See the ROX signal? For some the signal increases, which could be an evaporation issues and for two it actually drops into the negative (i.e. below initial baseline). There are really only two samples that show the expected flat line. Considering the very low amount of template, these issues can compound the result a fair bit. Also, to check the amplification curve shape, it often helps to plot the signal over ct rather than the delta rn. But even there you can see that it barely passes the threshold, aside from the positive controls. But first I would check the plates/tubes. Are they properly sealed, for instance, are there droplets/bubbles etc. Also is the cover heating activated?
Vahid Posted December 12, 2023 Author Posted December 12, 2023 We buy lyophilized 10CFU standards. They are certified by the manufacturer. How should I check the cover heating activation? in addition, I seal the plate carefully and tighten the sides. We always pipette reverse(which means no bubbles.
CharonY Posted December 12, 2023 Posted December 12, 2023 OK, this result is really weird, as it might suggest that ROX is dependent on your sample, rather than the MM. Not sure if it is related, but I do wonder whether the position plays a role. Your Positivkontrolle is on top and your NTC is on the bottom of the plate (and the increase affects all samples in the middle). Perhaps a silly question, but in your method, ROX is selected as the reference dye, rather than target? Generally speaking an increase in the reference tends to be indicative of evaporation and concentrating the MM. Quenching could affect ROX, too, in theory, but in your case I fail to see how that would work. I am especially unclear sure why you see all three effects in the same run in a seemingly well or template-dependent way. One thing to check is whether the ROX traces are similar in all your runs. Also ROX seems a bit on the low end (close to 0) though I don't know if your software normalizes the values somewhat. At this point I would check your reference runs/SOP to see what your expectation of the raw fluorescence levels are and/or discuss the issue with the manufacturer (after checking that the software settings are correct, of course).
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now