StringJunky Posted June 29, 2015 Posted June 29, 2015 First, as StringJunky indicated, ABs do not create resistance, they select for it. Resistances emerge by mechanisms such as mutations and horizontal gene transfer. I.e. resistance has to be present in the population to begin with. However, without ABs they do not confer any fitness advantage and will stay in low amounts (which reduces the risk of multiple resistances, which is the real issue here) and do not spread further or may vanish over time again. At lethal doses the selection is strong, killing off all sensitive strains while only those with resistance survive (there are exceptions like persisters and biofilms, which I will ignore for now). However, at sub-lethal doses (what that level is may differ from strain to strain), it still creates a selective advantage for the already resistant, but gives the sensitive ones still the chance to gain resistance (e.g. by horizontal gene transfer) or to accumulate sufficient mutations that they spontaneously become resistant. So while the selection is weaker, it still exist. And considering that the pool is much larger, as the concentrations are found in wastewater, aquifers and soil, whereas lethal doses mostly at point of application, the overall impact is likely to be higher (as newer research starts to suggest). For example, the amount of resistance genes in soil increase significantly after applying manure. Also, it has been known for years that wastewater sludge is a massive pool of rseistance plasmids that are able to spread resistance rapidly. These plasmids have found to confer resistances to a wide range of ABs at the same time, which, again is the real issue here. The food improvement seems to be due to the actions of ABs on gut biota, therefore as long as they affect bacteria, they will affect resistance. The only way to stop it is to use something else (such as hormones, which is also not ideal). Also there are not good regulations in place that I know of. E.g. in Germany the use of ABs for livestock was banned, however the actual use has not decreased a bit. Farmers just declare it for therapeutic use and continue to use it to fatten up livestock. The amount of ABs in effluents has, increased in most countries rather than decreased. And yes, it has been used for years. And that is one of the reasons why we suddenly find multiple resistant strains popping up in many places and slowly and steadily run out of antibiotics to use. There are some last-line antibiotics that are only used if the patient does not react to any other treatment (such as vancomycin). But recently strains with resistances against those have been found, too. From a disease standpoint, this is nothing short of a disaster. About 10-20 years ago we were in good spirit and were talking about the arms race between chemists and bacteria to develop ever new ABs to combat resistances. The arrival of multiple resistant strains, coupled with high genetic mobility has show over the last few years that we are losing. Badly. In the 90s about 10% of infections were resistant to treatment. Today we are easily above 50% and last-line antibiotics have seen a rise in use (which means they are not last-line anymore). At the same time, the development and production of new antibiotics have been declining and there are many reasons for that. Although crucial, it has become harder and harder to develop antibiotics to which bacteria are sensitive but does not harm the patient too much. The Obama administration has initiated funding for development of new drugs (which is mostly claimed by small companies). But most working on this field know that it is just a band-aid. If we continue to do what we are doing we are promoting resistances at a vastly higher rate than we can hope to cope with. Alternatives to ABs are being developed, but there is nothing that is no killer treatment yet (and even then we do not know how resistances may evolve and how we may screw it up again. We had the means to overcome bacterial diseases in our hands and we manage to turn it into shit (almost literally). If that sounds gloom and doom to you, I feel like that because I have been following the lit for years. In contrast to things like ebola scares the numbers, unfortunately, add up. Almost all areas imply resistance increase and we see little incentives for us to reduce it to crucial (i.e. therapeutic uses). It is a sad fact that most likely people will only start thinking about it once we actually monitor deaths related to resistances in more detail. Current clinical practice does not always collect this data. But estimates in the US are on the order of 20k deaths (2013) associated with resistant strains, of which over half are Clostridium difficile. What is worse, however, is that in non-fatal diseases the number of reported multiple resistances are also increasing, not only locally, but globally (the WHO gave a report last year, I think). Unfortunately I think the time to react was about 15 years ago. Very well put and put into focus what I've gradually come to realise as I read more. Perhaps the future lies somewhere in understanding the optimal microbiota hierarchies for us and our livestock and repairing them back to the optimal model.
jajrussel Posted June 30, 2015 Author Posted June 30, 2015 (edited) Very well put and put into focus what I've gradually come to realise as I read more. Perhaps the future lies somewhere in understanding the optimal microbiota hierarchies for us and our livestock and repairing them back to the optimal model.Your idea of an optimal model, and the grower/consumer optimal model might not be the same mainly for economic reasons. How would you achieve an optimal model while giving them at the minimum what they have now? To CharonY - you said that we had the means to overcome bacterial diseases in our hands - What means was that? Also, I have noticed a thing as simple as a label riquirement has enticed the practice of non antibiotic use, but to what degree I do not know. It may still be allowed that AB's be used under certain conditions, which could in truth be abused. Edited June 30, 2015 by jajrussel
CharonY Posted June 30, 2015 Posted June 30, 2015 (edited) To CharonY - you said that we had the means to overcome bacterial diseases in our hands - What means was that? Antibiotics. Bacterial diseases were a major killer, where even small wounds could be fatal or lead to loss of limbs. Use of antibiotics made us mostly safe from bacteria (at least compared to what used to be the case). Now, we managed to throw it so much around that it may very likely not work in the relatively near future. Most of us had to use antibiotics sometime in their life and even if someone did not, the fact that other people around them were not chronically sick assisted in that (a bit like the herd effect for immunizations). Labeling seems to have no effect whatsoever considering that the amount of use has increased over the years. I do think it is likely that more regulations (and actual enforcement) may come up. But I am pretty sure that until public opinion changes (i.e. consumers vote with their wallet) the food industry will fight tooth and nail. I am not even blaming the farmers as on their end Ag is a cut-throat business. Not using any advantage they can get could sink them. Edited June 30, 2015 by CharonY
Delta1212 Posted July 3, 2015 Posted July 3, 2015 There is one thing that remains unanswered in this thread... It is not obvious to me how can low concentrations of antibiotics create bugs resistant to high concentrations of antibiotics? There are hints in this thread that this is an obvious consequence, but I don't see it clearly. I know very little about antibiotics, so I use following analogy: If some bacteria does very good at 25 degrees Celsius, but I increase temperature to 40 degrees Celsius forcing them to develop coping mechanisms and change themselves - how can this create a super bug that can live in autoclave temperatures? My understanding is that organisms change only as much as needed, not more. To be clear, I do expect that higher-than-natural concentrations of antibiotics will somewhat increase the probability of super-bug creation. How much this probability is increased is an important factor of the problem discussed in this thread. If added antibiotics increase likelihood of super-bug creation only marginally (in comparison to natural likelihood) then maybe it is worth it. ---- Another thing comes in mind: if, as I understood this thread, farmers give antibiotics to animals not for therapeutics effects but for side-effects, wouldn't then be worth investigating this issue a bit further? Can we create a poor antibiotic with nice side-effects for farming usage (because 'gain weight faster while eating less' sounds like useful magic)... And how come those farmers give the whole spectrum of various antibiotics to their animals - why don't they just settle with the cheapest antibiotic type so that they don't rubbish all of them? Temperature isn't a great analogy, but let's see if I can run with it anyway. First, understand that we're never talking about "autoclave temperature" strength antibiotics. Antibiotics are essentially poisons that kill off bacteria without doing too much damage to the person taking them. If they are too "effective" you're going to wind up causing serious problems for the patient. It's more like finding that bacteria mostly die off at 45 C but people do relatively ok at that temperature. So the bacteria have an optimal temperature of 25 C with, let's say, 10% doing ok at 40C and 1% doing ok at 45C. If you crank the temperature up to 45C, pretty much everything does off and, especially in a patient, whatever is left is at low enough levels for the immune system to deal with. If you crank it up to 40C, a lot more are left, and if that batch regrows into a full colony, their temperature resistance is going to be derived from that initial batch that all did ok at 40C. So from that start and just from natural mutations that are likely to result in both higher and lower temperature tolerances in the descendants of that colony, you get a range spreading out from 40C upper tolerances as the norm rather than the upper 10%. So 45C tolerance maybe goes from 1% of the population to 25% or 50% of the population, which is a real problem when you try to use that temperature to cure somebody. And you can't just keep cranking the temperature higher because eventually you'll be cooking your patient along with the infection. Our antibiotic treatments have been exploiting the fact that bacteria have had no need to resist them, rather than the fact that they are inherently deadly to come into contact with them. We have swords that bacteria haven't bothered developing armor to defend against. We don't have nukes. If we always killed off every bacterium every time we came into conflict with them, the effect would be the same as having nukes, but by having programs that are effectively equivalent to just cutting bacteria and then letting them escape over and over, we give them enough time and experience to figure out how to defend against said swords and wind up facing armored veterans rather than always being able to face a new batch that have never heard of swords before.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now