Jump to content

CharonY

Moderators
  • Posts

    13283
  • Joined

  • Last visited

  • Days Won

    149

Everything posted by CharonY

  1. Not necessarily. Thyroid cancer is one of those types where mortality is fairly low, even without treatment. What is recommended is that the tumors are monitored and only treated when necessary. Especially tiny nodules are recommended not to be biopsied, unless there are other factors that put the patient into a high-risk category. Diagnosing more typically does not lead to a better prognosis in these cases. I have to say that I am not familiar with the data on young adults and how much that may affect overall outcome. But even in childhood cases of thyroid cancer extensive monitoring is typpically recommended first afaik.
  2. The likelihood of detection over a time series depends a lot on the properties of the group (even without external input). I would need precise data on that, but assuming that the same people are screened over time, the number are expected to increase as the likelihood of an incident is cumulative for each person. If you want to assume an equilibrium due to the age bracket you would have to know the age distribution in each case. In either case, the samples are clearly not independent (as likely same individuals are being retested a year later). I am not sure about the Corsica data, but to my knowledge the Chernobyl-related events were only reliably detected in a longer time series. While the effect started after around 4 years, the change was only reliably detectable when embedded in a longer time series. Note that the two numbers are not a study but just data collection. A study would have to explain the contribution of the population change withing the two time points (did people move out, were the same children, now older, gender, how many were newborn etc,). One issue is of course the sampling as in Fukushima much more tests are being conducted. Ideally, the same frequency data should have existed before the incidence. Failing that a crude way or normalization would be perform similar screens in another area that has the same characteristics (minus the meltdown, obviously) and perform the same analysis. The important bit is to monitor the incidence growth over time (and again, you cannot properly estimate the distribution from two points) and then look whether there are significant differences (while controlling for potential differences in population structure). It is always a dangerous thing to try to extract conclusions from very limited data set. As they are continuously collecting one could start doing a proper time-series analysis. The major limiting factor are the lack of an appropriate reference set, though. Aside from that, thyroid specialist are a bit wary of the screening efforts, as they fear overdiagnosis and overtreatment. This is mostly because in many cases thyroid cancer does not progress and many people die with it (often undiagnosed) rather than from it.
  3. Changes in short time frames are not usually very helpful, especially in diseases that take time to establish. One would need a much longer time frame to establish any connection. I believe screens were offered starting around 2011 and one would have to model the values until now to see any aberrations from the norm. Two point comparisons easily lead to false positives (to put it carefully). Another aspect is also that it does not seem clear whether the same sample population was used (i.e. whether the study was independent). If there were follow-up analyses, it is obvious that the number will increase over time. A final issue is that the there is no good reference value to look at, as the high-resolution screening was only done after the incident on only in that prefecture. Obviously the detection rate would be higher than average population estimates (especially as thyroid cancer can be quite asymptomatic). Ideally a larger data set obtained from a larger population would be needed to establish whether the values in Fukushima are really a deviation outside the norm. I.e. you would need to establish confidence intervals for the norm to identify outlying trends. As it stands, one would need to find more data to actually figure out what is going on (or conversely, the presented data in OP does not allow any conclusions). The only study I actually found is a 30-month evaluation, but I would a bit more time to actually read methodology in detail. The approach is different, though as instead of concentrating on the rare event (i.e. cancer) they looked at correlation between physiological factors such as thyroid volumes and related blood values and iodine-131 deposition at their residence (doi: 10.1371/journal.pone.0113804). Looking at small effects in a population is actually extremely tricky as there are always a lot of confounding factors. And unfortunately the honest answer would be that we need more data (and possible longer time frame) and decent study design to find answers.
  4. What is silly is that it is a report on a report on a scientific article. And now we are discussing this, instead of the original research.
  5. Minus typos and liberal use of grammar, I hope.
  6. The part that it was not a new flagellum was basically what what the lead author described (and what I mentioned) and which got blown out of proportion by media (as usual). The rest is his usual drivel and mix of bad analogies and agenda. Basically he is attacking a straw man that was partially assisted by media exaggeration (which is why I dislike certain types of simplification, it gives people with an agenda the material to build bogus arguments). What is relevant which Behe conveniently forgets (which to some point may also apply to posters here) is that the paper describes dynamics of regulation and the fact that small changes in regulatory mechanisms can have large physiological impact. Many laymen assume that you need genes for everything and are confused why there is so little variation between bacterium and elephant. Truth is that the coding genes are only part of the story, but for the actual activity their expression and the coordination of expression reigns supreme (well and quite a few other interactions but at some point the catchy phrase turns into an essay).
  7. That is a good point. However, especially in the US I found that degrees are often seen as an investment of sorts. While I do not necessarily agree, it is good to have an initial focus, even if it may shift over time.
  8. Virologist is a job description (or field), but not a job per se. You should look around and see what kind of positions there are. Unfortunately, academic and research institute positions are highly competitive and one should have a plan B in place.
  9. Some clarification, postdoctoral research is after having a PhD (i.e. it is not a program, more like a waiting list). Whether you are getting admitted to a PhD program with a bachelor depends a lot on the school (and country). Some allow it, with an honour's degree high GPA and letter of recommendation, for example. The requirements will vary but you should look at the requirements. Often, additional credits have to be taken during the program. In countries or schools where it is rare faculty may be more reluctant to take you on, though (mostly more true for training-intensive labs). You should also consider what kind of jobs you are looking at after PhD as that may also have an impact on your choice.
  10. While we are at it, what is your favorite type of cake and do you have a recipe?
  11. That pretty much seems to be the case. It is an inherently cultural/societal phenomenon that changes with societal changes (food availability, travel/leisure indicators of wealth etc), and has little to do with biology. Except, of course that these cultural traits can result to mate selection preferences. Though these are likely to shift once the cultural preferences changes. One especially striking example is probably the preference for heavier women in the past or societies with little to no incidence of obesity as there it is associated with health. Skininess as Phi mentioned, is a rather modern indicator of wealth, for example (same as the association of a tan with vacation rather than labor).
  12. Theoretically yes, and I believe for certain enzymes mutations in the active centers have resulted in loss in specificity in vitro. In addition, some transferases have found to be less specific with respect to non-naturally occurring stereoisomers of certain nucleotides. If the goal is to e.g. create a protein using the stereoisomers it would be very difficult as it is a process requiring several proteins which are stereospecific. Chemically it should be possible, but it would be very, very difficult to realize.
  13. What about it? Why do you think does it have anything to do with chirality?. (Note that some people like to use the term because it sounds novel, which it isn't, but functionally it is "just" another layer of regulation.) For about the rest, it is not about tRNA per se, but rather all the mechansims involved in polymerization. All of them recognize just one stereoisomer and that is where the specificity comes from. However, during regular metabolism, especially (but not exclusively) in prokaryotes routinely both isomers are produced. Further pathways allow their conversion, but disruption of these pathways can result in a change of the ratio of the isomers, As in the example of the link above and there are some more, this can be used by the cell in one way or another to sense some issues.
  14. First, please do not copy paste articles in their entirety (copyright infringement and all that). Second, the article is a bit misleading as organisms to produce amino acids and sugars with both chirality. They just use one. Enzymes such as racemases and epimerases change the molecules to the right configuration. Specific to the question, enzymes involved in building polymers such aminoacyl tRNA transferases (which load the tRNA with the cognate amino acids) only work with L amino acids, thus ensuring that only one type of peptide/protein can ever be formed. If mutated it could in theory be transported to the ribosome, at which point the elongation of the peptide would be stalled by its presence.
  15. Oh, sorry my bad. I overlooked the post in-between (and frankly, am sleep deprived). Carry on .
  16. And do not forget the subsequent taste analysis!
  17. I am not going to excuse his behavior, however nowadays we have a huger workforce of scientist who have only temporary jobs which are dependent on external funding. Also if he had connections he would not be in a dead-end job to begin with. Assuming that this was his sole source of funding, and further assuming that he did not hire anyone else, the reported number would not be much higher than a highschool teacher job (a bit higher than the median). If you are an established researcher, you have a much better standing in establishing the terms of this contracts. When I get corporate contracts for a study, it is par of the course to put a something in that allows me to freely disseminate the data (they may get a first look, but will have not veto power). Even if they cut off funding after that I do not care, as my salary is independent from that money. There is the side issue that universities have cut down on faculty positions and instead employ adjuncts for teaching and unpaid soft-money researchers to get external funding in. Considering that their salaries are comparatively low, they are easy targets for groups with deep pockets.
  18. Good idea, or different yeast varieties with various optimum temps.
  19. What position would that be? One of no power and constant threat of unemployment (you did read the article, didn't you)? Note that his actions are deplorable, but equating the situation being surrounded by cookie jars is about far from reality as it gets (unless you are saying that cookies are his only food...). I should note that it is not 100% sure whether he is getting money from the Smithsonian as the article only states that he is a part-timer. But typically these positions tend to only provide affiliation, and you have to get your whole salary from third parties.
  20. How about getting a home brewing kit?
  21. I am familiar with the concepts, and I do agree (and tried to express) that the wave speed is not an issue. However, I still think there is a bit of a misunderstanding about the property of the ears (as sampling frequency is more limiting than transmission speed, specifically ). How about you explain how the signal transduction works and at which point transmission will be limiting and we can try to discuss that. I think maybe the key could be the fact that one has to apply multiplexed sampling (and curiously, time information is coded in a yet slightly different mechanism by phase locked spikes).
  22. That is not correct as the speed of sound propagation is a parameter (largely) independent from frequency. In order to hear a high frequency sound you do not need faster sampling, you need sensory neurons that start firing once they sense sound of that given frequency. This is in large part modulated by the stiffness of the basilar membrane withing the cochlea as well as the organization of the sensory neurons (essentially as indicated by String and Acme). So, no if your friend's example is based on a wave with the given frequency the explanation is wrong (or you misunderstood him). As I tried to explain to explain earlier, it is wrong to thing of our sensory organs as the equivalent of a simple ADC or amplifier or anything of that sort, really. The transmission speed mostly only determines how fast we notice something, but we have means to increase the resolution of detection beyond that. One of the key points is that we parallelize the detection. If you must use an analogy think of several mics that work in parallel. However, all of them are tuned to a different band and have different filters. These are all processed in parallel an only in the end does a computer create a sound out of it. Moreover, the mics sample at staggered times, so that while each may only sample say every 5 ms, at almost any given time there are some that are done transmitting and are able to send data. The computer has then a much higher time resolution than the sample rate.
  23. If I understand you correctly you are talking about sampling rate (or the equivalent)? Now this is a hugely interesting aspect, but my knowledge is very limited. You may think that the transmission speed is the limiting factor in distinguishing time-dependent differences, yet it is not so. To simplify a lot (and keep in mind that this is outside my expertise, so it may be incorrect), the brain is able to integrate information with a higher resolution than the transmission speed. The trick is that unlike an amplifier it does not in real-time. Instead the brain integrates information and then uses cues in the signal to backdate the stimulus when it happened. Of course you would not be able to react to that, but you would be able to distinguish two stimuli that were given shortly after each other. Data suggest that we are able to distinguish visual stimuli 5ms seconds apart, which is close to the theoretical maximum firing speed of neurons (I believe). However, this is only possible if sufficient cells are stimulated, indicating that the answer of cell population has to be taken into account and not just the activation time. Hence, for complex stimuli the required time can be longer (something for like 20-30 ms for faces, I believe). However the latency of the afferent pathways (i.e. from eyes to the visual cortex) seems to be around 30-100 ms, IIRC. In other words, the discriminatory power is higher than the speed of transfer (and much faster than the ability to respond with an action). This is some of the stuff that I have referred to as trickery because biology does not like to do things easy (bloody nature).
  24. While this is true, it offers no explanation on the properties of the system. Merely that it is fast enough to deal with that particular example. I would agree that if the question was whether the transmission speed is too slow to deal with external stimuli. However, the question stated in OP is actually complicated as it conflates several properties of the stimulus, including information depth which technically is not related to signal speed at all. (except maybe localization) The evolutionary comment was not directed specifically to your comment though, as on this board many explanations about physiology is waved way by "we evolved that way" or we have selective pressure X that made things that way, which still does not explain how it works. Related to the question by OP the relevant mechanism is the fact that our sensory neurons are tuned within to specific properties of the stimulus in question (wavelength, amplitude, frequency etc). at that the combined action of these as well as processing on the way to the brain as well as in the brain itself offers the dynamic range that we are able to perceive. Granted, this is also a very simple explanation, but the discussion of specifics would easily fill a lecture. My main point is thus: read up on physiology. It is an absolutely fascinating topic.
  25. In the oldest organisms signaling occurs chemically, where diffusion is the time limiting step. Once distances became longer, different solutions were needed. Neurons are a rather early in the evolutionary tree and has remained mostly unchanged. (I am not sure why you think transmission speed would be significant? Unless you are misunderstanding my point of signal transmissions pre-neurons). My objection to trying to explain mechanisms based on evolutionary usefulness is that it creates an interesting narrative, which usually is not testable and typically is more a distraction. Ultimately they are also not terribly useful as the mechanism itself and their properties tell us much more.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.