CharonY
Moderators-
Posts
13260 -
Joined
-
Last visited
-
Days Won
149
Content Type
Profiles
Forums
Events
Everything posted by CharonY
-
Wait, who is the negative control?
-
Also note that frying mostly disinfects the surface, but not necessarily the complete inside.
-
Edit: I was rambling a bit. To clarify: if you did not denature your protein during isolation a denaturation in urea with subsequent renaturation does not make much sense. The question is whether the proteins are really denatured. A native gel gives only limited hints, depending on the protocols used. The question would be how much of a difference you would expect from a native protein compared to a SDS denatured. Confirmation state are harder to estimate, but you could check the expected charge, under the given conditions (unless your native protocol introduces a charge externally, of course). The lack of activity is for me a more compelling argument. However, I assume the protocol used is known to work and not to affect the activity of the protein? Also if a MS is available you may want to verify that your protein is really the correct one. If it is really denatured you should try to track back and find out what it was and try to remove that, rather than denaturing it with urea and removing it again does not sound too promising. There are several ways to proceed, the easiest is simple removal of denaturing agent and rebuffering. One thing is to determine the right conditions in which it should refold (often it is a Tris buffer with various oxidizing or reducing additives). Depending on the properties of the denaturant you could simply use spin columns to remove the original buffer and replace with renaturation buffer. If the whole isolation process somehow caused the denaturation non-chemically it is going to be very tricky, as these may have changed the properties of the protein irreversibly.
-
Very nice post DH. This null clearly is the way prosecution (which in turn does the categorizing) works. It is important and interesting to point out that the hypothesis is in fact the reverse to the way it should work (innocent until proven guilty). iNow, unfortunately it is not easy as that. The closer we get to zero for type I, the bigger the impact is likely to be on the type II error (think of intersecting normal distributions). In many analytical settings one would instead increase the sample size, so that the distributions (in this case of innocent and guilty) overlap less, but in the justice system this is not likely to be easily feasible.
-
Once denatured it is often not easy or possible to have them refolded properly again. To be honest the best guess would be try to reisolate it under non-denaturing conditions.
-
This is a typical problem of type I and II errors. Essentially there are four different outcomes, Innocent and convicted (false positive) innocent and not convicted (true negative), guilty and convicted (true positive) and guilty and not convicted (false negative). The question is what the relation between these factors are and what an impact of tweaking the system e.g. towards lower false positives (i.e. lowering type I error rate) will have on the false negative rate. However estimating the respective rates is tricky as only an unknown fraction of false positives will actually determined to be so (as by default there are assumed to the true positives, as they have been convicted). Only those that unequivocally prove their innocence (and less than 100% of those will be able to do so, in absence of evidence) will be counted. Likewise the problem with false negatives. And finally the form of correlation between those two errors is not really clear. It will likely vary a lot depending on the type of crime, too. Also there are certainly certain elements within the legal system that will have stronger effects on the errors than others. Based on the statistics of exonerated convicts being black has a higher chance of being falsely convicted.
-
Furthermore the defendant maliciously accuses an innocent bystander of fiddling with his links. Links that now he admits to be his. (Interesting talk btw.).
-
Depends on the concentration, though most DNA extraction methods yield relatively pure DNA. On the other hand PCR is relatively robust and if detection limit is not too much of an issue even relatively dirty samples yield products. I assume that you mean parasite eggs? If you need it pure you can use cleanup methods after the floatation and subsequent disruption.
-
Yesterday a paper was published that describes the use of genotyping arrays to detect genetic determinants of autism: Here the abstract: Pinto et al. Nature (2010) (online) One interesting finding is that no single locus was found to be responsible for the condition, but that many pathways are involved. However, the identification of these pathways may prove to be crucial for the understanding of the underlying mechanisms.
-
Uh, it is not a dating method per se (i.e. you do not use it to date something). Rather the inverse is true. You use (calibrated) mutation rates to estimate e.g. when two sequences diverged. This is also referred to as molecular clocks. The critical point is generally the means of calibration.
-
A point about money. You get money for research, not restating opinions. In other words a climate scientist would submit a grant e.g. proposing to apply a certain model to explain certain climate patterns. In other words, the results of this approach remains to be seen. Also, you generally do not get funded if your results are expected to be no different from what is already known. Either the approach has to be novel or the outcome has to have some impact. In fact, research that are against common knowledge have a chance of getting funded IF there is very strong preliminary data pointing into that direction. It would be new and cool and, if presented convincingly, it would open up new direction (one of my funded grants was a bit like that).
-
Are you sure that they are tenured? Though community colleges may be quite a different breed.
-
Interesting that everyone is speculating without actually knowing what the OP meant. I will continue to assume that it refers to gender differences in the science area. I think a couple of studies exist that in males the differences e.g. in IQ or similar measures are more diverse than in females (i.e. the variance is larger). The average on the other hand was pretty close. While for the individual it is not that interesting, it can have an impact on gender composition composition of highly competitive areas (as e.g. in science careers), even if one excludes social biases.
-
I did not have sound but based on the video I do not think that this will work. First, hay absorbs water as readily as oil, so only a fraction of the overall capacity if the hay will actually absorb oil. I would need to crunch some numbers but I am pretty sure that the amount of hay needed would be enormous.
-
Info of Culture (Bacteria) manufacturers needed
CharonY replied to fthnm2005's topic in Biochemistry and Molecular Biology
try looking up the american type culture collection. -
Well most patents on genes (that I am aware of) are a bit more than the mere discovery (otherwise HUGO would have patented everything by now). It requires the proof that it has any kind of relevance (e.g. being a suitable marker for a disease). Then, there has to be some kind of application. For instance, they could patent a disease marker and the associated detection methods (e.g. simply primers specific for the detection of the particular allele). And demonstrate that the whole package has an application e.g. as a diagnostic method. It is to my knowledge not sufficient to simply clone out a gene and patent it. However in the end I do not really know what the minimum requirement for a gene patent is as I have only limited personal experience with patents in general.
-
Well the method is to determine different (known) variants of a given gene. So in fact there is no prediction involved.
-
I assume that the test is a simple PCR (the whole story is a tad off-topic, though).
-
Well technically it should be feasible. Of course it is going to be trickier with longer and low abundant RNA, but there is nothing inherently impossible about it (depends a little bit on the protocols used, though). For immunoprecipitation it is important that the binding is highly specific, of course as otherwise higher abundant targets may outdilute your low-abundant targets.
-
Maybe you could also check craigslist and ebay.
-
The mutation estimation is fairly easy. Assuming no preferred site the chance of a mutation in any given locus is 1/number of loci. To get the time you just have adjust for mutation rate (i.e. number of mutations in a given time frame). The tricky (and more interesting) bit is actually to estimate which mutations will persist.
-
That I do not know. I have not met a single one being lecturer or similar without a PhD. It would surprise me a bit though. I think I heard that at community colleges there were a few, but all of them were something like adjuncts (and I do not know whether they still exist).
-
Hmm, for starters one of those take great pains to appear less hairy (excluding, for some reasons, the head) . Also get paid less (on average) in positions with negotiable wages.
-
All known biochemical process relevant to organisms require an aqueous environment. As such, yes, biological activities require water. However, certain organisms can survive in a dessicated state. They do need water to resume activity, though.
-
First of all in order not to be a hand waving alarmist it is necessary to point out elements that could pertain to a specific phenomenon (or not), instead of claiming the source of total truth. The point of science is throwing out new ideas in journals and allow for discussion. Second, while I am not knowledgeable to discuss it in more detail it appears to me that you are arguing a different point than the authors (the journal is nature geoscience, btw). According to the paper they talk about a timeline of around 1k year for the methane, and less than that for the extinction prior to that. Also afaik it has not been established that the methane drop was the sole cause of the ice age and the authors did not make a strong claim in the direction. Of course the paper is not perfect (none are), and one could argue about the numbers provided. But it appears to me that what you propose (70 year drop) is in sharp contrast to what they did (around 1000 years based on ice drills). Normally one would try to figure where that came from rather dismissing it. Though overall I do not really understand what the point of the whole discussion is. I do not see any revolutionary claim in that paper (which is interestingly one of the criticisms you provide).