Jump to content

Glider

Senior Members
  • Posts

    2384
  • Joined

  • Last visited

Everything posted by Glider

  1. Nah, that's just a hobby.
  2. What? 'Emotional logic'? That's a bit of an oxymoron. What is a 'good emotional conclusion'? What are you talking about? You present your points as though they were long established and widely accepted. To me, it feels like you are just making stuff up. Can you support any of this?
  3. Not really. Do you have a point?
  4. As far as I know, there is no such test in Psychology. I can't even imagine what such a thing would test for.
  5. So, ignore the most obvious explanation and look for something more obscure. Why?
  6. Emotion plays a significant part in all our decisions (here I'm talking about pre-attentive affective-motivational state, i.e. emotion, before it becomes conscious) In many cases, what we call a 'decision' is more a rationalised explanation for something we have already done or a path we have already committed to. Mr Skeptic is quite right. Our emotional brains are constantly scanning our environments for emotionally valenced information. Our affective-motivational state (pre-conscious) is determined by the net valence of that information and that state determines the broad direction of decisions we are likely to make. This automatic processinng is the way our brains determine, on a basic level, the probability of harm or benefit in any given environment. When something bad happens, the media report it and we will be exposed to it in every newspaper, all TV channels, radio, internet etc.. Even though it's a single event, we are exposed to it multiple times over a period of days or weeks. This is enough to tip the balance and our behavioural motivation will automatically compensate for the increased probability of harm signalled by the frequency of exposure to this (one) incident.
  7. That reminds me of something that happened last academic semester. A student complained about my 'going on' about alcohol in one of the seminars of the module 'Psychobiology and Clinical Neuroscience'. The seminar title was 'Mechanisms of Action of Psychoactive Drugs'. Given that alcohol is one of the most commonly abused psychoactive drugs, it's quite hard to avoid the topic. That student will be taking Health Psychology this coming semester. I fully expect another complaint as alcohol, addiction and other related topics are covered in much more depth in that module.
  8. There may well be a dedicated centre for 'moral behaviour' and the the dorsolateral prefrontal cortex seems a reasonable candidate. My point about circuitry was simply that higher order areas modulating behaviour or perception tend not to be discrete functional units as such, but rather areas in which other, more basic information from a number of more basic and dedicated centres is integrated and modulated, a bit like a mixer board where information from more specific (dedicated) channels is mixed and modulated to provide a unified output that is different to the input from any particular channel. If damage occurs to any of the areas generating input (channels), the result would be a differrence in output. Thus, the output is a function of all the areas involved rather than any one discrete area. In the case of the the dorsolateral prefrontal cortex, there are feeds into this from a number of different areas: the thalamus, the basal ganglia, the hippocampus and the the orbitofrontal cortex as well as cortical association areas. In any case, dedicated centre or otherwise, the underlying processes driving the development of neurological structures associated with moral behaviour would be the same; natural variations that provided some advantage being selected for and so increasing in frequency within a population. You're welcome.
  9. If we assume a dedicated neurological 'centre for morality', then it is likely to have come to be because it provided an advantage to the individuals that showed a propensity for 'moral' behaviour (in this case I mean behaviours supportive of, or at least not detrimental to other members of an ingroup), and thus provided an advantage to the group as a whole. Basic motivated behaviours are the function of the hypothalamus (these are basic survival and reproductive behaviours). Groups of animals of the same species driven only by these basic drives would not do well as groups, as every other member represents competition for resources (except when breeding). However, where these basic drives are modulated, significant advantages can be gained. Take meerkats as an example. Every individual has the basic hypothalamic drive to feed, and the females also have a basic drive to protect their young. Some individuals forgo the former in favour of the latter and act as nurse to a 'creche' of young, including the young of other females. This provides an advantage as it allows the others to forage more effectively, unhindered by their own young, whilst the vulnerable young are still being protected. Each advantage, however small, increases the chances of successful reproduction. The more cooperative groups get to reproduce more and so the genetic propensity for these behaviours becomes reinforced with each generation. These are evolutionary processes, and as such, one wouldn't expect to find 'morality' suddenly extant in all its glory. It would have come about incrementally, over time with each advantageous behaviour being passed on and built upon by successive generations. It is not unreasonable to suppose that the origin of the meerkat's 'creche' instinct was something as small as one individual whose natural propensity to protect her young, through natural variation, was not confined solely to her own offspring, but extended to the young of other individuals (i.e. she would not have rebuffed the approach of the young of other females as so many other animals do). As noted, this would have provided an instant advantage to the group, freeing up other individuals for hunting and foraging whilst still providing protection for the young. 'Advantage' translates as increased probability of reproduction and successful raising of young. Thus, this particular behavioural variation would have a higher probability of being passed on, and so the advantage to the group would also be passed down through generations. Natural variations in behaviour can be adaptive (providing advantage), maladaptive (providing disadvantage) or neutral (making no real difference). Maladaptive behaviours would be deselected by virtue of their decreasing the probability of successful reproduction. Adaptive behaviours would be selected for (and so increase in frequency in the overall population) simply by virtue of their increasing the probability of successful reproduction. By selecting for these specific, advantageous variations in behaviour, the process is obviously selecting for the underlying neurological mechanisms that produce them. Thus, with each successful generation, the underlying neurological mechanisms that provide these advantages become reinforced within the population. Modern meekats are renowned for their cooperative group behaviours. But these would have manifest incrementally over a very long time. Each slight adaptive variation in behaviour (and the concomitant slight neurological changes) increasing the group advantage and so being reinforced within the population; increasing in frequency and complexity down the generations.
  10. What led you to this conclusion? What you say is not obvious at all. I highly disagree with it in fact. There are different levels of morals; there are those that the species have evolved and which are pretty much universal and quite basic (these can be 'overwritten' by subsequent socially acquired ethics), and there are cultural morals. These are learned. There are three main approaches to ethics: Relativism: Maintains that what is right or wrong depends on the particular culture concerned. What is right in one society may be wrong in another, this view argues, and so no objective standards exist by which a culture may be judged right or wrong. Objectivism: Claims that there are objective standards of right and wrong which can be discovered and which apply to everyone equally. Subjectivism: States that all moral standards are subjective matters of taste or opinion Foodchain expresses a relativistic approach to morals whilst you appear to take an objectivistic view. Real-world observation supports the relativistic view (in practice). Evidence for this is that morals both differ between cultures and change over time. A good example is the current social attitude towards paedophilia, compared to 400-500 years ago (in Britain) when girls of 13 were of marriageable age and it was perfectly acceptable for them to marry men in their 30s or 40s. In some cultures, that is still acceptable and practiced. I am familiar with the story of Phineas Gage. There is a lot of myth involved in that story as it is usually told, including the precise path of the tamping rod through the prefrontal areas. This was only ever estimated from the damage to the skull, but there was significant deformation of the skull which made it almost impossible to tell accurately (e.g. studies of the skull show that the entry hole is too narrow to accomodate the rod. A vertical fracture shows there must have been a 'hingeing' movement of the skull, which makes a precise mapping of the path very difficult). This is discussed in some detail in the current issue of The Psychologist (the monthly journal of the British Psychological Society). In any event, It seems unlikely that such trauma and the subsequent decades of post-hoc story telling could result in a precise suggestion of a 'moral centre' of the brain being taken out in Gage's case. However, there are more modern conditions that produce the behaviour changes commonly reported for Gage. Many stroke patients who suffer prefrontal lesions suffer changes in personality. They show imparement of social judgement and (acceptable) response selection. They can become agressive and confrontational, but mainly they become impulsive and just 'do' what most other would just think, i.e. they act without the benefit of an internal 'social editor'. So, an attractive nurse is quite likely to find herself being physically molested by such patients, where other male patients might just imagine it. This condition is known by nurses as being 'frontal' as in "watch that guy, he's frontal!" It would seem that, as with most brain functions, socially acceptable behaviour is a function of circuitry rather than a dedicated centre. It would involve the hypothalamus (a/w motivated behaviours; sex, feeding etc.), the amygdala (a/w automatice processing of environment for emotionally valenced stimuli), anterior cingulate gyrus (a/w response selection), posterior cingulate and hippocampus (a/w memory) and feeds from all these (and other) areas are brought together in large cortical assocation areas. Damage to any one of these regions, or the larger assocation areas would result in altered social behaviour (assuming the damage was survivable). By now, I have completely forgotten where I was going with this, but I've done too much to delete it all. Suffice to say that (in order), humans do have instincts. Much of human behaviour is instinctive. Morals (ethics), in their most basic form, are instinctive and evolved from behaviours (or behavioural drives) that were adaptive and provided an advantage to animals living in social groups. Modern societal morals overwrite these basic instinctive drives, but in most cases they do not extinguish them. Rather, modern societal morals are adaptations of our instinctive predispositions. This is why, although morals differ between cultures and change over time, certain basic, universal traits remain relatively stable. The abhorrence of killing family or 'in-group' members, for example. Of course, one can always find examples of behaviours that are contrary to our basic instinctive morality, but where these are carried out (in the absence of any psychopathology), they are usually accompanied by a great deal of cognitive gymnastics in order to justify them. For example, under Islamic law, it is illegal to kill a virgin (whatever her [it usually tends to be a 'her'] crime). In order to carry out a death sentence, the woman must first be raped (usually by the prison guards) after which, no longer being pure, it's fine to kill her. In this exercise, the 'pure' woman is valued in Islam so, almost by definition, a virgin is 'ingroup'. After she has been defiled however, she is 'outgroup' and they don't count for much. That last statement applies universally, not just to muslims. For example, the same principle applies to the extremist Christian pro-life movement. The sanctity of life they preach applies only to the ingroup. Doctors performing abortions are not ingroup and so the sanctity that pro-lifers preach does not apply to them (which is how they self-justify the fact that killing these doctors has to be at least as bad as aborting foetuses). I don't want to get into religious debate here, suffice it to say that in order to get people to behave in a way contrary to their instinctive morality, it takes the logic of something like religion in order to define and shift the lines of ingroup - outgroup identity in such a way as makes it possible. Whilst not alone in this, religion is expert in it and does have a lot to answer for. I strongly recommend the book 'Mistakes Were Made (But Not By Me)' (Tavris, C & Aronson, E. (2007). Harcourt, Inc.) for an excellent insight into self-rationalisation and justification.
  11. No, you're not too old. There is nothing stopping you from gaining a new skill. You're not too old to learn and you're not old enough for your joints to have any serious problems. Playing an instrument is about fine motor skills and there are only two things that could hinder you: 1) flexibility. Those who have been playing the guitar or piano since childhood will have developed very flexible hands. You might need to work a bit harder for a while to gain the reach and strength needed for certain chords, but as long as you don't have arthritis or anything, that just means a little more effort for the first few months. 2) Practice. playing an instrument is primarily about fine motor skills. The only way to gain these is through constant repetition (practice). This is the main thing that will determine whether or not you ever learn any to play any instrument.
  12. An instinct (instinctive behaviour) is a hard-wired (automatic) behaviour that is universal to a species (i.e. it is innate and every member of the species is born with it). Humans have many instincts. Some extinguish naturally after a time. For example, the innate ability to swim, which is universal to human neonates, dissapears after about 6 months and we have to learn all over again, unless the parents take the infant swimming before they reach 6 months, as some do, in which case the behavious is reinforced and the child never has to learn how to swim. The same with the rooting instinct (except that most people don't actually learn that one again). The main difference with human instincts is that learning (particularly vicarious learning, which is another instinct and not unique to humans) can override most of them (e.g. fear of snakes, spiders and heights). PS. The sociobiological approach suggests that morals that ethics, or a ‘sense of morality’ has evolved with us, evolving from social behaviours adventagious to group survival. This argument has been supported to a large extent by the observations of ethologists such as Desmond Morris and Konrad Z. Lorenz studying the behaviours of social animals. An example of this is the observation that wolves engaged in dispute will threaten and even fight, but rarely will any damage be done. At the point of defeat, the subordinate will signal surrender through an evolved set of behaviours. The dominant wolf at this point becomes apparently unable to press his attack. This behaviour is said to have evolved to protect the pack from decimation through casualties of mating or pack rank disputes. Other behaviours such as mutual grooming, mutual removal of parasites and mutual protection are commonly observed among social animals. Parental care, co-operative foraging/hunting and reciprocal kindness have all been observed and recorded within such social groups
  13. I'm afraid I can't. I never paid enough attention to it to remember.

  14. I think you're right. Yes, that can happen, but is dicouraged. Good (high impact) journals tends to avoid it. Lecturers absolutely have to if they want to stand any chance at all of not putting students into coma. Therein lies the balance. The language should be as simple as possible, but no simpler, or you quickly loose precision. Exactly. This is precisely the point. In my experience, that tends to be the case. Another problem lies with the difference between the perceptions of the writers and those of the readers. Many writers have been involved in their fields and have been writing for years. However, many readers are students and people new to science. I will try to illustrate this difference using an anecdote: I was teaching a class in research methods. The students had been split into groups, and each group was required to design and implement a simple, cognitive experiment for their course work. One group had decided to conduct an experiment on memory. They had chosen to test the hypothesis that familiarity would aid recall. To test this, they decided to put together two lists of 20 words. One list would consist of common words in daily use and the other would consist of more obscure or archaic words that were less commmonly used. The idea was that they would present these lists to two groups of people (the 'commmon' list to one group, the 'obscure' list to another group), and see which group would remember the greatest number of words after a two minute exposure to the list (and a one-minute distraction task to prevent rehersal). The students took a week to prepare the two word lists and showed them to me the following week. I checked them and for the life of me, I couldn't tell the difference. Even after a year I still remember one of the words in their 'obscure' word list. It was 'Sill' (as in 'window sill'). I probed them about this (gently, as my aim was not to humiliate them...most unlike me... but I was intrigued Oh. I just remembered, 'Intrigue' was another word on their 'obscure' list), and all four of them honestly believed that the 'obscure list' contained obscure and archaic words that they were convinced would not be familiar to most people. The difference was absolutely clear to them. As I said though, I could see no difference between the lists. After this thread, I can't help wondering if these students might represent those people that complain the loudest about the 'obscure' terminology used in journal articles.
  15. If you find writers who do that, then they are bad writers because the words in each of the pairs you present mean different things: Enhance = to increase the quality of (i.e. improve) Vs. Increase = to make greater in size/volume/amplitude etc.; Prevarication = to avoid giving a direct answer Vs. Procrastination = to delay an action; Attenuate = make weaker Vs. Ameliorate = make better. If writers are using these words interchangeably, then that's simply bad English and a strong argument for specificity in writing to be more strongly encouraged (e.g. by teachers and reviewing editors). Was that your point? However, if they are using the words correctly, then they are simply being precise and saving at least one word, per example. There are bad writers, nobody is denying that. But for them to improve, it's not enough to say 'you are a bad writer'. There also needs to be a standard that can be presented as an ideal and to which they can be directed. This is the source of the status quo you mention. It's there for a reason. Yes, there are changes, and language itself changes over time. I have no problem with that. But, when writing a paper or a lecture, I have to conform to (and teach my students) what is correct now. I have no doubt that by the time they are writing papers themselves, things will be different (at least slightly), but I can adapt. I do use both examples. The term itself (short) and the explanation of it in simple terms (long) to teach them what it means and how to use it. That is teaching. That quote goes some way to explaining why it's hard for writers to take into account non-native speakers when writing in their own language.
  16. Why? Would it be reasonable of me to ask a Spanish (or Greek, or Danish, or German etc.) author to modify their use of language so I could understand it? Also, how? Is it for an author to guess the standard of English of a non-native speaker? The world is full of people with ego, that's a universal problem. I'm not sure how it would manifest in a journal article though. Jargon is universally discouraged (as I said in a previous post). Jargon tends to be made up short-hand, peculiar to a specific area, and so does not help with clarity. As for 'highfalutin' language, I'm not entirely sure what that means. If an author uses English that is grammatically correct and terms that are recognised within the discipline (excluding jargon), then I'm not sure where the problem lies. If I write a lecture (essentially the same in function; to disseminate information) describing the 'substantia gellatinosa as constituting laminae II & III of the the dorsal horn', is that pompous? Would it make the students' lives easier if I talked about 'The jelly-like stuff that makes up the second and third layers of the sticky-out, grey bits in the back part of the spinal cord'? Probably, in the short-term, but not in the long-term. 'Jelly-like substance' could describe snot. Substantia gellatinosa describes specifically the region in which primary afferent inhibition takes place. If I talk about 'pain detectors' instead of 'nociceptors', is that more clear? It might appear so, superficially, but in fact it's completely misleading. The term 'Pain detectors' suggests that pain exists as an objective entity and that we have receptors to detect it. That's completely untrue. Pain and nociception are completely different things. It would not serve my purposes, or my students, to allow them to infer from my sloppy use of language that it was otherwise. When writing about these things, I have to be specific. Specificity denies ambiguity. To be specific, I have to use the specific terms that are an accepted part of the discipline. I could simplify the language easily, but so much would be lost. For example, I could say that action potentials (how would you simplify that term?) are sent along nerve fibres', or flow down nerve fibres, instead of 'action potentials are propagated along nerve fibres'. However, the first options ignore the fact that the movement of action potentials along a nerve fibre is an active process of propagation. It suggests they simply move down axons like a signal down a telephone line. Again, that's not true. You have to be specific if you want to be clear (and accurate). In any case, I would suggest that as long as a writer is using correct grammar and conventional terminology (in whatever language they're writing), then any difficulty in understanding is not their problem. The onus is]/i] on the author to make their meaning clear, but that is not to say that readers have no responsibility at all, do not need to make any effort, and have an innate right to just sit back and be spoon-fed 'Sesame Street Science'.
  17. True, in some cases. But to reinforce my point, the two words you use refer to brevity and precision and neither means concision. Only together do they broadly suggest concision and even then, you haqve introduced ambiguity and the possibility of misinterpretation. For example, 'Brief' in that context might be taken to imply 'superficial' where 'concise' certainly cannot. I agree, we're probably not on the same topic. I'm speaking from the perspective of one who has to read and write these articles, and (perhaps more pertinent) to teach others to do so (the latter is really the tricky part as standards of written English seem to fall year by year). But these articles are in a completely different area. I don't think I've ever seen an engineering article. Cool & funky. I do take your point. Simplicity is an important part of writing articles, but there is a fine balance between keeping things simple and writing for children, which, as you acknowledge, would also increase the word count and (as I maintain) would introduce unacceptable levels of ambiguity. It is also very irritating to read. We do have to remember that the articles are targeted at other professionals in the area. If the articles are in English but your first language isn't English, then I can see how that might present problems. But that situation is not the fault of the authors. Seminal papers tend to get professionally translated for international publication though. If I had to read papers in my area, but from a Spanish journal, I couldn't really complain that the Spanish authors were using tricky Spanish words. I would consider it my obligation to improve my Spanish vocabulary. Neither do I, but again, that's not down to the authors. That's down to the publishers. See above. I wouldn't dream of trying to improve people's English here. As you say, I would have to quit my other job (where I can actually make a difference). It's just a mix of facetiousness and habit on my part. Different fields and journals have different conventions for the layour and presentation of articles, even for referencing style. As Klaynos says, you only have to look up a word once. Thereafter, you'll never have trouble with it again. Consider it personal growth (as I tell my students). Yes I do and yes I am. I have to read them completely, because in my case (I don't knnow about engineering) it's not just about knowing what the authors are presenting. It's about evaluating the authors' rationale and their methods and whether or not their conclusions are supported by their data and so-on. Critically evaluating the paper, as opposed to just absorbing it. Understanding is good though. That, when all is said and done, is what it's all about. But I love the discussion. Apologies for length.No worries. Good discussion. Apologies for concision (I have lectures to write).
  18. I don't see why. These are all transitional stages; lay people training to become professionals and as such, are not the target audience. Given the role of undergraduates (learning), I would have thought that the onus was on them to learn enough about their area so that papers relevent to their area become clear to them. In short, it is for undergraduates to grow, learn and adapt, not for researchers to 'dumb down' journal articles to cater to them. How would that encourage them to learn? Undergraduates are not the target audience and publishing journal articles is not all about them. It is their role to learn enough to be fluent in their area. However, research papers are not written deliberately to be impenetrable to them. As I said, they are written for people as qualified as the author. It is not the author's fault that undergraduates aren't and it is the whole purpose of being an undergraduate to become as qualified as these authors. Yes, grant proposals are different and serve a different purpose. That is why they are written differently. They are designed to give a brief and comparatively superficial impression of the proposed research to a non-specialist purse-holder (whose main interest will be deliverables anyway). My point exactly. There you go, you just leaned a new word. Now, when you see it again, you will know instantly what is meant and the author won't have to write "The process of presenting as much information in the fewest words possible" just to make it clear to you. Not really. It's more a matter of clarity. You should try reading undergraduate essays. The arguments they are presenting are too often buried in superfluous text and they can take a page to present a point that would be better presented in a few lines. It takes a long time to work out what it is they are trying to say, even when what they are saying is correct. As I have said, this is a part of the purpose of their being undergraduates, to learn how to express themselves and their arguments clearly. But nobody wants to read terrabytes of data. You will find, should you ever choose to write an article for a journal, that journals have word limits for articles, whether they are to be published online or not. Rambling and long-winded articles tend not to get published. When writing an article, the onus is on the author to make their points clear, not on the reader to have to trawl through pages of verbiage in order to find the point. When engaged in research, researchers have to read many papers. It eases the process if the authors of those papers just get to the point and present only what is necessary, clearly and precisely. Paragraphs are used appropriately. If you mean the spaces between them, then it's usually the publishers that limit them to indents to save page space. It does tend to contract things, but you get used to it. It also introduces ambiguity, which is a bad thing. It would be better for people to learn how to read in their field. This is particularly good for students, who should be encouraged to grow. It would be a bad thing to encourage authors to dumb things down for them. By the way, 'instead' is one word, not two. Listing things at the end would make it harder to navigate. A naive reader would constantly be turning to the index. It is better, particularly with abbreviations, to write the full term in the first instance, followed by the abbreviation in parentheses, and to use the abbreviation thereafter. For example, "levels of salivary SIgA were measured using Enzyme-Linked ImmunoSorbent Assay (ELISA). Results of the ELISA show....". That way, naive readers have all they need in that sentence, without having to turn to the back to find out what ELISA means. Anyway, a lot of this just sounds like "Awww...but I don'wanna learn the big words...". Well, it's all about the learning. We sometimes have students here who complain about the same thing; 'Why do these writers make it so hard for us to understand stuff?' The logic that blames the qualified for the ignorance of the unqualified seems strange to me. Surely these people know why they are students?
  19. The main objective of writing articles for publication is clarity, but this invoves sub-criteria; concision and precision. 'One of a pair of molecules' = 5 words, Moeity = 1 word. 'Broken down' = 2 words, Ablate = 1 word, so the use of these words increase concision. Precision is also important, particularly in more theoretical areas. Once published, your peers will be waiting to pull the paper apart. Imprecise language provides them with more oportunities to do that than is warrented by the subject matter and a lot of time can be wasted re-explaining your meaning to people who got the wrong end of the stick. You have to remember also who the target audience is. It is usually others involved in the same area of research, which is why papers get published in particular journals. But the most basic consideration of the audience is that they are at least as qualified as you (although not necessarily in your particular field). So, it's ok to use words that are generally understood, but not words or jargon that are specific to your field (without explanation). They are written to disseminate information accurately, precisely and concisely, to people who are qualified. There are examples of overblown pomposity out there, but they tend to be as frowned upon as sloppy, casual prose (examples of which can also be found). As a general rule, scientific articles aim to provide maximum information, as precisely as possible, using the fewest words possible.
  20. Lower oesophageal sphincter. Or are you thinking of the other end, where the stomach joins the duodenum? That would be the pylorus.
  21. Blood alcohol refers to (and is a measure of) levels of alcohol in the blood, not acetone, ketones or aldehydes. If blood alcohol tests responded to, or included metabolytes of alcohol, they would be completely useless as tests of blood alcohol levels. Then it's a possibility and cannot be ignored. As I said, alcohol can can freely cross cell membranes, so alcohol in the system will eventually find equilibrium throughout the body where it has a universally negative/inhibitory effect on all cells and systems. Being 'drunk' is simply the effects of alcohol on neurons, but it affects other cells and system just as much, it's just that we are not aware of those effects. For example, you are more likely to fall victim to an opportunistic infection when you have been drinking (colds, sore throat etc.) as alcohol supresses immune function. Likelwise, cells responsible for healing (e.g. cells in the basal cell layer in the skin) are also affected. The only reason we drink it is for its psychological effects, but it affecvts everything. We're just not aware of the other effects. We can make the assumptions, we just know that it would be foolish to do so. By doing so, as ecoli says, we are ruling out all the simplest and most obvious explanations (without testing them) and beginning with the assumption that the explanation must be something extraordinary and complex. That is bad science. The solution may be extraordinary, but you can't just ignore the ordinary to begin with.
  22. You have made your point well. I understand the analogy. The point I am trying to make is that it is fundamentally flawed. 45% of whole blood is cells, the rest is water, anionic proteins (albumin), glucose, carbon dioxide, clotting factors, hormones etc., and in this case, alcohol. By comparison to everything else, the cells are huge. You can’t consider blood cells females and plasma males. It would be more sensible to think of it as neutrally buoyant basketballs suspended in muddy water, where the basketballs are cells and the muddy water is plasma and its dissolved constituents (including alcohol). Now, in the case of burns trauma, according to the paper you provide, “Increased capillary permeability is the most significant physiological alteration”. It goes on to say “It allows the sieving or leaking of protein-rich plasma from the vascular into the interstitial spaces. One study revealed that proteins with a molecular weight of 300,000 daltons can escape from the vascular bed following thermal injury. (The molecular weight of albumin is 69,000 and IgG, one of the immunoglobulins is 160,000 daltons.)” Albumin at 69,000 daltons is tiny compared to a red cell. One transmembrane protein in a red cell is 95,000 daltons. More to the point, compared to alcohol (CH3CH2OH) albumin is truly immense! So, as I keep saying, even if alcohol was not able to cross capillary walls freely under normal conditions (which it is), under conditions which result in increased capillary permeability (such as burns trauma), wherever plasma leaks to, alcohol is going to go with it. Thus, your analogy of alcohol as ‘aliens’ which implies they are particles equivalent to blood cells rather than a completely miscible substance which, once in the system, forms a completely dissolved constituent of plasma, is flawed. As for the martians ‘refusing’ to go, that just makes no sense at all. You might just as well pour a cup of coffee through a garden sieve and expect the coffee to be caught whilst only the water passes through. You mean ‘haematocrit’ (or ‘hematocrit’ for those in the US) and no, it isn’t similar to any such thing. This explanation would make sense only if alcohol molecules were the same size as erythrocytes (or at least greater than 300,000 daltons). They’re not. Really, you have to believe that. They’re very much smaller and wherever plasma goes, alcohol dissolved in it will go also. We don’t know how it could be. That’s what we’ve been saying. What I am saying is that your suggestion as to how it might be, does not work. It contradicts what is already known about physiology and chemistry. That’s all I’m saying. I get the feeling that you very much want it to be true, but I’m afraid it isn’t. You will have to look for an alternative. I can’t think of one. It does, but it does not support your case.
  23. Trauma does indeed cause a change in blood chemistry, but it can't add something that the body does not naturally produce in measurable quanitities, like alcohol. In this case, as plasma shifted from the central compartment to the peripheral around the trauma site, any alcohol it contained would go with it, as I have said. Thus, the alcohol concentration of the plasma remaining in the central compartment would remain the same as before the loss imn volume. There is no reason at all why all the alcohol would shift. That would be like all the dye in a tank of water spontaneously moving to one end. Not impossible, but so improbable as to completely discount. It wouldn't happen, for the same reason as above. Alcohol is soluble in both water and lipid and can more freely across plasma membranes. Thus, it tends to find equilibrium between blood plasma in the central compartment and interstitial fluid. There is no physical reason that all the alcohol would spontaneously move from the central to the peripheral compartment, or vise versa, trauma or no trauma. Yes, the endogenous blood chemistry would change. Because an endogenous change in blood chemistry can only put into the blood what the body synthesises, stores or in some way uses. There would be changes in pH, elevations in potassium, endorphines, prostoglandins, histamine (associated with inflammatory response) and so on and probably a shift in electrolyte balance (calcium, sodium etc.), but all these changes involve natural physiological mechanisms and involve substances the body itself synthesises, stores or uses. The body does not naturally synthesise, store or use alcohol, it only metabolises it as a toxin to remove it from the system. Alcohol has to be introduced from the outside. How so? This is the crux. You need to show how it is a mistake. As far as I know, it isn't. You are jumping the gun. Any explanation why that would happen, is predicated on the assumption that it does happen. Before you seek explanations concerning why, you need to establish that it does, or at least that it can happen. As far as I know it doesn't, and there is nothing about a burns trauma that would cause alcohol to appear in the blood where previously there was none (or more, where previously there was less). Alcohol in solution will find equilibrium and its movement in solution is not hindered by cell membranes. If a person has a BAC of .04 and loses half his blood volume from the central compartment, the remaining blood will have a BAC of .04. To know this, you would need to have taken a test prior to, or immediately after the accident. Did you?
  24. I'm afraid I have no idea why that could be the case. Blood tests are acknowledged as being the most accurate method of testing for blood alcohol. There are a few things that can affect the reading in blood tests, fermentation, clotting and contamination, but these days samples aren't usually left too long before testing. Also, they're usually taken using pre-prepared tubes (e.g. vacutainer) containing EDTA (anticoagulant) and such phlebotomy tubes are also pre-sealed and sterile and not opened until they reach the lab. I have taken blood tests for alcohol and there is a specific procedure. For example, the venipuncture site should never be swabbed using sterets (they contain isopropyl alcohol) or any alcohol swab (for obvious reasons). We use iodine based pre-injection swabs in these cases. There is a possibility of messed up results if: The sample was taken in A&E (ER) by an idiot house officer or SHO who used an alcohol swab. They used a standard syringe and decanted into a plain tube with no anticoagulant. They left the sample laying around a long time at room temps or higher before testing. However, it is often police medics who take such samples and they 'should' know what they're doing. Other than that, I know blood chemistry is affected by trauma, but it is unlikely the effects would alter the concentrations of alcohol. However, I really don't enough about it to comment. If all procedures were followed correctly and all other things being equal, it would leave two probable situations: 1) The person drank more than they said they did. 2) The person drank exactly what they said they did, but had drunk a lot more the previous night and the 6-12oz beers were simply 'topping up' already elevated levels. After that, it gets less probable, such as: the person has a liver disease and cannot metabolise alcohol effectively.
  25. It is unlikely that blood plasma could diffuse from the central compartment and leave alcohol behind (and so increase relative concentration). The nature of ethyl alcohol (soluble in both water and lipids) means that it can diffuse through the endothelial cells that form capillary walls. Hence, for example, the blood brain barrier is no barrier to alcohol, whereas even glucose requires active transport across it. So, wherever plasma can go, alcohol can go with it.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.