Jump to content

Genecks

Senior Members
  • Posts

    1488
  • Joined

  • Last visited

Posts posted by Genecks

  1. I'm reading Biochemistry - 4th edition - Voet and Voet.

     

    It says this:

     

     

    Although prokaryotes
    lack the membranous subcellular organelles characteristic
    of eukaryotes (Section 1-2), their plasma membranes may
    be infolded to form multilayered structures known as
    mesosomes. The mesosomes are thought to serve as the
    site of DNA replication and other specialized enzymatic
    reactions.

     

    Uhhhhhh....

     

    I thought mesosomes were artifacts... Are they not?

    Something change in the realm of biology in the past couple of years?

  2. Carey - Organic Chemistry (most recent edition). Going about three editions back, you'll get similar stuff.

    Practice reaction mechanisms a fair amount, and learn how to generalize information about them: How to summarize something to an "R" group is great.

    Use notecards and use them as flashcards. Practice, practice, practice. Keep all of your information organized, and practice recall a fair amount.

  3. I'm reading through a couple analytical chemistry books. They look awfully similar to first-year chemistry books. I don't understand where the difference is. Supposedly, analytical chemistry can be like a third-year university course, but somehow the book material is the same as a first-year chemistry course. Is there something I'm not understanding here? Because it looks like nothing new is really learned.

  4. I could only think that microorganisms would have developed, and that they would have maintained their position in the world's oceans and water. That comes with the assumption to macromolecules would have only been developed in the world's water supply, thus never really reaching land. It's an interesting question that goes back to the development of the cell. Assu

    ming that evolution could still occur, would photosynthesis be necessary for eukaryotes to have developed on Earth?

     

    There may be more to that homework question, but that's what I think. It's a broad question, whereby you would have to develop an argument based on evidence and beliefs of how the world's ecosystem developed.

     

    http://en.wikipedia.org/wiki/Timeline_of_evolutionary_history_of_life

     

     

  5. The reason I have created this topic is because I was recently recruited for this program that allows you to attend college early, more specifically, three years early. I am very interested in a wide-range of scientifc topics to pursue for a career from cosmology to philosophy, however, the more I think about it, the more I realize I might be way out of my league. Eventhough I am passionate about different sciences, I am not entirely sure that I am intelligent enough in this field to seek a career in it. I don't want to wake up one day and realize that I didn't try to achieve my dream career because i was intimidated, but I also don't want to invest time in something that I am utterly in over my head about. If you are going to reply by lecturing me about how socially stunted I will be if I basically skip high school, go ahead, but I would really appreciate some advice pertaining to a career in science.

     

    ~Thanks

     

    You don't necessarily need a background in science to pursue a science career. What I mean is that experience while working with professionals can build up a skill set so that you can break into further research. However, without a theoretical understanding of the mechanics of what you're working with, you will be blind to many things that you are working with, thus being little more than a machine or (bench monkey). Also, being educated in doing research on your own makes a person a valuable asset, because there is less teaching required to make a lab member independent.

     

    In reference to being a social person, you would definitely learn to become a social person in college or university as you learn to work with others, develop comraderie, and work in teams with people in order to accomplish a goal. And, in many ways, that's a utilitarian perspective on social relationships. There is a whole realm of social psychology and sociology what it means to have a social relationship with one or more people. In general, I've adapted to a business-like philosophy, which not many people like; but it helps develop connections with people. I do not believe you will become a socially stunted person by leaving high school early to pursue academia and knowledge, as I consider academia to be a social atmosphere with mature people with common goals. High school tends to have a lot of flaky people who don't care so much about intelligence, knowledge, and progress: The people are simply watching after themselves, making ends meet, and living life without striving for excellence.

     

    A career in science can be experience-based or educational-based. It's all about networking (who you know) and what you know. Sometimes it's more about who you know. However, if you start making connections really early, you'll quickly develop the "who you know" part.

     

    A thing to keep in mind is to do your best and seek to do better, which is part of doing your best.

  6. I've focused my studies toward regenerative medicine as I went on in my biology career. I have come to believe that I could have just as well done molecular biology with plants and taken those skills to do molecular biology with brains. Proteins are proteins. Proteins are found in plants and brains. And I do not doubt that you can use immunoprecipitation on a plant just as you can on a brain. The required antibodies would be different, but the technique would be similar. However, preservation of material would be different, but I do not doubt that it would be so difficult to adjust one's skill set toward that.

     

    I don't have a Masters or Doctoral degree. However, I agree with CharonY on the skill set being of high importance. If you can easily take your skills and use them in a different field to solve a different problem or find some new knowledge, then there should not be much difficulty in a transition.

     

    I'd suggest looking at the skill sets between what you know and what you would need to know. Compare and contrast.

  7. It's all a bunch of memorization. School courses tend to have a level of abstraction where the things you've memorized need to be applied to things that are the same but look different from what you've studied, thus necessitating the use of memorized information to different situations. However, who is to say that any of that is useful, anyway? The real truth is that anatomy is more of a course of memorization rather than abstraction, so an individual would be just fine finding Gray's Anatomy for Students and reading it from cover-to-cover while making notecards and constantly quizzing one's knowledge and recall of the physical parts involved. It saves a fair amount of money to do such, too.,

     

    Cover Hundreds of Medical Topics Spanning Over 3000+ Pages

     

    I bet it's watered-down like the Rosetta Stone scam of a language teaching package. You don't see that on TV anymore. Once people start to wise up and notice something is a scam, they don't buy into it as much.

  8. First off, if you want to argue about this, we could request a thread to ourselves.

    This person wasn't really asking how to be a multidisplinary neuroscientist.

    The field appeared to be narrowed to molecular neuroscience.

     

    And in my experience, an individual is often better off establishing him or herself quite well in one realm of neuroscience work and only moving onto another research program due to lack of grants funds, etc.. Yes, there is definitely a way to be a multidisciplinary neuroscientist, and I had advised the calculus course, because there have been introductions of engineering and computer sciences to the realm of neuroscience. Even the forgetting curve, which has its basis in psychology, involves calculus and relates to neuroscience.

     

    Alright. My view on this is that even if computer technologies were developed and various nanotechnologies occurred, they're employment would involve understanding of the biochemical, anatomical, and physiological processes that they are going to be working with. I understand that many people hold an appreciation for neural engineering and bioengineering technologies. However, from my experience in studying and reviewing the technologies, from my transhumanist ambitions, the technologies are generally worthless, thus regenerative medicine is a more realistic approach for generating therapeutic technologies. The utility in the engineering technologies is fruitful when they can be used for regeneration in a controlled manner that will not harm the organism.

     

    Otherwise, moving toward the harder physics involved with neuroscience and neural engineering is mostly a fool's game for a misguided futurist OR an ambition of a nerd who lacks understanding that grants might not be coming his way for such technologies yet still attempts to develop them. I don't find much utility in these technologies from my current understanding of the neurosciences, thus I often disregard such nanotechnologies. I'm not saying they don't have their place, however. My strong views on the progress of the neurosciences is in my belief that things will be focusing on regeneration along with a person having a sense of consciousness, cognition, and memory. A person with a neurodegenerative disease in the brain more than likely holds appreciation for regeneration of brain tissue, even if that means memories will fade with time: There is paper and pen if anything important does not want to be forgotten.

     

    If something is focusing on regenerative medicine or therapeutic medicine in the realm of the neurosciences, I foresee grants and advancement in it. Nonetheless, for nanotechnologies to be of use, the molecular neurobiology of the nervous system needs to be understood. If your argument that an individual is better off being part of the nanotech and computer programming bit that builds upon advancements in molecular neurobiology, thus furthering advancement of technologies using advanced physics, then I can see you argument and the fruitfulness of the research program. However, the limiting agent to such a research program will be the available knowledge of molecular neurobiology and biochemistry. Yes, I can see how as more and more of molecular neurobiology and biochemistry is understood that individuals will eventually progress onto a new research program, because there is little more to study in molecular neurobiology. However, I do not think such a diminishing return in molecular neurobiology will occur by the years you have projected.

     

    In 30 to 40 years, I could see that. However, in the next 10? Nah.

     

    If I remember correctly, one of the major hurdles in bridging things into advancement is the crystallography process. If that can be simplified and conducted much easier, things would get done much faster. (http://www.cell.com/trends/pharmacological-sciences/abstract/S0165-6147%2812%2900041-7)

     

    I was working in an Alzheimer's lab, and there was still much more to explore in the realm of molecular biology. Even if you're able to regenerate tissue with engineered components, there still may be proteins that have not been explored that can lead to disease and breakdown. Furthermore, with a lack of knowledge of the molecular biology of those components, they may have toxic and cumulative effects that the technology does not account for. Similar to neural network programming, things are only as good as that which you have accounted for. As such, one could hope that people could push forward with neural engineering technologies once that which has not be accounted for is insignificant, such as being able to be taken care of by glial cells, thus reducing whatever levels of toxicity that may come about.

  9. Efficient coding uses less system resources. However, as I understand it, such efficient coding requires more time on behalf of the programmer or programmers. This is what I've seen throughout the 80s, 90s, and 2000s. However, as of late, sometimes such efficient coding is not necessary, as the system resources are capable of handing coding that requires more system resources. I think your discussion needs information about system resources. I feel like you're lacking a historical perspective on things. You may want want to research how flight simulator programs are coded and the importance of system resources. I think a brief paragraph or two on system resources and past coding languages in similar games may be useful. Video game programming and hardware has evolved throughout time: From the commodore 64 to whatever is out right now.

     

    http://en.wikipedia.org/wiki/Video_game_console

     

    So, you would have to define what efficient coding is, and describe its usage in modern video games. You also need to define realistic video games. Do you mean realistic violence, whereby if someone shot another person, then the person dies (this occurs in old 2D video games)? Do you mean the level of interactivity and fluidity, such as that found on the Nintendo Wii platform?

     

    I've been playing Call of Duty: Black Ops, and I really don't care for the physics. I've shot guns before, and the physics are off. My main problem is that I feel like the level of control is "unrealistic," because I believe increased muscle memory would occur with increased gameplay, thus resulting in better shooting. However, I feel like the more I adapt to the physical flaws of the game, the better I do.

     

    And then you have fighting video games. You ever notice how a punch occurs, despite physical contact not being made? Yeah, that's a flaw in the game.

  10. My first idea, too, was to make a cake. Maybe buy some dinosaur figurines and place them on the cake.

     

    Another idea that comes to mind, especially if exact or approximate proportions are not of concern, is to find some books and stack them on top of each other. Maybe find books with different colored covers with different spine thicknesses. Maybe stick a toy dinosaur inbetween them to show where a dinosaur skeleton may be found. And possibly use the bottom or top of blue BIC pens to represent water.

  11. ...

     

     

    Okay so the main 3 issues which essentially give way to all others are:

     

    Can the biotech be created as 100% safe? So that internal IC's or nanotech cant be hacked or modified.

     

    How do we ensure that any physical improvements are for the benefit of mankind and not a single nation, avoiding much death.

     

    If we modify our cognition and have the full scope of human knowledge installed, how human are we? how far do we push it? how can we predict its effects on society and the mind?

     

     

    ...

     

    First off, it's only safe to the point that it will not defect on its own. Otherwise, there would be outside tampering. If someone were to set-off an EMP bomb near a group of cybernetic humans, the cybernetic humans could defect. However, that would be considered a form of battery, which may lead to an attempted murder charge if not a murder charge. Simply the act of bringing an EMP bomb next to a cybernetic human could be considered a form of assault.

     

    If a person were hacked by someone else, there would be legal implications. Otherwise, if a company made bad parts for people, there could be a negligence lawsuit.

     

    In medical practice, tampering with a part of the body that an individual is not supposed to is considered battery when there has not been consent to touch that individual's part of the body. However, for a person to tamper with a chip in an individual's body, such as a GPS chip, it would be a form of battery were there no consent.

     

    So, can they be made 100% safe?

    No.

     

    Even if defects could be controlled for, there is the possibility of outside tampering. As such, various laws that involve punishment for violence would be involved. However, if people were given all of human knowledge, there is the possibility that significant fewer people would be so willing to conduct physical violence against another, as they would understand the potential legal implications and punishments involved (excluding any discussion of ignorance is no excuse for the law; as in this case, individuals cannot be ignorant). As such, there would be heightened potential for increased punishment to prevent re-offense.

     

    If such wetware had wireless networking capabilities, then there is the possibility that an individual tampers with the data stored for another cybernetic person. However, this might leave behind forensic evidence. As such, one might argue that if individuals were to be given cybernetic capabilities, then there would be a level of regulation so that evidence can exist and be left behind so that crimes can be prosecuted. If there was no regulation, then there would be a level of anarchy without the ability to punish others. So, I state something similar to MAC addresses and IP addresses, or things physically configured in the hardware that cannot be changed without an individual skilled in the medical technology to change such, would be used as a way of allowing evidence to be left behind.

     

    However, this becomes extremely problematic if some terrorists begins to start persuading people to kill themselves, thus preventing evidence from being left behind. However, such fringe things would require advanced control of an individual's motor cortex, thus safeguards may be created ahead of time to prevent control over another individual's motor cortex.

     

    I think the human race is quite some time away from developing such technologies without the addition of large masses of individuals into the field. Even then, there would be heightened competition for resources to develop such technologies. There appears to be no seriously large need for the cybernetic technology at the moment. However, various bioengineered parts are useful for individuals who have developed loss of a physical part of their bodies.

  12. There are a variety of cross-disciplinary things that go on in graduate school, which is often encountered by working in different labs. If you're interested in organic chemistry, I would suggest studying it after you get your bachelor's degree unless you desire to be a pre-med student. Becoming a pre-med, as I believe, would increase difficulties beyond getting a B.S. Psychology unless the workload in the B.S. psychology is so lax that you have an extra 40+ hours a week to devote to organic chemistry, studying nucleophiles and electrophiles, along with memorizing, generalizing, and abstracting reaction mechanisms. Looking back, I could have made an A in my first semester of organic chemistry. However, I'm not too sure with my advanced skills in studying that I could have made more than a B in my second semester organic chemistry. The workload was ridiculous. Then again, one of my problems is that I had all of my gen eds out of the way and was taking degree-major related courses, thus turning up the level of general difficulty with the things I dealt with.

  13. That sounds like a good plan. Balancing one's academic courseload is important.

    You may also be interested in biophysics.

     

    http://en.wikipedia.org/wiki/Biophysics

     

    The human brain is binary in some ways, whereby there is an all-or-none action potential. Either a neuron is firing or it is not: This is more like a binary system. Also, there are graded potentials, which are of importance: These are unlike the binary, all-or-none feature of other neurons. I would not focus so much on quantum physics during your undergraduate education for biology.

     

    Might I suggest you get a tutor if you feel that your math skills are not up-to-par. I often did such in my undergraduate education for calculus. Prior to that, I hadn't taken a hard math class for over three years.

  14. Brain death is used in a modern medical context to help medically define what it means for someone to be dead. As such, an individual who is dead has brain damage that is irreversible. There are other problems, of course, such as whether or not a person in a vegetative state holds a sense of consciousness, whereby the individual can perceive the outside world but have no control over his or her body.

     

    When I think of brain death, I think of a lack of consciousness, an inability to awake from a state of rest, no sense of personhood, and no memories to be recalled. The individual would be little more than a central nervous system attached to various muscles: A puppet with a motor cortex. If the person did hold memories, however, and the "sense" of consciousness could be re-instated, it would make for quite an argument in resuscitating individuals who have encountered traumatic brain injuries, despite what retardation may become present. So, I'm saying a brain-dead person is little more than a puppet. You could probably keep them alive, physically, but their generally accepted neuronal functions that involve personhood are gone.

     

    If a person is arguing for particular criteria that meet the definition of brain-death, such as the above stated criteria, then the person would have to argue that he or she held an inability to awake from a state of rest. However, the individual did have the ability to awake from a state of rest. A state of rest implies that consciousness is hibernating. A person could argue that there was no state of rest and consciousness had "left the body." However, I think it is better argued, in that case, that the person had a death experience rather than a "near-death" experience.

     

    The person died and came back to life. However, I think if there was a level of mysticism involved with these claims, and I were to follow with the mechanics and logic of the individuals, then I don't see why possession by some other worldly consciousness would not be possible: for example, demon possession. However, there haven't really been many arguments about such, to the best of my knowledge. Most arguments have been people attempting to argue about their so-called near-death experiences as evidence for there being a life beyond the Earthly realm.

     

    So, I'd say the guy is wrong. If he was near death, then he didn't die.

     

    There is a whole philosophy to this stuff. My argument has often been that a person has to claim that he or she feels to be the same person after coming back from a true brain death to claim that he or she is the same person. As such, the problem becomes, again, consciousness.

     

    What is consciousness?

     

    That's the philosophical question. Identity often relates to memories of past experiences. And then you have all kinds of sociological theories that describe how a person holds an identity in society. If the person had brain damage, that person may have part of his or her personality affected by loss of neurons or memories. You would have to examine their past and most recent memories, in line with Ribot's law. I believe Ribot's law relates mostly to neuroanatomy.

     

    Yeah, I wrote the above without looking through the Internet. Most of what I've read on wikipedia states similar arguments. I didn't include the fact that imaging is not done, but I considered it. Another thing that would need to be considered is neuroimaging, but that is expensive. If the study of death experience was of such high importance, then people would be doing more neuroimaging research. It may not be worthwhile, especially for scientists who consider the near-death experiences to be from brain injuries or brain damage. There would be arguments, etc. etc..

  15. There has been a lot of research on inflammation and stress. Furthermore, there are a multitude of factors that relate to brain aging. Estrogen is another one, which may explain why women live longer than men. As such, the reasoning would mean a man would take estrogen supplements... which might be more worthwhile if a guy is in his older years and no longer wants to reproduce.

     

    Those experiments were on mice rather than humans or primates. Furthermore, it would take the right equipment to consistently inject drugs into one's brain... so, there are all kinds of problems with that.

  16. Well, mutation, right?

     

    http://www.biology-online.org/2/8_mutations.htm

     

    Genetic inversion?

     

    If I'm looking at this from a genetic standpoint, whereby the code is changed, thus leading to a changed protein, then the DNA was incorrectly setup during repair, generation, etc. For the protein to be transcribed, then the start codon would still need to be present.

     

    In reference to a polypeptide being altered, protein degradation and repair comes to mind. However, if there was a mutation like such, I would assume that the repair mechanism is busted and that it will either be replaced quite quickly (thus making it hard to notice through bio-informatics search, I assume), or the organism/cell will die.

     

    From the protein code you gave, I would think that the protein would not be functional.

  17. Thanks guys, I have been meaning to reply but have just been so busy with school.

     

    I've already registered for the fall semester but if I play this right I should be able to squeeze in a semester of general physics and pre calc (a pre req. for calc). Not optimal, no, but I suppose it's better than nothing.

     

    Oh, you don't have pre-calc out of the way? Yes, then definitely increase your maths, I would argue.

     

    I would not suggest taking general (algebra-based) physics if you do not have pre-calculus out of the way. It is not impossible to understand physics without a strong algebra background. However, I reason you will be spending a fair amount of time understand the equations, the different situations they are used in, and learning how to use them in an abstract way for physical scenarios that deviate from the ones you may commonly encounter while studying physics. If you can understand the equations and quickly understand how to apply them to new and abstract situations, then good. Make sure you understand what the physics instructor is discussing and you work through various problems and understand whatever abstractions to concepts the instructor may present. If you're taking physics, algebra-based, you'll be competing against pre-medical students, so you may have some increased difficulty getting a good grade.

     

    For the higher mathematical fields I have studied, learned equations are eventually used in abstract situations. As such, an ability to use equations in abstract scenarios (scenarios deviated from previously encountered situations, whereby the formulas can be used) help a person think more abstractly. Abstract thinking becomes more important. I think that is something I did not across too often while studying psychology. I don't recall too often using abstract reasoning unless I was developing a psychological experiment. I think more often in psychology I would see a set of criteria and attempt to diagnose what disorder/condition fits that criteria. That is a kind of abstract reasoning that is similar to math and physics problems, but using math can be different due to the increased level of need to manipulate mathematical variables and visualize the physical situation so that math can be manipulated and equations used.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.