Marat
Senior Members-
Posts
1701 -
Joined
-
Last visited
Content Type
Profiles
Forums
Events
Everything posted by Marat
-
I located a source which discusses these issues, though the results seem contradictory, with blind patients restored to sight sometimes being able to recognize what objects are without touching them, while at other times they have to feel them first to identify them and make the correlation to their blind experience of them. One case reported describes a life-long blind man of 52 who was restored to sight by a cornea transplant. The experimenters report: "We showed him a simple lathe (a tool he had wished he could use) and he was very excited. We showed it him first in a glass class, at the Science Museum in London, and then we opened the case. With the case closed, he was quite unable to say anything about it, except that the nearest part might be a handle (which it was -- the transverse feed handle), but when he was allowed to touch it, he closed his eyes and placed his hand on it when he immediately said with assurance that it was a handle. He ran his hands eagerly over the rest of the lathe, with his eyes tight shut for a minute or so; then he stood back a little, and opening his eyes and staring at it said: 'Now that I've felt it I can see.'" -- R. L Gregory, 'Eye and Brain' (New York: McGraw-Hill) 1972, pp. 197-198 It seems from this that the blind person has learned to acquaint himself with things via touch and then apply this tactile information to his ability to organize his experience, label things, and identify them linguistically, but that he has still not learned how to apply the knowing process to visual material. With other objects, in contrast, he could label them correctly just from sight, such as block capital letters, which he had learned to identify by touch in blind school, but which he had not learned by sight. Perhaps the difference between his experience with block capital letters and the lathe was in the degree of spatial complexity each exhibited -- though letters can be quite complicated? More likely he had just had more practice with block capitals in blind school than with lathes, so he still seemed to have difficulty using visual information creatively to put together a coherent idea of a new thing. This seems supported by the fact that he could not identify lower-case letters, even though he knew upper-case letters by both touch and sight.
-
Logically there is a large variety of ways that the economy can respond to consumer goods getting more expensive because of a carbon tax. Consumers can buy fewer goods (which is in part what cigarette and alcohol taxes, as well as import duties, are designed to encourage) or just pay more for the same number of goods. Industry can absorb the costs if it believes it can still make an acceptable profit and cannot optimize profits by selling at increased prices, or it can try to optimize profits by selling at higher prices to meet a relatively inelastic demand. All these things will depend, as noted above, on the size of the carbon tax, the nature of the things being sold, the technological adaptations in response to the tax, etc. Since most sales taxes are inflationary, the carbon tax, which will essentially amount to a sales tax, will probably have the same effect. It is also an open question what the government will do with the proceeds of a carbon tax. If it uses the extra cash to invest in non-productive activities, as it is often tempted to do, like making more useless aircraft carriers to sail around the world at $4 billion each without ever producing anything of value beyond employing a few sailors, or invading Iraq at $1.5 trillion just to 'produce' destruction in someone else's country for no tangible benefit, then clearly the effect of the tax will be just to diminish the productive capacity of the economy. If God demanded of us that we produce something of no intrinsic value, merely to serve religious needs, like a Tower of Babel, then that would generate some employment but it would ultimately be economically detrimental, since it would represent the investment of scarce labor and capital resources in making something which is unproductive rather than in answering tangible needs for improved infrastructure, industrial capacity, housing, etc. Similarly, if we now have to invest labor and capital resources in buying something which is not directly valuable, like less CO2 in the atmosphere, then those scarce resources are being squandered, at least in terms of the traditional needs of a consumer society, since the investment neither provides additional consumer goods nor increases the capacity to produce consumer goods, and that is a real cost to our economy.
-
I'm only asking questions about the regularities noted. Most historians partially explain the disastrous performance of the Austro-Hungarian forces in World War I from the high death-rate among sergeants in the very first phase of the fighting, as a result of the failure initially to understand how the increase in firepower had required a change in military tactics, which was only fully worked out circa 1917. If anyone could explain the high proportion of sergeants on the roster above it would be interesting. Also, what does account for the high number of Anglo names? I thought the U.S. military was now increasingly Hispanic, but not the death list.
-
The proper way to address all 'slippery slope' arguments, where something which is in itself good is rejected on the speculation that it might, if extended, lead to something bad, is just to endorse only the good part and recommend clear legal restrictions to prevent the bad part. "We shouldn't permit driving, otherwise blind people may get behind the wheel and kill people!" Well, we could just license driving so that blind people wouldn't be allowed to drive. Similarly, we could make incitement to suicide illegal (as it already is, if done on a personal basis rather than in a theoretical discussion), if the reasons given for urging someone to consider seriously whether he should die were that his death would have a utilitarian benefit to society, rather than a benefit to his own interests. Most of these suicide-persuasion danger arguments seem to suppose that people are much more easily persuaded of very difficult things that they don't naturally want to do than they really are. Have you ever tried to persuade an aged relative to make an obviously rational choice about what to do with his own money, even if the benefits to him are overwhelmingly clear? "Garl darn it, I just don't trust banks, and that's all there is to it." -- "But dad, don't you realize how much interest you could be collecting?" -- "Ma mind's made up! Now git, or I'll cut you outta ma will!"
-
It might be difficult rigorously to distinguish the health benefits of running or of any sort of exercise from the health benefits of being healthy enough to begin with to be able to run, or to want to run rather than lie on the couch, trying to catch your breath as a result of your cardiomyopathy, for example. How do we know whether it is only the healthy who can run, and this accounts for the link between running and health, or whether running actually improves baseline health? Any test you could conduct would have to begin by disqualifying sick people from participating, since otherwise it would be unethical to expose them to the medical risks of running. Also, if I lose 2% of my productive time by running every day, and as a result I only gain 1% additional lifespan or can only extend the productive fraction of my existing life expectancy by 1%, then running is unprofitable. I've never seen anyone do a rigorous cost/benefit analysis of running, since they always seem to presuppose that the experience itself is valuable, rather than valuable only for its side-effects on health.
-
This reminds me of an old movie in which Dracula, suddenly aware that he is too well known under that name, changes his name to 'Count Alucard' to fool people, but is caught when someone sees his new name on the door of his carriage reflected in a mirror. Why on earth someone trying to hide his identity would do so by spelling his real name backwards, rather than just calling himself 'Dr. Smith' or 'Count Jones,' is beyond me. Similarly, why court exposure by playing on the possible associations between 'Marat' and 'Tar'? It would be so much easier just to call myself, say, 'Lemur.'
-
Philip Ruston, Professor at the University of Western Ontario, along with Professor Watson of the famous Watson and Crick team, as well as Professor Herrnstein, all believe in the theory that IQ can be shown to differ by race. IQ is certainly at least partially inheritable, as can be demonstrated in the relation between the IQ of children and their parents. This is actually fairly generally admitted, but critics now try to attack its significance by arguing that IQ only measures a narrow skill set which should not be described as 'intelligence.' I have always suspected that if true genuises like da Vinci and Goethe had been given IQ tests, these exams couldn't really have measured what was special about their creativity and inventiveness, since they in fact only measure a certain type of cleverness. It is also argued that individual variation in IQ within each race is great, so no individual should be tagged with either the negative or positive IQ measures of his race without being given a chance to demonstrate his own personal abilities. It is often said that race has to do only with skin color, but this is not true, since countless significant biological features, such as stress hormone responses, disease susceptibility, HLA groups, blood types, immunity, blood pressure, etc., vary with race. So the notion that the biological basis of intelligence could also be associated with race can hardly be said to be false a priori. The problem is that we have a conflicting imperative from political and moral theory which posits that all people are equal, and since we rate people very much by their intelligence or IQ, we cannot morally permit it to be true that IQ can vary by race, or that one race may be smarter than the other, regardless of what any sort of empirical data or objective tests may show. If the data or tests ever dare to refute our moral commitment, we feed the pressure of this conflict between value and fact onto the tests -- calling them non-objective; or the methodology -- calling it flawed; or the scientists -- calling them racist. In this we are like the Roman Catholic Church telling Galileo his empirical results couldn't be true because they conflicted with the dominant value system of the time. The proper response to all of this seems to me to be to recognize the separation of value from fact and affirm the importance of each in its proper realm. We strongly ethically insist that all people are legally and morally equal, and this requires us always to treat everyone with equal concern and respect. But we can also admit that it is childish to expect empirical data necessarily to correspond to our moral posits, so we should just accept whatever science demonstrates, while refusing to let it shake us from our ethical determination to treat people equally. It never was a fact -- even before the IQ controversy got going -- that all people are equal in every talent and by every measure -- but it was always (at least since 1789) a value that everyone be treated as an equal.
-
The problem with imposing a tax on carbon dioxide emissions is the same as taxing anything that industry has traditionally just dumped into the environment as a free externality of its profit-making processes. The tax would, as stated in the OP, inflate the price (but not the value) of the goods produced, which would promote stagnflation, since per-widget prices would rise (as during a fuel shortage) without economic activity increasing, since the public would just be able to afford fewer widgets after industry priced the carbon dioxide tax cost into them. The ultimate result would be just a lower standard of living. I can't think of any way to force the economy to pay for something -- like protecting the environment or avoiding global warming emissions -- which in itself is not a fungible commodity, without reducing the net wealth of the economy. It would be the historical tragedy of the 17th century Spain all over again. Despite its having assembled more silver and gold than the rest of the world combined as a result of its conquest of most of South America, instead of investing that wealth in productive enterprises, useful commodities, and capital, it just wasted it on better cathedrals and paying monks to say more prayers for the souls of the dead in Purgatory. As a result, the world's richest nation in 1600 was one of Europe's poorest nations by 1800. Similarly, paying large sums to 'buy' things which are not commodities and which cannot produce commodities, like less CO2, mushier wetlands for the snowy owl, and cleaner water, is just a drain on the economy. Even if it amounts to a small drain, such as 2% or 3% a year, having a surplus that size for investment in research and development, reinvestment in infrastructure, or building up new productive capital is what makes an economy flourish or -- if it is not available because it is disappearing in the black hole of the environment -- flounder. Of course, cleaning up industrial processes so they pollute less will produce the much-vaunted 'green jobs,' but that will just inflate the price of the goods produced, since the same widget which used to require 10 workers just to make it and dump the pollution into the fields will now require 10 to produce it and 5 to clean up the pollution, so the price has to pay 15 wages instead of 10. There will be more people employed because of the green jobs, but now instead of having the whole labor force employed in making widgets that give us a higher quality of life, now we will have a large fraction of the labor force employed in producing the intangible of 'less CO2 emission,' which is about as useful a commodity for making our lives better (in the way that our consumer society immediately experiences and rates value) as the Spanish monks produced with their prayers for the dead.
-
Half-asleep rats look wide awake
Marat replied to thinker_jeff's topic in Anatomy, Physiology and Neuroscience
What about the relation of this phenomenon in rats to highly sophisticated activity by human sleep-walkers who are not in the least conscious of what they have done when they are later woken up? There is even an English case, R. v. Harvey, of someone who murdered his wife while sleep-walking, put the body in the trunk of his car, drove off to the woods, hid the body in the forest, and drove back home, undressed, and got back into bed without even knowing what he had done. The court accepted his sleep-walking defense as proved, despite the best efforts by the Crown to convict. -
Stonehenge was clearly built way before the pyramids, as long as we rule out those strange theories about the base of the Sphinx showing environmental degradation indicating that it was built 15,000 years ago.
-
I was surprised at the predominance of Anglo-Saxon names, which seem to make up a greater proportion of that total than I am accustomed to see on other rosters, such as among the names of students registered to take a class, for example. Is that because so many soldiers are Blacks? Also, it is striking what a disproportionate number of casualties occur among sergeants, who are both active in leading the troops but not far enough away from the front to be safe, like the officers. But what view to take of the casualties is another question. If you see America's current wars as truly unavoidable humanitarian interventions or necessary actions in self-defense, then the roster of those who have died in these struggles seems quite different from how they appear if you see America's wars as imperialist adventures in service of America's global self-aggrandizement. But even if the latter is the correct interpretation, many of those who die even in a theoretically all-volunteer army are more victims of circumstance than international plunderers, since poverty, unemployment, lack of alternative opportunities, limited education and its concommitant limitation of critical perspective and intellectual horizon have all contributed to pushing them into military service.
-
Only when I have diarrhoea. Seriously, though, there is so much to do in life and so little time, I simply can't afford to run unless I can find a way to mount a brace on my chest to hold a book and turn the pages so I can get reading done while I run.
-
When I first went to university I bought all black socks so I wouldn't have to waste any valuable time sorting socks (yes, I am that crazy). Anyway, since I have also always been plagued by the missing sock problem, I doubt that buying socks all of one type will help, though it does reduce the problem of mismatches created by the random loss of socks to zero. A similar puzzle arises with respect to the single shoe one often sees by the side of the expressway while driving along. Why is it always just one shoe, given the very small percentage of one-legged persons whose prosthetic limb does not have an artificial foot which also requires a shoe, peg-legs being out of fashion since the early 19th century? Can it really be the case that someone loses just one shoe and continues walking along the expressway without having noticed anything unusual about his gait? Is someone just hurling shoes out the car window? A similar case has occurred recently with single disembodied feet washing up in running shoes along the coast of British Columbia and Washington state. Almost all of these have been right feet, and there have appeared seven or eight over the last few years. Other, matching body parts have not appeared. If only these feet had been shoeless, perhaps at least the single-shoe-by-the-highway problem could have been solved. The sock problem, which has been quite generally observed, for some reason doesn't count as empricial evidence against the conservation of mass. It just shows how we are so strongly committed to our scientific theories that we refuse even to consider good empirical evidence against them.
-
In the old days of medicine the doctor was supposed to take a fiduciary approach to the patient and not tell him about any frightening diagnoses if this news would harm the patient's health. Now there is more emphasis on patient autonomy, and thus on the duty of the doctor to share his diagnosis, no matter how psychiatrically damaging the news might be. But if you take the view that the doctor has a greater duty to spare the patient from harm than to respect the patient's autonomy, and the doctor knows that the patient will respond to a fatal diagnosis by stupidly clinging to life out of an irrational survival instinct and thus endure unspeakable horrors, then perhaps you could say that the doctor's duty is to kill the patient by some disguised and painless method so that the patient can avoid a fruitless and horrific dying process. Wouldn't this also be a kind of medicine, taking medicine in the sense of the effort to diminish human suffering rather than to prolong life? While I agree that it would be better not to think all these dark thoughts, the difficult thing about human existence is that we cannot force our minds to think only those thoughts which are most beneficial to our flourishing. Instead we feel called upon to understand everything, whether it is harmful or beneficial. I think this is part of the message of the Ancient Greeks' myth of Silenus, since the wise centaur begs the humans not to force him to tell them the ultimate truth of life, but the Greeks make him do so anyway, and then discover the awful truth that life is so terrible that it is better never to have been born. Perhaps all of the artificial business of life -- the compulsive shopping, self-medication with alcohol, tobacco, drugs, sex, and trinkets, pointless accumulation of money past all genuine need for material things, obsessive curiosity about miniscule academic points -- is just a gigantic effort at distraction from the insight gained in the myth of Silenus. The practical point of these thoughts would be to lead people to stop having children so that at least this misfortune would not continue indefinitely, even though we who already live are trapped in it. But whenever I haved discussed this idea to people considering having children, they just look at me as though I have two heads, or, more accurately, as though I have lost my head. I am constantly puzzled that no one seems able to understand all this.
-
Perhaps, as Mr. Sceptic suggests, clothing is essentially an extension of most species' desire to retreat and hide for protection against enemies during its reproductive processes. We then subconsciously extend this urge to protect ourselves in a nest during sex and childbirth by wearing clothes to cover up all the parts related to these activities, thus carrying our protective nest with us, at least symbolically. Society may also find male arousal threatening, and so the less the public can see of it the better. It is actually a criminal act of public indecency in some jurisdictions for a male to be seen in public fully clothed but with an obvious erection under his trousers. Since this act can be involuntary, it is a crime of strict liability which does not require criminal intention. Of course it is also no longer regarded as an act of public indecency for women to show their breasts in public while nursing, so we may wonder why the voluntary choice of a woman to do something disturbing the public environment is legalized while the involuntary act of a man which similarly disturbs the public environment is criminalized.
-
cultural normativity vs. scientific rationalism
Marat replied to lemur's topic in General Philosophy
The sociologist Max Weber has characterized modern society as being distinguished by its greater rationalization than previous eras, so the force of objectivity is relatively greater against the power of cultural norms. In contrast, in earlier times, norms could survive much longer against the pull of conflicting scientific evidence. For example, there were Jesuit universities throughout Europe in the 18th century which a century after Newton's 'Principia' persisted in teaching Aristotelian physics, since it was held at the time to be more congenial to Roman Catholic doctrine. Now those who cling to beliefs because of their normative associations even though their basis has been scientifically disproved are social outsiders, such as modern Creationists. However, it is worth keeping in mind that both facts and values are real for us. You could say that it is just as much objectively certain that murder is wrong as that the boiling point of water at sea level is 100 degrees C. Kant had the proper solution to the relation between science and moral value when he described them as parallel interpretations of reality, each offering a valid perspective in its own terms, but each confined to its own terms, so that they could not conflict. Thus both law and morality stipulate that all people are equal even though they are in fact always entirely distinct as a matter of fact. We can affirm both truths simultaneously, though from differing perspectives. -
If you look at the profoundly cumbersome way the Ancient Egyptians handled fractions; the remains of intertwined ropes which testify to the extremely ordinary way they moved the blocks which composed the pyramids; the standard templates they employed for drawing prescribed shapes to compose images of humans and animals on temple walls; the stupidly expressionless faces of people they depicted; the bizarre mix of a little empirical insight and a lot of sheer magic in their medical texts; and their way of indicating fear in depicted figures by showing them defecating, they don't seem very smart, for all their practical engineering skill. There have been recent attempts to try to attribute all of Ancient Greek wisdom to Egyptian influence (e.g., the book 'Black Athena'), arising out of the odd notion that this would somehow empower oppressed Blacks against modern Whites by its remote implication of Black racial superiority (though the Egyptians depicted themselves as tan, and depicted Sudanese as Black in contrast), but the uniquely critical, sceptical, and enquiring Greek style of thinking just doesn't match with that of the more superstitious Egyptians.
-
I agree that the human persistence in life is illogical, given its actual and potential horrors, and the fact that the threat of its potential horrors always exists as a background terror to everything that happens. Unfortunately, we are trapped by our animalistic drive to continue living no matter what the cost, and this is perhaps our greatest misfortune. You would be overwhelmed to see old people dying in the hospital from the most terrifying medical conditions -- with more pieces missing than present, and the remaining pieces swollen, deformed, and rotting with decay -- but nonetheless clinging to their utterly meaningless life-as-perpetual-torture. How do we keep ourselves from finding ourselves in that predicament someday, when our own stupid, tasteless, but overpowering will to live becomes our own worst enemy? I have often thought that terrible medical diagnoses should always be delivered to the patient not as an actual verbal diagnosis, but instead just as a massive bolus of morphine secretly delivered before the patient notices what is happening, so that there would be no need for the patient to experience the terrifying news, to undergo the slow and horrifying decline into death, and finally to die via the usual ('life sustaining') medical torture delivered at the end of life. At 18 it might be tempting to say that you can make of life what you want but choosing how to view it. This was the idea of the Ancient Stoics, encapsulated in the phrase in Shakespears's 'Hamlet,' where it is said, "Nothing is good or bad, but thinking makes it so." However, the sad fact is that because we are existing beings, actually thrust into the reality of life, rather than just thinking beings contemplating life from the outside, whatever happens to us grabs us with such intimate, visceral force that we cannot think it away into the interpretation that we want.
-
I think the original Biblical sense of the nudity and then the shame of Adam and Eve over their nakedness was that as they become more morally sophisticated after eating of the the tree of the knowledge of good and evil, they became capable of understanding shame and so adopted clothing. I.e., metaphorically, civilization brings both knowledge and awareness of self, and with awareness of self comes shame, and with that the concern about good and evil, salvation and sin. In its own way, the Old Testament offers an interesting, condensed, mythological version of historical anthropology. Modern clothing seems important for reasons other than temperature control, since even in extremely hot climates people still feel the need to wear clothes. I lived almost right on the Equator for a while and few people were willing to go around in anything less than short pants and thongs, and females also always wore shirts. A more curious case is that of true primitive tribespeople, among whom the males still wear a codpiece, even though they are otherwise nearly naked. What is this for? Are they hiding their shame, avoiding accidental sexual stimulation, or trying to protect themselves from injury?
-
Apart from this statistical cat-fighting, we can at least distinguish the reasoning of atheists about the possibility of alien intelligent life existing on other planets from that of theists about the possibility of God existing. Atheists 'believe' in alien beings as a possibility, not as a certainty to which they are unshakeably committed: the use of the common word 'belief' for the two types of mental orientation confuses the issue, because for atheists with respect to alien beings 'belief' means 'have some confidence that it could be possible as an ordinary physical reality,' while for theists with respect to God, 'belief' means 'having an unshakeable moral commitment to the reality of something whose existence could never be empirically demonstrated, for which no ordinary, objective, empirical evidence can be cited, which contains supernatural aspects, and which appears self-contradictory in its infinite power and goodness while presiding over an evil world.' The atheists' belief in aliens is more in the nature of believing that there might be a new and as yet undiscovered species of giraffe in Africa, rather than in a supernatural, metaphysical concept unlike anything else we have ever known to have empirical reality actually existing.
-
This is like that old saw of philosophy classes: Is it better to be a happy pig or a miserable Socrates? Eating of the tree of the knowledge of good and evil is a fall in the sense of entering a more sophisticated type of existence where one's moral status is at stake, as opposed to persisting in a minimally conscious barely self-aware, animalistic state. But it is also an improvement since one becomes a more important type of entity, able to make significant moral choices, though having to face the consequences. I think almost everyone would recognize the more sophisticated state as a better option, and the fact that God punishes people for choosing the better option reflects poorly on his own sense of value. Another way to concretize this option is to imagine a world where nearly the whole human population could be permanently hooked up to a device which constantly stimulated the pleasure center of everyone's brain, so we would all be perpetually and deliriously happy. Only a tiny subset of the population would be exempted from this state and assigned to maintain the pleasure-inducing machinery. The price for living in such a world, however, would be that no one would every write a Beethoven symphony again; there would never be another Kant or Einstein; and we would all just be happy pigs. On the other hand, the enormous misery and terror of life would also be avoided. If you love Beethoven's music and you've worked in a cancer ward, this question never goes away, but you are increasingly inclined to answer it in favor of the happy pig option.
-
Tar, how do you distinguish your definition of love from the definition of profound empathy? Or do you not want to make a distinction?
-
I experience this phenomenon of dream anticipation of future reality all the time, but it is of no real use in waking experience, since the connections between my dream material and my waking life are only clear after the corresponding events occur in the waking state. But I wonder if there is a natural way to explain these apparently mysterious coincidences? For example, of the thousands of things we dream in any given week, we forget all those which are not confirmed by later, waking experience, but we do remember those which are, so we seem to be getting more hits than is actually the case. Also, there are no clear rules about what counts as a hit, so a merely symbolic link between dream and waking experiences, or a link between the two on the basis of some very minor detail common to both, is also allowed to count, and this may exaggerate the apprarent prevalence of hits. Another factor, first noted by Poisson, to consider is that there are always many more pairs in any collection of items than the number of items would suggest, and this too statistically inflates apparent coincidences. Thus in a group of just 23 people there is a 50% chance that two people will have the same birth date (month and day), even though there are 365 items to be matched. Also, there are no clear rules about when the dream 'predictions' have to be confirmed, and there is no set time span during which the non-occurrence of the confirmation counts as a disconfirmation, the system is prejudiced in favor of finding confirmations. Finally, given the vagueness of remembered dream images, perhaps we count too many near misses as hits, since the permissible breadth of the similarity space for matching the dream images with real world images is not precisely fixed.
-
Long-term effects of electrocution
Marat replied to kej's topic in Anatomy, Physiology and Neuroscience
You don't say how old you are, which could be important in suggesting other possible diagnoses. The fact that you complain of racing thoughts keeping you awake suggests there might be some emotional issues that are also disturbing your thought processes. Falling asleep and staying asleep require a lot of positive activity from a healthy nervous system, so there may be some sort of background neuropathy operative in your case. Keep in mind that many brain/neurological issues don't have their ultimate cause in the nervous system, but instead in apparently unrelated medical problems, such as metabolic disturbances altering body chemistry, which then in turn has a neurological impact. Obviously only a clinical exam can settle all these issues. -
Something now often proposed as an alternative to patent protection is public bounties for new inventions, treatments, or drugs. Just as England in the 18th century offered a bounty to anyone who could develop a method for accurately determining longitude at sea, so too modern governments could offer a bounty for developing an effective but non-toxic immunosuppressive drug, a better treatment for cancer, a cure for viruses, etc. This way public interest could better control what products are developed and could also manage the cost parameters. In general, though, the limit to the free market in medicine is that medicine has to serve goals of humanity which cannot be adequately represented by the profit mechanisms. Thus it has long been the practice of drug companies to make and store so-called 'orphan drugs,' that is, drugs for rare diseases which it is simply not financially worthwhile to treat. They do this because they are pressured by governments to act in the public interest in this way, and they agree since it is not too expensive for them, though it is unprofitable. If drug companies were fully socialized, however, the entire development of new treatments could be 100% in the public interest, rather than only a fraction in the public interest as it is now. As long as the system is private, we get absurdities such as have occurred over the last decade, when new drug development has just been an expensive effort to circumvent other companies' patent protections by making copy-cat drugs which represent absolutely no added benefit to humanity.