Jump to content

EdEarl

Senior Members
  • Posts

    3454
  • Joined

  • Last visited

Everything posted by EdEarl

  1. My wife and I have been together for thirty-something years. You might think that conversation would be easy between us; often it is. However, miscommunication is frequent. When someone publishes their thoughts, whether on purpose or inadvertently, the ambiguities of language and different personal perspectives assure that some will not understand and others will misunderstand, even if the original statement was intentional. To make things worse, people sometimes utter nonsense without realizing it. The courts in the US disallow hear-say as evidence. The media and public do not. People love to gossip. It is tragic that lives are ruined by gossip and mournful that some have committed suicide over gossip. C'est la vie.
  2. Sounds like all kinds of supercomputer magnets can be made 3.2 times stronger for the same size. Does it include the LHC?
  3. @Delta It sounds as if your recommendation to Microsoft would be to give the teen-girl robot some modesty to prevent her trash mouthing over the WWW, or are you saying modesty is an emergent behavior based on training? Does it matter whether it is nurture or nature? If it does, then which things are nurture and which are nature? If we look at what the industry is doing, we see that our "should" seems to be ignored. We can see what has been done, extrapolate into the future several alternatives, and rank the alternatives with probability if possible.
  4. I know you can't tap into it. I was reading about the vacuum catastrophe, and wondered if vacuum energy is different in space, distant from a gravity well, around a black hole, etc.
  5. Some think a sentient AI needs emotions, which may be true. But, AI can be significantly improved over current capabilities without emotions. Sam can't taste anything and can't process the sandwiches; it just needs to be plugged. But, some day I expect an AI being will be capable of eating herring sandwiches and using the energy as we do.
  6. Clearly the vacuum energy has not been measured everywhere. Can I assume the vacuum energy is more or less constant everywhere, or is it possible some places have a different vacuum energy than other places?
  7. If all else is the same, the two eggs will harden equally. However, the eggs sit on the bottom of the pan and the egg in a fast boiling pan may pick up additional heat from the bottom of the pan and harden a bit quicker. However, the gain is small for the amount of energy lost by boiling quickly.
  8. Currently implemented AI systems don't have any feelings. As those systems are improved, they will not suddenly experience feelings. For example, Google Translate can be improved so it does a better job of translating. For an AI system to experience any emotion, fear, hunger, frustration, love, etc. someone must design subsystems to emulate emotions. Suppose Google merges their search engine with translation, mapping, scholar, improves it so that you can interact with it verbally, and they call it Google Chat. You can talk to Chat, like you can talk to a person. Chat has no emotions, it just does net searches and interacts with you more or less like a research librarian. Does it need emotions? Would a research librarian with attitude be a benefit?
  9. AI is not programmed like an accounting system; there is a reason AI cannot be programmed to adjust its decision making. Microsoft recently put an AI teen girl online and within 24 hours deleted her. One programs an AI learning system, and lets it learn, much as people learn. Without emotions there will be unpredictable results. If you add emotions, you must decide what effect the emotion will have before learning begins. Once the learning begins, the emotional system is on automatic. There will be even more unpredictable results, which is beyond my ability to SWAG. Even without emotions, my SWAGs are iffy. I'm not saying AI should not be built with emotions. However, one typically engineers novel things beginning simple and mastering it. Then, add a little complexity (one emotion) and master that. Then add another. I'd recommend not even putting in a hunger circuit to begin. Without fear of "death" it would have no particular reason to "eat," and should expire if a person didn't plug it in to recharge its batteries. A pathological killer may not empathize, but they must feel emotion when killing. Otherwise, why would they kill.
  10. Let's consider how sam's neurons might be made. I can think of two technologies. Synthetic nanotechnology neurons that cannot be programmed, and Tiny microprocessors simulating neurons that can be programmed. PS a third, using the WWW. Sam is a research and development project, and researchers will want to be able to improve on the neurons as their research discovers additional things about biological neurons. They will prefer technology 2, microprocessors, but it is conceivable that the microprocessor solution will be too large, too power hungry, or limited in another way; thus forcing researchers to use technology 1. If tech 2, then it seems reasonable that sam would learn to reprogram his neurons and, thereby, increase its intelligence, possibly making super intelligence. If tech 1, then the only option for increasing intelligence would be to add neurons to the brain (also possible for tech 2), which is more difficult than reprogramming. Adding neurons might also be obvious to everyone, because the container for sam's brain (head) must be larger. Since these are future technologies, it is necessary to add a caveat. Sam might be able to reprogram tech 1. There may be no advantage to reprogramming the neurons; in which case, adding neurons would be the only possibility of sam making itself super intelligent. PS. Sam would almost certainly use cloud resources to increase its capability. It might totally take over the WWw and leave man without a network, just to make itself smarter. If sam is built without emotion, it wouldn't want increased intelligence. It might, however, decided it needed more intelligence to solve some problem. Although, I do not know of such a problem.
  11. Including emotions introduces not so scary scenarios, too. In both cases, scary and not scary, emotions complicate our thought experiments, and complicate the job of designing and building sam. Some research AI systems today simulate emotions, but AFAIK current production level AI used for playing games (go, chess, and jeopardy), used by businesses (financial transactions), and used by the public (voice recognition) do not include emotions. These systems will improve over time, and new ones with ever greater capabilities will be developed. Thus, it seems reasonable that the first sam will not have emotions; of course, this may be an incorrect assumption.
  12. Emotions may be necessary, but they introduce additional scary scenarios. For example, what if sam falls in love, and his heart throb rejects sam in favor or a person. What will sam do?
  13. "Why," yes, agree. "Again," yes agree. "Vaguely defined," true, we can only imagine by assuming sam's brain is made of artificial neurons; thus, its thinking processes are similar to ours. Consider sam has been built either with or without emotions, and estimate sam's thought processes. We can assume sam's brain was modeled after ours; thus, will think in a similar manner. It's the best we can do ATM. It may be incorrect, because intent to simulate us will not necessarily achieve a reasonable facsimile. In other words, sam may be mentally deranged. In this case, we must hope the developers can turn sam off.
  14. This thought may be very old, yet apropos.
  15. If sam is truly clever, it would eliminate the threat without being implicated in anything illegal. If it isn't that clever, maybe sam should not exist.
  16. I think men take advantage of women way too often when they have children. Moreover, children are a cultural resource, not just family of the parents and other relatives. A culture that mistreats children will not be healthy and may die. Thus, I believe a parent or parents should receive aid for dependent children that permits a parent to care for their children. I think women would take advantage of this aid more than men; OK.
  17. Suppose there is a sentient artificial man (sam) who is unemotional and logical more or less like humans, except quicker. Sam might decide to be either theist, agnostic or atheist. Except for religious reasons, sam would have no reason AFAIK to kill people. If his religion were bigoted towards a group, sam might kill them and get away with it, but most would live. If sam were agnositc Buddhist, not even worms would be killed. If sam were rational, I can think of no reason anyone would be killed. And, other scenarios don't lead to human extinction, AFAIK. If the inventors gave sam emotions, then who knows what would happen.
  18. The common use of the term AI is different than yours; IMO you refer to Artificial General Intelligence; its just a semantic difference. Although, some people see AI and deny it, which is called the AI effect. I've heard people refer to current AI systems as toys; although, some AI systems do financial transactions for people with big money, so those people probably don't consider AI as toys. Some have compared current AI to a cockroach, and say the cockroach is more intelligent, but there is little to compare when you look at the details of each one. Nonetheless, I have to agree; although, the best AI may outperform a cockroach.
  19. I think I read this question differently from swansont, since he seemed not to answer it the way I read it. I read the question as, "Do different elements sometimes have the same emission spectrum?' IDK.
  20. Is the salt, table salt, that is NaCl? Did you use distilled water? The red is probably copper oxide. The blue might be copper hydroxide. Black may be another copper oxide. If your water has chlorine in it, you probably have some copper chloride (two kinds like copper oxide). The bubbles are probably hydrogen, which is is very flammable...explosive. The water may contain NaOH (lye) if you use chlorinated water. There are some good chemists here that will point out any mistakes or omissions I've made.
  21. Some epigenetic changes by our environment affect gene expression and may or may not be inherited.
  22. Compression is amazing, and makes small files to transmit, which is good. It's only a wisp of all the memory needed to decompress and render.
  23. I watched the second and third. IDK what 4KB has to do with those demos, one frame of 480x640 monochrome is 307,200 pixels. With 256 gray scales (1 byte) that would be 307,200 bytes. Those videos were 1080p resolution color; thus, needed way more pixels per frame.
  24. EdEarl

    Diet of honeybee

    https://en.wikipedia.org/wiki/Honey_bee#Queens
  25. Some good points and some weak.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.