Jump to content

fredrik

Senior Members
  • Posts

    532
  • Joined

  • Last visited

Everything posted by fredrik

  1. I'm no biologist but from what I know the general relation between health, physiological fitness and food source is quite complex, and sometimes what is best food even depends on the individuals specific endeavors, physical activities and environment. Also in general like ecoli said, an organisms preferred food need not have the same composition as it's own biomass. I guess you can "live on" eating humans, but exactly how eating only such things will effect your health and general fitness must be hard to predict? The meat and fat certainly gives energy and nitrogen. About all vitamins and stuff though I am not sure. But to ask what is the optimum food is, seems like a very hard question? Can anyone answer that? /Fredrik
  2. Is that popular in us btw? blue moon? from what I recall in the illinois area there was alot of beer signs in restaurants marketing blue moon? /Fredrik
  3. If I had to pick one favourite type of beers it would the belgian abbey ales. I visited US a couple of years and got hold of this beer called blue moon, supposedly a belgian style wheat beer, but I sense it was a disgrace to the belgian beer. /Fredrik
  4. A simple specific living example of osmosis in relation to cells. Is that draining the cell on water is obviously a physiological stress conditions, and for example brewers yeast are regularly stressed in high gravity fermentations. Because the extracellulary sugar concentration is so high. Part of the physiological stress response to "fight" the osmosis, is that the yeast starts to synthesise glycerol in the cytosol, and this counteracts the osmosis. This part of why high gravity wines and beers, tend to have higher residual levels of glycerol giving an additional sweetness. But as with most stresses in cells, there are several responses, both long term and short term responses. One long term response to osmotic stress is also a change in the fatty acid composition in the membrane lipids as it changes the physical properties of the membrane. The chemistry and biology of various membranes is one of many keys topics to living organisms. Since my other hobby is beer fermentations and brewers yeast I've spend some time to analyze the logic of these things a bit. /Fredrik
  5. I visited the oktoberbest in munich last year for my first time, and I never pissed that much during a single night. They only had the 1 litre glasses, and as soon as it was emptied there was another one. It was an interesting and unforgettable experience too see an entire city circle about beer. On our way back to the hotel the subway was jammed due to all the people that had been to the beer feast. OTOH, for enjoying a beer at it's best I still prefer a regular german beer restaurant. The oktoberfest was more nice german culture than beer IMO. I won't forget the 2000 germans in the tent I was in standing up on he tables and dancing synchronously, all of them have consumed a few liters of beer each /Fredrik
  6. Yes, given that you have a semipermeable membrane, the concentration gradient can be said to cause the osmotic flow, but i's the solvent concentration or "activity". Equilibrium is when the solvent is equally likely to diffuse in either direction and there is thus no net flow. Chemists would say the chemical potential of the solvent is equal on both sides of the membrane. Or it's rather the "solvent activity" that matters, because there can be multiple solutes, and they could be different at the different sides of the membrane. So the solvent diffusion is determined not by the solute gradient, buy the "solvent gradient". Passive transport, diffusion and osmosis across membranes are controlled by what's thermodynamically and kinetically favourable. Compare with "communicating buckets", there is flow until the levels are equal. There is also a kind of fascilitated transport that may change the kinetics but not the thermodynamics. Active transport is the transport of molecules against an otherwise thermodynamic gradient, or transport of molecules that simply are impermeable to the membrane itself. Living organisms can power such active membrane transport channels in various ways. Regardless of mechanics, these active transports often make up a significant part of the cells "energy bill". This means you can pump uphills, by using a powered pump. Or just as complement the passive transport in cases when time is expensive. Yeasts for example, can utilize various sugars. But each sugar har their own transport cost. This often means that sugars that require expensive transport mechanism giver lower biomass yields and their uptake is often repressed in the presence of "cheaper" food. /Fredrik
  7. Do you think (if proven right) string theory qualify for a such "new field"? /Fredrik
  8. If you were looking for a number... In the "quantum gravity domain" for which there is not yet a satisfactory theory, there is also a scale (planck scale) below which the question "beeing made of" is thought to no longer make sense. This is the order of 10^-35 m. The basic "idea" is that the energy of radiation needed to probe such small distances, is enough to create a microscopic black hole - which would certainly blur things becase to probe the interior of a black is not thought to make sense. It is not possible to distinguish or define the notion of points or structure, inside a black hole, thus the supposed limit of resolution regardless of the amount of energy supplied. This is also part of the motivation for some theories that suggest that any finite volume of spacetime is discrete. Because distances far less than the planck distances apart is thought to be effectively indistinguishable to us, and thus to any other object in the universe interacting there "effectively discrete". Some theoretical speculations also exists that elementary particles can be identified with some of these small quanta. All of these is just speculations though. Some people has toyed with the idea that an electron might be a small black hole wich charge and spin. /Fredrik
  9. Sometimes people complain on wikipedia and while I'm sure there is as much crap in wikipedia as there is crap in papers, books and TV, I really like the philosophy of wikipedia and I think it is a great initiative and resource and anyone that wants to propagate their understanding can contribute. There is no need that everything has to evolve around money. And I think beeing expensive by no means equals quality. I think sometimes commercialisation can even be detrimental to quality because the driving measure is profit, and you don't know what the purpose of the author is. Even in cases where things aren't plain false, they can be heavily distorted and presented in whatever way serves the purposes of the author - may it be political change or profit. In the age of internet, the importance of making our own evaluations and consistency checks of information quality can't be understated. This applies to free as well as non-free media. I think this ranges from science to politics. If I am to complain on anything, it's on the general lack of evaluation of information itself (beyond evaluating the source: is it free or expensive? is the author famous?). The fact that people trust others without forming their own opinion, is far more dangerous than the fact that people propagate odd or sometimes false information. /Fredrik
  10. Are you thinking of quantized changes? There are some ideas related to this. http://en.wikipedia.org/wiki/Spin_network I think the basis for the thinking is that a surface - a 2 dimensional set, can either be thought of as a connection or correlation between 2 different 3 dimensional sets, or also as deviation between from two 1 dimensional sets. For any given confidence level, there is a minimum confidence interval. Typically 100% requires an infinitely many samples, just like determining position exactly in QM requires infinite momentum. In a certain sense there is a analogy with sample size and confidence, and momentum, I thikn it can even be sort of identified with the confidence. So the confidence level is implied in your prior. In which case it almost seems trivial. This is interesting stuff... there are I think several opinions on this and most of it is technically speculations, but there's little doubt that it will mature eventually. /Fredrik
  11. So, everything is physics? Yes how would my brain, as a subset of the physical world(?) be able to devise a question that is non-physical? If that is your opinion, I think it indirectly answers my first question too. Thanks But some people tend to have more strict thinking. I'd be interested to hear also the other side. /Fredrik
  12. This isn't a speculation isn't merely that I am curious to hear how most people think of the future of "physics". It seem that different people think of the future of (theoretical) physics in different ways and have different visions, that are more or less abstract. For example chemistry is explained by the underlying physics, but will the future grand theory that explains physics as we know it still be physics, or will it give birth to a new field, or fit better into another field, relating to physics, a bit like chemistry relates to physics? Considering many of the current speculations... quantum foam relating to a collection of almost indistinguishable neigbourhooods, black holes been equated to maximum entropy objects... speculations about relations between the quantum foams and virtual black holes... new attempts to random dynamics where many belive the first principles of physics will rather be more like principles of a general stochastic information and learning dynamics based on alot of probabilistic and very basic first generic principles.... How many of you on here belive that the future of physics will be explained in therms of such "kind of" first principles that might as well fit into other fields of science like aritifical intelligence, stochastic information processing? Or do you think that all of that may be interesting but that is really not "physics", and rather just toying. And that future first principles will explain more or "hands on" what the material world we experience really IS without? As usual I apply the universal apologee that may seem fuzzy but for those how think the question makes sense I am curious to hear from everyone, also the experts on here what they think. I'm more interested in the attitude, not necessarily specifics of various approaches - what kind of shape do we expect from the next generation of major unfication? /Fredrik
  13. I sensed the article was very too focused on the "hands on visual spacetime" than what's in my taste. The last section of the scale relativity was very thin, and I never looked into it. I am also not sure I share the obsession with fractals, like the assumption that whatever you come up with, it needs to be in the language of fractals? If it turns out to be fractal, I will certainly accept it. If not, I'll accept that to. Untile we know, anything remains possible right? But, right now, I see no reason to spend the rest of my life doing nothing but fractal studies or anything, trying to find the fractal patterns in the universe. But it seems that author has an underlying first principle thinking goes beyond this that argues WHY the universe "MUST" be fundementally fractal, and definie more specifically what fractal construction of the universe means in what setting. It's apparently a fuzzy formulation. But this argumentation was missing? I think there is a set of univeral and possibly recursive for how the universe evolves. Whether this turns out fractal like in some abstract way, or just share fractal like properties, I don't know. /Fredrik
  14. Did you try thiosulfate or bisulfite to reduce chlorine and chloramines in into chloride? This is also used by brewers to treat brewing water because chlorine can otherwise form extremely potent flavour active chlorophenols that are detectable as chlorine/plastic/phenolic flavours at the ppb range. /Fredrik
  15. I haven't seen the article, but a wild guess is maybe something like "information fractcal" maybe? No matter what scale you're at, it can be given an abstract information interpretation that gives you the distinct deja vu feeling that you've been there before, though you really haven't, but the abstraction of the place is repeating. /Fredrik
  16. In the everyday world, and in the classical physics domain these relations that sort of define spacetime are so confident that it appards almost rigid and "classical". In such case, the treatment of the spacetime as a kind of background is sufficently accurate. I think this humans live in this domain, the human "common sense" sometimes have hard to even imagine that the frames of references that we thought to be rigid are in fact not so. If we go down in the confidence scale, at some point the uncertainy in our references are so massive that the sense of som of them them start to break down. And at some point we simply throw them away. What is the use for a meter stick that only gives close to random readings anyway. You might as well throw it away and say that the concept of distance is really uselss, or close to useless. /Fredrik
  17. One of the issues that many people realize when it comes to attempts at quantum gravity is that the concept of some background spacetime really doesn't make much sense. In a limited sense, and mathematically, it could be technically treated as such, but there doesn't seem to be a way to from first principles distinguish any particular choice of background. What would be logical basis be for choosing a specific background? I think it makes sense. Spacetime can be identified with the connections or relations themselves. Spacetime IS the abstraction of relations. If the relations are unclear of chaotic, so is spacetime. Relations themselves can be thought of as correlations. In that sence, they way I see it, the one sensible choice of reference that comes to mind is chaos, or zero information. Because this can IMO be philosophically quite well motivated, without need to rely on ad hoc stuff. This way spaces can intuitively be understood to emerge out of chaos as relations between whatver patterns and objects we start to distinguish become more and more confident. /Fredrik
  18. Here is a link defining temperature (or beta = 1/kT) in terms of an energy derivative of the total possible microstates, possible with the macro constraints given. http://en.wikipedia.org/wiki/Thermodynamic_beta Also check the various "statistical ensembles". /Fredrik
  19. Interesting questions. Statistical mechanics is good stuff and I think as a student any effort you put into it is going to pay off later, when you get into QM and QFT, these things come back. There are various interesting ideas in different field of research that many things can be given statistical interpretations "analogous to temperature", in terms of information fields. The concept of disorder is I think in general very fundamental, so hold that thought. It might be worth spending a few thoughts even on gravity and time in terms of this, and decide for yourself if you think it makes sense or not. /Fredrik
  20. I see. I am not very updated on all the experiments that have been done on that. But I guess if one is to exploit anything that doesn't come with a 100% confidence level (which never happens in reality anyway), I guess "proof" is strong word. This is even one reason for my previously expressed attitude on the matter. So I think you may have a point. Lately I didn't spend that much time with the Bell stuff because I didn't consider it that much of an issue, but some issues that I think has been beaten over several times is that the original formulation of the bell inequalities first of all is unclear about the proabilistic framework. They start to talk about probabilities without defining the settings and priors. Some papers also assume that the conditional probabilities of the detections on the hidden variable are integrable. It's alot of things that could be questions, and I think there are many papers on that. For sure I know there are papers rectifying the use of probability originally from Bell. ( def. P(x|y) := the probability of x, give that we know y ) Some papers start out with like this [math] P(A \cap B) = \int P(A|\lambda|B)P(B|\lambda) P(\lambda) d\lambda [/math] The locality assumption is that [math] P(A|\lambda|B) = P(A|\lambda) [/math] Giving [math] P(A \cap B) = \int P(A|\lambda)P(B|\lambda) P(\lambda) d\lambda [/math] Now where is the prior assumption of entanglement? The whole construction relies on that there is a relation between A and B, so that once A is determined, B is, because they are entangled. Let's call this relation R(A,B). I'd like to write then [math] P(A \cap B | R(A,B)) = \int P(A|\lambda|B| R(A,B))P(B|\lambda| R(A,B)) P(\lambda) d\lambda [/math] To now suggest that [math] P(A|\lambda|B| R(A,B)) = P(A|\lambda|R(A,B)) [/math] doesn't make sense does it, taken into account the entanglement constraint. The locality assumption defined as above seems inconsistent with the entanglement constraint. So it seems either you believe in the existence of entanglement or you don't. In the nature of the probability concept is also the incompleteness that in reality we never ever reach completel confidence. There is always a possibly arbitrary big, but still finite amount of data. This is the reason why I like to keep doors open. Anyway, I do not have any references to the experiments at hand. Maybe someone else has. /Fredrik
  21. IMO, both time and motion can be devised from the slighly more abstract "change" or uncertainty which is also a connected thing. Because event without time the status quo may be uncertain. The relative change of the clock device to other changes would be a measure of time. Then this can be used to define motion. A valid question is also where the heck does this clock device come from? I imagine that it as a more or less well distinguished substructure. Consider a series of snapshots of our information of reality(fuzzy I know) - can the time ordering be recovered in case someone permuted the shots? Usually the arrow of time reveals itself in that the entropy increases(*). In the case of periodic phenomena So that is a basis for the "direction of change", then we only need to define the units that parametrise our world line along the "direction of change", that's what the clock device is for. One may not object that, like is well known from chemistry, due to kinetical reasons, reactions does not always proceed in the direction of maximum global entropy increase. But what prevents us from entending this statistical reasoning to include the kinetic prior? The state of the system can be considered diffusing into a new state. Diffusion of what within what is the obvious question? But I think of this diffusion as beeing random disturbances in our information state - constrained by our prior informaton (kinetic constraints included). This is an extension to the similarities between of QFT and statistical mechanics, but it seem the full exploit remains to be seen. As has been noted long time ago the QM equations look like diffusion equations, and usually there are some lame supposedly appealing argumentations that suggest you just "replace the Temperauture T in the for 1/it" in he partition function and there you go. This is IMO not convincing. But I think there is a proper way to find the formal connection. I do not have a thourough background in those details to know if someone has worked out this connection satisfactory, but so far I have not see a full treatise. (*) One may wonder, how is it that the macroscopically obvious arrow of time seems to be hard to find in the microscopic scale? A possibility is that at this noisy level, we are unable to see the arrow of time. The arrow of time is just not not distinguishable, and what isn't distinguishable can by definition not have any impact. Accordingly the notion of time at this scale, is either completely random/chaotic, or pointless, which is effectively the same thing. It's hard to explain properly and I wish I had some proper links to supply but I have not seen a single paper that IMO makes a fundamental treatise of this. Some of the modern research papers in quantum gravity tangent on this but it's often unclear for the outsider because a single paper tend to be a small piece belonging to a research program. Which is presented elsewhere. Often the papers also come out at mathematical exercises to someone who isn't deeply involved inthe the specific research program, which I am not either. I have just recently started to attempt a fundamental treatist of this from scratch but it will sure take me some time. Meanwhile I suggest searching "random dynamics". http://www.nbi.dk/~kleppe/random/qa/qa.html is one site which attempts to line out a research program... not the best but OTOH pretty decent overview, that is also clearly a young project. Ariel Caticha http://www.albany.edu/physics/ariel_caticha.htm also have written a series of relevant papers also thouching the subject. For those who don't like that math, the above pages are comparatively readable. I feel like I have written the same thing several times now Do the posts come out as annoying? too much words? to little math? or fuzzy? /Fredrik
  22. Bascule, I think we may share some basic interests but you seem to see things from another angle and I don't exactly understand your thinking. Before trying to comment on the wrong thing I am curious what you are working on. What are you looking for so to speak? > Rule 30 moves no faster than "c" (one cell per second). Are you identifying time flow with data flow? How do you distinguish data from useful information? Ie. data with obvious rules - that are easily compressed - and hard to reduce random data? would the flow of time not have any relation to this? /Fredrik
  23. I was mainly referring to intuitive human understanding beeing dependent on philosophy, and evolution of science in a certain sense beeing philosophy dependent. Not that a certain philosophy is required, but rather that the philosophy can make it more, or less, natural. Perhaps we disagree here? Of course, it is possible to define the terms better, and use a highly formal and less ambigous language here. But then it gets cumbersome for a forum and not everyone would follow the abstractions anyway. Sometimes the main message can be propagated without math. Then calculations have nothing to do with philosophy of course, which is why computers can successfuly do it. But devising the calculations is another story IMO. /Fredrik
  24. I agree with you that we want to understand the world we live in but I think one may need to decide about a starting point. I'm not suggesting that the concept of time is completely arbitrary. I suggesting that the choice of units of time is a convention or choice but that the direction of time so to speak, can be given a probabilistic interpretation. In this interpretation time is defined in a context, which means there is a specific observer which possess a specific set of information about the world. Without this "context", time is ill defined as far as I understand. *This is formally speculative* but in my personal opinion, by a similar token, space can probably be built as well. Starting with one dimension, the second may evolve, and finally the third dimension can evolve as a way to resolve noise in the lower dimensions. In this respect, time and space can be said to have similar origins. Unfortunately it's hard to explain this without starting out with alot of definitions and constructs. I personally think it's a bit intuitive, but it all depends on your philosophy. It is fundamentally incompatible with the old deterministic ideals. Everything I have tried to explain is based on probability theory and the world on microscopic levels is sort like just chaos. Anyway, good luck in your quest. I sometimes think we all have to find our own asnwers to our own questions. Keep looking and the progress is unavoidable. /Fredrik
  25. In the case of the chemical reaction you actually make two measurements in parallell. You make a measurement on the concentrations of products and reactants, and you make measurements of your clock device. The reading of the clock device have no observer invariant meaning. It is only a relative pace keeper of your chemical reaction. Of course in the construction it is assumed that you are able to distinguish the clock device from your chemical reaction, not to mix the readings up Like I tried to explain, I think the best view is to think of time as relative change. So a certain number of seconds means that if you bring your standardised clock device with you during your observations, time is simply a measure of how much the clock change relative to your overall change. So how come different things happen at different rates as compared to your clock device? It can be interpreted intuitively as that the probability of a specific changes in the clock device is large or small compare to the probability of your overall change. A simplistic analogy... If you are sort of like in empty space and don't see anything (ie nothing happens) the only thing that happens is your clock is ticking. (let's ignore the human biology for simplicity) That's about as exiting as it gets. So the change that so to speak "peaks" the probability of possible futures is that your clock will proceed with another "tick". However if you are with your clock in a much more active environment, the most probable change is no longer a plain clock tick. Many another things are more likely to happens at a higher rate. I'm sorry if this is unreadable but IMO this is a intuitive view of time in terms of probabilistic reasoning that I find to be very simple yet powerful. /Fredrik
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.