Jump to content

fredrik

Senior Members
  • Posts

    532
  • Joined

  • Last visited

Everything posted by fredrik

  1. The way I see it the connection to experiment will actually gets more tight, and the status of information is given a more fundamental status. There is no way we're replacing "real input" with "theory". The theory just makes sure we _process the given input_ in the (ideally) optimal way, and how to update our predictive models accordingly. The idea is that this concept is deeper than it seems at first. It does not end with human theories! The fundamental interactions of particles might also be reflected by the same logic. Two "particles" interacting, are nothing but two systems exchanging information, and they respond accordingly (dynamics). This also has the potential to eventually lead us to reveal the true nature of the observer-system relation. These ideas may also have a major impact on AI models, since the philosophy considers information to be more fundamental than physical things. The reason is that our knowledge of physical things are always mediated by information exchange. The concept that unknown data has a definite value even though it's beyond our information leads to akward conclusions. A system responds to given information only, nothing else does not make sense, would it? IMO the better way to think of it is that we may get information that was previously beyond us, and then and only then do we respond, not sooner. The desire to want to think of things having a definite nature even when it's beyond our information is IMO almost religious and it leads to weird conclusions. When information is incomplete (as is always the case), our response is not definite, but it's not arbitrary either. It's only arbitrary or chaotic withing the gives constrains/priors. If reality then proves to not agree with our predictions, it means we have received new information, and we will act accordingly and also update our priors for future processing. I think it's as beautiful as I can reasonably expect. The approach just needs work. Constraints/priors means - current information geometry - current information of boundary conditions - current information of various relational patterns such as momentum and energy spectra Howto add to stochastic processing ontop of those constrains combined will be a complex mathematical task, under current work, but it will nontheless result in dynamics. However there is another complexity that will prove nasty and that's the third point. I personally look for the optimal inference mthod of choice of patterns. This is i think the key to proper understanding of structure, and dimensionality. I have not given this hat much thought yet, and I haven't seen much else progress there either. I'm still fighting with the formers steps. /Fredrik
  2. I consider it a sign of problems, and I don't consider it logically legitimate to just remove it. I think when the theories are better understood and improve it shouldn't have to be removed like that. It seems we can do it and not get punished but that does not justify it from a logical point of view. I think we're lucky, and there is a better, more proper resolution. It seems to be the fact that the background spacetime is assumed to be a rigid and fixed prior reference (=thus with "infinite" inertia) that implies these infinite energies. This is just an approximation anyway until we know better, it isn't logically satisfactory as is. /Fredrik
  3. > Realism is dying It was just a matter of time. Cheers to 21th century Here are a quote of someone with what I consider more a modern philosophy of physics. "The point of view that has been prevalent among scientists is that the laws of physics mirror the laws of nature. The reflection might be imperfect, a mere approximation to the real thing, but it is a reflection nonetheless. The connection between physics and nature could, however, be less direct. The laws of physics could be mere rules for processing information about nature. If this second point of view turns out to be correct one would expect many aspects of physics to mirror the structure of theories of inference. Indeed, it should be possible to derive the “laws of physics” appropriate to a certain problem by applying standard rules of inference to the information that happens to be relevant to the problem at hand." from "The information Geometry of Space and Time", Ariel Caticha /Fredrik
  4. There isn't much doubt IMO that eventually there will be unification. IMO when you look at it there are major flaws in the logical foundation first of all. I think the new theory will definitely also resolve some of the present QM issues at the same time. Many physicists seem to think that physics should stay away from philosophical questions and instead end up viewing physics theories as a kind of "computational procedures", that either work or doesn't work, and the question of why they work and what it means is irrelevant and left to philosophers as less important. But this attitude also seems to reject the problem of how a model come into beeing, which in my view of things is one of the key points. It's not a step we can omit. Sometimes when I read papers from certain physicists I have sensed and undertone that their philosophical strategy is to try to fit observations into already existing and well understood mathematical structures blindly, and then see if it can be fiddled to comply to data. Using as major guidance some kind of beauty of mathematics. That approach never complied with my thinking. I think the key is not how to support or disprove a statement or theory, the key is howto systematically resolve an inconsistency, and here deviations is your friend. My personal expectations from the future unification is a proper answer to at least the following issues. 1) The relation and dynamics with the observers nature and the system. Sometimes the current formulation contains an assymetry that contradicts the obvious fact that an interaction is symmetric and can be described in several ways. 2) A more first principles view of basic concepts, mass, energy and space. 3) What is the key mechanism behind time evolution? What about the arrow of time in relation to the apparent lack of arrow in microphysics? I see plenty of clues and I'm optimistic. I'm currently working on some ideas that will address all the above points. I think that the laws of physics will come up as stochastic patterns constrained to give constraints. So far it looks promising but there are plenty of problems that I'm trying to digest. The current issue on my mind is to unify relation between "intertial mass" and "relative confidence". There is a deeper connection that I think will give new insight into GR. The dynamics or GR can probably be seen as a simple form of dynamical evolution of inconsistencies. The conncetion between a large mass and high confidence is obvious. But I am still wondering what the proper formalism is. I've started out by a clear intuitive feeling, and now I just need to translate that into a formalism that says the same thing. This is close to some other entropy methods many are working on. I wish more was working on it because I haven't found that awfully many papers as compared to string papers for example. /Fredrik
  5. An idea is that some of these speculated extra dimensions are not observed because our everyday resolution of observations can't resolve them. A popular and commonly presented "standard analogy" used to understand the concept is to picture that you are observing something constrained to exist on the surface of a long garden hose. This means to position something on those hose you need to know the position along the hose as well as the position along the circumfrence. Ie. 2 dimensions. But if you watch this system from distance, you see a long hose with something seeming to move along it, so you may only be able to resolve the most significant degree of freedom - position along the hose (1 dimension). The position along a given cross section of the hose possibly isn't that significant, and ignorance about it yields an extra uncertainty that might not be resolved in the overall uncertainty and the system appears 1-dimensional, while on a closeup it would seem 2-dimensional. /Fredrik
  6. There is one thing that I have been thinking about, but never found a satisfactory answer too: When people take common NSAID drugs, it obvioulsy typically reduces symptoms and makes the patient feel less discomfort, but to what extent does this manipulation of symptom reduction, compromise the bodys effiency at fighting the problem? I am aware of the fact that sometimes the body may overreact and thus this sypmtom reducing drugs can be used and the body still have no problem fighting the problem. But what are the exceptions to this rule? Are there any research on what impact the typical NSAID drugs have on the immune system? I'm not talking about the COX-1 related side effects but I'm more curious if someone reducing the inflammatory response at some point may worsen things? Like disarming the body, while fighting a battle? For example cancers? Would COX-2 inhibitors have any impact whatsoever on the bodys intrinsic healing power in such a case? What about either diseases? Can someone give a direct response, or know of links to papers on this? /Fredrik
  7. Maybe this discussing starts to diverge away from the original purpose but it's true that there are different versions of entropy measures, I think it's still a universal abstraction and here is another comment. The generalisation would be a kind of generalized relative entropy, which included all priors. If you have a filter, this filter is defined as part of your prior. Otherwise you are not respecting the relative nature of information. How to actually mathematically represent such a general entropy though as a function of an arbitrary filter would be quite complex and is subject of current work. But if you consider this relative entropy (I claim without proof ) there is indeed a analogous version of the 2nd law of thermodynamics, and the general form does not forbid that the entropy decreases, it just concludes that it's correspondingly unlikely. The filter itself can be seen as constraint. This constraint will be added to the stochastic process. So, the outcomes of such a process - given that the constraints are respected, should still obey a generalized form of a 2nd law analogy. The apparent violation of the law is IMHO caused by not considering the proper relative entropy (meaning considering relative information, existenceo f a filter is definitely part of the information). I still agree with Swanson that there are different uses of entropy, and it's important to see the difference. I just wanted to add that I think there is at another abstraction layer again a unification to be found! /Fredrik
  8. First I don't know much about the creationism belifes but I can imagine that they argue that 2nd law of thermodynamics suggest that there's no way humans could have evolved. But that's an incorrect application of the law, there is no such conclusion applying to subsystems. Biosynthesis and microbioal growth are all well in compliance with the second law. The balancing entropy of the uinverse is from degrading food (oxidizing carbs etc). Touching what Dak said, I have long suspected that there exists a deeper information theoretic duality between the observed universe and the observer himself, that may be given a creational interpretion, all in compliance with scientifical principles. I am not sure how much is written about it, but I'd expect that perhaps some people that worked on the observer issue in QM has elaborated this. This is pretty fuzzy though but I think eventually more insight into this will come without the need for magic. /Fredrik
  9. Maybe depends on the context but I'd say is not exactly the same though a bit close. Plain diffusion is a kind of passive transport. But in biology contexts one usually makes distinctions between different types of passive transport: osmosis, facilitated diffusion and more "straight/simple diffusion". Even though they are related from a very basic point of view, the differences between the transport types are important when talking about cells. The classifications based on *energy requirement* are passive vs active transport. Then there are different types of passive transport, the same way there are different types of active transport, this fruther classification is based on transport mechanism. Osmosis, plain diffusion and facilitated diffusion are all in a sense "diffusion". But to a cell there is quite a difference between them. Plain diffusion through a cell membrane means the diffusing spieces go right through the membrane, and regulations mean regulating the membrane. Facilitated diffusion means the cell has synthesised special typically membrane proteins acting as tunnels for diffusions, and regulation menas just regulating these transport proteins. These tunnels can be powered like pumps (active transport) or non-powered (facilitated diffusion). /Fredrik
  10. I agree with the others. Judging from my different textbooks there is clearly a huge overlap, the but IMO it's mainly the focus and perspective that's a bit different (at least judging from textbooks). Molecular biology has more the organism perspective, and studies the topic from the point of view of life. Like cell physiology. Biochemstry is more the opposite (chemistry of life, rather than life of chemistry). You try to reduce life into chemistry and analyse it in detail. I don't know if it's a conicdience but in the books I have the chemistry parts, regarding reactions steps, enzyme analysis and stuff, is more detailed in the biochemistry book than in the molecular bio book, but OTOH it lacks parts on the connection to life. Which is the reason I have several books even though big parts of the two books are redundant. /Fredrik
  11. It looks a standard design type called "binocular" type. I'm not sure what tools you have but many things have impacy, clearly the choice of material in the beam of course, and a general balance act when making load cells is that on 1) one hand you want a high strain on the position on the transducer, to get a good signal to noise ratio. Things affecting this is material properties and dimensions. There are "special" expensive strain gages with a high gage factor for applications where the material dimensions simply isn't a variable, but most standard type strain gages have a typical gage factor of around 2. 2) OTOH you do not want too much strain, because it will increase non-linearity, and lifetime of both the material and the strain gage itself. Lifetimes are typically measured in load cycles for a particular load level. What's the expected lifetime of the product? In general I see two ways, either you just use your experience and make a prototype and make a measurement series so you can estimate the transducer quality, and adjust if needed.... or you try to make the theoretical strain/load calculations on the beam and apply material data and estimate the optimum dimensions. If you are looking for sample designs, I'd suggest googling "binocular load cell design", perhaps you can even find some standard formulas for that type of design. But of course there are plenty of other design types for load cells. /Fredrik
  12. I'm not sure I understand what you are saying, and what the problem is? Are you talking about a mechanical load cell (force transducer)? A common load cell design is based on strain gages, which is usually somehow glued inside depending on the design. If you modify the mechanics the general thing (the flexing parts) that I can imagine happens is that you change the factory calibration characteristics of the transducer (Newton to microStrain rating). But it's hard to tell without knowing how it looks and where the active transducer element is mounted. OTOH there is nothing that stops you from making our own calibration. /Fredrik
  13. Diffusion is a kind of passive transport. So the term "passive transport" is normally meant wider than just "diffusion". The meaning of passive is that the transport does not require active energy supply, and the transport is driven by the natural thermodynamical (downhill) gradient of the transported compound itself. In biology so called facilitated diffusion is common, and it usually means that the cell membrane contains transport channels (that doesn't require energy - ie NOT pumps) that simply stimulates diffusion by their presence. The degree of facilitation can be regulated, but there is no power supply involved. The opposite - active transport, require an energy supply of some sort and can thus transport stuff against the natural thermodynamic gradient (uphill). /Fredrik
  14. I'm not sure if I missed your point here but we discussed this earlier in the thread we talked about muon decay. The observed half-life of "high speed muons" is longer. (But the half-life of the muon in it's own resting frame is always the same.) There is nothing magic about the clock devices as such, so if you prefer, you can loosely think of decaying muons a moving "muon clocks". Thus one can picture it so that speeding clocks "run slower", when compared _on the fly_ to a resting clock. /Fredrik
  15. I'm glad you said this, this is exactly what it should ideally be like, and why I think we need a good fundamental framework, that can support this. A framework that is *designed* for intelligent dataprocessing to start with and do just what you suggest. Trying to manually "add information" to exisiting knowledge based on static models, will result in a gigantic manual patchwork and allow for arbitration, that might not even be possible to finish. The model itself should be able to tell us how the optimum modificiation should be done. This has traditionally been done by hand, and human intellect. I feel that this is a primitive and old faishoned, and I definitely see a room for improvement. A new framework doesn't mean we need to redo history or all experiments, a modern model should be readily trained to adapt to current knowledge at an estimated confidence level. The thing is that such kind of model, places fairly high demands on logical consistency and founding all concepts ultimately in terms of data. And the current physics models simply wont do. Their philosophy and structure is old style that seems unsuitable for such things. QM and general relativity certain have elements that will live, but there is not yet a consistent unified logic - set aside the lack of experimental data in the relevant domains. This is what we need solved. And in that sense the suggestion that world is made out of excited strings simply isn't sufficient. In fact it's no improvement at all from my the described point of view, and is I think a wrong focus. I expect a fundamental change to in detail, analyse the concept and nature of measurement, information and knowledge. The foundations of physics need to be more firmly attached to reality. Reality means handling uncertainties, noise. If we hide too distant mathematical abstractions without clearly defining it's observable connection I think we risk missing the whole point. /Fredrik
  16. To not run into issues of unphysical assumptions, one thing that is clear in a very general sense. 1) Take two, from your point of view, as identical clock devices as possible. 2) Synchronize them. 3) Then separate the clocks, and simply let the two clock devices exists under non-identical conditions (in a wide sense). Sending the clock away can also be thought of as a transformation of the clock. Because technically it's not the same clock in the information theoretic sense. 4) Then after a while collect the clock devices and bring them back to one place for comparasion. In the "general case" these two clocks will now be out of sync, which can be expected because they have evolve in different environments. But the exact amount will depend on the exact differences between the two clocks. Ie. a clock in motion is clearly different from a clock at rest, because you will in general not mix them up. The cases where they agree should IMO be considered a special case, or exception, due to the symmetries of the transformations. The general conclusion is that the clock device measures the time relative to it's own references. If you send a clock on a journey, the general case is that they will disagree upon a later comparasion, except for cases where the symmetry in the journey. /Fredrik
  17. Because going from point to string is described as "radical" I think from a pure mathematical point of view, the idea may be worth some exploration (as would other things) but to me I can't see how it possibly can qualify as fundamental radicalism. Point to string, and string to brane. And then imagine that all previously known particles are various states of this extended object. Clearly any model based on higher dimensional objects are expected to have more adaptive power, but I fail to see the radical nature of this starting point - it is not fundamental, not in my eyes. A string is not much of an improvement to an answer to my questions than a point is, except from some extra mathematical features. I see no fundamental motivation for this, it seems to be an attempt at a mathematical exploit only, resting on the same old foundations of physical reality and spacetime as before. It is clear from the beginning that the old concept of a point is a mathematical abstraction and idealisation due to limited resolution and simplified theories. In reality we could probably not distinguish a point from a neighbouring point anyway. And I am not sure I find the string much more sensible. From a philosophical point of view I see similar problems as with points. I'd like to see a logical prescription on howto identify a string (in principle), as opposed to identify the fuzzball that could be a string to only a low level of confidence, but could also be something else within a reasonable level of confidence. Unless the string can be identified, string theory seems to me to be some sort of "hidden variable" approach, that uses some unobservable constructed objects to come up with a mathematical model of what we see. I for one will not accept such a fundamental solution. I'll keep looking because I am convinced that there is something much better that describes the nature of reality in a context of observations. Instead of starting from a ad hoc starting point and hope to reproduce what we know, and then suggest that as a fundamental explanation, would it not be more natural to start from what we know, and prescribe how to move forward by analyzing real data and allow the data to suggest what new constructs that we need? Anyway, I think the most important thing is for everyone to try to think for themselves. The one reason why I feel motivated to express my opinion (which is no more valid than anyonese else) is to counterbalance what in my personal experience has been dominating beyond motivation. /Fredrik
  18. I'd personally encourage anyone paying interest in physics to think themselves and evaluate information on their own. If you think string theory is the radical change we all need, go ahead and blow the critics away. But I think new students in particular (that I think tend to have an innocent mind and make the mistake that the teacher is always right) should do themselves and everyone else the favour and don't take anyone elses word for what is the future of physics. Use your own judgement, there is no acceptable replacement. Considering the fact that to a certain extent at least some string physicists has discouraged other thinking among students and forced them either to align or to pursue other carriers - making them think that unless you like string theory this business isn't for you. I think it's in place that alternatives becomes visible unless someone missed it. As someone beeing philosophically inclinced (which btw doesn't equal ignorant to math) it's beyond me how so many people can settle with the almost nonexistent improvement on the philosophical and logical foundations? That's why string theory smells like a mathematical game, lacking depth. This is a line that I find amusing "String theory is a rather radical generalization of quantum field theory whereby the fundamental objects are extended, one-dimensional lines or loops." -- http://arxiv.org/PS_cache/hep-th/pdf/0207/0207142v1.pdf /Fredrik
  19. I think Swansont said it best. Science is not just "knowledge", it's something even better! It's the key to a systematic and efficient acquisition of knowledge. Because knowledge is alive, relative and it changes, so does the truth. The world is full of wrong inconsistencies and faulty predictions, that is on what we thrive. I think ignorance and apparent disorder is food for scienctists. Deviations need not equal failure, it might as well be a golden opportunity to progress if handled correctly. /Fredrik
  20. A general (non-string specific) reason for this particular scale is that, according to what we is currently thought about unfication of general relativity and quantum Mechanics, beyond this scale identities themselves start to blur. The idea is that if you are to focus a sufficiently high energy that would required by QM for such "resolution" into a smaller object than that, the energy density is high enough to - according to GR - create a microscopic black hole. And it is not thought to make sense to probe the interior of a black hole, because focusing energy into it will only enlarge it, blurring things even more. This is one of the basis for the expectations that whatever tomorrows theory will be, we suspect that there are logical reasons why we cannot probe infinitely deep into matter. It's not *merely* a technical or financial problem, there are thought to be more profound reasons. /Fredrik
  21. I tend to agree with Dac. Whoever (dictators or majority choose governments) that apply constraints on society - laws and rules, should also have some kind or moral responsibility for it's consequences. This includes ensuring that the exists a reasonable solution for everyone, meaning that everyone should at least have a reasonable chance to a decent life. If not, the result will be tension, people suffering, and maybe even crime. The worst case scenario is if someone feels they have received an unfair chance, because then this may power criminal solutions, and people loosing respect for the constraints. And I'm not talking about that your neighbour makes more money than you, I'm talking about people that risk their life and healt because they don't have money to pay for medical care, or doesn't have money for food, or people that live in constant horror. And where the solution to their future might not be possible respecting the constraints. And from a moral point of view I would hold whoever is responsible for the rules, part responsible for this. I would not blaim someone who is starving for stealing food etc. At some point I'm sure I'd do the same, if I see that as my best solution. For those who say that life isn't fair, nature isn't fair nor "perfect" etc. That is certainly true, but that is no excuse for not doing our best to make it as good as possible. Doing my best is as far as my own moral goes. In as much as I am completely aware of that nature in imperfect I am also aware of that people that suffer are less likely to align to the laws of society, and that eventually it will come back and hit me in the face. /Fredrik
  22. The basic idea how it should work is simple and is something like this: Relative to the current state of knowledge, we have an uncertainty or things we do not know. Now if we could quantify our uncertainty into a set of equally likely outcomes (relative to our prior information), then evolution would be playing dice on what we don't know, constrained to what we do know. The problem is how to define some measure of the unknown so that we can define the outcomes. And here the entropy principle can help out. We consider infinitesimal disturbances, and calculate the increase in the etropy (or just the number of micro degrees of freedom (like the microcanonical ensemble), the exact entropy definition should not matter.). Then we can assign probabilities to each possible disturbance. And a more likely disturbance is will appear more frequently. This will also be self correcting, becase if deviations are observed, our priors will update, but not instantly! Here comes also the correspondence of inertia, the resistance to updating your opinion when exposed to contradictory information. The "internal mass" can be given interpretations of how fast the prior is updated. Here something like the particles information capacity will be a fundamental variable. Someone that can store say only one bit of information, will have minimal intertia and it will align instantly. But a more complex object will consume the deviations but adapt according to it's own inertia. This is some of the philosophy behind the method. And the point is that it starts from very raw first principles, and I see good chances that it will be able to *explain* previously considered fundamental things in terms of the new framework. Another advantage is that it will be closely related to intelligent processing algorithsm. I think it may (eventually) provide a unification of physics and artifical intelligence models, probalby also bringing more insight to the human brain from another perspective. My own starting points has been to start to consider how the prior emerges from data consumption, and the basic rules for prior updates. nd the next thing is how new dimensions are born out of deviations. I'm also trying to reevaluate the choice of formalism, wether the wavefunction formalism is really the ultimate choice or not. /Fredrik
  23. Hello Atheist, first I think it should be noted that this is to my knowledge not yet a completed approch and there are open wires, but it's what I'm working on and a number of others are working on similar ideas, but there are variations. Yes it is similiar to the action principle, and the infinitesimal action would correspond to a transition probability. And the idea is the the transition probability can be driven by an microscopic submering everything in noise and then it's boils down to a matter of stability. Unstable and easily excited transitions are more likely - with respect to the current state. Transitions probabilities are fundamentally relative, not absolute. Moreover the transition amplitudes are estimates from the current state of information in the spirit similar to maximum entropy methods. Which means the laws of time evolution looks exactly like a learning algoritm based on inference methods. It also means that stuff like mass and energy will receive a new interpretation in more abstract setting. I have not found a single source yet which lines out a satisfactory approach in detail, and I resumed working on this myself just a month ago after a 10 year break. But here are some links relating to the ideas. See Ariel Caticha's page http://www.albany.edu/physics/ariel_caticha.htm - this guys has started to address some fundamental questions and had ideas on derive GR from more basic first principles, but I have not seen any recent papers from him. Wether he has succeded or not I share his ideas on this. If he hasn't done it, someone else will. RANDOM DYNAMICS http://www.nbi.dk/~kleppe/random/rel.html - I didn't analyse this completely and I am not sure I agree with their exact approach but no doubt the general spirit is right. Some other keywords for searches are "information geometry" "entropy dynamics" "bayesian inference" The nice thing about this approach is that it starts off from first principles, and stuff like inertia and relativity will come naturally. And due to the probabilistic foundations i think the extension to QM domain will be natural. I'd be interested in your comments on these ideas. In particular negative critics. But if you search the existing papers I woul'd get hung up on all details because the subject seems to be young, and I've see no complete paper yet. I'll be happy to get back with details as I get more work done myself. Feedback would be nice. But this is a hobby for me and it's slow. /Fredrik
  24. I read again I suspect we are discussing two things here. I can identify at least two questions 1) What is time? or more specifically, how is time properly defined? 2) How does different times compare, when defined in different settings. I was talking about (1), but I now suspect perhaps you were thinking of (2) - the relativistic time dilation effects. A clock in motion, and a non-moving clocks are different. My suggested concept that I tried to explain is that time is defined by the relative change of your information relative to the information of your clock device. In the general case the information of other clock devices are not a valid replacement. Because you can certainly distinguish a clock at rest, and a moving clock. So they is two different clocks. And to make a proper comparasion you need to know their relation. This relation is what Einstein solved. But Einsteins did IMO not in detail explain (1) - what is the proper definition of time in the local frame. In terms of my fuzzy talk in the previous posts, the relativistiv time dilation (relation between different observers) can be understood loosely like this: The flow of time that appears to slow down in a near light speed frame or in a gravity field, can be explained if that environment is in general higher in order, thus explain why the similar "translated" event is less likely there as compared to your reference because the flow of time is related to *relative* order/disorder. And if your reference is more ordered, your relative disorder is higher, and the probabilistic driving force of "time" is lower. So there is no such thing as absolute disorder, only relative disorder, and relative time. But that's not to imply it's not real. Maybe this complicated it further, if so, ignore this for now. /Fredrik
  25. > I seem to have contradicting answers now from swansont and fredrik I'm not sure I see the contradiction? Perhaps we made different interpretations here or I misunderstood your intention. The discussion was a bit fuzzy after all (these "counters" and all) ? /Fredrik
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.