Jump to content

fredrik

Senior Members
  • Posts

    532
  • Joined

  • Last visited

Everything posted by fredrik

  1. It's still unclear to me how you define "better smelling". I can imagine two ways, either the subjective question what do You think smells better, and then interpret freely. Or to consider the biological value of finding food with a certain smell (good, healthy food), that may somehow define some prior preferences to flavours. Maybe the traits to o recognizing good food and dangerous food is preserved and there is some positive feedback to the brain upon smelling something "good". But how to calculate and quantify that seems very difficult, and probably still fuzzy since there may be individual to individual variation anyway. If I were you, I'd start doing some reaserch on the genes, and try to find papers on each sensor type. Here is one paper "The human olfactory receptor gene family" http://www.pnas.org/cgi/reprint/101/8/2584.pdf I'm not sure about humans, but for other organisms one can find genome detabases and links to papers related to certain genes. About practical parts of other things, esters you make in the lab typically appear in some solution in equilibrium with the acid and alcohols so there would possibly be undertones of the respective acids and alcohols in a real test. And ester reaction has different equilibrium points, and various flavour odour tresholds. I'm not sure it's interesting but I wrote some notes on this in the home brew digest (hbd.org) some time ago in the context of beer aging, where there is a balance between esters, alcohol and carboxylic acid and their tresholds and wether the aging improves or worsens the flavour. Generally the ester smells good and the carboxylic does not. Some acids have fairly low tresholds too. Some people argued that harsh higher alcohols mellow since then esterified, but thermodynamic reasoning shows that this is usually unfavourable. The more likely candidate for harshness might have been acids which have low tresholds and relatively speaking more reduced during aging in some cases. hbd seems temporary down so I can't supply the link. But search for "Thermodynamics of Esterifiction" and you may find some exmnples. /Fredrik
  2. For example, what smells "better", banana or pear? And do you prove it? /Fredrik
  3. The question of which smells "better" is obviously a subjective one. Two people will in general disagree on which smells better, depending on associations and other things. Or past experience that certain smells associate with stale food, fresh fruit etc. So smell is a tool to get good food and avoid bad good. What's best is bound to be subjective in any case. So I'm not sure what you're asking? /Fredrik
  4. Sensations are complex and have several components. The primary transducer is an olfactory receptor neurons, which from what I recall contains transmembrane proteins that bind to special molecules. The binding changes the proteins and via a series of reactions in the end causes the neuron to create an electric signal that is sent to the brain via the olfactory nerve. Here there olfactory receptor proteins clearly have different affinities to bind to different odour molecules, in general these affinities are probably also regulated in response to several other things. This is the case also with taste receptors. So on the transducer side there are a complex regulatory network that regulates what molecules or functional groups that tend to bind to the receptors. Second question, and probably even more complicated, is what happens to the electric nerve signal in the brain. The brain have to "decode" and interpret the electrical signals from specific neurons in context. Meaning the preception of a single molecule may vary with the general context, both ambient, and the rest of the body, and in comparasion with memory associations you have had with this signal before. So there two parts, the mechanism how a molecule generates a electric nerve signal to the brain. And how the brain interprets/handles this signal in context. Both parts are complex and there is not yet complete knowledge of how this all works in detail. The first part, one can study, then one usually identifies a specific receptor proteins, and the gene that encodes it. Then one can study the regulation of this gene, as well as the regulations of the protein itself. I haven't look much into smell, but I did some reading on salt and sweet receptors and how they respond differently to various ions, but the conclusion is that it's complex, and ontop of that you have the brain treatise which I think there is even less knowlegdge of. /Fredrik
  5. Try to make friends with them, the are part of life on earth just like human. They are your friends, if you get to know them. Try to study them from a biological point of view and the threat they pose to you may be seen as an opportunitiy and turn into curiosity. These bugs are "kids" looking up to you. No need to be afraid Get a biology book, try to learn everything you can about them. What food they like, and how the fight to survive and reproduce and you may get some symphaty for them. /Fredrik
  6. Even though these things are far from settled, I would say the your intuitive association of evolution and entropy is a very good one. Keep it There are some approaches in physics which is working on a strong information approach, and that all the laws of physics may be explainable in terms of a generalised entropy principle. I expect alot out of this in the future. And it has a clear potential to unify the biological evolution as a natural extension to the supposed evolution of various elementary particles, and atoms, then molecules. I am convinced that there is a uniform logic to be found. Some associations... evolution ~ adaption ~ learning ~ equilibration Entropy as a measure of missing information is a key concept there. /Fredrik
  7. When you talk to different people, of different background, or even similar backgrounds but different mindsets it's clear that everyone has their own way of preferred thinking. Some people are very mathematical and if you ask something philosophical they don't understand it. But there is also the opposite case. Here is one of those classic engineering jokes... "Engineers think that equations approximate the real world. Scientists think that the real world approximates equations. Mathematicians are unable to make the connection." Gib, I've seen your great philosophical questions in various threads. Check this paper out and tell me if it makes any sense, or triggers any ideas in your mind? no? "Change, Time and Information Geometry" http://arxiv.org/PS_cache/math-ph/pdf/0008/0008018v1.pdf It does contain a few basic equations, but there are also some text that I think is fairly readable. Information geometry is IMO yields a very intuitive understanding of geometry, even in the abstract case where visualisations stall. The basic concept is the association of "shortest path" (straight line) with a sort of minimum transition probability or simply the "most probable path". The obvious question is of course what the measure is to define "most probable". But one can IMO find some very plausible argumentation for that. But I figure it's a matter of mindset if you like it or not. It talks about maximum entropy principles. And one obvious objection may be that entropy can be defined in different ways. But it can be shown that the exact definition/choice of this measure doesn't matter within limits. Also, there is seemingly a way to derive this principle without touching the concept of entropy in the first place. I am currenly working on that but I don't yet have anything readable. But it will come. I am also curious to see what Ariel Caticha will come with. It seems he is still working on deriving GR from these principles to make the intuitive connection explicit. Check out his other papers too. I was told he is currently writing a book on information physics. If you got a basic idea of probability theory in the context of learning (which can be very intuitive in the first place) they might excite your imagination. You may not that alot of stuff is missing though, which is true. But it seems not many papers to find on this. /Fredrik
  8. Like ajb mentions, generalized geometrical methods are very popular in physics, supposedly because it can sometimes be given intuitive geometric interpretations, and since it's well studied by mathematicians. So once you've got a geometric formulation, you've got an arsenal of theorems from mathematics to play with. However I wouldn't say that alone "explains" anything in it's proper fundamental sense. It seems many physicists are guided by mathematical beauty and some like geometry and may think that a geometric model are more likely to described nature than anything else. I don't share that view. You can certainly often formulate the same information in several more or less equivalent ways. Pick the one of your choice, but either way it needs qualifying experimental support. There are also geometric interpretations of probability theory, where probability distributions can be exploited to define distance in distribution space (information space). So there seems to be a few equivalent ways to described things. It would mean that the distance between two points, is associated with the probability that they are mixed up. So statistical models can actually map out geometries too if you prefer that interpretation. But some of the arrived geometries are abstract one, and not something as universal as space. /Fredrik
  9. So that is why that pic is in my profile. Those are my friends. /Fredrik
  10. Hmm I do some experiments. But like I said, everyday is an "experiment" to me, wether I asked for it or not Yes the pic is mine: It's a microscope shot of an iodine stain (for glycogen testing) of a culture of windsor brewers yeast, where the darker cells have higher glycogen pools. I made an experiment 3 years ago, which had two purposes, starting from more or less no bio background, I tried to understand and learn about yeast cells in a beer fermentation setting. During that process which was interesting in itself, I tried to study myself, how I tried to make progress. What struck me during this process is that it lead me to abstractions first all, and those abstractions showed similarities with the abstractiosn you typically end up with trying to solve about anything. Eventually I was looped back to physics by a connected series of reasoning. So atm I put the yeast stuff aside because I have no time for it. The idea I initiated was to make a computer simulation of the yeast, by a technique of metabolic network simulations, but I realized that since I could not model dna transcriptions etc in detail... I was going to find a measure that "optimizes the cell" this lead me to ask, what optimizes a cell? Growth rate? if so, the mean or the peak? survival rate? anyway... this ended up basically beeing a generic learning model... (as a note many success has been done with this ideas, metabolic network simulations where the gene expression is estimated by optimation routines (on a defined measure of a combination of growh rate and synthesis of precursors) has been reported to show remarkable agreement with gene expression analysis of real bacterial cultures... actually the real bacterial culture converged to the computer simulation suggestion only after a few generations...) anyway... this grew from a special case to the general case all on it's own... and then when I was trying to picture how the brain can learn howto decode the signals in nerves... from just a collection of electric signals to a perception of reality.... I got a deja vu feeling from 10 years ago... this was 3 months ago... and I resumed the physics project again.... my cellular friends have taught me alot but they will have to ignore then for a few years at least... So I've done some silly experimenting, but to any significant extent and it's more of desktop experiment. Not anything near the billion dollar particle physics labs that is out there /Fredrik
  11. What I appeal to here is what I think is normal human intuition. Humans do not have intuitive first hand experience with electrons. But we do happen to live in the same universe, we are goverend by the same laws. And in the relational information approach I belive in, I can actually extract deep information about reality from a seemingly distinct system, by trying to make the appropriate abstractions. The nice part about studying yourself is that you have unlimited acccess to data.. every day is an experiment So forget about the school days about newtonian world.. because it's obviously a severe simplification.. think deeper... how do You act... if information inference approach should work... all we need to find is the induction step... and why look where we can not see... when I've got prime access to one of the most amazing systems on earth at least...? That's my idea behind the old questions... can physics can from thought alone.... well it can't alone, but _some_ of the principles might... this is how I get most of my personal inspiration anyways. I do not have a lab. I've never seen an electron IRL, and I probably never will /Fredrik
  12. Another intuitive brain comparasion is this: A first your memory fills up. You will see some kind of pattern - associate with p(x) You may be tempted to think that you got it know. But as more data is processed, you may see that the pattern is not stable. So you may initiate some kind of online processing and keep track of wether there is perhaps (second best) a pattern in the changes in hte first pattern? For example take the fourier transform and look for frequency patterns. Associate here with momentum. Now you have learned to see a pattern of a changes in a transformation of the original pattern. This imposes *an expected* constraint on changes in general. Note that we're just guessing, based on experience. But humans are masters here. And this is undoubtedly proven to be successful. Next you may find that there is still fluctuations the transformed pattern, you may repeat the strategy and decode more, until you can't resolve further patterns. Due to there is none, or that your sort of out of memory or whatever (ie imagine trying to storage information about the entire universe... there clearly has to be a limit, unless your a black hole eating the universe) Anyway, know consider that you have two informations... x and a related variable p. The suggested construction now implies that there is a connection between these two that means it does not make sense to know x and p exactly at the same time. you can't specify a pattern, and it's variation ath same time. It makes no sense. This is philosophical a bit fuzzy, but I've got the impression that Gib is a philosopher in the first place. I'm working on formalisms for this, and I think it will reveal some of the logic used. /Fredrik
  13. The historical perspective may be interesting, but set aside that I think that if you want to understand quantum mechanics better some key questions should be 1) What proper support do my presumed classical facts have? When does a extremely qualified expectation transform into a hard fact? 2) Try to mentally define things in terms of experiments, and some things become more clear. For example, momentum is defined in terms of changes in position. It means the definition of momentum means more than one position datapoint - you need change, imlplying you need several datapoints and make comparasions. This is one intuitive understanding of HUP. If x is exactly peaked, the momentum concepts makes no sens and the momentum is either undefined or "any momentum is as likely as any other". Momentum is a measure of variation of position, or our information of position (see 1). Note that I mention variation in a neutral way, without specific reference to time. When you reference to time, you get the energy concept. Implicit in this is a connection between x and p, and E and t that's made explicit in QM and this contains the wave particle thing and implies the HUP. Superposition is just the idea that all possibilities is accounted for in the expectation process. The expectations we have on a observable, is obviously a superposition of the possibilities, right? But upon observation only one of the possibilities is observed. It doesn't have to be more strange than that? One may be tempted to think in hidden variable ideas that perhaps the variable had that value all along and I just didn't know about it? Well, the point would be that what you don't know never impacts your action... this principle should apply to particles as well... a particle responds to information that is available... not to any hidden information. just like a poker player acts on to the information he has, not to the information he could have had (which is a completely ambigous way of reasoning in the first place). How do You do? When you make a decision, your brain considers all the alternatives right? and you make some "averaging", trying to find the "best" action... but then eventually the response is only one of the options - the from your point of view, the estimated best. Suppose I were to model YOU.. I would of course assume that your dynamics is effectively a function of the information you get... you respond to the information you get, compare it with your memory and make a decistion. This wasn't very well writte but maybe it gives some ideas. /Fredrik
  14. In the quantum description the classical conservations laws applies to expectations/mean values. Which means energy is not really conserved, but it's expectation value is, which in turns translates into that fluctuations in the "classical conservation" is allowed. This is to say that energy is on average conserved, but there may be fluctuations. The larger away from the mean you get, the more unlikely is such a fluctuation to be observed. So when you compare classical energy with quantum energy, the more accurate comparasion would be to compare the classical energy to the quantum expectation value. [math]E_{classical} = <Energy.QM> = \sum_{i} p(i)E_{i}[/math] [math]\sum_{i} p(i) = 1[/math] Where the sum is over the possibile energy states. This is not unlike thermodynamics where the temperature T kind of measures the mean energy per particle. But the actually energy of any particular molecule will vary according to a distribution. Chemical reactions proceed even though the mean energy (temperature) is really below the activation energy. In this sense, QM is really more "humble" by realising that we think we "know" are nothing but our expectations. In the classical domain, the expectations are stable enough to qualify for effective facts. /Fredrik
  15. I'm sorry... I'm not sure where to start of focus though. Also it gets a bit philosophical and that's always fuzzy. > The "information loss" refers to that *assuming the hawking radiation to be > thermal* total probability seems to not be conserved - ie. unitary evolution > seems violated. Which means the states seem to kind of step outside of > the hilbert spaces - in violation with QM foundations One of the postulates of quantum mechanics is that the state of the system, our information of the system, is described by one possibility out of a defined set of possibilites. One assumes as a starting point that there exists such a set that contains all the possible states, and this set is known. Which means that no matter what happens, if we sum over all possibilities we should find the state somewhere. And time evolution is considered as just moving the state around within this given set. I mean, regardless of what happens, the state of the system is not supposed to leave this set. It's constrained to this set by assumption. But if some interaction seems to suggest, that suddenly the state is found nowhere in this set, then we have an inconsistency and something is wrong. Either we don't understand the interaction (note that the black hole stuff is mostly gedanken experiments and are thus subject to possible fallacies), or our postulates set of possibilities is incomplete, or the assumption that the set of possibilities is a non-dynamical quantity in the first place might be wrong. Hilbert space is a function space, making parts of the mathematical formalism of QM. Suppose you play dice, expecting outcomes in {1..6}, but if you suddently you get 7 you'd get perplexed. Did someone manipulate the dice? Did the dice evolve another face? Is the dice alive? Or is something wrong with your vision? Maybe it will never happen again? And more importantly, how can you tell he difference? I'm just trying to provoce some questions, since you asked about what the issue was to start with. /Fredrik
  16. Note that there is no question of wether the hilbert space is complete in the mathematical sense. The question is, can we squeeze reality into a particular hilbert space? Which QM does by it's axioms/postulates/assumptions depending no which view you take. /Fredrik
  17. The "information loss" refers to that *assuming the hawking radiation to be thermal* total probability seems to not be conserved - ie. unitary evolution seems violated. Which means the states seem to kind of step outside of the hilbert spaces - in violation with QM foundations. The core of QM is to assume that the space of statefunctions is complete in the sense than any state of reality can be described by a "vector" in this space. Which clearly boils down to the assignment of the event space in the first place? this boils down to the postultes of quantum mechanics, which IMO, as far as reality is concerned, technically are assumptions. Probably the best ones we've got, but still their status of assumptions, or expectations should not be lost. Does it make sense to assume that, when you are to learn something new, you have an accurate list of every possible option you may ever come across? IMO that is doubtful. But that's not to say that an estimate of such a list may be provide an excellent basis for progression. But we must not be so stubborn to think that the list we made in "in all our ignorance" is going to stay accurate forever, it may need revision. I think this "list" (space) is in the general case also possibly dynamic. Of course it is fixed, life gets much easier It's a natural first guess, but nothing I'd like to see carved in stone. When the hawking radiation first hits the object, it is beeing informed, and it responds accordingly. It does not need to be priorly informed. Hawking radiation of course have energy coming from the supposedly evaporating black hole. If the presumed "thermal" hawking radiation has actually information coded in it, the information is nevertheless effectively lost, at least until the information is decoded (ie gaining the information about coding). Just like energy isn't conserved, rather just it's expectation values, I think this is only the first expansion in a general kind of learning expansion, you can similarly see that expectation itself may fluctuate too. But you would not invoke such variations unless you have evidence suggesting it. The trick seems to be that of choosing the right patterns for conserved quantities to limit the expansion. I define my information about x, as my ability to predict x. Of course, I do not actually know until in retrospect wether my predictions were correct, but the learnin strategy seems convergent enough to be successful, so it means I can't KNOW how good I can predict x, I can again only estimate it. But reality repetadly proves the success of this strategy. Often a qualified guess, while not perfect, is extremtly successful. I consider the process of learning to be somewhat similar to that of reaching equilibrium. When there is nothing more to learn, I think you reach some kind of residual unresolvable uncertainty. And then, and only then, does the original QM assumptions of completenss make sense. Others are free to disagree of course. Anyway, I forgot what we were talking about? I guess I responded to your question on information, and I consider it to be quite fundamental, although subjective and relative, but still. Sorry for the ramblings. /Fredrik
  18. > That doesn't sound like "information" to me. That sounds like characteristics. Reality and information are in a way different words and perhaps different ina way but what I suggest is that some qualifying information is so entangled with this supposed reality (beeing described by it's characteristics), that they can't be distinguished. /Fredrik
  19. This is philosophy but here are my "associations" Physical reality is coded in information; we inform ourself about the physical reality, and the qualifying evidence for "physical reality" is the pieces of information we consume. Ie. we have seen evidence supporting reality, or even forming it. physical interactions <-> mutual information exchange; in this interpretations an interaction is usually conflicting information making a mutual exchange, if there was no conflict the interaction would be trivial (ie preservation of previous state). Why doesn't a neutral particle respond to an electric field? From the point of view of the neutral particle, the electric field simply can't be seen. It's unaware of it, and thus wont respond. A neutral particle, by definition, has no way of *relating* to an electric field. So from the direct point of view of this neutral particle, there is no such as an electric field. Consider a poker game, except for the element of bluffing, all pro players play their cards based on the information they have, and from that tries to predict the other players actions by guessing what information they have. Etc. IMO the analogy is clear, and it's implications possibly profound. I could be wrong, but this is my view. If you consider physical reality to be independent of information, then I am curious how do you determine it? out of the infinite possibilities of "possible realities". If there is a qualifying information to pinpoint actual reality, this qualifying is seemingly deeply entangled with whatever this true reality is, right? No? /Fredrik
  20. Dstebbins, I think I know what your saying. But the fact that we are humans, and anything we do so sort of conditional on that fact is nothing we can do anything about But the information concept can certainly be generalized beyond human brain. After all, the human brain is also part of the universe, right? And it happens to be one of the more complex dynamical systems we konw of, that we also have a very good intuitive feeling about, so why not exploit it? > "the laws of physics would exist without humans, just without a language" This is true, however they have been discovered by means of the human brain. We have learned. If you do not consider that fundamental I am not sure what is. How do you think a particle learns about it's environment? How does an electron know that it must deflect in an magnetic field? Couldn't it ignore it? Or is the difference that the human brain has a choice, and the electron doesn't? Well, that would IMO be an inconsistent view because then the only one having a choice would be ME, not any other humans. From my point of view all other humans must simply obey the laws of nature, complex as they are! And what does the human brain dynamics work like? Well, it responds to given information, and action is taken upon that, and the result may lead to new data that is fed back into the brain via our senses. IMO the only consistent view is that information, or a generalized concept there of must be fundamental and have relative meaning beyond human world: A particle needs to somehow be make "aware of", or be informed of, the magnetic field in order to respond. If the particle was completely uninformed about this field, it would ignore it - violating the laws of physics. So, intuitively, x having information about y, must have some sort of connection to the physical interactions between x and y. /Fredrik
  21. > But where in the blue world did we get the scientific law that particles had information IN THEM in the FIRST PLACE? I always considered information to be intrisinsically relatioal. A relation between the system and observe, a relation between two particles reflects their mutual information of each other. In my thinking I can't possibly make any sensible distinction between the qualifying information of x, and x itself. I mean it's kind of a circular and IMO is ultimately the same thing, but put differently. /Fredrik
  22. Gib, I didn't respond to your threads because I think it's concepts difficult enough to ponder on your own, and even more difficult to discuss with others as there is no common terminology and my experiences from long time ago is that the discussions easily turn into some kind of trashing where all effort is spent on trying to understand the meaning of words and concepts with minimal progress made. That doesn't mean I think it's not interesting. I think back in high school I read an book called the "quantum self", it started out interesting and fun but then got very fuzzy... so I think I never finished it. But no matter what I think about it (I don't remember much anyway) perhaps it could be of your interest. I read (half) this book back before I had studie QM formalism so I never read it in the light of my current improved state and I doubt I would consider it worth the time now since I'm onto other projects I rate more promising. /Fredrik
  23. I guess I didn't build that awfully much really, I just tried to find cheap do-your-own solutions for a few things, mostly related to my yeast/fermentation projects on an as-needed basis, whose emphasis was really the computer simulation, and a learning study of myself. The whole point was that I avoided all biology in my previous life, so it was an interesting journey where I learned alot set aside the yeast itself. I did consider trying to built some kind of chromatograph by making some home made column and detector but never continued the testing. I exploited a simple diabetes bood sugar meter for beer analysis. the enzyme assay based on GD/PQQ designed to measure blood glucose, does have sensitivity to other reducing sugars like maltose and maltotriose which happens to be just want I want, so I've made very simple beer analysis with a plain blood sugar meter. It has the teststicks, and then you put htem in the digital meter and get a readin in a few seconds. I did a series and concluded that the standard deviation of the method was bad, but with a larger series is was usable. Easier than titrations as it takes seconds. I've toyed with staining as welll, a qualitative estimation of glycogen level in the yeast cell was made with iodine staining of yeast. Then you can hold a normal digita camera over the microscop and then do RGB analysis on the image. I also has some ideas to do decode measurements from simple conductivity and capacitance measurements. But things got in the way so I never completed it. Of course all normal sensors are easy to hook up, pressure and temp. You can also built your own scales with strain gages. Commercially, the most expensive parts is the condition amplifier unless you built your own. I got some IR sensors that I was hoping to something fun with but never got to it either. I think I got one of those magnetic sensor as well, same story. Check your electronic component vendor for fun usable components. But now since I resumed the physics project I have no time for the fermentation project So it's frozen atm. /Fredrik
  24. Here is an old pic my simple home made bubble counter I made for my beer project I also made an acoustic prototype wich basically was a microphone, and the characteristics of the sound pulse could distinguish the sound of a bubble from ambient noise (human talk etc). Both prototypes worked, but was sensitive and a hassle to calibrate So I ended up finding a pro mass flow meter on ebay that I got instead. However the beer projects are not put on ice as they take too much time. The flow meter was to track the beer fermentation process and get gas flow vs time. In parallell I am/was working on a computer simulation on the fermentation process, including yeast growth and metabolism. Usually the fun part is solving the problems, once the problem is solve it's boring and you never use it again /Fredrik
  25. Usually there are different components, you have first some kind of transducer, that converts a "physical" quantity into an electric quantity. Then you have a signal conditioner and and amplifier, that tunes up the signal into a volt value that goes into the A/D converter. Then you usually have a computer interfact so you can log it. I've dont alot of toying and there is alot you can do at home. First I'd get a simple PC connected A/D converter, say an USB device or something. Then you can toy around with the conditining and transducers. I've made pulse counters (from diodes and photosensors) to be used as flow meters, I've made simple photometers using the same components where you can measure absorbance. I've also hooked up various gas sensers, for example the type of sensor that sensens EtOH vapours in alcohol meters, these you can buy for 2$ and hook up. I was able to get amazing resolution just hovering it over a glass of beer. OF course to handle long term stability and temperature drift you may need more fancy electronics and quality components. You can hook up homemade electrochemical cells. Electronic component vendors usually have a section for single chip sensors, that are fun. From gas sensors, photosensors to humidity sensors. Those you can usually buy for a few $ and if you are handy and have a soldering iron you can have fun with little money. There is alot you can toy with at home, for not alot of money, but I don't know of any guides though. Good luck. /Fredrik
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.