-
Posts
4082 -
Joined
-
Last visited
Content Type
Profiles
Forums
Events
Everything posted by Severian
-
My halo alias is obvious. Unfortunately I am completely crap at it.
-
Dark matter research (Edinburgh conf. this week)
Severian replied to Martin's topic in Astronomy and Cosmology
-
Dark matter research (Edinburgh conf. this week)
Severian replied to Martin's topic in Astronomy and Cosmology
Unfortunately I could not go to all of the conference because I am in the middle of moving house. So I was just there for the day of my talk. The transparancies for the talks are on the web site too. If you want an introductory overview, I would recommend the talks of the first day, and in particular Keith Olive and Leszek Roszkowski. You should be aware though that DM researchers have a bad habit of placing exculsions at 2 sigma. The WMAP data then rules out a lot of the models for DM. This is very unfair in my opinion, since 2 sigma is far too low. Saying a model is ruled out to 2 sigma is only saying that it is ruled out to 95.45% confidence. Statistically, you would expect a correct model to be 'ruled out' one in 20 measurements! In contrast, in particle physics exclusions (ruling something out) is done at 3 sigma (99.73%), and discovery (definitively saying something has been discovered) is announced with a 5 sigma (99.9943%) confidence. Changing the exclusion limits on the plots from 2 to 3 sigma would make a substantial difference. -
Space-time and the thermodynamic arrow of time.
Severian replied to Sorcerer's topic in Modern and Theoretical Physics
Yes - possibly. The Baryon asymmetry (the dominance of matter over anti-matter) is not properly understood. Our models don't have enough CP violation to account for it (this is one of the BIG questions in physics right now, since it appears contradictory to the Big Bang). And K0 decays are indeed one place where CP violation is seen. The world would be P symmetric if you could take the laws of physics and subtitute x->-x, y->-y, z->-z (where x, y and z are the usual Cartesian coordinates) and get exactly the same laws back. So F=ma for example is parity conserving. As swansont said, neutrinos violate parity because they are all left handed - there is no right handed neutrinos. A parity transformation would map a left handed neutrino onto a right handed one, so clearly the universe is not P conserving. I am not sure what you mean by that. A GUT does not include gravity though - a GUT is a union of the three lower energy forces (weak nuclear, strong nuclear and electromagnetism). A theory of everything (TOE) would include gravity. -
Yes, the electron and positron are virtual - they are not om mass-shell, so [math] E_e^2-p_e^2c^2 \neq m_e^2 c^4[/math]. As Swantson says (sort of), they are allowed to be off-shell only for a short time (inversely proportional to how off shell they are). Try the math if you like: you will see that you cannot have [math]\gamma \to e^+e^-[/math] while keeping [math]E_{\gamma}=p_{\gamma} c[/math] and [math]E_e^2-p_e^2c^2 = m_e^2 c^4[/math] For your HE gamma rays, you have [math] \gamma \gamma \to e^+e^-[/math]. The extra photon lets you keep all the (external) particles on shell.
-
I find that surprising. I had thought one of the major problems of quantum gravity was that it had too many non-renormalizable divergences. There are very few theories which need no renormalization (N=4 supersymmetry is one). Maybe I should go read up on LQG... (strangely enough it is a subject which I hear very little about). That sounds very like renormalization under a different name....
-
I need to get a new laptop, and I was thinking about a Dell Latitude D800. Has anyone tried this machine? Anyone have any advice? I have decided against having a truely dinky laptop since I will be doing the majority of my work on this. It doesn't really need to be terribly fast - I don't do anything graphics intensive. Battery life is important but 2-3hrs is probably enough.
-
Space-time and the thermodynamic arrow of time.
Severian replied to Sorcerer's topic in Modern and Theoretical Physics
Entropy is a macroscopic statistical quantity. You can't talk about statistics for one particle, so one particle doesn't have a defined entropy. Entropy only makes sense when talking about ensembles of particles. When you discuss macroscapic quantities you are talking about averages of the individual particle properties. So even though these properties are quantum mechanically uncertain, the mean-value-theorem tells you that this uncertainty will become very small when you take the average of a large number of states. A lot of uncertain quantum properties give you your macroscopic classical measurement. Statistically multiparticle states become more disordered (because phase space opens up) and Entropy increases. But you are right, in principle, all the particles in a large ensemble could suddenly line up and dance the tango (so to speak) - the probability will be very low but not necessarily zero. (The probability of all the oxygen molecules in the air around you tunnelling into the next room and sufocating you is also non-zero.) Things only become more disordered on average, but since the universe contains a heck of a lot of particles, the increase of entropy with time is a very very good macroscopic premise. So the entropy 'arrow of time' is inescapably linked to the space-time definition of time as the forward (or backward) light cone. Incidentally, you have missed the obvious quantum level 'arrow of time': CP violation. We know that CPT is conserved (i.e. that a charge conjugation © (swapping matter for antimatter) followed by a parity switch (P) (looking in a mirror) followed by reversing the direction of time (T) leaves the physics laws unchanged), so CP violation (which has been observed in nature) implies that T is not conserved. The laws of nature are not the same in the reverse time direction. -
As you point out the nice thing about natural units is that you simplify equations. But presumably one cannot simplify all equations at once, so it is best to simplify the ones which are most fundamental. So, for example, you still have c=1, which is good. But G is not really very fundamental. It is the gravitational coupling constant. It may seem fundamental when using GR, but eventually we will have a quantum theory of gravity where it becomes a renormalised coupling constant. We see this already in your beloved Quantum Loop Gravity. Renormalisation introduces an extra energy (or distance) scale into the equations (something called dimensional transmutation) and then coupling constants change with energy. The electromagnetic coupling [math]\alpha[/math] changes from 1/137 at eV scales to 1/128 at a GeV. So I expect that G changes too. So even if you pick one value of 'G' to set to one (or 8 pi or whatever) you will still generate numbers in your equations. I don't see the point in this. The gravitational constant is [math]G = 6.707 \times 10^{-39} \hbar c ({\rm GeV}/c^2)^{-2} = (1.221 \times 10^{19} {\rm GeV})^{-2}[/math] where I have set [math]\hbar c=1[/math]. This defines the Planck length and Planck energy (so G is the planck energy to the power of -2). But why should gravity be the force that we use for units? Why not use the Fermi constant: [math]G_F = 1.16639 \times 10^{-5} (\hbar c/{\rm GeV})^2 = (292.8 {\rm GeV})^{-2}[/math], which describes the strength of electroweak interactions (at a particlular scale)? In fact, our current energy unit (the eV) is nice because it is fundamental to two of the forces, electromagnetism (obviously) but also QCD, where conincidentally 1GeV is the hadronisation scale. Once we have a theory of everything, we may then have one force with one characteristic distance scale, but until them, isn't any choice we make rather arbitrary?
-
-
This seems especially ironic coming from someone with Arnie as his avatar. (But then again, who am i to talk -- I have a cow! mooo!)
-
I always get pissed off when America describes itself as a democracy. It is not a democracy - it is a federal republic. A democracy is where the decisions of government are made by the people through a democratic vote - not where the people vote for a candidate to represent them in government. The closest we have right now to a true democracy is Switzerland. It may be possible to make a 'true' democracy in the future by setting up some kind of internet voting on issues, but would we really want that? Is democracy really the best form of government? Even though Hitler's election was a sham, he did have vast popular support in Germany. The majority is not always right. I don't like having democracy itself as a goal, a principle to acheive, or even a desire. I think good government should be the desire, and if democracy is part of that, then fine, but if it isn't, that's fine too.
-
I wouldn't be surprised if Kerry was arrested for this. It is just the sort of thing that Bush would do to 'win' the election (we all saw what he got up to last time). Bush is way ahead in the polls now. I have already stated that I will not visit the US while Bush is in power, so it looks like I will not be going there for quite some time. Frankly, if the American public is stupid enough to vote Bush back in, they deserve whatever they get....
-
Space-time and the thermodynamic arrow of time.
Severian replied to Sorcerer's topic in Modern and Theoretical Physics
The thermodynamic arrow of time is just an observation that phase-space opens up when particles interact. In other words: if I have one particle in the universe, the only quantities which are important are its internal quantum numbers, since there is no extrenal reference to compare the momentum (or even mass) of the particle with. If the particle decays into 2 particles, they will be back-to-back in the CM frame and have a pre-determined magnitude of momentum and energy, but the direction they fly off in is random. So the system needs at least one more variable to describe it. The 'phase-space' for the new state is bigger than the old one - the number of possible configuration of the universe increase with each interaction. While it is possible (statistically speaking) for the universe to return to the initial state, it does not happen because there are so many other possible states opening up, that the probability of returning to the initial state becomes (effectively) zero. -
People here seem to have too rigid an idea of mass. The photon, for example, is continually shifting between a normal photon and an electro-poritron pair: [math]\gamma \to e^+e^- \to \gamma \to e^+e^- \to \gamma \to e^+e^- \to \gamma[/math] etc. So in a way, we are changing energy into mass (and back) all the time. In fact normally this sort of shifting would give a quantum correction to the particle mass. For example, the Higgs boson does [math]H \to t \bar t \to H \to t \bar t \to H \to t \bar t \to H \to t \bar t \to H[/math] etc. When we measure the Higgs boson mass (in the future!) this would cause a shift in the effective Hiiggs mass since one cannot in principle distinguish the Higgs boson from a highly virtual top--anti-top pair. In the case of the photon though, when one calculates the quantum correction to the mass, one finds that it is zero, keeping the photon massles. At first sight this is surprising, but on deeper examination one realises that the photon is kept massless by a symmetry: the local U(1) symmetry of electromagnetism. Since the original system is symmetric under U(1), the quantum corrections must be too and the photon remains massless. In fact, pretty much any massless particle must have an associated symmetry since otherwise quantum corrections would give them a finite mass. The gluon for example is massless because of the SU(3) color symmetry of QCD.
-
OK - I understand the 100 GeV bound now. This is for the NMSSM, not the MSSM. In next-to-minimal models the Higgs boson can be considerably lighter than the LEP experimental 'bounds' because the coupling to the Z boson can be much less. This is because the main LEP channel for producing a Higgs would be [math]e^+e^- \to Z^* \to ZH[/math]. By reducing the ZZH coupling, you reduce the the cross-section of the above reaction (known as Higgs-strahlung) and therefore LEP would not have seen it. Coincidentally, I pointed this out first in hep-ph/0403137 (so now you know who I am). I can't believe these idiots didn't cite my paper - they definitely know about it because Hugonie has cited it before - they are just being jerks. Anyway, politics aside, they are saying that in these 'interesting scenarios' where the Higgs in so light the lightest neutralino is in the mass range 50-100GeV. In other words, they are not saying that it is an upper bound on the nuetralino - just that it has to be light if you want such a light Higgs boson (this is basically because the it is the scale [math]\mu[/math] which sets both masses).
-
Yes - that is true. It is unobserved The reason I believe it to be dark matter is mainly because I believe supersymmetry exists. And if supersymmetry with R-parity exists, then the LSP will contribute to dark matter. The fact that a neutralino of a 100GeV or so would provide almost exactly the right amount of dark matter to explain the WMAP data is extra evidence. Can this just be coincidence? The lower bound is easy - since we have been looking for it and not found it, it must be heavier than 50 GeV or so. If it was lighter than this we would have seen it at LEP. The upper bound is more tricky to explain. Firstly there is the WMAP data: if the neutralino is too heavy, then it gives too much mass to the universe and disagrees with the WMAP 'relic density' measurements. But if I recall correctly, this is not 100GeV, but quite a bit more (you can play tricks like having the neutralino be half the mass of a heavy Higgs boson, to increase its annihilation rate, and thus its contribution to dark matter densities), maybe 200GeV or so. Then there are theoretical prejudices. For example, if one believes in unification of masses at the GUT scale (very high energies), then the LSP neutralino becomes quite light in comparison to other susy particles (this is because it is mainly a bino, so doesn't feel the SU(3) or SU(2) forces which would push the mass up). Then, if you want supersymmetry to solve phenomenological problems at low energies, there is an upper limit on the susy masses, placing an upper limit on the neutralino mass. As I said though, these are prejudices and don't have to be so. Where did you get the 100 GeV figure? Hmm.. I wasn't 100% happy about it, but it went OK I suppose.
-
Thales talked about a neutrino, rather than a neutralino, and he is correct - the neutrino would not explain the dark matter sitting in the halo of galaxies. They could only explain the part of dark matter which is needed to explain the WMAP data (which is a lot more than galactic halos or MACHO's can explain). Neutralinos on the other hand are (as Martin said) heavy. They are actually supersymmetric particles. If supersymmetry is true then every particle gets a partner which differs by spin 1/2. So the spin 1 gauge bosons (the W, Z, gluon and photon) all get partners which are spin 1/2 called gauginos, and the Higgs bosons get partners which are spin 1/2 called higgsinos. Now the neutral gauginos and higgsinos all have the smae quantum numbers so what you see in reality(?) is a mixture of them called neutralinos -- part higgsino and part gaugino. In minimal supersymmetry, there are 4 neutralinos: 2 gauginos (partners of the Z and the photon, and two higgsinos, partners of the Higgs bosons h and H). In minimal supergravity, the lightest neutralino is mainly the gaugino partner of the photon (to be more precise it is usually mainly the partner of the U(1) hypercharge boson, called the bino). Supersymmetry has an odd property called R-parity, which is needed to stop protons decaying, R-parity says that if a supersymmetric particle decays, it must produce another supersymmetric particle. Since the lightest neutralino is often the lightest supersymmetric particle (LSP), it cannot decay and is stable. This is what makes it a good (cold) dark matter candidate. The paper that Martin quotes is not a very good one for reading up on this because it is about a non-minimal model called the NMSSM (Next-to-Minimal-Supersymmetric-Standard-Model). In that model, there is an extra Higgs boson and its susy partner (a higgsino) is the LSP. A higgsino LSP has different properties from a bino like LSP, so this is not very standard. It is an interesting model though. It also has problems with domain walls in the early universe. You can read up on this theory in a slightly more pedagogic form in hep-ph/0304049 and hep-ph/0407209. Alternatively you can read up about Cold Dark Matter at the LHC in hep-ph/0406147 and hep-ph/0403047.
-
Dark matter is probably a neutralino [math]\tilde \chi_1^0[/math]. (I am giving a talk on this at the 5th International Workshop on the Identification of Dark Matter, tomorrow.)
-
You have quark and lepton the wrong way round. In my opinion, it shouldn't be based on post count. If you post a particularly interesting scientific point, you should be awarded a point by a mod. Then people who post science discussions as the intention of the forum would be rewarded, while spam and chit-chat wouldn't be. I think the mods here know enough and are openminded enough for that to work.
-
Very true. The 5th International Workshop on the Identification of Dark Matter is on in Edinburgh next week.
-
10-20 eV/c2 if you insist.
-
Should Russia declare an all out war on Chechen Rebels
Severian replied to bloodhound's topic in Politics
They don't need to invade Chechnya. They are already there!