Anamitra Palit Posted December 19, 2020 Share Posted December 19, 2020 The covariant derivative of a rank one contravariant tensor is a mixed tensor of rank two. Considering its transformation we arrive at a conflicting result[discrepancy] in the enclosed paper.Equation (3) should no be valid for an arbitrary tensor Transformation_Covariant_Derivative.pdf Link to comment Share on other sites More sharing options...
joigus Posted December 19, 2020 Share Posted December 19, 2020 There is no conflict in the operation of taking covariant derivatives of any tensor, at any order of derivation. Again, can you explain here, instead of linking to a document, please? Link to comment Share on other sites More sharing options...
Markus Hanke Posted December 19, 2020 Share Posted December 19, 2020 1 hour ago, Anamitra Palit said: Considering its transformation we arrive at a conflicting result[discrepancy] As joigus said, there are no conflicts or discrepancies at all. As you correctly say, the covariant derivative yields a tensor, so it automatically has the correct transformation properties. This is so pretty much by definition, because otherwise it wouldn’t be a covariant derivative at all! Link to comment Share on other sites More sharing options...
Anamitra Palit Posted December 19, 2020 Author Share Posted December 19, 2020 (edited) The covariant derivative is indeed a tensor.The example in the attached paper considers the usual transformation of a rank two mixed tensor. And this leads to an equation revealing a discrepancy [equation (3) of the paper]. Thus standard theory has been used to project a discrepancy. Therefore something must be wrong with the standard theory itself.Otherwise why should the discrepancy be there? Edited December 19, 2020 by Anamitra Palit Link to comment Share on other sites More sharing options...
studiot Posted December 19, 2020 Share Posted December 19, 2020 1 hour ago, Anamitra Palit said: The covariant derivative is indeed a tensor.The example in the attached paper considers the usual transformation of a rank two mixed tensor. And this leads to an equation revealing a discrepancy [equation (3) of the paper]. Thus standard theory has been used to project a discrepancy. Therefore something must be wrong with the standard theory itself.Otherwise why should the discrepancy be there? If you are able to work with tensors you should be able to take a tensor and post here (not a link to something) a simple example of the alleged discrepancy. Link to comment Share on other sites More sharing options...
Markus Hanke Posted December 20, 2020 Share Posted December 20, 2020 20 hours ago, Anamitra Palit said: Otherwise why should the discrepancy be there? There is no “discrepancy”, so the point is moot. 20 hours ago, Anamitra Palit said: Thus standard theory has been used to project a discrepancy. Therefore something must be wrong with the standard theory Non-sequitur. If you arrive at some kind of “discrepancy”, then that means you did something wrong, plain and simple. Tensor calculus is not a “theory”, it’s a mathematical framework that has been extensively developed, and is fully self-consistent. 1 Link to comment Share on other sites More sharing options...
Anamitra Palit Posted December 20, 2020 Author Share Posted December 20, 2020 (edited) Markus Henke: remarked "Non-sequitur. If you arrive at some kind of “discrepancy”, then that means you did something wrong, plain and simple. Tensor calculus is not a “theory”, it’s a mathematical framework that has been extensively developed, and is fully self-consistent." The important point is that the conflict exists as projected in the paper. Can he refute the contradiction shown in the paper by finding errors with my calculation? It is clear that Markus Henke has has not found any error with my calculations which,incidentally, are not of a lengthy nature. If standard theory happens to be correct everywhere, as Markus Henke believes in, he should be able to point directly to errors in my calculations. If he really such found errors he would have been vocal about them. Edited December 20, 2020 by Anamitra Palit -2 Link to comment Share on other sites More sharing options...
joigus Posted December 20, 2020 Share Posted December 20, 2020 2 hours ago, Markus Hanke said: There is no “discrepancy”, so the point is moot. Non-sequitur. If you arrive at some kind of “discrepancy”, then that means you did something wrong, plain and simple. Tensor calculus is not a “theory”, it’s a mathematical framework that has been extensively developed, and is fully self-consistent. I stand by every word here. There are many sources of possible mistakes. A very common one is forgetting that \( g^{\mu \nu} \) is the inverse of \( g_{\mu \nu} \). It is always the best idea to go over the calculation again, instead of believing you've found a shortcut to the Nobel Prize. It's a natural rite of passage. You do it in the abstract, with indices; you do it with polar coordinates on the plane, you do it with hyperbolic polar coordinates. You convince yourself that it's correct. You turn on and off the contravariant to covariant "switch". It still works... Oh my. It must be true. That's the path. Link to comment Share on other sites More sharing options...
Markus Hanke Posted December 21, 2020 Share Posted December 21, 2020 22 hours ago, Anamitra Palit said: Can he refute the contradiction shown in the paper by finding errors with my calculation? I will not be opening files from untrustworthy sources, and you have failed to presented it here, so I haven’t seen what you did. However, the details are irrelevant, because the very fact that you arrived at a contradiction is the error. When you start with a 4-vector - which is a rank-1 tensor -, and take the covariant derivative, the result is a rank-2 tensor. Since both of these objects are tensors, they are subject to the same general transformation law - so there cannot be any contradiction. You’re simply mapping a rank-n tensor into a rank-(n+1) tensor. If you arrived at anything else, then the error is logically yours; there is nothing to be refuted here. Link to comment Share on other sites More sharing options...
Anamitra Palit Posted December 22, 2020 Author Share Posted December 22, 2020 Difficulties with the theory of Tensors: The transformation of a rank two covariant tensor has been considered. Then we proceed to consider a diagonal tensor[off diagonal components are zero:A^(mu nu)=0 for mu not equal to nu] to bring out a result that all tensors should be null tensors. A link to the google drive file has been provided.A file has also been attached considering the file attachment facility that has been provided by the forum. https://drive.google.com/file/d/10z63Xidgs3m8p04_C6ZiGh-8Q6KTwPsh/view?usp=sharing Incidentally I tried the Latex with the code button.But I am not getting the correct preview. Example \begin{equation}\bar{A}^{\mu\nu}=\frac{\partial \bar{x}^{\mu}}{\partial x^{\alpha}}\frac {\partial \bar{x}^{\nu}}{\partial x^{\beta}}A^{\alpha \beta}\end {equation} \[{\ bar{A}}^{\mu\nu}=\frac{\partial {\bar{x}}^{\mu}}{\partial x^{\alpha}}\ frac{\partial {\bar{x}}^{\nu}}{\partial x^{\beta}}\] Currently for a long time I am not a frequent user of Latex thanks to the equation bar of MS Word. But I do appreciate,like many others, the application of Latex in various forums.Help is being requested from the forum regarding Latex. Link to comment Share on other sites More sharing options...
Markus Hanke Posted December 23, 2020 Share Posted December 23, 2020 On 12/22/2020 at 7:13 AM, Anamitra Palit said: The transformation of a rank two covariant tensor has been considered. Then we proceed to consider a diagonal tensor[off diagonal components are zero:A^(mu nu)=0 for mu not equal to nu] to bring out a result that all tensors should be null tensors. I‘m sorry, but that doesn‘t make any sense at all. The tensor transformation law applies irrespective of the value of individual components in a particular coordinate basis, so saying that all tensors should be null tensors on account of their transformation law is an entirely meaningless statement. To pick just one random example, the Minkowski tensor evidently transforms as a rank-2 tensor, but it most certainly isn‘t null, regardless of what coordinate basis you choose. 1 Link to comment Share on other sites More sharing options...
joigus Posted December 23, 2020 Share Posted December 23, 2020 2 hours ago, Markus Hanke said: I‘m sorry, but that doesn‘t make any sense at all. The tensor transformation law applies irrespective of the value of individual components in a particular coordinate basis, so saying that all tensors should be null tensors on account of their transformation law is an entirely meaningless statement. To pick just one random example, the Minkowski tensor evidently transforms as a rank-2 tensor, but it most certainly isn‘t null, regardless of what coordinate basis you choose. Enough said. Link to comment Share on other sites More sharing options...
studiot Posted December 23, 2020 Share Posted December 23, 2020 On 12/22/2020 at 7:13 AM, Anamitra Palit said: Difficulties with the theory of Tensors: The transformation of a rank two covariant tensor has been considered. Then we proceed to consider a diagonal tensor[off diagonal components are zero:A^(mu nu)=0 for mu not equal to nu] to bring out a result that all tensors should be null tensors. A link to the google drive file has been provided.A file has also been attached considering the file attachment facility that has been provided by the forum. https://drive.google.com/file/d/10z63Xidgs3m8p04_C6ZiGh-8Q6KTwPsh/view?usp=sharing Incidentally I tried the Latex with the code button.But I am not getting the correct preview. Example \begin{equation}\bar{A}^{\mu\nu}=\frac{\partial \bar{x}^{\mu}}{\partial x^{\alpha}}\frac {\partial \bar{x}^{\nu}}{\partial x^{\beta}}A^{\alpha \beta}\end {equation} barAμν=∂x¯μ∂xα frac∂x¯ν∂xβ Currently for a long time I am not a frequent user of Latex thanks to the equation bar of MS Word. But I do appreciate,like many others, the application of Latex in various forums.Help is being requested from the forum regarding Latex. You also started another thread to ask this question about LaTex. I explained how to post tensors at SF in that thread for you, but I don't seem to have an answer. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now