Freeman Posted August 17, 2005 Posted August 17, 2005 OK, after reading books on linear algebra, and vector calculus, I have come to tensor transformations. Huzzah! However, I do not have any book to study from(!) so I am rather "sol". I do understand that covariant tensors have subscript lettering and contravariant tensors have superscipt lettering. How do I transform a given vector in covariant and contravariant tensor transformations? I googled it and there were a number terrible websites. Any help would be great!
timo Posted August 18, 2005 Posted August 18, 2005 The question how to transform a tensor of course depends on how you want to transform ... Assume you have the following transformation for a covariant vector: [math] \var v_i = T_i{^j} v_j [/math]. For this matrix T the inverse [math] T^i{_k} \, : T^i{_k} T_i{^j} = \delta^j{_k} [/math] exists. Since scalars like the product of a covariant and a contravariant vector are invariant under coordinate transformation, one can now construct the transformation rules for a contravariant vector out of the trafo for a covariant one and the demand on invariance: [math] c = \bar v_j \bar v^j = \bar v_j \delta^j{_k} \bar v^k = \underbrace{\bar v_j T_i{^j}}_{:= v_i} \underbrace{T^i{_k} \bar v^k}_{\rightarrow v^i} = v_i v^i = c [/math]. Now that you know the transformations for covariant and contravariant vectors, the same method can be used to determine the transformations for a tensor. The key point is realizing that a tensors of n covariant and m contravariant ranks maps n contravariant and m covariant vectors on a scalar. The example of a rank-1 contravariant tensor is just the one from above. For higher ranks, you simply include more deltas. In the end you come up with the result that each covariant index and each contravariant index gets contracted by the matrix you´d contract a contravariant/covariant vector with. EDIT: Nah, that´s incoherently written, at least. Have to go to work now so I can´t edit it out. The key point still is that the tensor transformations follow from the vector transformation and the demand on invariance of a scalar.
MetaFrizzics Posted August 18, 2005 Posted August 18, 2005 Quick Intro to Tensor Analysis: An alternative method for dealing with the problems of measurement does exist, one that does not require transforming to a Cartesian system but receives its information directly from the metric components [math]g_{ij}[/math]; it does, however, require the development of a different, less pictorial formalism. Such a formalism is provided by a field of mathematics where a dazzlingly elegant notation is associalted with an approach in which basis vectors are hardly, if ever, mentioned. It is called tensor analysis. Differences between Vectors and Tensors One should not make the mistake of thinking that tensor analysis is merely an extension of vector analysis because, even though a closely related math universe is addressed, the fundamental approach of the two formalisms is radically different. Vector Analysis concentrates on properties of geometrical quantities that can be specified without regard to coordinate systems, or components. By contrast, the ethos of tensor analysis is to talk primarily about components: the "underlying geometrical object itself" tends to move into the background. The fact that a relation among tensor quantities is independent of space distortions is then customarily shown, not by avoiding the use of components, but by demonstrating that the corresponding relation among components maintains its form under general coordinate transformations . As a result, tensor analysis, while still profoundly geometrical, is not particularly pictorial at all. Other Objects that Show Up In this new formalism, the role of arrows, sheaves, etc. is taken by a large genus of quantities called tensors. They include, for example, not merely covariant vector capacities (ie. like thumbtacks) but also covariant vector densities (for which no simple picture exists). Also, whereas vector components are numbereed by a single index which ranges over the dimensions of the space, a tensor componoent may carry any number of indexs having that same range. Each index can be covariant (written as a subscript) or contravariant (written as a superscript). One should mention also that there are a number of players on the stage of tensor analysis which are not, in fact tensors, although they carry similar sub and super scripts. Purpose of Tensors Tensor Analysis has many applications. One is in classical differential geometry of the curve, and surfaces in ordinary space, as well as more important generalizations to spaces of higher dimensions. e.g., Riemannian geometry. Another application is mathematical physics. Here tensor analysis lets us easily express (in terms of curvilinear coordinates) the fundamental equations of the various subjects such as hydrodynamics, elasticity, electricity, and magnetisim. It can also help formulate natural laws, since when laws are expressed this way they are independant of any particular coordinate system. This independance of the coordinate system allows us to choose whatever system is convenient or appropriate to describing or solving a given problem.
MetaFrizzics Posted August 18, 2005 Posted August 18, 2005 By the way, here is the perfect free online textbook: Quick Introduction to Tensor Analysis But I would get a copy of: Geometrical Vectors by Gabriel Weinreich as well, for a good grounding in vectors too.
Freeman Posted August 18, 2005 Author Posted August 18, 2005 Thanks...I have a few questions. Is there any real difference between covariant and contravariant tensors other than they are using different scripts? I know this one is stupid, but can you graph out a rank 2 (or any other higher ranked) tensor? How? Are tensors just extensions of the Jacobian?
MetaFrizzics Posted August 18, 2005 Posted August 18, 2005 The Jacobian lets you convert from Cartesian to Curvilinear Coordinates. Now in the new system of coordinates, you apply tensor analysis. One other thing you might be missing here is a good grounding in Partial Differential Equations, and going from Cartesian coordinates to Curvilinear Coordinates. A good book on Partial Differential Equations is: Partial Differential Equations Basic Theory by Taylor (1996 Springer Publ.)
timo Posted August 18, 2005 Posted August 18, 2005 Thanks...I have a few questions. Is there any real difference between covariant and contravariant tensors other than they are using different scripts? You got a vector space V and you got a reziprocal vector space D (also called dual vector space). Covariant vectors are physical vectors. Contravariant vectors are elements of the dual space. But since the dual space of the dual space is the original space' date=' and since one can obtain corresponding vectors by contracting with the metric tensor, one usually doesn´t pay much attention. So: In first principle there is a huge difference. For practical purposes there´s usually none. Same applies for tensors, of course: A contravariant index works on an element of V and a covariant index works on D. I know this one is stupid, but can you graph out a rank 2 (or any other higher ranked) tensor? How? I know this sounds stupid, but what do you mean by "graph out" ? Are tensors just extensions of the Jacobian? Actually, I have never thought of that before. By definition, a tensor of rank(a,b) is a multilinear map [math] V^a \times D^b \rightarrow \textbf{R} [/math]. So there´s not much connection to be seen in the first place. However, the two typical tensors I´d think of now -the energy momentum tensor and the curvature tensor- are in fact related to derivatives. So one could find an analogy, there. I would be careful to restrict yourself to "it´s just an extension of the Jacobian" but it might be an interesting analogy from time to time. Of course, the meaning of "extension" is arbitrarily extendable .
MetaFrizzics Posted August 18, 2005 Posted August 18, 2005 If you're ready to look at special tensors, here's some more links: Metric Tensor/Extensor The Energy-Momentum Tensor Energy-Momentum Tensor as Form ElectroMagnetic Field Tensor
Freeman Posted August 19, 2005 Author Posted August 19, 2005 Thanks for all the help everyone, its great stuff. I know this sounds stupid, but what do you mean by "graph out" ? Given a rank two tensor, and -say- cartesian coordinates, how would one go about to put the matrix on the coordinates?
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now