Tensor Structure

From GM-RKB
(Redirected from tensor structure)
Jump to navigation Jump to search

A Tensor Structure is a mathematical structure that ...



References

2018

  • (Wikipedia, 2018) ⇒ https://en.wikipedia.org/wiki/tensor Retrieved:2018-1-22.
    • In mathematics, tensors are geometric objects that describe linear relations between geometric vectors, scalars, and other tensors. Elementary examples of such relations include the dot product, the cross product, and linear maps. Geometric vectors, often used in physics and engineering applications, and scalars themselves are also tensors. A more sophisticated example is the Cauchy stress tensor T, which takes a direction v as input and produces the stress T(v) on the surface normal to this vector for output, thus expressing a relationship between these two vectors, shown in the figure (right). Given a reference basis of vectors, a tensor can be represented as an organized multidimensional array of numerical values. The ' (also degree or ') of a tensor is the dimensionality of the array needed to represent it, or equivalently, the number of indices needed to label a component of that array. For example, a linear map is represented by a matrix (a 2-dimensional array) in a basis, and therefore is a 2nd-order tensor. A vector is represented as a 1-dimensional array in a basis, and is a 1st-order tensor. Scalars are single numbers and are thus 0th-order tensors. The collection of tensors on a vector space forms a tensor algebra. Because they express a relationship between vectors, tensors themselves must be independent of a particular choice of basis. The basis independence of a tensor then takes the form of a covariant and/or contravariant transformation law that relates the array computed in one basis to that computed in another one. The precise form of the transformation law determines the type (or valence) of the tensor. The tensor type is a pair of natural numbers }}, where is the number of contravariant indices and is the number of covariant indices. The total order of a tensor is the sum of these two numbers. Tensors are important in physics because they provide a concise mathematical framework for formulating and solving physics problems in areas such as stress, elasticity, fluid mechanics, and general relativity. In applications, it is common to study situations in which a different tensor can occur at each point of an object; for example the stress within an object may vary from one location to another. This leads to the concept of a tensor field. In some areas, tensor fields are so ubiquitous that they are simply called "tensors". Tensors were conceived in 1900 by Tullio Levi-Civita and Gregorio Ricci-Curbastro, who continued the earlier work of Bernhard Riemann and Elwin Bruno Christoffel and others, as part of the absolute differential calculus. The concept enabled an alternative formulation of the intrinsic differential geometry of a manifold in the form of the Riemann curvature tensor.


2015

  • (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/tensor#As_multidimensional_arrays Retrieved:2015-1-17.
    • Just as a vector with respect to a given basis is represented by an array of one dimension, any tensor with respect to a basis is represented by a multidimensional array. The numbers in the array are known as the scalar components of the tensor or simply its components. They are denoted by indices giving their position in the array, as subscripts and superscripts, after the symbolic name of the tensor. In most cases, the indices of a tensor are either covariant or contravariant, designated by subscript or superscript, respectively. The total number of indices required to uniquely select each component is equal to the dimension of the array, and is called the order, degree or rank of the tensor.[Note 1] For example, the entries of an order 2 tensor T would be denoted Tij, Ti j, Tij, or Tij, where i and j are indices running from 1 to the dimension of the related vector space.[Note 2] When the basis and its dual coincide (i.e. for an orthonormal basis), the distinction between contravariant and covariant indices may be ignored; in these cases Tij or Tij could be used interchangeably.[Note 3]

      Just as the components of a vector change when we change the basis of the vector space, the entries of a tensor also change under such a transformation. Each tensor comes equipped with a transformation law that details how the components of the tensor respond to a change of basis. The components of a vector can respond in two distinct ways to a change of basis (see covariance and contravariance of vectors), where the new basis vectors [math]\displaystyle{ \mathbf{\hat{e}}_i }[/math] are expressed in terms of the old basis vectors [math]\displaystyle{ \mathbf{e}_j }[/math] as,

       :[math]\displaystyle{ \mathbf{\hat{e}}_i = \sum_j R^j_i \mathbf{e}_j = R^j_i \mathbf{e}_j, }[/math]

      where Ri j is a matrix and in the second expression the summation sign was suppressed (a notational convenience introduced by Einstein that will be used throughout this article).[Note 4] The components, vi, of a regular (or column) vector, v, transform with the inverse of the matrix R,

       :[math]\displaystyle{ \hat{v}^i = (R^{-1})^i_j v^j, }[/math]

      where the hat denotes the components in the new basis. While the components, wi, of a covector (or row vector), w transform with the matrix R itself,

       :[math]\displaystyle{ \hat{w}_i = R_i^j w_j. }[/math]

      The components of a tensor transform in a similar manner with a transformation matrix for each index. If an index transforms like a vector with the inverse of the basis transformation, it is called contravariant and is traditionally denoted with an upper index, while an index that transforms with the basis transformation itself is called covariant and is denoted with a lower index. The transformation law for an order-m tensor with n contravariant indices and covariant indices is thus given as,

       :[math]\displaystyle{ \hat{T}^{i_1,\ldots,i_n}_{i_{n+1},\ldots,i_m}= (R^{-1})^{i_1}_{j_1}\cdots(R^{-1})^{i_n}_{j_n} R^{j_{n+1}}_{i_{n+1}}\cdots R^{j_{m}}_{i_{m}}T^{j_1,\ldots,j_n}_{j_{n+1},\ldots,j_m}. }[/math]

      Such a tensor is said to be of order or type .[Note 5]

      This discussion motivates the following formal definition: _{i_{n+1}}\cdots R^{j_{m}}_{i_{m}}T^{j_1,\ldots,j_n}_{j_{n+1},\ldots,j_m}[\mathbf{f}].</math> }} The definition of a tensor as a multidimensional array satisfying a transformation law traces back to the work of Ricci. Nowadays, this definition is still used in some physics and engineering text books.




Cite error: <ref> tags exist for a group named "Note", but no corresponding <references group="Note"/> tag was found