Rotation matrix: Difference between revisions
imported>Paul Wormer No edit summary |
imported>Paul Wormer |
||
Line 47: | Line 47: | ||
\mathbf{R}^\mathrm{T} \mathbf{R} = \mathbf{E} \quad \Longleftrightarrow\quad\mathbf{R}\mathbf{R}^\mathrm{T} = \mathbf{E}. | \mathbf{R}^\mathrm{T} \mathbf{R} = \mathbf{E} \quad \Longleftrightarrow\quad\mathbf{R}\mathbf{R}^\mathrm{T} = \mathbf{E}. | ||
</math> | </math> | ||
A matrix with this property is also called ''orthogonal''. Writing out the two matrix products it follows that both the rows and the columns of the matrix are orthonormal (normalized and orthogonal). Indeed, | |||
:<math> | |||
\begin{align} | |||
\mathbf{R}^\mathrm{T} \mathbf{R} &= \mathbf{E} \quad\Longrightarrow\quad | |||
\sum_{k=1}^{3} R_{ki}\, R_{kj} =\delta_{ij} \quad\hbox{(columns)} \\ | |||
\mathbf{R} \mathbf{R}^\mathrm{T} &= \mathbf{E} \quad\Longrightarrow\quad | |||
\sum_{k=1}^{3} R_{ik}\, R_{jk} =\delta_{ij} \quad\hbox{(rows)} \\ | |||
\end{align} | |||
</math> | |||
where δ<sub>ij</sub> is the [[Kronecker delta]]. | |||
Orthogonal matrices come in two flavors: proper (det = 1) and improper (det = −1) rotations. | |||
Invoking some properties of determinants, one can prove | |||
:<math> | |||
1=\det(\mathbf{E})=\det(\mathbf{R}^\mathrm{T}\mathbf{R}) = \det(\mathbf{R}^\mathrm{T})\det(\mathbf{R}) | |||
= \det(\mathbf{R})^2 \quad\Longrightarrow \quad \det(\mathbf{R}) = \pm 1. | |||
</math> | |||
===Compact notation=== | |||
A compact way of presenting the same results is the following. Designate the columns of '''R''' by | |||
'''r'''<sub>1</sub>, '''r'''<sub>2</sub>, '''r'''<sub>3</sub>, | |||
i.e., | |||
:<math> | |||
\mathbf{R} = \left(\mathbf{r}_1,\, \mathbf{r}_2,\, \mathbf{r}_3 \right) | |||
</math>. | |||
The matrix '''R''' is ''orthogonal'' if | |||
:<math> | |||
\mathbf{r}_i \cdot \mathbf{r}_j = \delta_{ij}, \quad i,j = 1,2,3 . | |||
</math> | |||
The matrix '''R''' is a ''proper rotation matrix'', if it is | |||
orthogonal ''and'' if '''r'''<sub>1</sub>, '''r'''<sub>2</sub>, | |||
'''r'''<sub>3</sub> form a right-handed set, i.e., | |||
:<math> | |||
\mathbf{r}_i \times \mathbf{r}_j = \sum_{k=1}^3 \, \varepsilon_{ijk} | |||
\mathbf{r}_k . | |||
</math> | |||
Here the symbol × indicates a | |||
[[cross product]] and <math>\varepsilon_{ijk}</math> is the | |||
antisymmetric [[Levi-Civita symbol]], | |||
:<math> | |||
\begin{align} | |||
\varepsilon_{123} =&\; \varepsilon_{312} = \varepsilon_{231} = 1 \\ | |||
\varepsilon_{213} =&\; \varepsilon_{321} = \varepsilon_{132} = -1 | |||
\end{align} | |||
</math> | |||
and <math>\varepsilon_{ijk} = 0</math> if two or more indices are equal. | |||
The matrix '''R''' is an ''improper rotation matrix'' if | |||
its column vectors form a left-handed set, i.e., | |||
:<math> | |||
\mathbf{r}_i \times \mathbf{r}_j = - \sum_{k=1}^3 \, \varepsilon_{ijk} | |||
\mathbf{r}_k \; . | |||
</math> | |||
The last two equations can be condensed into one equation | |||
:<math> | |||
\mathbf{r}_i \times \mathbf{r}_j = \det(\mathbf{R}) \sum_{k=1}^3 \; | |||
\varepsilon_{ijk} \mathbf{r}_k | |||
</math> | |||
by virtue of the the fact that | |||
the determinant of a proper rotation matrix is 1 and of an improper | |||
rotation −1. This was proved above, an alternative proof is the following: | |||
The determinant of a 3×3 matrix with column vectors '''a''', | |||
'''b''', and '''c''' can be written as [[scalar triple product#Triple product as determinant|scalar triple product]] | |||
:<math> | |||
\det\left(\mathbf{a},\,\mathbf{b},\, \mathbf{c}\right) = | |||
\mathbf{a} \cdot (\mathbf{b}\times\mathbf{c}) | |||
</math>. | |||
It was just shown that for a proper rotation the columns of '''R''' are orthonormal and satisfy, | |||
:<math> | |||
\mathbf{r}_1 \cdot (\mathbf{r}_2 \times \mathbf{r}_3 ) = \mathbf{r}_1 \cdot\left(\sum_{k=1}^3 \, | |||
\varepsilon_{23k} \, | |||
\mathbf{r}_k \right) = \varepsilon_{231} = 1 . | |||
</math> | |||
Likewise the determinant is −1 for an improper rotation. | |||
==Explicit expression== | |||
Let <math>\overrightarrow{OP} \equiv \vec{r}</math> be a vector pointing from the fixed point ''O'' of a rotating rigid body to an arbitrary point ''P'' of the body. A rotation of this arbitrary vector around the unit vector <math>\hat{n}</math> over an angle φ can be written as | |||
:<math> | |||
\mathcal{R}(\varphi, \hat{n})(\vec{r}\,) = \left[ \vec{r} -(\hat{n}\cdot\vec{r}\,)\; \hat{n}\right] \cos\varphi | |||
+ (\hat{n} \times \vec{r}\,) \sin\varphi. | |||
</math> | |||
where • indicates an [[inner product]] and the symbol × a [[cross product]]. |
Revision as of 06:51, 12 May 2009
A rotation of a 3-dimensional rigid body is a motion of the body that leaves one point, O, fixed. By Euler's theorem follows that then not only the point is fixed but also an axis—the rotation axis— through the fixed point. Write for the unit vector along the rotation axis and φ for the angle over which the body is rotated, then the rotation is written as
Erect three Cartesian coordinate axes with the origin in the fixed point O and take unit vectors along the axes, then the rotation matrix is defined by its elements :
In a more condensed notation this equation is written as
Given a basis of a linear space, the association between a linear map and its matrix is one-to-one.
Properties of matrix
Since rotation conserves the shape of a rigid body, it leaves angles and distances invariant. In other words, for any pair of vectors and in the inner product is invariant,
A linear map with this property is called orthogonal. It is easily shown that a similar vector/matrix relation holds. First we define
and observe that the inner product becomes by virtue of the orthonormality of the basis vectors
The invariance of the inner product under leads to
since this holds for any pair a and b it follows that a rotation matrix satisfies
where E is the 3×3 identity matrix. For finite-dimensional matrices one shows easily
A matrix with this property is also called orthogonal. Writing out the two matrix products it follows that both the rows and the columns of the matrix are orthonormal (normalized and orthogonal). Indeed,
where δij is the Kronecker delta.
Orthogonal matrices come in two flavors: proper (det = 1) and improper (det = −1) rotations. Invoking some properties of determinants, one can prove
Compact notation
A compact way of presenting the same results is the following. Designate the columns of R by r1, r2, r3, i.e.,
- .
The matrix R is orthogonal if
The matrix R is a proper rotation matrix, if it is orthogonal and if r1, r2, r3 form a right-handed set, i.e.,
Here the symbol × indicates a cross product and is the antisymmetric Levi-Civita symbol,
and if two or more indices are equal.
The matrix R is an improper rotation matrix if its column vectors form a left-handed set, i.e.,
The last two equations can be condensed into one equation
by virtue of the the fact that the determinant of a proper rotation matrix is 1 and of an improper rotation −1. This was proved above, an alternative proof is the following: The determinant of a 3×3 matrix with column vectors a, b, and c can be written as scalar triple product
- .
It was just shown that for a proper rotation the columns of R are orthonormal and satisfy,
Likewise the determinant is −1 for an improper rotation.
Explicit expression
Let be a vector pointing from the fixed point O of a rotating rigid body to an arbitrary point P of the body. A rotation of this arbitrary vector around the unit vector over an angle φ can be written as
where • indicates an inner product and the symbol × a cross product.