当前位置:网站首页>Derivative Operations on Vectors and Derivative Operations on Vector Cross and Dot Products
Derivative Operations on Vectors and Derivative Operations on Vector Cross and Dot Products
2022-07-30 07:50:00 【sunset stained ramp】
目的:I've been writing optimized code recently,The variables in the function need to be differentiated,and find their Jacobian matrices.Therefore, the derivation of vectors and matrices is used.
A vector can be represented as follows: Y = [ y 1 , y 2 , . . . , y m ] T Y=[y_1,y_2,...,y_m]^T Y=[y1,y2,...,ym]T
Basic knowledge of vector derivatives.It is divided into the following categories:
1)向量 Y = [ y 1 , y 2 , . . . , y m ] T Y=[y_1,y_2,...,y_m]^T Y=[y1,y2,...,ym]T对 x x x标量求导:
∂ Y ∂ x = [ ∂ y 1 ∂ x ∂ y 2 ∂ x ⋮ ∂ y m ∂ x ] \cfrac{\partial{Y}}{\partial{x}}=\begin{bmatrix} \cfrac{\partial{y_1}}{\partial{x}} \\ \cfrac{\partial{y_2}}{\partial{x}} \\ \vdots \\ \cfrac{\partial{y_m}}{\partial{x}} \end{bmatrix} ∂x∂Y=⎣⎡∂x∂y1∂x∂y2⋮∂x∂ym⎦⎤
如果 Y = [ y 1 , y 2 , . . . , y m ] Y=[y_1,y_2,...,y_m] Y=[y1,y2,...,ym]是行向量,derivation
∂ Y ∂ x = [ ∂ y 1 ∂ x ∂ y 2 ∂ x … ∂ y m ∂ x ] \cfrac{\partial{Y}}{\partial{x}}=\begin{bmatrix} \cfrac{\partial{y_1}}{\partial{x}} \space \cfrac{\partial{y_2}}{\partial{x}} \ldots \cfrac{\partial{y_m}}{\partial{x}} \end{bmatrix} ∂x∂Y=[∂x∂y1 ∂x∂y2…∂x∂ym]
2)标量 y y y对向量 X = [ x 1 , x 2 , . . . , x m ] T X=[x_1,x_2,...,x_m]^T X=[x1,x2,...,xm]T求导
∂ y ∂ X = [ ∂ y ∂ x 1 ∂ y ∂ x 2 ⋮ ∂ y ∂ x m ] \cfrac{\partial{y}}{\partial{X}}=\begin{bmatrix} \cfrac{\partial{y}}{\partial{x_1}} \\ \cfrac{\partial{y}}{\partial{x_2}} \\ \vdots \\ \cfrac{\partial{y}}{\partial{x_m}} \end{bmatrix} ∂X∂y=⎣⎡∂x1∂y∂x2∂y⋮∂xm∂y⎦⎤
如果 X = [ x 1 , x 2 , . . . , x m ] X=[x_1,x_2,...,x_m] X=[x1,x2,...,xm]为行向量:
∂ y ∂ X = [ ∂ y ∂ x 1 ∂ y ∂ x 2 … ∂ y ∂ x m ] \cfrac{\partial{y}}{\partial{X}}=\begin{bmatrix} \cfrac{\partial{y}}{\partial{x_1}} \space \cfrac{\partial{y}}{\partial{x_2}} \ldots \cfrac{\partial{y}}{\partial{x_m}} \end{bmatrix} ∂X∂y=[∂x1∂y ∂x2∂y…∂xm∂y]
3)向量 Y = [ y 1 , y 2 , . . . , y m ] T Y=[y_1,y_2,...,y_m]^T Y=[y1,y2,...,ym]T对向量 X = [ x 1 , x 2 , . . . , x n ] X=[x_1,x_2,...,x_n] X=[x1,x2,...,xn]求导
∂ Y ∂ X = [ ∂ y 1 ∂ x 1 ∂ y 1 ∂ x 2 … ∂ y 1 ∂ x n ∂ y 2 ∂ x 1 ∂ y 2 ∂ x 2 … ∂ y 2 ∂ x n ⋮ ∂ y m ∂ x 1 ∂ y m ∂ x 2 … ∂ y m ∂ x n ] \cfrac{\partial{Y}}{\partial{X}}=\begin{bmatrix} \cfrac{\partial{y_1}}{\partial{x_1}} \space \space \cfrac{\partial{y_1}}{\partial{x_2}} \space \space \ldots \space \space \cfrac{\partial{y_1}}{\partial{x_n}} \\ \cfrac{\partial{y_2}}{\partial{x_1}} \space \space \cfrac{\partial{y_2}}{\partial{x_2}} \space \space \ldots \space \space \cfrac{\partial{y_2}}{\partial{x_n}} \\ \vdots \\ \cfrac{\partial{y_m}}{\partial{x_1}} \space \space \cfrac{\partial{y_m}}{\partial{x_2}} \space \space \ldots \space \space \cfrac{\partial{y_m}}{\partial{x_n}} \end{bmatrix} ∂X∂Y=⎣⎡∂x1∂y1 ∂x2∂y1 … ∂xn∂y1∂x1∂y2 ∂x2∂y2 … ∂xn∂y2⋮∂x1∂ym ∂x2∂ym … ∂xn∂ym⎦⎤
Vector-to-vector derivation is also known as a Jacobian matrix,It is very common in optimization.
如果是矩阵的话,
如 Y Y Y是矩阵的时候,它的表达:
Y = [ y 11 y 12 … y 1 n y 21 y 22 … y 2 n ⋮ y m 1 y m 2 … y m n ] Y=\begin{bmatrix} y_{11} \space \space y_{12} \space \space \ldots \space \space y_{1n} \\ y_{21} \space \space y_{22} \space \space \ldots \space \space y_{2n} \\ \vdots \\ y_{m1} \space \space y_{m2} \space \space \ldots \space \space y_{mn} \end{bmatrix} Y=⎣⎡y11 y12 … y1ny21 y22 … y2n⋮ym1 ym2 … ymn⎦⎤
如 X X X是矩阵的时候,它的表达:
X = [ x 11 x 12 … x 1 n x 21 x 22 … x 2 n ⋮ x m 1 x m 2 … x m n ] X=\begin{bmatrix} x_{11} \space \space x_{12} \space \space \ldots \space \space x_{1n} \\ x_{21} \space \space x_{22} \space \space \ldots \space \space x_{2n} \\ \vdots \\ x_{m1} \space \space x_{m2} \space \space \ldots \space \space x_{mn} \end{bmatrix} X=⎣⎡x11 x12 … x1nx21 x22 … x2n⋮xm1 xm2 … xmn⎦⎤
There are two kinds of derivatives of matrices,如下
1)矩阵 Y Y Y对标量 x x x求导:
∂ Y ∂ x = [ ∂ y 11 ∂ x ∂ y 12 ∂ x … ∂ y 1 n ∂ x ∂ y 21 ∂ x ∂ y 22 ∂ x … ∂ y 2 n ∂ x ⋮ ∂ y m 1 ∂ x ∂ y m 2 ∂ x … ∂ y m n ∂ x ] \cfrac{\partial{Y}}{\partial{x}}=\begin{bmatrix} \cfrac{\partial{y_{11}}}{\partial{x}} \space \space \cfrac{\partial{y_{12}}}{\partial{x}} \space \space \ldots \space \space \cfrac{\partial{y_{1n}}}{\partial{x}} \\ \cfrac{\partial{y_{21}}}{\partial{x}} \space \space \cfrac{\partial{y_{22}}}{\partial{x}} \space \space \ldots \space \space \cfrac{\partial{y_{2n}}}{\partial{x}} \\ \vdots \\ \cfrac{\partial{y_{m1}}}{\partial{x}} \space \space \cfrac{\partial{y_{m2}}}{\partial{x}} \space \space \ldots \space \space \cfrac{\partial{y_{mn}}}{\partial{x}} \end{bmatrix} ∂x∂Y=⎣⎡∂x∂y11 ∂x∂y12 … ∂x∂y1n∂x∂y21 ∂x∂y22 … ∂x∂y2n⋮∂x∂ym1 ∂x∂ym2 … ∂x∂ymn⎦⎤
2)标量 y y y对矩阵 X X X求导:
∂ y ∂ X = [ ∂ y ∂ x 11 ∂ y ∂ x 12 … ∂ y ∂ x 1 n ∂ y ∂ x 21 ∂ y ∂ x 22 … ∂ y ∂ x 2 n ⋮ ∂ y ∂ x m 1 ∂ y ∂ x m 2 … ∂ y ∂ x m n ] \cfrac{\partial{y}}{\partial{X}}=\begin{bmatrix} \cfrac{\partial{y}}{\partial{x_{11}}} \space \space \cfrac{\partial{y}}{\partial{x_{12}}} \space \space \ldots \space \space \cfrac{\partial{y}}{\partial{x_{1n}}} \\ \cfrac{\partial{y}}{\partial{x_{21}}} \space \space \cfrac{\partial{y}}{\partial{x_{22}}} \space \space \ldots \space \space \cfrac{\partial{y}}{\partial{x_{2n}}} \\ \vdots \\ \cfrac{\partial{y}}{\partial{x_{m1}}} \space \space \cfrac{\partial{y}}{\partial{x_{m2}}} \space \space \ldots \space \space \cfrac{\partial{y}}{\partial{x_{mn}}} \end{bmatrix} ∂X∂y=⎣⎡∂x11∂y ∂x12∂y … ∂x1n∂y∂x21∂y ∂x22∂y … ∂x2n∂y⋮∂xm1∂y ∂xm2∂y … ∂xmn∂y⎦⎤
This is the derivative definition of the basic vector.Based on these definitions and some basic algorithms,Get some combined formulas.Very useful in the programming of geometric algorithms.
Derivation of vectors in formulas,In general formulas there will be multiple vectors and vector dependencies,因此,When taking the derivative, we hope that it can satisfy the chain rule of scalar derivative.
Suppose the vectors are interdependent as : U − > V − > W U->V->W U−>V−>W
Then the partial derivative is :
∂ W ∂ U = ∂ W ∂ V ∂ V ∂ U \cfrac{\partial{W}}{\partial{U}}=\cfrac{\partial{W}}{\partial{V}} \space \space \cfrac{\partial{V}}{\partial{U}} ∂U∂W=∂V∂W ∂U∂V
证明:It is only necessary to unpack and derive the elements one by one:
∂ w i ∂ u j = ∑ k ∂ w i ∂ v k ∂ v k ∂ u j = ∂ w i ∂ V ∂ V ∂ u j \cfrac{\partial{w_i}}{\partial{u_j}} = \sum_{k}\cfrac{\partial{w_i}}{\partial{v_k}}\space \cfrac{\partial{v_k}}{\partial{u_j}} =\cfrac{\partial{w_i}}{\partial{V}} \space \cfrac{\partial{V}}{\partial{u_j}} ∂uj∂wi=k∑∂vk∂wi ∂uj∂vk=∂V∂wi ∂uj∂V
由此可见 ∂ w i ∂ u j \cfrac{\partial{w_i}}{\partial{u_j}} ∂uj∂wiis equal to the matrix ∂ W ∂ V \cfrac{\partial{W}}{\partial{V}} ∂V∂W第 i i i行和矩阵 ∂ V ∂ U \cfrac{\partial{V}}{\partial{U}} ∂U∂V的第 j j j列的内积,This is the multiplication definition of a matrix.
It can easily be generalized to scenarios with multiple layers of intermediate variables.
The situation encountered in variables is often formulated as F F F为一个实数,When the intermediate variables are all vectors,它的依赖为:
X − > V − > U − > f X->V->U->f X−>V−>U−>f
According to the transitivity of the Jacobian matrix, it can be obtained as follows:
∂ F ∂ X = ∂ F ∂ U ∂ U ∂ V ∂ V ∂ X \cfrac{\partial{F}}{\partial{X}} = \cfrac{\partial{F}}{\partial{U}}\space \cfrac{\partial{U}}{\partial{V}} \space \cfrac{\partial{V}}{\partial{X}} ∂X∂F=∂U∂F ∂V∂U ∂X∂V
因为 f f f为标量,So it is written as follows:
∂ f ∂ X T = ∂ f ∂ U T ∂ U ∂ V ∂ V ∂ X \cfrac{\partial{f}}{\partial{X^T}} = \cfrac{\partial{f}}{\partial{U^T}}\space \cfrac{\partial{U}}{\partial{V}} \space \cfrac{\partial{V}}{\partial{X}} ∂XT∂f=∂UT∂f ∂V∂U ∂X∂V
为了便于计算,The above needs to be converted to row vectors U T U^T UT, X T X^T XT计算.这个非常重要.
The following introduces the commonly used formulas encountered when calculating the reciprocal of a vector,They are of the following two categories
1)两向量 U U U, V V V(列向量)The result of the dot product is pair W W W求导:
∂ ( U T V ) ∂ W = ( ∂ U ∂ W ) T V + ( ∂ V ∂ W ) T U ( 4 ) \cfrac{\partial{(U^T V)}}{\partial{W}} = ( \cfrac{\partial{U}}{\partial{W}})^T V + ( \cfrac{\partial{V}}{\partial{W}})^T U \space (4) ∂W∂(UTV)=(∂W∂U)TV+(∂W∂V)TU (4)
The derivative formula of the dot product proves that it is added later.
证明:假设 U = [ u 0 u 1 u 3 ] U=\begin{bmatrix} u_0 \\ u_1 \\ u_3 \end{bmatrix} U=⎣⎡u0u1u3⎦⎤ 和 V = [ v 0 v 1 v 3 ] V=\begin{bmatrix} v_0 \\ v_1 \\ v_3 \end{bmatrix} V=⎣⎡v0v1v3⎦⎤ ,They are three-dimensional vectors.Get the dot multiplication as f = U T V f=U^T V f=UTV,It is a scalar for : f = u 0 v 0 + u 1 v 1 + u 2 v 2 f=u_0v_0+u_1v_1+u_2v_2 f=u0v0+u1v1+u2v2,Then ask it to be right W W W的导数
∂ f ∂ W = ∂ ( u 0 v 0 + u 1 v 1 + u 2 v 2 ) ∂ W = ∂ u 0 ∂ W v 0 + ∂ v 0 ∂ W u 0 + ∂ u 1 ∂ W v 1 + ∂ v 1 ∂ W u 1 + ∂ u 2 ∂ W v 2 + ∂ v 2 ∂ W u 2 = ( ∂ u 0 ∂ W v 0 + ∂ u 1 ∂ W v 1 + ∂ u 2 ∂ W v 2 ) + ( ∂ v 0 ∂ W u 0 + ∂ v 1 ∂ W u 1 + ∂ v 2 ∂ W u 2 ) = ( ∂ U ∂ W ) T V + ( ∂ V ∂ W ) T U \cfrac{\partial{f}}{\partial{W}}=\cfrac{\partial{(u_0v_0+u_1v_1+u_2v_2)}}{\partial{W}} \\ =\cfrac{\partial{u_0}}{\partial{W}}v_0 + \cfrac{\partial{v_0}}{\partial{W}}u_0 + \cfrac{\partial{u_1}}{\partial{W}}v_1 + \cfrac{\partial{v_1}}{\partial{W}}u_1 + \cfrac{\partial{u_2}}{\partial{W}}v_2 + \cfrac{\partial{v_2}}{\partial{W}}u_2 \\ =(\cfrac{\partial{u_0}}{\partial{W}}v_0 + \cfrac{\partial{u_1}}{\partial{W}}v_1 + \cfrac{\partial{u_2}}{\partial{W}}v_2) + (\cfrac{\partial{v_0}}{\partial{W}}u_0 + \cfrac{\partial{v_1}}{\partial{W}}u_1 + \cfrac{\partial{v_2}}{\partial{W}}u_2) \\ =( \cfrac{\partial{U}}{\partial{W}})^T V + ( \cfrac{\partial{V}}{\partial{W}})^T U ∂W∂f=∂W∂(u0v0+u1v1+u2v2)=∂W∂u0v0+∂W∂v0u0+∂W∂u1v1+∂W∂v1u1+∂W∂u2v2+∂W∂v2u2=(∂W∂u0v0+∂W∂u1v1+∂W∂u2v2)+(∂W∂v0u0+∂W∂v1u1+∂W∂v2u2)=(∂W∂U)TV+(∂W∂V)TU
It can be generalized to other dimensions.证明完毕.
如果 W W Wis that the scalar is actually directly substituted ( 4 ) (4) (4)即可.但是如果 W W W为向量,在计算中,一般都是把 W W W变成行向量.因为定义jacobi矩阵是,Derivation of a column vector with respect to a row vector.因此 W W W可以表示为 W T W^T WT(行向量),所以 ( 4 ) (4) (4),写成:
∂ ( U T V ) ∂ W T = ( ∂ U ∂ W T ) T V + ( ∂ V ∂ W T ) T U \cfrac{\partial{(U^T V)}}{\partial{W^T}} = ( \cfrac{\partial{U}}{\partial{W^T}})^T V + ( \cfrac{\partial{V}}{\partial{W^T}})^T U ∂WT∂(UTV)=(∂WT∂U)TV+(∂WT∂V)TU
2)两个向量 U U U, V V V (列向量)The result of the cross product is pair W W W求导:
∂ ( U × V ) ∂ W = − S k e w ( V ) ( ∂ U ∂ W ) + S k e w ( U ) ( ∂ V ∂ W ) ( 5 ) \cfrac{\partial{(U \times V)}}{\partial{W}} = -Skew(V)( \cfrac{\partial{U}}{\partial{W}}) +Skew(U)( \cfrac{\partial{V}}{\partial{W}}) \space (5) ∂W∂(U×V)=−Skew(V)(∂W∂U)+Skew(U)(∂W∂V) (5)
其中
S k e w ( U ) = [ 0 − U 3 U 2 U 3 0 − U 1 − U 2 U 1 0 ] Skew(U) = \begin{bmatrix} 0 \space \space -U_3 \space \space U_2 \\ U_3 \space \space 0 \space \space -U_1 \\ -U_2 \space \space U_1 \space \space 0 \end{bmatrix} Skew(U)=⎣⎡0 −U3 U2U3 0 −U1−U2 U1 0⎦⎤
其中 S k e w ( V ) Skew(V) Skew(V)is the matrix that converts the cross product to the dot product.It's very easy to prove,Because it is just a matrix expansion.
For cross-multiplication of multiple vectors,The formula needs to be transformed.The cross product satisfies the distribution ratio.
∂ ( U × V ) ∂ W = ( ∂ U ∂ W ) × V + U × ( ∂ V ∂ W ) ( 6 ) \cfrac{\partial{(U \times V)}}{\partial{W}} = ( \cfrac{\partial{U}}{\partial{W}}) \times V + U \times ( \cfrac{\partial{V}}{\partial{W}}) \space (6) ∂W∂(U×V)=(∂W∂U)×V+U×(∂W∂V) (6)
The proof will be added later.(5)和(6)The formulas for both are figured out.只是表达形式不同.Their transformations will be added later.
证明:假设 U = [ u 0 u 1 u 3 ] U=\begin{bmatrix} u_0 \\ u_1 \\ u_3 \end{bmatrix} U=⎣⎡u0u1u3⎦⎤ 和 V = [ v 0 v 1 v 3 ] V=\begin{bmatrix} v_0 \\ v_1 \\ v_3 \end{bmatrix} V=⎣⎡v0v1v3⎦⎤ ,They are three-dimensional vectors.
U × V = [ i j k u 0 u 1 u 2 v 0 v 1 v 2 ] = ( u 1 v 2 − u 1 v 2 ) i + ( u 2 v 0 − u 0 v 2 ) j + ( u 0 v 1 − u 1 v 0 ) k U \times V = \begin{bmatrix} i \space \space j \space \space k \\ u_0 \space \space u_1 \space \space u_2 \\ v_0 \space \space v_1 \space \space v_2 \end{bmatrix} \\ = (u_1v_2 - u_1v_2)i+ (u_2v_0 - u_0v_2)j+ (u_0v_1 - u_1v_0)k U×V=⎣⎡i j ku0 u1 u2v0 v1 v2⎦⎤=(u1v2−u1v2)i+(u2v0−u0v2)j+(u0v1−u1v0)k
它是一个向量,因此展开后,It is expressed as follows:
U × V = [ ( u 1 v 2 − u 2 v 1 ) ( u 2 v 0 − u 0 v 2 ) ( u 0 v 1 − u 1 v 0 ) ] U \times V = \begin{bmatrix} (u_1v_2 - u_2v_1) \\ (u_2v_0 - u_0v_2) \\ (u_0v_1 - u_1v_0) \end{bmatrix} U×V=⎣⎡(u1v2−u2v1)(u2v0−u0v2)(u0v1−u1v0)⎦⎤
After expansion, we get the following:
∂ ( U × V ) ∂ W = [ ∂ ( u 1 v 2 − u 2 v 1 ) ∂ W ∂ ( u 2 v 0 − u 0 v 2 ) ∂ W ∂ ( u 0 v 1 − u 1 v 0 ) ∂ W ] = ∂ ( u 1 v 2 − u 2 v 1 ) ∂ W I + ∂ ( u 2 v 0 − u 0 v 2 ) ∂ W J + ∂ ( u 0 v 1 − u 1 v 0 ) ∂ W K = ( ∂ u 1 ∂ W ∗ v 2 + ∂ v 2 ∂ W ∗ u 1 − ∂ u 2 ∂ W ∗ v 1 − ∂ v 1 ∂ W ∗ u 2 ) I + ( ∂ u 2 ∂ W ∗ v 0 + ∂ v 0 ∂ W ∗ u 2 − ∂ u 0 ∂ W ∗ v 2 − ∂ v 2 ∂ W ∗ u 0 ) J + ( ∂ u 0 ∂ W ∗ v 1 + ∂ v 1 ∂ W ∗ u 0 − ∂ u 1 ∂ W ∗ v 0 − ∂ v 0 ∂ W ∗ u 1 ) K = [ ( ∂ u 1 ∂ W ∗ v 2 − ∂ u 2 ∂ W ∗ v 1 ) I + ( ∂ u 2 ∂ W ∗ v 0 − ∂ u 0 ∂ W ∗ v 2 ) J + ( ∂ u 0 ∂ W ∗ v 1 − ∂ u 1 ∂ W ∗ v 0 ) K ] + [ ( ∂ v 2 ∂ W ∗ u 1 − ∂ v 1 ∂ W ∗ u 2 ) I + ( ∂ v 0 ∂ W ∗ u 2 − ∂ v 2 ∂ W ∗ u 0 ) J + ( ∂ v 1 ∂ W ∗ u 0 − ∂ v 0 ∂ W ∗ u 1 ) K ] = ( ∂ U ∂ W ) × V − ( ∂ V ∂ W ) × U = − V × ( ∂ U ∂ W ) + U × ( ∂ V ∂ W ) = − S k e w ( V ) ( ∂ U ∂ W ) + S k e w ( U ) ( ∂ V ∂ W ) \cfrac{\partial{(U \times V)}}{\partial{W}} = \begin{bmatrix} \cfrac{\partial{(u_1v_2 - u_2v_1) }}{\partial{W}} \\ \cfrac{\partial{ (u_2v_0 - u_0v_2)}}{\partial{W}}\\ \cfrac{\partial{ (u_0v_1 - u_1v_0)}}{\partial{W}}\\ \end{bmatrix} = \cfrac{\partial{(u_1v_2 - u_2v_1) }}{\partial{W}} I + \cfrac{\partial{ (u_2v_0 - u_0v_2)}}{\partial{W}}J+ \cfrac{\partial{ (u_0v_1 - u_1v_0)}}{\partial{W}}K \\ = (\cfrac{\partial{u_1}}{\partial{W}}*v_2+\cfrac{\partial{v_2}}{\partial{W}}*u_1-\cfrac{\partial{u_2}}{\partial{W}}*v_1-\cfrac{\partial{v_1}}{\partial{W}}*u_2)I+(\cfrac{\partial{u_2}}{\partial{W}}*v_0+\cfrac{\partial{v_0}}{\partial{W}}*u_2-\cfrac{\partial{u_0}}{\partial{W}}*v_2-\cfrac{\partial{v_2}}{\partial{W}}*u_0)J + (\cfrac{\partial{u_0}}{\partial{W}}*v_1+\cfrac{\partial{v_1}}{\partial{W}}*u_0-\cfrac{\partial{u_1}}{\partial{W}}*v_0-\cfrac{\partial{v_0}}{\partial{W}}*u_1)K \\ =[(\cfrac{\partial{u_1}}{\partial{W}}*v_2 -\cfrac{\partial{u_2}}{\partial{W}}*v_1)I + (\cfrac{\partial{u_2}}{\partial{W}}*v_0 - \cfrac{\partial{u_0}}{\partial{W}}*v_2)J + (\cfrac{\partial{u_0}}{\partial{W}}*v_1 - \cfrac{\partial{u_1}}{\partial{W}}*v_0)K] + [(\cfrac{\partial{v_2}}{\partial{W}}*u_1 -\cfrac{\partial{v_1}}{\partial{W}}*u_2)I + (\cfrac{\partial{v_0}}{\partial{W}}*u_2 - \cfrac{\partial{v_2}}{\partial{W}}*u_0)J + (\cfrac{\partial{v_1}}{\partial{W}}*u_0 - \cfrac{\partial{v_0}}{\partial{W}}*u_1)K] \\ =( \cfrac{\partial{U}}{\partial{W}}) \times V - ( \cfrac{\partial{V}}{\partial{W}}) \times U = -V \times (\cfrac{\partial{U}}{\partial{W}}) + U \times ( \cfrac{\partial{V}}{\partial{W}})= -Skew(V)( \cfrac{\partial{U}}{\partial{W}}) +Skew(U)( \cfrac{\partial{V}}{\partial{W}}) ∂W∂(U×V)=⎣⎡∂W∂(u1v2−u2v1)∂W∂(u2v0−u0v2)∂W∂(u0v1−u1v0)⎦⎤=∂W∂(u1v2−u2v1)I+∂W∂(u2v0−u0v2)J+∂W∂(u0v1−u1v0)K=(∂W∂u1∗v2+∂W∂v2∗u1−∂W∂u2∗v1−∂W∂v1∗u2)I+(∂W∂u2∗v0+∂W∂v0∗u2−∂W∂u0∗v2−∂W∂v2∗u0)J+(∂W∂u0∗v1+∂W∂v1∗u0−∂W∂u1∗v0−∂W∂v0∗u1)K=[(∂W∂u1∗v2−∂W∂u2∗v1)I+(∂W∂u2∗v0−∂W∂u0∗v2)J+(∂W∂u0∗v1−∂W∂u1∗v0)K]+[(∂W∂v2∗u1−∂W∂v1∗u2)I+(∂W∂v0∗u2−∂W∂v2∗u0)J+(∂W∂v1∗u0−∂W∂v0∗u1)K]=(∂W∂U)×V−(∂W∂V)×U=−V×(∂W∂U)+U×(∂W∂V)=−Skew(V)(∂W∂U)+Skew(U)(∂W∂V)
the assumptions therein a , b a,b a,b为向量,Easy to get as follows
a × b = − b × a a \times b = -b \times a a×b=−b×a
It can be extended from three-dimensional to multi-dimensional vectors.证明完毕
边栏推荐
- MongoDB - query
- The Society of Mind - Marvin Minsky
- Network Protocol 03 - Routing and NAT
- export , export default, import complete usage
- Proof of distance calculation from space vertex to plane and its source code
- Test Development Engineer Growth Diary 008 - Talking About Some Bugs/Use Case Management Platform/Collaboration Platform
- 图解关系数据库设计思想,这也太形象了
- Huawei released "ten inventions", including computing, intelligent driving and other new fields
- 人工肌肉智能材料新突破
- Multithreading basics (multithreaded memory, security, communication, thread pools and blocking queues)
猜你喜欢
STL源码剖析:临时对象的代码测试和理解
LVM and disk quotas
Test Development Engineer Growth Diary 017 - The Life Cycle of a Bug
计算矩阵的逆源码(使用伴随矩阵,3×3的矩阵)
如何理解普吕克坐标(几何理解)
C#的访问修饰符,声明修饰符,关键字有哪些?扫盲篇
From catching up to surpassing, domestic software shows its talents
DHCP principle and configuration
空间顶点到直线的距离计算及其源码
Required request body is missing problem solving
随机推荐
Equation Derivation Proof of Vector Triple Product
如何理解普吕克坐标(几何理解)
向量叉乘的几何意义及其模的计算
prometheus监控mysql
Install MySQL under Linux (centos7)
RAID磁盘阵列
No, the Log4j vulnerability hasn't been fully fixed yet?
你被MySQL 中的反斜杠 \\坑过吗?
debian vsftpd + ssl
Let the "label" content in Baidu map generator expand--solution
阿里二面:Redis有几种集群方案?我答了4种
华为发布“十大发明”,包含计算、智能驾驶等新领域
Detailed explanation of numpy multidimensional array ndarray
向量的导数运算和向量叉乘以及点乘的导数运算
Bull: remove common characters
手机端滚动至页面指定位置
Distance calculation from space vertex to straight line and its source code
(GGG)JWT
预测人们对你的第一印象,“AI颜狗”的诞生
Redis下载与安装