Kronecker Delta

\delta_{ij}

If i=j, \delta_{ij}=1.

If i \neq j, \delta_{ij}=0

We will demonstrate this with an example having three dimensions:

 \begin{bmatrix} \delta_{11} && \delta_{12} && \delta_{13} \\ \delta_{21} && \delta_{22} && \delta_{23} \\ \delta_{31} && \delta_{32} && \delta_{33} \end{bmatrix}

 \begin{bmatrix} 1 && 0 && 0 \\ 0 && 1 && 0 \\ 0 && 0 && 1 \end{bmatrix}

Kronecker Delta refers to one of the elements of the Identity Matrix. The entire matrix is not a Kronecker Delta. If you build a matrix using Kronecker Deltas, you will have an Identity Matrix.

Appendix A

It helps to see what happens when A 3×3 matrix constructed with Kronecker Dektas is multiplied against an old vector v to make a new vector v’.

 \begin{bmatrix} \delta_{11} && \delta_{12} && \delta_{13} \\ \delta_{21} && \delta_{22} && \delta_{23} \\ \delta_{31} && \delta_{32} && \delta_{33} \end{bmatrix} \begin{bmatrix} v^1 \\ v^2 \\ v^3 \end{bmatrix} = \begin{bmatrix} v'^1 \\ v'^2 \\ v'^3 \end{bmatrix}

This lets us see the following:

v'^1 =\delta^1_1 v^1 + \delta^1_2 v^2 + \delta^1_3 v^3

v'^2 =\delta^2_1 v^1 + \delta^2_2 v^2 + \delta^2_3 v^3

v'^3 =\delta^3_1 v^1 + \delta^3_2 v^2 + \delta^3_3 v^3

We change over to another symbol for the coefficients as follows:

v'^1 = \dfrac {\partial x'^1} {\partial x^1} v^1 +  \dfrac {\partial x'^1} {\partial x^2}  v^2 +  \dfrac {\partial x'^1} {\partial x^3}  v^3

v'^2 = \dfrac {\partial x'^2} {\partial x^1} v^1 +  \dfrac {\partial x'^2} {\partial x^2}  v^2 +  \dfrac {\partial x'^2} {\partial x^3}  v^3

v'^3 = \dfrac {\partial x'^3} {\partial x^1} v^1 +  \dfrac {\partial x'^3} {\partial x^2}  v^2 +  \dfrac {\partial x'^3} {\partial x^3}  v^3

Appendix B

The Kronecker Delta is sometimes called the Substitution Tensor.

\delta^i_j w_i = w_j

  • Let i run from 1 to 4.
  • Let j=3
\delta^i_j w_i = (0)w_1 + (0)w_2 + (1)w_3 + (0)w_4

For this example there was only one term where i=j. However we could have had more going on. For example, if we’d had another index, L, running from 1 to 4, then we would have had four instances where i=j and thus four terms getting that (1) efficient.

Let

Appendix C

In order to see what is happening it may help to look at what occurs with the matrix multiplication shown below:

 \begin{bmatrix} 0 && 3 \\ 2 && 0 \end{bmatrix}  \begin{bmatrix} 1 \\ 10 \end{bmatrix} =  \begin{bmatrix} 30 \\ 2 \end{bmatrix}

For the new vector, the first component is completely blind to what the first component of the old vector was, and it becomes three times the value of the second component of the old vector.

v'^1 = \dfrac {\partial x'^1} {\partial x^1} v^1 +  \dfrac {\partial x'^1} {\partial x^2}  v^2

v'^2 = \dfrac {\partial x'^2} {\partial x^1} v^1 +  \dfrac {\partial x'^2} {\partial x^2}  v^2

v'^1 = 0 v^1 +  3  v^2

v'^2 = 2 v^1 +  0  v^2

The expression \dfrac {\partial x'^i} {\partial x^j} is two different things. First, it is a component in a matrix. Second, it is the slope of a line we get for a calculation where all coordinates that are not x^j are being held constant.

References

Kronecker Delta as Tensor Proof