# Derivation in Algebra

It is well known that the derivative is an operation on functions which is linear

$\displaystyle \begin{array}{c} \left(f+g\right)'=f'+g'\\ \left(\lambda f\right)'=\lambda f'\end{array}$

and satisfies the Leibniz rule

$\displaystyle \left(fg\right)'=f'g+fg'$

In fact, these two properties are purely algebraic and can be generalized with no reference to infinitesimal calculus. This approach has become common in modern mathematics since it allows profitable interactions between multiples domains such as algebra, analysis and geometry.

DERIVATION

Let A be an algebra over a commutative field K. A derivation of A is any linear map D satisfying the Leibniz rule for all ${f,g\in A}$

$\displaystyle D\left(fg\right)=D\left(f\right)g+fD\left(g\right)$

If A is unital then ${D\left(1\right)=0}$ because ${D\left(1\right)=D\left(1\cdot1\right)=D\left(1\right)+D\left(1\right)}$.

We denote by ${\mathrm{Der}\left(A\right)}$ the set of all derivations of A.

Clearly, the sum of two derivations is a derivation. We could be tempted to consider that it has a module structure over A. Unfortunately, it is not true in general. Indeed, if ${a\in A}$ and ${D\in\mathrm{Der}\left(A\right)}$ we have for every ${f,g\in A}$

$\displaystyle \left(aD\right)\left(fg\right)=aD\left(f\right)g+afD\left(g\right)$

so the Leibniz rule is satisfied by aD only if ${af=fa}$. Thus ${\mathrm{Der}\left(A\right)}$ is not an A-module in general unless A is commutative.

The product of two derivations is not a derivation apart trivial case, but it is straightforward to show that the commutator ${\left[d_{1},d_{2}\right]=d_{1}d_{2}-d_{2}d_{1}}$ of two derivations is a derivation. Thus ${\mathrm{Der}\left(A\right)}$ has a Lie algebra structure

$\displaystyle \left[d_{1},d_{2}\right]+\left[d_{2},d_{1}\right]=0$

$\displaystyle \left[d_{1},\left[d_{2},d_{3}\right]\right]+\left[d_{2},\left[d_{3},d_{1}\right]\right]+\left[d_{3},\left[d_{1},d_{2}\right]\right]$

The commutator also allows to associate an operator to each ${f\in A}$

$\displaystyle g\mapsto\left[f,g\right]$

It is easy to check that ${\left[f,\cdot\right]}$ is a derivation

• ${\left[f,g+h\right]=\left[f,g\right]+\left[f,h\right]}$
• ${\left[f,kh\right]=k\left[f,h\right]}$
• ${\left[f,gh\right]=\left[f,g\right]h+g\left[f,h\right]}$

Such a derivation ${\left[f,\cdot\right]}$ is called an inner derivation as it is automatically generated by the algebra.

EXAMPLES

• Let ${C^{\infty}\left(\mathbb{R}\right)}$ be the algebra of smooth functions on $\mathbb{R}$. Then the operator ${f\mapsto f'}$ is a derivation.
• Let ${M_{n}\left(K\right)}$ be the algebra of square matrices of order n over a field K. Then given a matrix ${M\in M_{n}\left(K\right)}$ the operator ${N\mapsto\left[M,N\right]}$ is an inner derivation.
• Let ${C^{\infty}\left(M\right)}$ be the algebra of smooth functions over a differentiable manifold M. Then a ${C^{\infty}}$ vector field X gives rise to a derivation defined by ${\left(Xf\right)\left(p\right)=X_{p}f}$ where f is any smooth function and ${X_{p}}$ is the directionnal derivative at point ${p\in M}$. The Leibniz rule is satisfied as ${X\left(fg\right)=\left(Xf\right)g+f\left(Xg\right)}$.