# Differential operator

In mathematics, a **differential operator** is an operator defined as a function of the differentiation operator. It is helpful, as a matter of notation first, to consider differentiation as an abstract operation that accepts a function and returns another function (in the style of a higher-order function in computer science).

This article considers mainly linear differential operators, which are the most common type. However, non-linear differential operators also exist, such as the Schwarzian derivative.

The most common differential operator is the action of taking the derivative. Common notations for taking the first derivative with respect to a variable *x* include:

When taking higher, *n*th order derivatives, the operator may be written:

The derivative of a function *f* of an argument *x* is sometimes given as either of the following:

The *D* notation's use and creation is credited to Oliver Heaviside, who considered differential operators of the form

One of the most frequently seen differential operators is the Laplacian operator, defined by

Another differential operator is the Θ operator, or theta operator, defined by^{[1]}

This is sometimes also called the **homogeneity operator**, because its eigenfunctions are the monomials in *z*:

As in one variable, the eigenspaces of Θ are the spaces of homogeneous polynomials.

In writing, following common mathematical convention, the argument of a differential operator is usually placed on the right side of the operator itself. Sometimes an alternative notation is used: The result of applying the operator to the function on the left side of the operator and on the right side of the operator, and the difference obtained when applying the differential operator to the functions on both sides, are denoted by arrows as follows:

Such a bidirectional-arrow notation is frequently used for describing the probability current of quantum mechanics.

The differential operator del, also called *nabla*, is an important vector differential operator. It appears frequently in physics in places like the differential form of Maxwell's equations. In three-dimensional Cartesian coordinates, del is defined as

Del defines the gradient, and is used to calculate the curl, divergence, and Laplacian of various objects.

In the functional space of square-integrable functions on a real interval (*a*, *b*), the scalar product is defined by

A (formally) **self-adjoint** operator is an operator equal to its own (formal) adjoint.

If Ω is a domain in **R**^{n}, and *P* a differential operator on Ω, then the adjoint of *P* is defined in *L*^{2}(Ω) by duality in the analogous manner:

for all smooth *L*^{2} functions *f*, *g*. Since smooth functions are dense in *L*^{2}, this defines the adjoint on a dense subset of *L*^{2}: P^{*} is a densely defined operator.

The Sturm–Liouville operator is a well-known example of a formal self-adjoint operator. This second-order linear differential operator *L* can be written in the form

This property can be proven using the formal adjoint definition above.

This operator is central to Sturm–Liouville theory where the eigenfunctions (analogues to eigenvectors) of this operator are considered.

Any polynomial in *D* with function coefficients is also a differential operator. We may also compose differential operators by the rule

Some care is then required: firstly any function coefficients in the operator *D*_{2} must be differentiable as many times as the application of *D*_{1} requires. To get a ring of such operators we must assume derivatives of all orders of the coefficients used. Secondly, this ring will not be commutative: an operator *gD* isn't the same in general as *Dg*. For example we have the relation basic in quantum mechanics:

The subring of operators that are polynomials in *D* with constant coefficients is, by contrast, commutative. It can be characterised another way: it consists of the translation-invariant operators.

The same constructions can be carried out with partial derivatives, differentiation with respect to different variables giving rise to operators that commute (see symmetry of second derivatives).

In differential geometry and algebraic geometry it is often convenient to have a coordinate-independent description of differential operators between two vector bundles. Let *E* and *F* be two vector bundles over a differentiable manifold *M*. An **R**-linear mapping of sections *P* : Γ(*E*) → Γ(*F*) is said to be a ** kth-order linear differential operator** if it factors through the jet bundle

*J*

^{k}(

*E*). In other words, there exists a linear mapping of vector bundles

where *j*^{k}: Γ(*E*) → Γ(*J*^{k}(*E*)) is the prolongation that associates to any section of *E* its *k*-jet.

This just means that for a given section *s* of *E*, the value of *P*(*s*) at a point *x* ∈ *M* is fully determined by the *k*th-order infinitesimal behavior of *s* in *x*. In particular this implies that *P*(*s*)(*x*) is determined by the germ of *s* in *x*, which is expressed by saying that differential operators are local. A foundational result is the Peetre theorem showing that the converse is also true: any (linear) local operator is differential.

This characterization of linear differential operators shows that they are particular mappings between modules over a commutative algebra, allowing the concept to be seen as a part of commutative algebra.

The conceptual step of writing a differential operator as something free-standing is attributed to Louis François Antoine Arbogast in 1800.^{[2]}