An imaginary number is a real number multiplied by the imaginary unit i,[note 1] which is defined by its property i2 = −1. The square of an imaginary number bi is −b2. For example, 5i is an imaginary number, and its square is −25. By definition, zero is considered to be both real and imaginary.
Originally coined in the 17th century by René Descartes as a derogatory term and regarded as fictitious or useless, the concept gained wide acceptance following the work of Leonhard Euler (in the 18th century) and Augustin-Louis Cauchy and Carl Friedrich Gauss (in the early 19th century).
An imaginary number bi can be added to a real number a to form a complex number of the form a + bi, where the real numbers a and b are called, respectively, the real part and the imaginary part of the complex number.
Although the Greek mathematician and engineer Hero of Alexandria is noted as the first to present a calculation involving the square root of a negative number, it was Rafael Bombelli who first set down the rules for multiplication of complex numbers in 1572. The concept had appeared in print earlier, such as in work by Gerolamo Cardano. At the time, imaginary numbers and negative numbers were poorly understood and were regarded by some as fictitious or useless, much as zero once was. Many other mathematicians were slow to adopt the use of imaginary numbers, including René Descartes, who wrote about them in his La Géométrie in which he coined the term imaginary and meant it to be derogatory. The use of imaginary numbers was not widely accepted until the work of Leonhard Euler (1707–1783) and Carl Friedrich Gauss (1777–1855). The geometric significance of complex numbers as points in a plane was first described by Caspar Wessel (1745–1818).
In 1843, William Rowan Hamilton extended the idea of an axis of imaginary numbers in the plane to a four-dimensional space of quaternion imaginaries in which three of the dimensions are analogous to the imaginary numbers in the complex field.