In mathematics, orthogonal, as a simple adjective not part of a longer phrase, is a generalization of perpendicular. Mathematics is the body of Knowledge and Academic discipline that studies such concepts as Quantity, Structure, Space and In Grammar, an adjective is a word whose main syntactic role is to modify a Noun or Pronoun, giving more information about the In Geometry, two lines or planes (or a line and a plane are considered perpendicular (or orthogonal) to each other if they form congruent It means "at right angles". In Geometry and Trigonometry, a right angle is an angle of 90 degrees corresponding to a quarter turn (that is a quarter of a full circle The word comes from the Greek ὀρθός (orthos), meaning "straight", and γωνία (gonia), meaning "angle". Greek (el ελληνική γλώσσα or simply el ελληνικά — "Hellenic" is an Indo-European language, spoken today by 15-22 million people mainly Two streets that cross each other at a right angle are orthogonal to one another. In recent years, "perpendicular" has come to be used more in relation to right angles outside of a coordinate plane context, whereas "orthogonal" is used when discussing vectors or coordinate geometry.

## Explanation

Formally, two vectors x and y in an inner product space V are orthogonal if their inner product $\langle x, y \rangle$ is zero. In Mathematics, a vector space (or linear space) is a collection of objects (called vectors) that informally speaking may be scaled and added In Mathematics, an inner product space is a Vector space with the additional Structure of inner product. This situation is denoted $x \perp y$.

Two vector subspaces A and B of vector space V are called orthogonal subspaces if each vector in A is orthogonal to each vector in B. The concept of a linear subspace (or vector subspace) is important in Linear algebra and related fields of Mathematics. In Mathematics, a vector space (or linear space) is a collection of objects (called vectors) that informally speaking may be scaled and added The largest subspace that is orthogonal to a given subspace is its orthogonal complement. In the mathematical fields of Linear algebra and Functional analysis, the orthogonal complement W^\bot of a subspace W

A linear transformation $T : V \rightarrow V$ is called an orthogonal linear transformation if it preserves the inner product. In Mathematics, a linear map (also called a linear transformation, or linear operator) is a function between two Vector spaces that In Mathematics, an inner product space is a Vector space with the additional Structure of inner product. That is, for all pairs of vectors x and y in the inner product space V,

$\langle Tx, Ty \rangle = \langle x, y \rangle.$

This means that T preserves the angle between x and y, and that the lengths of Tx and x are equal. In Geometry and Trigonometry, an angle (in full plane angle) is the figure formed by two rays sharing a common Endpoint, called Length is the long Dimension of any object The length of a thing is the distance between its ends its linear extent as measured from end to end

A term rewriting system is said to be orthogonal if it is left-linear and is non-ambiguous. In Mathematics, Computer science and Logic, rewriting covers a wide range of potentially non-deterministic methods of replacing subterms of a Orthogonality as a property of Term rewriting systems describes where the reduction rules of the system are all left-linear that is each variable occurs only once on the left Orthogonal term rewriting systems are confluent. Confluence is a property of term rewriting systems, describing that terms in this system can be rewritten in more than one way to yield the same result

The word normal is sometimes also used in place of orthogonal. However, normal can also refer to unit vectors. In Mathematics, a unit vector in a Normed vector space is a vector (often a spatial vector) whose length is 1 (the unit length In particular, orthonormal refers to a collection of vectors that are both orthogonal and normal (of unit length). In Linear algebra, two vectors in an Inner product space are orthonormal if they are orthogonal and both of unit length So, using the term normal to mean "orthogonal" is often avoided.

## In Euclidean vector spaces

In 2- or 3-dimensional Euclidean space, two vectors are orthogonal if their dot product is zero, i. In mathematics the dimension of a Space is roughly defined as the minimum number of Coordinates needed to specify every point within it In Mathematics, the dot product, also known as the scalar product, is an operation which takes two vectors over the Real numbers R e. they make an angle of 90° or π/2 radians. The radian is a unit of plane Angle, equal to 180/ π degrees, or about 57 Hence orthogonality of vectors is a generalization of the concept of perpendicular. In Geometry, two lines or planes (or a line and a plane are considered perpendicular (or orthogonal) to each other if they form congruent In terms of Euclidean subspaces, the orthogonal complement of a line is the plane perpendicular to it, and vice versa. In Linear algebra, an Euclidean subspace (or subspace of R n) is a set of vectors that is closed under addition Note however that there is no correspondence with regards to perpendicular planes, because vectors in subspaces start from the origin. In Mathematics, the origin of a Euclidean space is a special point, usually denoted by the letter O, used as a fixed point of reference

In 4-dimensional Euclidean space, the orthogonal complement of a line is a hyperplane and vice versa, and that of a plane is a plane. A hyperplane is a concept in Geometry. It is a higher-dimensional generalization of the concepts of a line in Euclidean plane geometry and a

Several vectors are called pairwise orthogonal if any two of them are orthogonal, and a set of such vectors is called an orthogonal set. Such a set is an orthonormal set if all its vectors are unit vectors. In Mathematics, a unit vector in a Normed vector space is a vector (often a spatial vector) whose length is 1 (the unit length Non-zero pairwise orthogonal vectors are always linearly independent. In Linear algebra, a family of vectors is linearly independent if none of them can be written as a Linear combination of finitely many other vectors

## Orthogonal functions

It is common to use the following inner product for two functions f and g:

$\langle f, g\rangle_w = \int_a^b f(x)g(x)w(x)\,dx.$

Here we introduce a nonnegative weight function w(x) in the definition of this inner product. The Mathematical concept of a function expresses dependence between two quantities one of which is given (the independent variable, argument of the function A weight function is a mathematical device used when performing a sum integral or average in order to give some elements more of a "weight" than others

We say that those functions are orthogonal if that inner product is zero:

$\int_a^b f(x)g(x)w(x)\,dx = 0.$

We write the norms with respect to this inner product and the weight function as

$\|f\|_w = \sqrt{\langle f, f\rangle_w}$

The members of a sequence { fi : i = 1, 2, 3, . In Linear algebra, Functional analysis and related areas of Mathematics, a norm is a function that assigns a strictly positive length . . } are:

• orthogonal if
$\langle f_i, f_j \rangle=\int_{-\infty}^\infty f_i(x) f_j(x) w(x)\,dx=\|f_i\|^2\delta_{i,j}=\|f_j\|^2\delta_{i,j}$
• orthonormal
$\langle f_i, f_j \rangle=\int_{-\infty}^\infty f_i(x) f_j(x) w(x)\,dx=\delta_{i,j}$

where

$\delta_{i,j}=\left\{\begin{matrix}1 & \mathrm{if}\ i=j \\ 0 & \mathrm{if}\ i\neq j\end{matrix}\right.$

is the Kronecker delta. In Mathematics, the Kronecker delta or Kronecker's delta, named after Leopold Kronecker ( 1823 - 1891) is a function of two In other words, any two of them are orthogonal, and the norm of each is 1 in the case of the orthonormal sequence. See in particular orthogonal polynomials. In Mathematics, an orthogonal polynomial sequence is an infinite sequence of real Polynomials p_0\ p_1\ p_2\ \ldots

## Examples

• The vectors (1, 3, 2), (3, −1, 0), (1/3, 1, −5/3) are orthogonal to each other, since (1)(3) + (3)(−1) + (2)(0) = 0, (3)(1/3) + (−1)(1) + (0)(−5/3) = 0, (1)(1/3) + (3)(1) − (2)(5/3) = 0. Observe also that the dot product of the vectors with themselves are the norms of those vectors, so to check for orthogonality, we need only check the dot product with every other vector.
• The vectors (1, 0, 1, 0, . . . )T and (0, 1, 0, 1, . . . )T are orthogonal to each other. Clearly the dot product of these vectors is 0. We can then make the obvious generalization to consider the vectors in Z2n:
$\mathbf{v}_k = \sum_{i=0\atop ai+k < n}^{n/a} \mathbf{e}_i$
for some positive integer a, and for 1 ≤ ka − 1, these vectors are orthogonal, for example (1, 0, 0, 1, 0, 0, 1, 0)T, (0, 1, 0, 0, 1, 0, 0, 1)T, (0, 0, 1, 0, 0, 1, 0, 0)T are orthogonal.
• Take two quadratic functions 2t + 3 and 5t2 + t − 17/9. These functions are orthogonal with respect to a unit weight function on the interval from −1 to 1. The product of these two functions is 10t3 + 17t2 − 7/9 t − 17/3, and now,
$\int_{-1}^{1} \left(10t^3+17t^2-{7\over 9}t-{17\over 3}\right)\,dt = \left[{5\over 2}t^4+{17\over 3}t^3-{7\over 18}t^2-{17\over 3}t\right]_{-1}^{1}$
$=\left({5\over 2}(1)^4+{17\over 3}(1)^3-{7\over 18}(1)^2-{17\over 3}(1)\right)-\left({5\over 2}(-1)^4+{17\over 3}(-1)^3-{7\over 18}(-1)^2-{17\over 3}(-1)\right)$
$={19\over 9}-{19\over 9}=0.$
• The functions 1, sin(nx), cos(nx) : n = 1, 2, 3, . . . are orthogonal with respect to Lebesgue measure on the interval from 0 to 2π. In Mathematics, the Lebesgue measure, named after Henri Lebesgue, is the standard way of assigning a Length, Area or Volume to This fact is basic in the theory of Fourier series. In Mathematics, a Fourier series decomposes a periodic function into a sum of simple oscillating functions
• Various eponymously named polynomial sequences are sequences of orthogonal polynomials. In Mathematics, an orthogonal polynomial sequence is an infinite sequence of real Polynomials p_0\ p_1\ p_2\ \ldots In particular:
• The Hermite polynomials are orthogonal with respect to the normal distribution with expected value 0. In Mathematics, the Hermite polynomials are a classical orthogonal Polynomial sequence that arise in Probability, such as the Edgeworth The normal distribution, also called the Gaussian distribution, is an important family of Continuous probability distributions applicable in many fields
• The Legendre polynomials are orthogonal with respect to the uniform distribution on the interval from −1 to 1. Note People sometimes refer to the more general Associated Legendre polynomials as simply Legendre polynomials.
• The Laguerre polynomials are orthogonal with respect to the exponential distribution. In Mathematics, the Laguerre polynomials, named after Edmond Laguerre (1834 &ndash 1886 are the Canonical solutions of Laguerre's equation WikipediaWikiProject Probability#Standards for a discussionof standards used for probability distribution articles such as this one Somewhat more general Laguerre polynomial sequences are orthogonal with respect to gamma distributions. In Probability theory and Statistics, the gamma distribution is a two-parameter family of continuous Probability distributions It has a Scale parameter
• The Chebyshev polynomials of the first kind are orthogonal with respect to the measure $1/\sqrt{1-x^2}.$
• The Chebyshev polynomials of the second kind are orthogonal with respect to the Wigner semicircle distribution. In Mathematics the Chebyshev polynomials, named after Pafnuty Chebyshev, are a sequence of Orthogonal polynomials which are related to The Wigner semicircle distribution, named after the physicist Eugene Wigner, is the Probability distribution supported on the interval ''R'' the graph of whose
• In quantum mechanics, two eigenstates of a wavefunction, ψm and ψn, are orthogonal if they correspond to different eigenvalues. Quantum mechanics is the study of mechanical systems whose dimensions are close to the Atomic scale such as Molecules Atoms Electrons In Mathematics, given a Linear transformation, an of that linear transformation is a nonzero vector which when that transformation is applied to it changes A wave function or wavefunction is a mathematical tool used in Quantum mechanics to describe any physical system This means, in Dirac notation, that $\langle \psi_m | \psi_n \rangle = 0$ unless ψm and ψn correspond to the same eigenvalue. Bra-ket notation is a standard notation for describing Quantum states in the theory of Quantum mechanics composed of angle brackets (chevrons and Vertical This follows from that Schrödinger's equation is a Sturm-Liouville equation (in Schrödinger's formulation) or that observables are given by hermitian operators (in Heisenberg's formulation). In Physics, especially Quantum mechanics, the Schrödinger equation is an equation that describes how the Quantum state of a Physical system In Mathematics and its applications a classical Sturm-Liouville equation, named after Jacques Charles François Sturm (1803-1855 and Joseph Liouville In Mathematics, on a finite-dimensional Inner product space, a self-adjoint operator is one that is its own adjoint, or equivalently one whose matrix

## Derived meanings

Other meanings of the word orthogonal evolved from its earlier use in mathematics.

### Art

In art the perspective imagined lines pointing to the vanishing point are referred to as 'orthogonal lines'. Perspective (from Latin perspicere to see through in the graphic arts such as drawing is an approximate representation on a flat surface (such as paper of an image as it is perceived A vanishing point is a point in a perspective drawing to which Parallel lines appear to converge

### Computer science

Orthogonality is a system design property facilitating feasibility and compactness of complex designs. Orthogonality guarantees that modifying the technical effect produced by a component of a system neither creates nor propagates side effects to other components of the system. The emergent behavior of a system consisting of components should be controlled strictly by formal definitions of its logic and not by side effects resulting from poor integration, i. e. non-orthogonal design of modules and interfaces. Orthogonality reduces testing and development time because it is easier to verify designs that neither cause side effects nor depend on them.

For example, a car has orthogonal components and controls (e. g. accelerating the vehicle does not influence anything else but the components involved exclusively with the acceleration function). On the other hand, a non-orthogonal design might have its steering influence its braking (e. g. Electronic Stability Control), or its speed tweak its suspension. Electronic stability control ( ESC) is a computerized technology that improves the safety of a vehicle's handling by detecting and preventing skids [1] Consequently, this usage is seen to be derived from the use of orthogonal in mathematics: One may project a vector onto a subspace by projecting it onto each member of a set of basis vectors separately and adding the projections if and only if the basis vectors are mutually orthogonal. The concept of a linear subspace (or vector subspace) is important in Linear algebra and related fields of Mathematics. Basis vector redirects here For basis vector in the context of crystals see Crystal structure.

An instruction set is said to be orthogonal if any instruction can use any register in any addressing mode. An instruction set is a list of all the instructions and all their variations that a processor can execute In Computer architecture, a processor register is a small amount of storage available on the CPU whose contents can be accessed more quickly than storage Addressing modes are an aspect of the Instruction set architecture in most Central processing unit (CPU designs This terminology results from considering an instruction as a vector whose components are the instruction fields. One field identifies the registers to be operated upon, and another specifies the addressing mode. An orthogonal instruction set uniquely encodes all combinations of registers and addressing modes. Orthogonal instruction set is a term used in Computer engineering.

In radio communications, multiple-access schemes are orthogonal when an ideal receiver can completely reject arbitrarily strong unwanted signals using different basis functions than the desired signal. In Mathematics, particularly Numerical analysis, a basis function is an element of the basis for a Function space. One such scheme is TDMA, where the orthogonal basis functions are non-overlapping rectangular pulses ("time slots"). This article is about the medium access technology The name "TDMA" is also commonly used in the United States to refer to D-AMPS, which is a mobile telephone

Another scheme is orthogonal frequency-division multiplexing (OFDM), which refers to the use, by a single transmitter, of a set of frequency multiplexed signals with the exact minimum frequency spacing needed to make them orthogonal so that they do not interfere with each other. Orthogonal frequency-division multiplexing ( OFDM) — essentially identical to Coded OFDM ( COFDM) and Discrete multi-tone modulation ( Well known examples include (a and g) versions of 802.11 Wi-Fi; Wimax; DVB-T, the terrestrial digital TV broadcast system used in most of the world outside North America; and DMT, the standard form of ADSL. IEEE 80211 is a set of standards for wireless local area network (WLAN computer communication developed by the IEEE LAN/MAN Standards Committee ( IEEE 802 Wi-Fi (ˈwaɪfaɪ is the trade name for the popular wireless technology used WiMAX, an approximate acronym of Worldwide Interoperability for Microwave Access, is a Telecommunications technology that provides for the wireless transmission DVB-T is an abbreviation for Digital Video Broadcasting – Terrestrial; it is the DVB European-based consortium standard for the broadcast transmission of Asymmetric Digital Subscriber Line ( ADSL) is a form of DSL, a data communications technology that enables faster data transmission over Copper Telephone

### Statistics, econometrics, and economics

When performing statistical analysis, variables that affect a particular result are said to be orthogonal if they are uncorrelated. [2] That is to say that by varying each separately, one can predict the combined effect of varying them jointly. If correlation is present, the factors are not orthogonal. In Probability theory and Statistics, correlation, (often measured as a correlation coefficient) indicates the strength and direction of a linear In addition, orthogonality restrictions are necessary for inference. In statistics regression analysis is a collective name for techniques for the modeling and analysis of numerical data consisting of values of a Dependent variable (response This meaning of orthogonality derives from the mathematical one, because orthogonal vectors are linearly independent.

### Taxonomy

In taxonomy, an orthogonal classification is one in which no item is a member of more than one group, that is, the classifications are mutually exclusive. Taxonomy is the practice and science of classification The word comes from the Greek, taxis (meaning 'order' 'arrangement' and, nomos

### Combinatorics

In combinatorics, two n×n Latin squares are said to be orthogonal if their superimposition yields all possible n2 combinations of entries. A Latin square is an n × n table filled with n different symbols in such a way that each symbol occurs exactly once in each row and exactly once in In Graphics, superimposition is the placement of an Image or video on top of an already-existing image or video usually to add to the overall image effect but also

### Chemistry

In chemistry orthogonal protection is a strategy allowing the deprotection of functional groups independently of each other. A protecting group or protective group is introduced into a molecule by chemical modification of a Functional group in order to obtain Chemoselectivity