In constrained least squares one solves a linear least squares problem with an additional constraint on the solution. This means, the unconstrained equation X β = y {\displaystyle \mathbf {X} {\boldsymbol {\beta }}=\mathbf {y} } must be fit as closely as possible (in the least squares sense) while ensuring that some other property of β {\displaystyle {\boldsymbol {\beta }}} is maintained.
There are often special-purpose algorithms for solving such problems efficiently. Some examples of constraints are given below:
If the constraint only applies to some of the variables, the mixed problem may be solved using separable least squares by letting X = [ X 1 X 2 ] {\displaystyle \mathbf {X} =[\mathbf {X_{1}} \mathbf {X_{2}} ]} and β T = [ β 1 T β 2 T ] {\displaystyle \mathbf {\beta } ^{\rm {T}}=[\mathbf {\beta _{1}} ^{\rm {T}}\mathbf {\beta _{2}} ^{\rm {T}}]} represent the unconstrained (1) and constrained (2) components. Then substituting the least-squares solution for β 1 {\displaystyle \mathbf {\beta _{1}} } , i.e.
(where + indicates the Moore–Penrose pseudoinverse) back into the original expression gives (following some rearrangement) an equation that can be solved as a purely constrained problem in β 2 {\displaystyle \mathbf {\beta } _{2}} .
where P := I − X 1 X 1 + {\displaystyle \mathbf {P} :=\mathbf {I} -\mathbf {X} _{1}\mathbf {X} _{1}^{+}} is a projection matrix. Following the constrained estimation of β ^ 2 {\displaystyle {\hat {\boldsymbol {\beta }}}_{2}} the vector β ^ 1 {\displaystyle {\hat {\boldsymbol {\beta }}}_{1}} is obtained from the expression above.