Main Content

nLinearCoeffs

Number of nonzero linear coefficients in discriminant analysis classifier

Description

example

ncoeffs = nLinearCoeffs(Mdl) returns the number of nonzero linear coefficients in the linear discriminant analysis classifier Mdl.

ncoeffs = nLinearCoeffs(Mdl,delta) returns the number of nonzero linear coefficients for the threshold parameter delta.

Examples

collapse all

Find the number of nonzero coefficients in a discriminant analysis classifier for various Delta values.

Create a discriminant analysis classifier from the fishseriris data.

load fisheriris
obj = fitcdiscr(meas,species);

Find the number of nonzero coefficients in obj.

ncoeffs = nLinearCoeffs(obj)
ncoeffs = 4

Find the number of nonzero coefficients for delta = 1, 2, 4, and 8.

delta = [1 2 4 8];
ncoeffs = nLinearCoeffs(obj,delta)
ncoeffs = 4×1

     4
     4
     3
     0

The DeltaPredictor property gives the values of delta where the number of nonzero coefficients changes.

ncoeffs2 = nLinearCoeffs(obj,obj.DeltaPredictor)
ncoeffs2 = 4×1

     4
     3
     1
     2

Input Arguments

collapse all

Trained discriminant analysis classifier, specified as a ClassificationDiscriminant model object trained with fitcdiscr, or a CompactClassificationDiscriminant model object created with compact.

Threshold, specified as a numeric scalar or numeric vector. See Gamma and Delta.

Example: delta = [1 2 3]

Data Types: single | double

Output Arguments

collapse all

Number of nonzero coefficients in the discriminant analysis classifier Mdl, returned as a nonnegative integer.

If you call nLinearCoeffs with the delta argument, ncoeffs is the number of nonzero linear coefficients for the threshold parameter delta. If delta is a vector, ncoeffs is a vector with the same number of elements.

If Mdl is a quadratic discriminant analysis classifier, ncoeffs is the number of predictors in Mdl.

More About

collapse all

Gamma and Delta

Regularization is the process of finding a small set of predictors that yield an effective predictive model. For linear discriminant analysis, there are two parameters, γ and δ, that control regularization as follows. cvshrink helps you select appropriate values of the parameters.

Let Σ represent the covariance matrix of the data X, and let X^ be the centered data (the data X minus the mean by class). Define

D=diag(X^T*X^).

The regularized covariance matrix Σ˜ is

Σ˜=(1γ)Σ+γD.

Whenever γ ≥ MinGamma, Σ˜ is nonsingular.

Let μk be the mean vector for those elements of X in class k, and let μ0 be the global mean vector (the mean of the rows of X). Let C be the correlation matrix of the data X, and let C˜ be the regularized correlation matrix:

C˜=(1γ)C+γI,

where I is the identity matrix.

The linear term in the regularized discriminant analysis classifier for a data point x is

(xμ0)TΣ˜1(μkμ0)=[(xμ0)TD1/2][C˜1D1/2(μkμ0)].

The parameter δ enters into this equation as a threshold on the final term in square brackets. Each component of the vector [C˜1D1/2(μkμ0)] is set to zero if it is smaller in magnitude than the threshold δ. Therefore, for class k, if component j is thresholded to zero, component j of x does not enter into the evaluation of the posterior probability.

The DeltaPredictor property is a vector related to this threshold. When δ ≥ DeltaPredictor(i), all classes k have

|C˜1D1/2(μkμ0)|δ.

Therefore, when δ ≥ DeltaPredictor(i), the regularized classifier does not use predictor i.

Version History

Introduced in R2011b