Back to Statsmodels

Multivariate Ls

examples/notebooks/multivariate_ls.ipynb

0.15.0.dev09.6 KB
Original Source

Multivariate Linear Model - MultivariateLS

This notebooks illustrates some features for the multivariate linear model estimated by least squares. The example is based on the UCLA stats example in https://stats.oarc.ucla.edu/stata/dae/multivariate-regression-analysis/ .

The model assumes that a multivariate dependent variable depends linearly on the same set of explanatory variables.

Y = X * B + u

where

  • the dependent variable (endog) Y has shape (nobs, k_endog),
  • the matrix of explanatory variables including constant (exog) X has shape (nobs, k_exog), and
  • the parameter matrix B has shape (k_exog, k_endog), i.e. coefficients for explanatory variables in rows and dependent variables in columns.
  • the disturbance term u has the same shape as Y, (nobs, k_endog), and is assumed to have mean zero and to be uncorrelated with the exog X.

Estimation is by least squares. The parameter estimates with common explanatory variables for each dependent variables corresponds to separate OLS estimates for each endog. The main advantage of the multivariate model is that we can make inference

python
import os
import numpy as np

import pandas as pd

from statsmodels.base.model import LikelihoodModel
from statsmodels.regression.linear_model import OLS
from statsmodels.multivariate.manova import MANOVA
from statsmodels.multivariate.multivariate_ols import MultivariateLS

import statsmodels.multivariate.tests.results as path

dir_path = os.path.dirname(os.path.abspath(path.__file__))
csv_path = os.path.join(dir_path, "mvreg.csv")
data_mvreg = pd.read_csv(csv_path)
python
data_mvreg.head()
python
formula = "locus_of_control + self_concept + motivation ~ read + write + science + prog"
mod = MultivariateLS.from_formula(formula, data=data_mvreg)
res = mod.fit()

Multivariate hypothesis tests mv_test

The mv_test method by default performs the hypothesis tests that each term in the formula has no effect on any of the dependent variables (endog). This is the same as the MANOVA test.
Note, MANOVA in statsmodels is implemented as test on coefficients in the multivariate model and is not restricted to categorical variables. In the current example, we have three continuous and one categorical explanatory variables, in addition to the constant.

Consequently, using mv_test in MultivariateLS and in MANOVA produces the same results. However. MANOVA only provides the hypothesis tests as feature, while MultivariateLS provide the usual model results.

More general versions of the mv_test are for hypothesis in the form L B M = C. Here L are restrictions corresponding to explanatory variables, M are restrictions corresponding to dependent (endog) variables and C is a matrix of constants for affine restrictions. See docstrings for details.

python
mvt = res.mv_test()
mvt.summary_frame
python
manova = MANOVA.from_formula(formula, data=data_mvreg)
manova.mv_test().summary_frame

The core multivariate regression results are displayed by the summary method.

python
print(res.summary())

The the standard results attributes for the parameter estimates like params, bse, tvalues and pvalues, are two dimensional arrays or dataframes with explanatory variables (exog) in rows and dependend (endog) variables in columns.

python
res.params
python
res.bse
python
res.pvalues

General MV and Wald tests

The multivariate linear model allows for multivariate test in the L B M form as well as standard wald tests on linear combination of parameters.

The multivariate tests are based on eigenvalues or trace of the matrices. Wald tests are standard test base on the flattened (stacked) parameter array and their covariance, hypothesis are of the form R b = c where b is the column stacked parameter array. The tests are asymptotically equivalent under the model assumptions but differ in small samples.

The linear restriction can be defined either as hypothesis matrices (numpy arrays) or as strings or list of strings.

python
yn = res.model.endog_names
xn = res.model.exog_names
yn, xn
python
# test for an individual coefficient

mvt = res.mv_test(hypotheses=[("coef", ["science"], ["locus_of_control"])])
mvt.summary_frame
python
tt = res.t_test("ylocus_of_control_science")
tt, tt.pvalue

We can use either mv_test or wald_test for the joint hypothesis that an explanatory variable has no effect on any of the dependent variables, that is all coefficient for the explanatory variable are zero.

In this example, the pvalues agree at 3 decimals.

python
wt = res.wald_test(
    ["ylocus_of_control_science", "yself_concept_science", "ymotivation_science"],
    scalar=True,
)
wt
python
mvt = res.mv_test(hypotheses=[("science", ["science"], yn)])
mvt.summary_frame
python
# t_test provides a vectorized results for each of the simple hypotheses

tt = res.t_test(
    ["ylocus_of_control_science", "yself_concept_science", "ymotivation_science"]
)
tt, tt.pvalue

Warning: the naming pattern for the flattened parameter names used in t_test and wald_test might still change.

The current pattern is "y{endog_name}_{exog_name}".

examples:

python
[f"y{endog_name}_{exog_name}" for endog_name in yn for exog_name in ["science"]]
python
c = [
    f"y{endog_name}_{exog_name}"
    for endog_name in yn
    for exog_name in ["prog[T.general]", "prog[T.vocational]"]
]
c

The previous restriction corresponds to the MANOVA type test that the categorical variable "prog" has no effect.

python
mant = manova.mv_test().summary_frame
mant.loc["prog"]  # ["Pr > F"].to_numpy()
python
res.wald_test(c, scalar=True)

Note: The degrees of freedom differ across hypothesis test methods. The model can be considered as a multivariate model with nobs=600 in this case, or as a stacked model with nobs_total = nobs * k_endog = 1800.

For within endog restriction, inference is based on the same covariance of the parameter estimates in MultivariateLS and OLS. The degrees of freedom in a single output OLS are df_resid = 600 - 6 = 594. Using the same degrees of freedom in MultivariateLS preserves the equivalence for the analysis of each endog. Using the total df_resid for hypothesis tests would make them more liberal.

Asymptotic inference based on normal and chisquare distribution (use_t=False) is not affected by how df_resid are defined.

It is not yet decided whether there will be additional options to choose different degrees of freedom in the Wald tests.

python
res.df_resid

Both mv_test and wald_test can be used to test hypothesis on contrasts between coefficients

python
c = [
    f"y{endog_name}_prog[T.general] - y{endog_name}_prog[T.vocational]"
    for endog_name in yn
]
c
python
res.wald_test(c, scalar=True)
python
mvt = res.mv_test(
    hypotheses=[("diff_prog", ["prog[T.general] - prog[T.vocational]"], yn)]
)
mvt.summary_frame

Example: hypothesis that coefficients are the same across endog equations.

We can test that the difference between the parameters of the later two equation with the first equation are zero.

python
mvt = res.mv_test(
    hypotheses=[
        (
            "diff_prog",
            xn,
            ["self_concept - locus_of_control", "motivation - locus_of_control"],
        )
    ]
)
mvt.summary_frame

In a similar hypothesis test, we can test that equation have the same slope coefficients but can have different constants.

python
xn[1:]
python
mvt = res.mv_test(
    hypotheses=[
        (
            "diff_prog",
            xn[1:],
            ["self_concept - locus_of_control", "motivation - locus_of_control"],
        )
    ]
)
mvt.summary_frame

Prediction

The regression model and its results instance have methods for prediction and residuals.

Note, because the parameter estimates are the same as in the OLS estimate for individual endog, the predictions will also be the same between the MultivariateLS model and the set of individual OLS models.

python
res.resid
python
res.predict()
python
res.predict(data_mvreg)
python
res.fittedvalues

The predict methods can take user provided data for the explanatory variables, but currently are not able to automatically create sets of explanatory variables for interesting effects.

In the following, we construct at dataframe that we can use to predict the conditional expectation of the dependent variables for each level of the categorical variable "prog" at the sample means of the continuous variables.

python
data_exog = data_mvreg[["read", "write", "science", "prog"]]

ex = pd.DataFrame(data_exog["prog"].unique(), columns=["prog"])
mean_ex = data_mvreg[["read", "write", "science"]].mean()

ex.loc[:, ["read", "write", "science"]] = mean_ex.values
ex
python
pred = res.predict(ex)

pred.index = ex["prog"]
pred.columns = res.fittedvalues.columns
print("predicted mean by 'prog':")
pred

Outlier-Influence

This is currently in draft version.
resid_distance is a one dimensional residual measure based on Mahalanobis distance for each sample observation. The hat matrix in the MultivariateLS model is the same as in OLS, the diagonal of the hat matrix is temporarily attached as results._hat_matrix_diag.

Note, individual components of the multivariate dependent variable can be analyzed with OLS and are available in the corresponding post-estimation methods like OLSInfluence.

python
res.resid_distance[:5]
python
res.cov_resid
python
import matplotlib.pyplot as plt
python
plt.plot(res.resid_distance)
python
plt.plot(res._hat_matrix_diag)
python
plt.plot(res._hat_matrix_diag, res.resid_distance, ".")