Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationTue, 13 Dec 2016 10:48:20 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/13/t1481622763dxfgj6xcb4qjt6e.htm/, Retrieved Sun, 05 May 2024 00:49:39 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=299024, Retrieved Sun, 05 May 2024 00:49:39 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact104
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [] [2016-12-13 09:48:20] [94ac3c9a028ddd47e8862e80eac9f626] [Current]
Feedback Forum

Post a new message
Dataseries X:
22	15	13
24	13	16
21	14	17
21	13	NA
24	12	NA
20	17	16
22	12	NA
20	13	NA
19	13	NA
23	16	17
21	12	17
19	12	15
19	13	16
21	16	14
21	15	16
22	12	17
22	NA	NA
19	NA	NA
21	15	NA
21	12	NA
21	15	16
20	11	NA
22	13	16
22	13	NA
24	14	NA
21	14	NA
19	14	16
19	15	15
23	16	16
21	16	16
21	16	13
19	13	15
21	13	17
19	14	NA
21	13	13
21	14	17
23	12	NA
19	17	14
19	14	14
19	15	18
18	13	NA
22	14	17
18	15	13
22	19	16
18	14	15
22	13	15
22	12	NA
19	NA	15
22	14	13
25	15	NA
19	15	17
19	12	NA
19	14	NA
19	11	11
21	12	14
21	10	13
20	NA	NA
19	14	17
19	14	16
22	15	NA
26	15	17
19	13	16
21	15	16
21	16	16
20	12	15
23	17	12
22	15	17
22	NA	14
22	12	14
21	16	16
21	15	NA
22	15	NA
23	12	NA
18	13	NA
24	10	NA
22	14	15
21	11	16
21	12	14
21	14	15
23	12	17
21	14	NA
23	12	10
21	13	NA
19	13	17
21	14	NA
21	12	20
21	15	17
23	13	18
23	13	NA
20	11	17
20	12	14
19	16	NA
23	11	17
22	13	NA
19	12	17
23	17	NA
22	14	16
22	15	18
21	8	18
21	13	16
21	13	NA
21	15	NA
22	14	15
25	13	13
21	14	NA
23	12	NA
19	19	NA
22	15	NA
20	14	NA
21	14	16
25	15	NA
21	13	NA
19	15	NA
23	14	12
22	11	NA
21	17	16
24	13	16
21	9	NA
19	12	16
18	13	14
19	17	15
20	14	14
19	13	NA
22	16	15
21	14	NA
22	14	15
24	14	16
28	10	NA
19	12	NA
18	13	NA
23	14	11
19	18	NA
23	14	18
19	14	NA
22	13	11
21	13	NA
19	16	18
22	NA	NA
21	13	15
23	14	19
22	8	17
19	13	NA
19	13	14
21	16	NA
22	14	13
21	13	17
20	14	14
23	12	19
22	16	14
23	18	NA
22	16	NA
21	15	16
20	18	16
18	15	15
18	14	12
20	14	NA
19	15	17
21	9	NA
24	17	NA
19	11	18
20	15	15
19	NA	18
23	15	15
22	13	NA
21	NA	NA
24	15	NA
21	15	16
21	14	NA
22	16	16




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time6 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299024&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]6 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=299024&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299024&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
Age[t] = + 20.6376 -0.00459084Epsum[t] + 0.0268931tvsum[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Age[t] =  +  20.6376 -0.00459084Epsum[t] +  0.0268931tvsum[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299024&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Age[t] =  +  20.6376 -0.00459084Epsum[t] +  0.0268931tvsum[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299024&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299024&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Age[t] = + 20.6376 -0.00459084Epsum[t] + 0.0268931tvsum[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+20.64 1.933+1.0680e+01 4.69e-18 2.345e-18
Epsum-0.004591 0.08999-5.1020e-02 0.9594 0.4797
tvsum+0.02689 0.09055+2.9700e-01 0.7671 0.3836

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +20.64 &  1.933 & +1.0680e+01 &  4.69e-18 &  2.345e-18 \tabularnewline
Epsum & -0.004591 &  0.08999 & -5.1020e-02 &  0.9594 &  0.4797 \tabularnewline
tvsum & +0.02689 &  0.09055 & +2.9700e-01 &  0.7671 &  0.3836 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299024&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+20.64[/C][C] 1.933[/C][C]+1.0680e+01[/C][C] 4.69e-18[/C][C] 2.345e-18[/C][/ROW]
[ROW][C]Epsum[/C][C]-0.004591[/C][C] 0.08999[/C][C]-5.1020e-02[/C][C] 0.9594[/C][C] 0.4797[/C][/ROW]
[ROW][C]tvsum[/C][C]+0.02689[/C][C] 0.09055[/C][C]+2.9700e-01[/C][C] 0.7671[/C][C] 0.3836[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299024&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299024&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+20.64 1.933+1.0680e+01 4.69e-18 2.345e-18
Epsum-0.004591 0.08999-5.1020e-02 0.9594 0.4797
tvsum+0.02689 0.09055+2.9700e-01 0.7671 0.3836







Multiple Linear Regression - Regression Statistics
Multiple R 0.0309
R-squared 0.0009548
Adjusted R-squared-0.01964
F-TEST (value) 0.04635
F-TEST (DF numerator)2
F-TEST (DF denominator)97
p-value 0.9547
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.689
Sum Squared Residuals 276.7

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.0309 \tabularnewline
R-squared &  0.0009548 \tabularnewline
Adjusted R-squared & -0.01964 \tabularnewline
F-TEST (value) &  0.04635 \tabularnewline
F-TEST (DF numerator) & 2 \tabularnewline
F-TEST (DF denominator) & 97 \tabularnewline
p-value &  0.9547 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1.689 \tabularnewline
Sum Squared Residuals &  276.7 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299024&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.0309[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.0009548[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]-0.01964[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 0.04635[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]2[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]97[/C][/ROW]
[ROW][C]p-value[/C][C] 0.9547[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1.689[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 276.7[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299024&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299024&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.0309
R-squared 0.0009548
Adjusted R-squared-0.01964
F-TEST (value) 0.04635
F-TEST (DF numerator)2
F-TEST (DF denominator)97
p-value 0.9547
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.689
Sum Squared Residuals 276.7







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 22 20.92 1.082
2 24 21.01 2.992
3 21 21.03-0.03055
4 20 20.99-0.9899
5 23 21.02 1.979
6 21 21.04-0.03973
7 19 20.99-1.986
8 19 21.01-2.008
9 21 20.94 0.05931
10 21 21 0.0009343
11 22 21.04 0.9603
12 21 21 0.0009343
13 22 21.01 0.9918
14 19 21-2.004
15 19 20.97-1.972
16 23 20.99 2.006
17 21 20.99 0.005525
18 21 20.91 0.0862
19 19 20.98-1.981
20 21 21.04-0.03514
21 21 20.93 0.07243
22 21 21.03-0.03055
23 19 20.94-1.936
24 19 20.95-1.95
25 19 21.05-2.053
26 22 21.03 0.9695
27 18 20.92-2.918
28 22 20.98 1.019
29 18 20.98-2.977
30 22 20.98 1.019
31 22 20.92 1.077
32 19 21.03-2.026
33 19 20.88-1.883
34 21 20.96 0.04095
35 21 20.94 0.05866
36 19 21.03-2.031
37 19 21-2.004
38 26 21.03 4.974
39 19 21.01-2.008
40 21 21 0.0009343
41 21 20.99 0.005525
42 20 20.99-0.9859
43 23 20.88 2.118
44 22 21.03 0.974
45 22 20.96 1.041
46 21 20.99 0.005525
47 22 20.98 1.023
48 21 21.02-0.01743
49 21 20.96 0.04095
50 21 20.98 0.02324
51 23 21.04 1.96
52 23 20.85 2.149
53 19 21.04-2.035
54 21 21.12-0.1204
55 21 21.03-0.02596
56 23 21.06 1.938
57 20 21.04-1.044
58 20 20.96-0.9591
59 23 21.04 1.956
60 19 21.04-2.04
61 22 21 0.9963
62 22 21.05 0.9471
63 21 21.09-0.08499
64 21 21.01-0.008247
65 22 20.98 1.023
66 25 20.93 4.072
67 21 21-0.003657
68 23 20.9 2.104
69 21 20.99 0.01012
70 24 21.01 2.992
71 19 21.01-2.013
72 18 20.95-2.954
73 19 20.96-1.963
74 20 20.95-0.9499
75 22 20.97 1.032
76 22 20.98 1.023
77 24 21 2.996
78 23 20.87 2.131
79 23 21.06 1.943
80 22 20.87 1.126
81 19 21.05-2.048
82 21 20.98 0.01865
83 23 21.08 1.916
84 22 21.06 0.9419
85 19 20.95-1.954
86 22 20.92 1.077
87 21 21.04-0.03514
88 20 20.95-0.9499
89 23 21.09 1.906
90 22 20.94 1.059
91 21 21 0.0009343
92 20 20.99-0.9853
93 18 20.97-2.972
94 18 20.9-2.896
95 19 21.03-2.026
96 19 21.07-2.071
97 20 20.97-0.9722
98 23 20.97 2.028
99 21 21 0.0009343
100 22 20.99 1.006

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  22 &  20.92 &  1.082 \tabularnewline
2 &  24 &  21.01 &  2.992 \tabularnewline
3 &  21 &  21.03 & -0.03055 \tabularnewline
4 &  20 &  20.99 & -0.9899 \tabularnewline
5 &  23 &  21.02 &  1.979 \tabularnewline
6 &  21 &  21.04 & -0.03973 \tabularnewline
7 &  19 &  20.99 & -1.986 \tabularnewline
8 &  19 &  21.01 & -2.008 \tabularnewline
9 &  21 &  20.94 &  0.05931 \tabularnewline
10 &  21 &  21 &  0.0009343 \tabularnewline
11 &  22 &  21.04 &  0.9603 \tabularnewline
12 &  21 &  21 &  0.0009343 \tabularnewline
13 &  22 &  21.01 &  0.9918 \tabularnewline
14 &  19 &  21 & -2.004 \tabularnewline
15 &  19 &  20.97 & -1.972 \tabularnewline
16 &  23 &  20.99 &  2.006 \tabularnewline
17 &  21 &  20.99 &  0.005525 \tabularnewline
18 &  21 &  20.91 &  0.0862 \tabularnewline
19 &  19 &  20.98 & -1.981 \tabularnewline
20 &  21 &  21.04 & -0.03514 \tabularnewline
21 &  21 &  20.93 &  0.07243 \tabularnewline
22 &  21 &  21.03 & -0.03055 \tabularnewline
23 &  19 &  20.94 & -1.936 \tabularnewline
24 &  19 &  20.95 & -1.95 \tabularnewline
25 &  19 &  21.05 & -2.053 \tabularnewline
26 &  22 &  21.03 &  0.9695 \tabularnewline
27 &  18 &  20.92 & -2.918 \tabularnewline
28 &  22 &  20.98 &  1.019 \tabularnewline
29 &  18 &  20.98 & -2.977 \tabularnewline
30 &  22 &  20.98 &  1.019 \tabularnewline
31 &  22 &  20.92 &  1.077 \tabularnewline
32 &  19 &  21.03 & -2.026 \tabularnewline
33 &  19 &  20.88 & -1.883 \tabularnewline
34 &  21 &  20.96 &  0.04095 \tabularnewline
35 &  21 &  20.94 &  0.05866 \tabularnewline
36 &  19 &  21.03 & -2.031 \tabularnewline
37 &  19 &  21 & -2.004 \tabularnewline
38 &  26 &  21.03 &  4.974 \tabularnewline
39 &  19 &  21.01 & -2.008 \tabularnewline
40 &  21 &  21 &  0.0009343 \tabularnewline
41 &  21 &  20.99 &  0.005525 \tabularnewline
42 &  20 &  20.99 & -0.9859 \tabularnewline
43 &  23 &  20.88 &  2.118 \tabularnewline
44 &  22 &  21.03 &  0.974 \tabularnewline
45 &  22 &  20.96 &  1.041 \tabularnewline
46 &  21 &  20.99 &  0.005525 \tabularnewline
47 &  22 &  20.98 &  1.023 \tabularnewline
48 &  21 &  21.02 & -0.01743 \tabularnewline
49 &  21 &  20.96 &  0.04095 \tabularnewline
50 &  21 &  20.98 &  0.02324 \tabularnewline
51 &  23 &  21.04 &  1.96 \tabularnewline
52 &  23 &  20.85 &  2.149 \tabularnewline
53 &  19 &  21.04 & -2.035 \tabularnewline
54 &  21 &  21.12 & -0.1204 \tabularnewline
55 &  21 &  21.03 & -0.02596 \tabularnewline
56 &  23 &  21.06 &  1.938 \tabularnewline
57 &  20 &  21.04 & -1.044 \tabularnewline
58 &  20 &  20.96 & -0.9591 \tabularnewline
59 &  23 &  21.04 &  1.956 \tabularnewline
60 &  19 &  21.04 & -2.04 \tabularnewline
61 &  22 &  21 &  0.9963 \tabularnewline
62 &  22 &  21.05 &  0.9471 \tabularnewline
63 &  21 &  21.09 & -0.08499 \tabularnewline
64 &  21 &  21.01 & -0.008247 \tabularnewline
65 &  22 &  20.98 &  1.023 \tabularnewline
66 &  25 &  20.93 &  4.072 \tabularnewline
67 &  21 &  21 & -0.003657 \tabularnewline
68 &  23 &  20.9 &  2.104 \tabularnewline
69 &  21 &  20.99 &  0.01012 \tabularnewline
70 &  24 &  21.01 &  2.992 \tabularnewline
71 &  19 &  21.01 & -2.013 \tabularnewline
72 &  18 &  20.95 & -2.954 \tabularnewline
73 &  19 &  20.96 & -1.963 \tabularnewline
74 &  20 &  20.95 & -0.9499 \tabularnewline
75 &  22 &  20.97 &  1.032 \tabularnewline
76 &  22 &  20.98 &  1.023 \tabularnewline
77 &  24 &  21 &  2.996 \tabularnewline
78 &  23 &  20.87 &  2.131 \tabularnewline
79 &  23 &  21.06 &  1.943 \tabularnewline
80 &  22 &  20.87 &  1.126 \tabularnewline
81 &  19 &  21.05 & -2.048 \tabularnewline
82 &  21 &  20.98 &  0.01865 \tabularnewline
83 &  23 &  21.08 &  1.916 \tabularnewline
84 &  22 &  21.06 &  0.9419 \tabularnewline
85 &  19 &  20.95 & -1.954 \tabularnewline
86 &  22 &  20.92 &  1.077 \tabularnewline
87 &  21 &  21.04 & -0.03514 \tabularnewline
88 &  20 &  20.95 & -0.9499 \tabularnewline
89 &  23 &  21.09 &  1.906 \tabularnewline
90 &  22 &  20.94 &  1.059 \tabularnewline
91 &  21 &  21 &  0.0009343 \tabularnewline
92 &  20 &  20.99 & -0.9853 \tabularnewline
93 &  18 &  20.97 & -2.972 \tabularnewline
94 &  18 &  20.9 & -2.896 \tabularnewline
95 &  19 &  21.03 & -2.026 \tabularnewline
96 &  19 &  21.07 & -2.071 \tabularnewline
97 &  20 &  20.97 & -0.9722 \tabularnewline
98 &  23 &  20.97 &  2.028 \tabularnewline
99 &  21 &  21 &  0.0009343 \tabularnewline
100 &  22 &  20.99 &  1.006 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299024&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 22[/C][C] 20.92[/C][C] 1.082[/C][/ROW]
[ROW][C]2[/C][C] 24[/C][C] 21.01[/C][C] 2.992[/C][/ROW]
[ROW][C]3[/C][C] 21[/C][C] 21.03[/C][C]-0.03055[/C][/ROW]
[ROW][C]4[/C][C] 20[/C][C] 20.99[/C][C]-0.9899[/C][/ROW]
[ROW][C]5[/C][C] 23[/C][C] 21.02[/C][C] 1.979[/C][/ROW]
[ROW][C]6[/C][C] 21[/C][C] 21.04[/C][C]-0.03973[/C][/ROW]
[ROW][C]7[/C][C] 19[/C][C] 20.99[/C][C]-1.986[/C][/ROW]
[ROW][C]8[/C][C] 19[/C][C] 21.01[/C][C]-2.008[/C][/ROW]
[ROW][C]9[/C][C] 21[/C][C] 20.94[/C][C] 0.05931[/C][/ROW]
[ROW][C]10[/C][C] 21[/C][C] 21[/C][C] 0.0009343[/C][/ROW]
[ROW][C]11[/C][C] 22[/C][C] 21.04[/C][C] 0.9603[/C][/ROW]
[ROW][C]12[/C][C] 21[/C][C] 21[/C][C] 0.0009343[/C][/ROW]
[ROW][C]13[/C][C] 22[/C][C] 21.01[/C][C] 0.9918[/C][/ROW]
[ROW][C]14[/C][C] 19[/C][C] 21[/C][C]-2.004[/C][/ROW]
[ROW][C]15[/C][C] 19[/C][C] 20.97[/C][C]-1.972[/C][/ROW]
[ROW][C]16[/C][C] 23[/C][C] 20.99[/C][C] 2.006[/C][/ROW]
[ROW][C]17[/C][C] 21[/C][C] 20.99[/C][C] 0.005525[/C][/ROW]
[ROW][C]18[/C][C] 21[/C][C] 20.91[/C][C] 0.0862[/C][/ROW]
[ROW][C]19[/C][C] 19[/C][C] 20.98[/C][C]-1.981[/C][/ROW]
[ROW][C]20[/C][C] 21[/C][C] 21.04[/C][C]-0.03514[/C][/ROW]
[ROW][C]21[/C][C] 21[/C][C] 20.93[/C][C] 0.07243[/C][/ROW]
[ROW][C]22[/C][C] 21[/C][C] 21.03[/C][C]-0.03055[/C][/ROW]
[ROW][C]23[/C][C] 19[/C][C] 20.94[/C][C]-1.936[/C][/ROW]
[ROW][C]24[/C][C] 19[/C][C] 20.95[/C][C]-1.95[/C][/ROW]
[ROW][C]25[/C][C] 19[/C][C] 21.05[/C][C]-2.053[/C][/ROW]
[ROW][C]26[/C][C] 22[/C][C] 21.03[/C][C] 0.9695[/C][/ROW]
[ROW][C]27[/C][C] 18[/C][C] 20.92[/C][C]-2.918[/C][/ROW]
[ROW][C]28[/C][C] 22[/C][C] 20.98[/C][C] 1.019[/C][/ROW]
[ROW][C]29[/C][C] 18[/C][C] 20.98[/C][C]-2.977[/C][/ROW]
[ROW][C]30[/C][C] 22[/C][C] 20.98[/C][C] 1.019[/C][/ROW]
[ROW][C]31[/C][C] 22[/C][C] 20.92[/C][C] 1.077[/C][/ROW]
[ROW][C]32[/C][C] 19[/C][C] 21.03[/C][C]-2.026[/C][/ROW]
[ROW][C]33[/C][C] 19[/C][C] 20.88[/C][C]-1.883[/C][/ROW]
[ROW][C]34[/C][C] 21[/C][C] 20.96[/C][C] 0.04095[/C][/ROW]
[ROW][C]35[/C][C] 21[/C][C] 20.94[/C][C] 0.05866[/C][/ROW]
[ROW][C]36[/C][C] 19[/C][C] 21.03[/C][C]-2.031[/C][/ROW]
[ROW][C]37[/C][C] 19[/C][C] 21[/C][C]-2.004[/C][/ROW]
[ROW][C]38[/C][C] 26[/C][C] 21.03[/C][C] 4.974[/C][/ROW]
[ROW][C]39[/C][C] 19[/C][C] 21.01[/C][C]-2.008[/C][/ROW]
[ROW][C]40[/C][C] 21[/C][C] 21[/C][C] 0.0009343[/C][/ROW]
[ROW][C]41[/C][C] 21[/C][C] 20.99[/C][C] 0.005525[/C][/ROW]
[ROW][C]42[/C][C] 20[/C][C] 20.99[/C][C]-0.9859[/C][/ROW]
[ROW][C]43[/C][C] 23[/C][C] 20.88[/C][C] 2.118[/C][/ROW]
[ROW][C]44[/C][C] 22[/C][C] 21.03[/C][C] 0.974[/C][/ROW]
[ROW][C]45[/C][C] 22[/C][C] 20.96[/C][C] 1.041[/C][/ROW]
[ROW][C]46[/C][C] 21[/C][C] 20.99[/C][C] 0.005525[/C][/ROW]
[ROW][C]47[/C][C] 22[/C][C] 20.98[/C][C] 1.023[/C][/ROW]
[ROW][C]48[/C][C] 21[/C][C] 21.02[/C][C]-0.01743[/C][/ROW]
[ROW][C]49[/C][C] 21[/C][C] 20.96[/C][C] 0.04095[/C][/ROW]
[ROW][C]50[/C][C] 21[/C][C] 20.98[/C][C] 0.02324[/C][/ROW]
[ROW][C]51[/C][C] 23[/C][C] 21.04[/C][C] 1.96[/C][/ROW]
[ROW][C]52[/C][C] 23[/C][C] 20.85[/C][C] 2.149[/C][/ROW]
[ROW][C]53[/C][C] 19[/C][C] 21.04[/C][C]-2.035[/C][/ROW]
[ROW][C]54[/C][C] 21[/C][C] 21.12[/C][C]-0.1204[/C][/ROW]
[ROW][C]55[/C][C] 21[/C][C] 21.03[/C][C]-0.02596[/C][/ROW]
[ROW][C]56[/C][C] 23[/C][C] 21.06[/C][C] 1.938[/C][/ROW]
[ROW][C]57[/C][C] 20[/C][C] 21.04[/C][C]-1.044[/C][/ROW]
[ROW][C]58[/C][C] 20[/C][C] 20.96[/C][C]-0.9591[/C][/ROW]
[ROW][C]59[/C][C] 23[/C][C] 21.04[/C][C] 1.956[/C][/ROW]
[ROW][C]60[/C][C] 19[/C][C] 21.04[/C][C]-2.04[/C][/ROW]
[ROW][C]61[/C][C] 22[/C][C] 21[/C][C] 0.9963[/C][/ROW]
[ROW][C]62[/C][C] 22[/C][C] 21.05[/C][C] 0.9471[/C][/ROW]
[ROW][C]63[/C][C] 21[/C][C] 21.09[/C][C]-0.08499[/C][/ROW]
[ROW][C]64[/C][C] 21[/C][C] 21.01[/C][C]-0.008247[/C][/ROW]
[ROW][C]65[/C][C] 22[/C][C] 20.98[/C][C] 1.023[/C][/ROW]
[ROW][C]66[/C][C] 25[/C][C] 20.93[/C][C] 4.072[/C][/ROW]
[ROW][C]67[/C][C] 21[/C][C] 21[/C][C]-0.003657[/C][/ROW]
[ROW][C]68[/C][C] 23[/C][C] 20.9[/C][C] 2.104[/C][/ROW]
[ROW][C]69[/C][C] 21[/C][C] 20.99[/C][C] 0.01012[/C][/ROW]
[ROW][C]70[/C][C] 24[/C][C] 21.01[/C][C] 2.992[/C][/ROW]
[ROW][C]71[/C][C] 19[/C][C] 21.01[/C][C]-2.013[/C][/ROW]
[ROW][C]72[/C][C] 18[/C][C] 20.95[/C][C]-2.954[/C][/ROW]
[ROW][C]73[/C][C] 19[/C][C] 20.96[/C][C]-1.963[/C][/ROW]
[ROW][C]74[/C][C] 20[/C][C] 20.95[/C][C]-0.9499[/C][/ROW]
[ROW][C]75[/C][C] 22[/C][C] 20.97[/C][C] 1.032[/C][/ROW]
[ROW][C]76[/C][C] 22[/C][C] 20.98[/C][C] 1.023[/C][/ROW]
[ROW][C]77[/C][C] 24[/C][C] 21[/C][C] 2.996[/C][/ROW]
[ROW][C]78[/C][C] 23[/C][C] 20.87[/C][C] 2.131[/C][/ROW]
[ROW][C]79[/C][C] 23[/C][C] 21.06[/C][C] 1.943[/C][/ROW]
[ROW][C]80[/C][C] 22[/C][C] 20.87[/C][C] 1.126[/C][/ROW]
[ROW][C]81[/C][C] 19[/C][C] 21.05[/C][C]-2.048[/C][/ROW]
[ROW][C]82[/C][C] 21[/C][C] 20.98[/C][C] 0.01865[/C][/ROW]
[ROW][C]83[/C][C] 23[/C][C] 21.08[/C][C] 1.916[/C][/ROW]
[ROW][C]84[/C][C] 22[/C][C] 21.06[/C][C] 0.9419[/C][/ROW]
[ROW][C]85[/C][C] 19[/C][C] 20.95[/C][C]-1.954[/C][/ROW]
[ROW][C]86[/C][C] 22[/C][C] 20.92[/C][C] 1.077[/C][/ROW]
[ROW][C]87[/C][C] 21[/C][C] 21.04[/C][C]-0.03514[/C][/ROW]
[ROW][C]88[/C][C] 20[/C][C] 20.95[/C][C]-0.9499[/C][/ROW]
[ROW][C]89[/C][C] 23[/C][C] 21.09[/C][C] 1.906[/C][/ROW]
[ROW][C]90[/C][C] 22[/C][C] 20.94[/C][C] 1.059[/C][/ROW]
[ROW][C]91[/C][C] 21[/C][C] 21[/C][C] 0.0009343[/C][/ROW]
[ROW][C]92[/C][C] 20[/C][C] 20.99[/C][C]-0.9853[/C][/ROW]
[ROW][C]93[/C][C] 18[/C][C] 20.97[/C][C]-2.972[/C][/ROW]
[ROW][C]94[/C][C] 18[/C][C] 20.9[/C][C]-2.896[/C][/ROW]
[ROW][C]95[/C][C] 19[/C][C] 21.03[/C][C]-2.026[/C][/ROW]
[ROW][C]96[/C][C] 19[/C][C] 21.07[/C][C]-2.071[/C][/ROW]
[ROW][C]97[/C][C] 20[/C][C] 20.97[/C][C]-0.9722[/C][/ROW]
[ROW][C]98[/C][C] 23[/C][C] 20.97[/C][C] 2.028[/C][/ROW]
[ROW][C]99[/C][C] 21[/C][C] 21[/C][C] 0.0009343[/C][/ROW]
[ROW][C]100[/C][C] 22[/C][C] 20.99[/C][C] 1.006[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299024&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299024&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 22 20.92 1.082
2 24 21.01 2.992
3 21 21.03-0.03055
4 20 20.99-0.9899
5 23 21.02 1.979
6 21 21.04-0.03973
7 19 20.99-1.986
8 19 21.01-2.008
9 21 20.94 0.05931
10 21 21 0.0009343
11 22 21.04 0.9603
12 21 21 0.0009343
13 22 21.01 0.9918
14 19 21-2.004
15 19 20.97-1.972
16 23 20.99 2.006
17 21 20.99 0.005525
18 21 20.91 0.0862
19 19 20.98-1.981
20 21 21.04-0.03514
21 21 20.93 0.07243
22 21 21.03-0.03055
23 19 20.94-1.936
24 19 20.95-1.95
25 19 21.05-2.053
26 22 21.03 0.9695
27 18 20.92-2.918
28 22 20.98 1.019
29 18 20.98-2.977
30 22 20.98 1.019
31 22 20.92 1.077
32 19 21.03-2.026
33 19 20.88-1.883
34 21 20.96 0.04095
35 21 20.94 0.05866
36 19 21.03-2.031
37 19 21-2.004
38 26 21.03 4.974
39 19 21.01-2.008
40 21 21 0.0009343
41 21 20.99 0.005525
42 20 20.99-0.9859
43 23 20.88 2.118
44 22 21.03 0.974
45 22 20.96 1.041
46 21 20.99 0.005525
47 22 20.98 1.023
48 21 21.02-0.01743
49 21 20.96 0.04095
50 21 20.98 0.02324
51 23 21.04 1.96
52 23 20.85 2.149
53 19 21.04-2.035
54 21 21.12-0.1204
55 21 21.03-0.02596
56 23 21.06 1.938
57 20 21.04-1.044
58 20 20.96-0.9591
59 23 21.04 1.956
60 19 21.04-2.04
61 22 21 0.9963
62 22 21.05 0.9471
63 21 21.09-0.08499
64 21 21.01-0.008247
65 22 20.98 1.023
66 25 20.93 4.072
67 21 21-0.003657
68 23 20.9 2.104
69 21 20.99 0.01012
70 24 21.01 2.992
71 19 21.01-2.013
72 18 20.95-2.954
73 19 20.96-1.963
74 20 20.95-0.9499
75 22 20.97 1.032
76 22 20.98 1.023
77 24 21 2.996
78 23 20.87 2.131
79 23 21.06 1.943
80 22 20.87 1.126
81 19 21.05-2.048
82 21 20.98 0.01865
83 23 21.08 1.916
84 22 21.06 0.9419
85 19 20.95-1.954
86 22 20.92 1.077
87 21 21.04-0.03514
88 20 20.95-0.9499
89 23 21.09 1.906
90 22 20.94 1.059
91 21 21 0.0009343
92 20 20.99-0.9853
93 18 20.97-2.972
94 18 20.9-2.896
95 19 21.03-2.026
96 19 21.07-2.071
97 20 20.97-0.9722
98 23 20.97 2.028
99 21 21 0.0009343
100 22 20.99 1.006







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
6 0.6666 0.6668 0.3334
7 0.7929 0.4142 0.2071
8 0.8092 0.3816 0.1908
9 0.7155 0.5689 0.2845
10 0.6112 0.7776 0.3888
11 0.537 0.9259 0.463
12 0.4343 0.8686 0.5657
13 0.3638 0.7276 0.6362
14 0.4238 0.8475 0.5762
15 0.4478 0.8957 0.5522
16 0.4579 0.9157 0.5421
17 0.377 0.7539 0.623
18 0.3038 0.6077 0.6962
19 0.3035 0.6069 0.6965
20 0.2372 0.4744 0.7628
21 0.195 0.3901 0.805
22 0.1465 0.2931 0.8535
23 0.1634 0.3269 0.8366
24 0.1574 0.3147 0.8426
25 0.2063 0.4125 0.7937
26 0.1761 0.3523 0.8239
27 0.2278 0.4555 0.7722
28 0.1916 0.3831 0.8084
29 0.2692 0.5383 0.7308
30 0.2616 0.5233 0.7384
31 0.2713 0.5426 0.7287
32 0.2996 0.5991 0.7004
33 0.2766 0.5532 0.7234
34 0.2344 0.4688 0.7656
35 0.1995 0.399 0.8005
36 0.2182 0.4363 0.7818
37 0.2296 0.4592 0.7704
38 0.6785 0.643 0.3215
39 0.6941 0.6118 0.3059
40 0.6399 0.7201 0.3601
41 0.5827 0.8346 0.4173
42 0.5415 0.9171 0.4585
43 0.587 0.826 0.413
44 0.5482 0.9035 0.4518
45 0.5261 0.9477 0.4739
46 0.4672 0.9345 0.5328
47 0.4337 0.8674 0.5663
48 0.3805 0.7611 0.6195
49 0.3304 0.6609 0.6696
50 0.2798 0.5595 0.7202
51 0.2982 0.5963 0.7018
52 0.3346 0.6692 0.6654
53 0.3545 0.7089 0.6455
54 0.3024 0.6047 0.6976
55 0.2531 0.5062 0.7469
56 0.2672 0.5344 0.7328
57 0.2386 0.4772 0.7614
58 0.2113 0.4226 0.7887
59 0.2217 0.4433 0.7783
60 0.2436 0.4872 0.7564
61 0.2123 0.4246 0.7877
62 0.1837 0.3673 0.8163
63 0.1503 0.3005 0.8497
64 0.118 0.2361 0.882
65 0.09853 0.1971 0.9015
66 0.258 0.5159 0.742
67 0.2104 0.4207 0.7896
68 0.2347 0.4693 0.7653
69 0.1896 0.3792 0.8104
70 0.2818 0.5635 0.7182
71 0.3032 0.6065 0.6968
72 0.423 0.8461 0.577
73 0.428 0.856 0.572
74 0.3843 0.7686 0.6157
75 0.3452 0.6905 0.6548
76 0.3021 0.6042 0.6979
77 0.4298 0.8597 0.5702
78 0.5014 0.9971 0.4986
79 0.5227 0.9547 0.4773
80 0.5334 0.9333 0.4666
81 0.5908 0.8184 0.4092
82 0.5174 0.9653 0.4826
83 0.4973 0.9946 0.5027
84 0.4723 0.9447 0.5277
85 0.4276 0.8552 0.5724
86 0.4808 0.9617 0.5192
87 0.3914 0.7829 0.6086
88 0.3075 0.6149 0.6925
89 0.3952 0.7903 0.6048
90 0.3891 0.7782 0.6109
91 0.3015 0.603 0.6985
92 0.2812 0.5623 0.7188
93 0.3714 0.7428 0.6286
94 0.5061 0.9878 0.4939

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
6 &  0.6666 &  0.6668 &  0.3334 \tabularnewline
7 &  0.7929 &  0.4142 &  0.2071 \tabularnewline
8 &  0.8092 &  0.3816 &  0.1908 \tabularnewline
9 &  0.7155 &  0.5689 &  0.2845 \tabularnewline
10 &  0.6112 &  0.7776 &  0.3888 \tabularnewline
11 &  0.537 &  0.9259 &  0.463 \tabularnewline
12 &  0.4343 &  0.8686 &  0.5657 \tabularnewline
13 &  0.3638 &  0.7276 &  0.6362 \tabularnewline
14 &  0.4238 &  0.8475 &  0.5762 \tabularnewline
15 &  0.4478 &  0.8957 &  0.5522 \tabularnewline
16 &  0.4579 &  0.9157 &  0.5421 \tabularnewline
17 &  0.377 &  0.7539 &  0.623 \tabularnewline
18 &  0.3038 &  0.6077 &  0.6962 \tabularnewline
19 &  0.3035 &  0.6069 &  0.6965 \tabularnewline
20 &  0.2372 &  0.4744 &  0.7628 \tabularnewline
21 &  0.195 &  0.3901 &  0.805 \tabularnewline
22 &  0.1465 &  0.2931 &  0.8535 \tabularnewline
23 &  0.1634 &  0.3269 &  0.8366 \tabularnewline
24 &  0.1574 &  0.3147 &  0.8426 \tabularnewline
25 &  0.2063 &  0.4125 &  0.7937 \tabularnewline
26 &  0.1761 &  0.3523 &  0.8239 \tabularnewline
27 &  0.2278 &  0.4555 &  0.7722 \tabularnewline
28 &  0.1916 &  0.3831 &  0.8084 \tabularnewline
29 &  0.2692 &  0.5383 &  0.7308 \tabularnewline
30 &  0.2616 &  0.5233 &  0.7384 \tabularnewline
31 &  0.2713 &  0.5426 &  0.7287 \tabularnewline
32 &  0.2996 &  0.5991 &  0.7004 \tabularnewline
33 &  0.2766 &  0.5532 &  0.7234 \tabularnewline
34 &  0.2344 &  0.4688 &  0.7656 \tabularnewline
35 &  0.1995 &  0.399 &  0.8005 \tabularnewline
36 &  0.2182 &  0.4363 &  0.7818 \tabularnewline
37 &  0.2296 &  0.4592 &  0.7704 \tabularnewline
38 &  0.6785 &  0.643 &  0.3215 \tabularnewline
39 &  0.6941 &  0.6118 &  0.3059 \tabularnewline
40 &  0.6399 &  0.7201 &  0.3601 \tabularnewline
41 &  0.5827 &  0.8346 &  0.4173 \tabularnewline
42 &  0.5415 &  0.9171 &  0.4585 \tabularnewline
43 &  0.587 &  0.826 &  0.413 \tabularnewline
44 &  0.5482 &  0.9035 &  0.4518 \tabularnewline
45 &  0.5261 &  0.9477 &  0.4739 \tabularnewline
46 &  0.4672 &  0.9345 &  0.5328 \tabularnewline
47 &  0.4337 &  0.8674 &  0.5663 \tabularnewline
48 &  0.3805 &  0.7611 &  0.6195 \tabularnewline
49 &  0.3304 &  0.6609 &  0.6696 \tabularnewline
50 &  0.2798 &  0.5595 &  0.7202 \tabularnewline
51 &  0.2982 &  0.5963 &  0.7018 \tabularnewline
52 &  0.3346 &  0.6692 &  0.6654 \tabularnewline
53 &  0.3545 &  0.7089 &  0.6455 \tabularnewline
54 &  0.3024 &  0.6047 &  0.6976 \tabularnewline
55 &  0.2531 &  0.5062 &  0.7469 \tabularnewline
56 &  0.2672 &  0.5344 &  0.7328 \tabularnewline
57 &  0.2386 &  0.4772 &  0.7614 \tabularnewline
58 &  0.2113 &  0.4226 &  0.7887 \tabularnewline
59 &  0.2217 &  0.4433 &  0.7783 \tabularnewline
60 &  0.2436 &  0.4872 &  0.7564 \tabularnewline
61 &  0.2123 &  0.4246 &  0.7877 \tabularnewline
62 &  0.1837 &  0.3673 &  0.8163 \tabularnewline
63 &  0.1503 &  0.3005 &  0.8497 \tabularnewline
64 &  0.118 &  0.2361 &  0.882 \tabularnewline
65 &  0.09853 &  0.1971 &  0.9015 \tabularnewline
66 &  0.258 &  0.5159 &  0.742 \tabularnewline
67 &  0.2104 &  0.4207 &  0.7896 \tabularnewline
68 &  0.2347 &  0.4693 &  0.7653 \tabularnewline
69 &  0.1896 &  0.3792 &  0.8104 \tabularnewline
70 &  0.2818 &  0.5635 &  0.7182 \tabularnewline
71 &  0.3032 &  0.6065 &  0.6968 \tabularnewline
72 &  0.423 &  0.8461 &  0.577 \tabularnewline
73 &  0.428 &  0.856 &  0.572 \tabularnewline
74 &  0.3843 &  0.7686 &  0.6157 \tabularnewline
75 &  0.3452 &  0.6905 &  0.6548 \tabularnewline
76 &  0.3021 &  0.6042 &  0.6979 \tabularnewline
77 &  0.4298 &  0.8597 &  0.5702 \tabularnewline
78 &  0.5014 &  0.9971 &  0.4986 \tabularnewline
79 &  0.5227 &  0.9547 &  0.4773 \tabularnewline
80 &  0.5334 &  0.9333 &  0.4666 \tabularnewline
81 &  0.5908 &  0.8184 &  0.4092 \tabularnewline
82 &  0.5174 &  0.9653 &  0.4826 \tabularnewline
83 &  0.4973 &  0.9946 &  0.5027 \tabularnewline
84 &  0.4723 &  0.9447 &  0.5277 \tabularnewline
85 &  0.4276 &  0.8552 &  0.5724 \tabularnewline
86 &  0.4808 &  0.9617 &  0.5192 \tabularnewline
87 &  0.3914 &  0.7829 &  0.6086 \tabularnewline
88 &  0.3075 &  0.6149 &  0.6925 \tabularnewline
89 &  0.3952 &  0.7903 &  0.6048 \tabularnewline
90 &  0.3891 &  0.7782 &  0.6109 \tabularnewline
91 &  0.3015 &  0.603 &  0.6985 \tabularnewline
92 &  0.2812 &  0.5623 &  0.7188 \tabularnewline
93 &  0.3714 &  0.7428 &  0.6286 \tabularnewline
94 &  0.5061 &  0.9878 &  0.4939 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299024&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]6[/C][C] 0.6666[/C][C] 0.6668[/C][C] 0.3334[/C][/ROW]
[ROW][C]7[/C][C] 0.7929[/C][C] 0.4142[/C][C] 0.2071[/C][/ROW]
[ROW][C]8[/C][C] 0.8092[/C][C] 0.3816[/C][C] 0.1908[/C][/ROW]
[ROW][C]9[/C][C] 0.7155[/C][C] 0.5689[/C][C] 0.2845[/C][/ROW]
[ROW][C]10[/C][C] 0.6112[/C][C] 0.7776[/C][C] 0.3888[/C][/ROW]
[ROW][C]11[/C][C] 0.537[/C][C] 0.9259[/C][C] 0.463[/C][/ROW]
[ROW][C]12[/C][C] 0.4343[/C][C] 0.8686[/C][C] 0.5657[/C][/ROW]
[ROW][C]13[/C][C] 0.3638[/C][C] 0.7276[/C][C] 0.6362[/C][/ROW]
[ROW][C]14[/C][C] 0.4238[/C][C] 0.8475[/C][C] 0.5762[/C][/ROW]
[ROW][C]15[/C][C] 0.4478[/C][C] 0.8957[/C][C] 0.5522[/C][/ROW]
[ROW][C]16[/C][C] 0.4579[/C][C] 0.9157[/C][C] 0.5421[/C][/ROW]
[ROW][C]17[/C][C] 0.377[/C][C] 0.7539[/C][C] 0.623[/C][/ROW]
[ROW][C]18[/C][C] 0.3038[/C][C] 0.6077[/C][C] 0.6962[/C][/ROW]
[ROW][C]19[/C][C] 0.3035[/C][C] 0.6069[/C][C] 0.6965[/C][/ROW]
[ROW][C]20[/C][C] 0.2372[/C][C] 0.4744[/C][C] 0.7628[/C][/ROW]
[ROW][C]21[/C][C] 0.195[/C][C] 0.3901[/C][C] 0.805[/C][/ROW]
[ROW][C]22[/C][C] 0.1465[/C][C] 0.2931[/C][C] 0.8535[/C][/ROW]
[ROW][C]23[/C][C] 0.1634[/C][C] 0.3269[/C][C] 0.8366[/C][/ROW]
[ROW][C]24[/C][C] 0.1574[/C][C] 0.3147[/C][C] 0.8426[/C][/ROW]
[ROW][C]25[/C][C] 0.2063[/C][C] 0.4125[/C][C] 0.7937[/C][/ROW]
[ROW][C]26[/C][C] 0.1761[/C][C] 0.3523[/C][C] 0.8239[/C][/ROW]
[ROW][C]27[/C][C] 0.2278[/C][C] 0.4555[/C][C] 0.7722[/C][/ROW]
[ROW][C]28[/C][C] 0.1916[/C][C] 0.3831[/C][C] 0.8084[/C][/ROW]
[ROW][C]29[/C][C] 0.2692[/C][C] 0.5383[/C][C] 0.7308[/C][/ROW]
[ROW][C]30[/C][C] 0.2616[/C][C] 0.5233[/C][C] 0.7384[/C][/ROW]
[ROW][C]31[/C][C] 0.2713[/C][C] 0.5426[/C][C] 0.7287[/C][/ROW]
[ROW][C]32[/C][C] 0.2996[/C][C] 0.5991[/C][C] 0.7004[/C][/ROW]
[ROW][C]33[/C][C] 0.2766[/C][C] 0.5532[/C][C] 0.7234[/C][/ROW]
[ROW][C]34[/C][C] 0.2344[/C][C] 0.4688[/C][C] 0.7656[/C][/ROW]
[ROW][C]35[/C][C] 0.1995[/C][C] 0.399[/C][C] 0.8005[/C][/ROW]
[ROW][C]36[/C][C] 0.2182[/C][C] 0.4363[/C][C] 0.7818[/C][/ROW]
[ROW][C]37[/C][C] 0.2296[/C][C] 0.4592[/C][C] 0.7704[/C][/ROW]
[ROW][C]38[/C][C] 0.6785[/C][C] 0.643[/C][C] 0.3215[/C][/ROW]
[ROW][C]39[/C][C] 0.6941[/C][C] 0.6118[/C][C] 0.3059[/C][/ROW]
[ROW][C]40[/C][C] 0.6399[/C][C] 0.7201[/C][C] 0.3601[/C][/ROW]
[ROW][C]41[/C][C] 0.5827[/C][C] 0.8346[/C][C] 0.4173[/C][/ROW]
[ROW][C]42[/C][C] 0.5415[/C][C] 0.9171[/C][C] 0.4585[/C][/ROW]
[ROW][C]43[/C][C] 0.587[/C][C] 0.826[/C][C] 0.413[/C][/ROW]
[ROW][C]44[/C][C] 0.5482[/C][C] 0.9035[/C][C] 0.4518[/C][/ROW]
[ROW][C]45[/C][C] 0.5261[/C][C] 0.9477[/C][C] 0.4739[/C][/ROW]
[ROW][C]46[/C][C] 0.4672[/C][C] 0.9345[/C][C] 0.5328[/C][/ROW]
[ROW][C]47[/C][C] 0.4337[/C][C] 0.8674[/C][C] 0.5663[/C][/ROW]
[ROW][C]48[/C][C] 0.3805[/C][C] 0.7611[/C][C] 0.6195[/C][/ROW]
[ROW][C]49[/C][C] 0.3304[/C][C] 0.6609[/C][C] 0.6696[/C][/ROW]
[ROW][C]50[/C][C] 0.2798[/C][C] 0.5595[/C][C] 0.7202[/C][/ROW]
[ROW][C]51[/C][C] 0.2982[/C][C] 0.5963[/C][C] 0.7018[/C][/ROW]
[ROW][C]52[/C][C] 0.3346[/C][C] 0.6692[/C][C] 0.6654[/C][/ROW]
[ROW][C]53[/C][C] 0.3545[/C][C] 0.7089[/C][C] 0.6455[/C][/ROW]
[ROW][C]54[/C][C] 0.3024[/C][C] 0.6047[/C][C] 0.6976[/C][/ROW]
[ROW][C]55[/C][C] 0.2531[/C][C] 0.5062[/C][C] 0.7469[/C][/ROW]
[ROW][C]56[/C][C] 0.2672[/C][C] 0.5344[/C][C] 0.7328[/C][/ROW]
[ROW][C]57[/C][C] 0.2386[/C][C] 0.4772[/C][C] 0.7614[/C][/ROW]
[ROW][C]58[/C][C] 0.2113[/C][C] 0.4226[/C][C] 0.7887[/C][/ROW]
[ROW][C]59[/C][C] 0.2217[/C][C] 0.4433[/C][C] 0.7783[/C][/ROW]
[ROW][C]60[/C][C] 0.2436[/C][C] 0.4872[/C][C] 0.7564[/C][/ROW]
[ROW][C]61[/C][C] 0.2123[/C][C] 0.4246[/C][C] 0.7877[/C][/ROW]
[ROW][C]62[/C][C] 0.1837[/C][C] 0.3673[/C][C] 0.8163[/C][/ROW]
[ROW][C]63[/C][C] 0.1503[/C][C] 0.3005[/C][C] 0.8497[/C][/ROW]
[ROW][C]64[/C][C] 0.118[/C][C] 0.2361[/C][C] 0.882[/C][/ROW]
[ROW][C]65[/C][C] 0.09853[/C][C] 0.1971[/C][C] 0.9015[/C][/ROW]
[ROW][C]66[/C][C] 0.258[/C][C] 0.5159[/C][C] 0.742[/C][/ROW]
[ROW][C]67[/C][C] 0.2104[/C][C] 0.4207[/C][C] 0.7896[/C][/ROW]
[ROW][C]68[/C][C] 0.2347[/C][C] 0.4693[/C][C] 0.7653[/C][/ROW]
[ROW][C]69[/C][C] 0.1896[/C][C] 0.3792[/C][C] 0.8104[/C][/ROW]
[ROW][C]70[/C][C] 0.2818[/C][C] 0.5635[/C][C] 0.7182[/C][/ROW]
[ROW][C]71[/C][C] 0.3032[/C][C] 0.6065[/C][C] 0.6968[/C][/ROW]
[ROW][C]72[/C][C] 0.423[/C][C] 0.8461[/C][C] 0.577[/C][/ROW]
[ROW][C]73[/C][C] 0.428[/C][C] 0.856[/C][C] 0.572[/C][/ROW]
[ROW][C]74[/C][C] 0.3843[/C][C] 0.7686[/C][C] 0.6157[/C][/ROW]
[ROW][C]75[/C][C] 0.3452[/C][C] 0.6905[/C][C] 0.6548[/C][/ROW]
[ROW][C]76[/C][C] 0.3021[/C][C] 0.6042[/C][C] 0.6979[/C][/ROW]
[ROW][C]77[/C][C] 0.4298[/C][C] 0.8597[/C][C] 0.5702[/C][/ROW]
[ROW][C]78[/C][C] 0.5014[/C][C] 0.9971[/C][C] 0.4986[/C][/ROW]
[ROW][C]79[/C][C] 0.5227[/C][C] 0.9547[/C][C] 0.4773[/C][/ROW]
[ROW][C]80[/C][C] 0.5334[/C][C] 0.9333[/C][C] 0.4666[/C][/ROW]
[ROW][C]81[/C][C] 0.5908[/C][C] 0.8184[/C][C] 0.4092[/C][/ROW]
[ROW][C]82[/C][C] 0.5174[/C][C] 0.9653[/C][C] 0.4826[/C][/ROW]
[ROW][C]83[/C][C] 0.4973[/C][C] 0.9946[/C][C] 0.5027[/C][/ROW]
[ROW][C]84[/C][C] 0.4723[/C][C] 0.9447[/C][C] 0.5277[/C][/ROW]
[ROW][C]85[/C][C] 0.4276[/C][C] 0.8552[/C][C] 0.5724[/C][/ROW]
[ROW][C]86[/C][C] 0.4808[/C][C] 0.9617[/C][C] 0.5192[/C][/ROW]
[ROW][C]87[/C][C] 0.3914[/C][C] 0.7829[/C][C] 0.6086[/C][/ROW]
[ROW][C]88[/C][C] 0.3075[/C][C] 0.6149[/C][C] 0.6925[/C][/ROW]
[ROW][C]89[/C][C] 0.3952[/C][C] 0.7903[/C][C] 0.6048[/C][/ROW]
[ROW][C]90[/C][C] 0.3891[/C][C] 0.7782[/C][C] 0.6109[/C][/ROW]
[ROW][C]91[/C][C] 0.3015[/C][C] 0.603[/C][C] 0.6985[/C][/ROW]
[ROW][C]92[/C][C] 0.2812[/C][C] 0.5623[/C][C] 0.7188[/C][/ROW]
[ROW][C]93[/C][C] 0.3714[/C][C] 0.7428[/C][C] 0.6286[/C][/ROW]
[ROW][C]94[/C][C] 0.5061[/C][C] 0.9878[/C][C] 0.4939[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299024&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299024&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
6 0.6666 0.6668 0.3334
7 0.7929 0.4142 0.2071
8 0.8092 0.3816 0.1908
9 0.7155 0.5689 0.2845
10 0.6112 0.7776 0.3888
11 0.537 0.9259 0.463
12 0.4343 0.8686 0.5657
13 0.3638 0.7276 0.6362
14 0.4238 0.8475 0.5762
15 0.4478 0.8957 0.5522
16 0.4579 0.9157 0.5421
17 0.377 0.7539 0.623
18 0.3038 0.6077 0.6962
19 0.3035 0.6069 0.6965
20 0.2372 0.4744 0.7628
21 0.195 0.3901 0.805
22 0.1465 0.2931 0.8535
23 0.1634 0.3269 0.8366
24 0.1574 0.3147 0.8426
25 0.2063 0.4125 0.7937
26 0.1761 0.3523 0.8239
27 0.2278 0.4555 0.7722
28 0.1916 0.3831 0.8084
29 0.2692 0.5383 0.7308
30 0.2616 0.5233 0.7384
31 0.2713 0.5426 0.7287
32 0.2996 0.5991 0.7004
33 0.2766 0.5532 0.7234
34 0.2344 0.4688 0.7656
35 0.1995 0.399 0.8005
36 0.2182 0.4363 0.7818
37 0.2296 0.4592 0.7704
38 0.6785 0.643 0.3215
39 0.6941 0.6118 0.3059
40 0.6399 0.7201 0.3601
41 0.5827 0.8346 0.4173
42 0.5415 0.9171 0.4585
43 0.587 0.826 0.413
44 0.5482 0.9035 0.4518
45 0.5261 0.9477 0.4739
46 0.4672 0.9345 0.5328
47 0.4337 0.8674 0.5663
48 0.3805 0.7611 0.6195
49 0.3304 0.6609 0.6696
50 0.2798 0.5595 0.7202
51 0.2982 0.5963 0.7018
52 0.3346 0.6692 0.6654
53 0.3545 0.7089 0.6455
54 0.3024 0.6047 0.6976
55 0.2531 0.5062 0.7469
56 0.2672 0.5344 0.7328
57 0.2386 0.4772 0.7614
58 0.2113 0.4226 0.7887
59 0.2217 0.4433 0.7783
60 0.2436 0.4872 0.7564
61 0.2123 0.4246 0.7877
62 0.1837 0.3673 0.8163
63 0.1503 0.3005 0.8497
64 0.118 0.2361 0.882
65 0.09853 0.1971 0.9015
66 0.258 0.5159 0.742
67 0.2104 0.4207 0.7896
68 0.2347 0.4693 0.7653
69 0.1896 0.3792 0.8104
70 0.2818 0.5635 0.7182
71 0.3032 0.6065 0.6968
72 0.423 0.8461 0.577
73 0.428 0.856 0.572
74 0.3843 0.7686 0.6157
75 0.3452 0.6905 0.6548
76 0.3021 0.6042 0.6979
77 0.4298 0.8597 0.5702
78 0.5014 0.9971 0.4986
79 0.5227 0.9547 0.4773
80 0.5334 0.9333 0.4666
81 0.5908 0.8184 0.4092
82 0.5174 0.9653 0.4826
83 0.4973 0.9946 0.5027
84 0.4723 0.9447 0.5277
85 0.4276 0.8552 0.5724
86 0.4808 0.9617 0.5192
87 0.3914 0.7829 0.6086
88 0.3075 0.6149 0.6925
89 0.3952 0.7903 0.6048
90 0.3891 0.7782 0.6109
91 0.3015 0.603 0.6985
92 0.2812 0.5623 0.7188
93 0.3714 0.7428 0.6286
94 0.5061 0.9878 0.4939







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level00OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 0 & 0 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=299024&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=299024&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299024&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level00OK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.8269, df1 = 2, df2 = 95, p-value = 0.06419
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.1555, df1 = 4, df2 = 93, p-value = 0.3356
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.83459, df1 = 2, df2 = 95, p-value = 0.4372

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.8269, df1 = 2, df2 = 95, p-value = 0.06419
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.1555, df1 = 4, df2 = 93, p-value = 0.3356
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.83459, df1 = 2, df2 = 95, p-value = 0.4372
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=299024&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.8269, df1 = 2, df2 = 95, p-value = 0.06419
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.1555, df1 = 4, df2 = 93, p-value = 0.3356
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.83459, df1 = 2, df2 = 95, p-value = 0.4372
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=299024&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299024&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.8269, df1 = 2, df2 = 95, p-value = 0.06419
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.1555, df1 = 4, df2 = 93, p-value = 0.3356
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.83459, df1 = 2, df2 = 95, p-value = 0.4372







Variance Inflation Factors (Multicollinearity)
> vif
   Epsum    tvsum 
1.002903 1.002903 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
   Epsum    tvsum 
1.002903 1.002903 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=299024&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
   Epsum    tvsum 
1.002903 1.002903 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=299024&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=299024&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
   Epsum    tvsum 
1.002903 1.002903 



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
par5 <- '0'
par4 <- '0'
par3 <- 'No Linear Trend'
par2 <- 'Do not include Seasonal Dummies'
par1 <- '1'
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')