Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationTue, 06 Dec 2016 14:08:12 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/06/t14810306045akn3lg2t9zp65c.htm/, Retrieved Sat, 04 May 2024 13:55:01 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=297825, Retrieved Sat, 04 May 2024 13:55:01 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywordsMultiple regression (paper)
Estimated Impact75
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Blend Krasniqi] [2016-12-06 13:08:12] [e302b41a790d997d9c99fd21b0cdfda2] [Current]
Feedback Forum

Post a new message
Dataseries X:
5	4	4	4	13
5	4	4	4	16
4	3	3	2	17
4	3	3	3	NA
5	4	4	3	NA
5	3	4	3	16
5	4	2	3	NA
5	4	2	4	NA
5	2	2	4	NA
5	1	2	4	17
4	4	3	2	17
5	4	3	2	15
5	4	5	4	16
5	5	4	5	14
4	4	3	4	16
5	1	4	4	17
3	4	4	2	NA
5	2	3	2	NA
5	3	4	5	NA
5	3	4	4	16
2	2	3	1	NA
3	1	3	5	16
4	3	2	3	NA
4	2	2	4	NA
4	4	3	4	NA
5	4	3	2	16
4	4	3	4	15
5	2	4	2	16
4	3	4	3	16
5	4	3	4	13
4	4	4	4	15
4	4	3	4	17
4	3	4	4	NA
5	4	3	4	13
5	4	3	4	17
5	4	3	5	NA
5	4	3	4	14
2	3	2	4	14
4	3	5	3	18
4	4	3	4	NA
4	2	1	4	17
5	3	2	3	13
5	4	2	2	16
5	4	3	5	15
4	3	2	4	15
4	2	3	3	NA
5	3	5	4	15
5	3	4	4	13
5	4	5	4	NA
4	3	2	3	17
4	3	4	4	NA
5	3	3	4	NA
5	3	3	4	11
5	3	2	4	14
4	5	3	5	13
5	4	2	4	NA
5	4	4	2	17
4	3	4	4	16
4	4	3	5	NA
5	4	1	2	17
5	1	1	3	16
4	4	3	4	16
4	3	3	3	16
5	3	2	4	15
3	4	3	4	12
3	2	4	4	17
5	4	3	5	14
4	5	4	3	14
4	4	4	4	16
5	4	3	4	NA
5	4	4	4	NA
4	4	4	4	NA
5	4	3	4	NA
4	2	3	4	NA
4	4	5	4	15
4	2	2	4	16
5	5	4	4	14
4	5	3	3	15
4	2	3	3	17
4	4	3	2	NA
4	3	4	2	10
4	3	4	2	NA
2	3	3	3	17
4	4	5	4	NA
4	4	3	4	20
5	3	4	4	17
4	3	3	4	18
5	4	5	4	NA
4	4	4	4	17
4	2	4	4	14
3	3	4	2	NA
4	3	4	3	17
2	3	2	2	NA
4	4	3	3	17
5	4	4	4	NA
3	4	3	5	16
4	4	3	4	18
5	5	5	5	18
2	4	3	3	16
5	3	1	5	NA
5	4	3	4	NA
5	4	4	5	15
4	2	2	2	13
4	3	3	3	NA
5	3	4	4	NA
5	3	4	5	NA
4	4	4	4	NA
4	4	4	5	NA
5	4	5	5	16
5	4	4	5	NA
5	3	3	4	NA
4	3	3	4	NA
5	3	3	4	12
4	2	3	4	NA
5	3	4	4	16
4	2	2	4	16
5	4	5	5	NA
5	5	2	5	16
4	3	2	5	14
4	3	2	4	15
4	3	3	4	14
5	2	3	4	NA
5	3	4	5	15
4	3	4	4	NA
4	3	4	4	15
5	4	3	4	16
5	4	4	4	NA
4	3	4	2	NA
4	4	3	4	NA
4	1	3	2	11
4	5	5	4	NA
5	4	4	3	18
5	3	3	5	NA
4	5	3	2	11
4	4	3	4	NA
4	3	3	3	18
3	4	3	3	15
4	4	2	4	19
5	3	4	5	17
4	2	4	3	NA
4	4	4	2	14
5	3	5	5	NA
3	3	2	4	13
4	4	2	4	17
1	2	3	2	14
5	3	3	5	19
4	4	2	3	14
5	4	4	3	NA
3	3	2	3	NA
4	4	3	4	16
4	4	4	4	16
4	3	3	4	15
4	2	3	4	12
5	4	4	4	NA
5	2	2	4	17
5	3	5	5	NA
5	4	4	3	NA
4	3	3	3	18
5	2	5	4	15
5	4	2	4	18
4	1	4	5	15
3	5	4	3	NA
4	4	4	4	NA
4	3	3	2	NA
5	4	5	5	16
4	4	3	4	NA
4	3	3	3	16




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time7 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297825&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]7 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297825&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297825&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
TVDCSUM[t] = + 14.7087 + 0.0734196KVDD1[t] -0.00667169KVDD2[t] -0.012973KVDD3[t] + 0.139941KVDD4[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
TVDCSUM[t] =  +  14.7087 +  0.0734196KVDD1[t] -0.00667169KVDD2[t] -0.012973KVDD3[t] +  0.139941KVDD4[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297825&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]TVDCSUM[t] =  +  14.7087 +  0.0734196KVDD1[t] -0.00667169KVDD2[t] -0.012973KVDD3[t] +  0.139941KVDD4[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297825&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297825&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
TVDCSUM[t] = + 14.7087 + 0.0734196KVDD1[t] -0.00667169KVDD2[t] -0.012973KVDD3[t] + 0.139941KVDD4[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+14.71 1.341+1.0970e+01 9.547e-19 4.774e-19
KVDD1+0.07342 0.2481+2.9600e-01 0.7679 0.3839
KVDD2-0.006672 0.1955-3.4120e-02 0.9728 0.4864
KVDD3-0.01297 0.2071-6.2630e-02 0.9502 0.4751
KVDD4+0.1399 0.2157+6.4880e-01 0.518 0.259

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +14.71 &  1.341 & +1.0970e+01 &  9.547e-19 &  4.774e-19 \tabularnewline
KVDD1 & +0.07342 &  0.2481 & +2.9600e-01 &  0.7679 &  0.3839 \tabularnewline
KVDD2 & -0.006672 &  0.1955 & -3.4120e-02 &  0.9728 &  0.4864 \tabularnewline
KVDD3 & -0.01297 &  0.2071 & -6.2630e-02 &  0.9502 &  0.4751 \tabularnewline
KVDD4 & +0.1399 &  0.2157 & +6.4880e-01 &  0.518 &  0.259 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297825&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+14.71[/C][C] 1.341[/C][C]+1.0970e+01[/C][C] 9.547e-19[/C][C] 4.774e-19[/C][/ROW]
[ROW][C]KVDD1[/C][C]+0.07342[/C][C] 0.2481[/C][C]+2.9600e-01[/C][C] 0.7679[/C][C] 0.3839[/C][/ROW]
[ROW][C]KVDD2[/C][C]-0.006672[/C][C] 0.1955[/C][C]-3.4120e-02[/C][C] 0.9728[/C][C] 0.4864[/C][/ROW]
[ROW][C]KVDD3[/C][C]-0.01297[/C][C] 0.2071[/C][C]-6.2630e-02[/C][C] 0.9502[/C][C] 0.4751[/C][/ROW]
[ROW][C]KVDD4[/C][C]+0.1399[/C][C] 0.2157[/C][C]+6.4880e-01[/C][C] 0.518[/C][C] 0.259[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297825&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297825&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+14.71 1.341+1.0970e+01 9.547e-19 4.774e-19
KVDD1+0.07342 0.2481+2.9600e-01 0.7679 0.3839
KVDD2-0.006672 0.1955-3.4120e-02 0.9728 0.4864
KVDD3-0.01297 0.2071-6.2630e-02 0.9502 0.4751
KVDD4+0.1399 0.2157+6.4880e-01 0.518 0.259







Multiple Linear Regression - Regression Statistics
Multiple R 0.07737
R-squared 0.005986
Adjusted R-squared-0.03459
F-TEST (value) 0.1475
F-TEST (DF numerator)4
F-TEST (DF denominator)98
p-value 0.9637
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.905
Sum Squared Residuals 355.5

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.07737 \tabularnewline
R-squared &  0.005986 \tabularnewline
Adjusted R-squared & -0.03459 \tabularnewline
F-TEST (value) &  0.1475 \tabularnewline
F-TEST (DF numerator) & 4 \tabularnewline
F-TEST (DF denominator) & 98 \tabularnewline
p-value &  0.9637 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1.905 \tabularnewline
Sum Squared Residuals &  355.5 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297825&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.07737[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.005986[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]-0.03459[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 0.1475[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]4[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]98[/C][/ROW]
[ROW][C]p-value[/C][C] 0.9637[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1.905[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 355.5[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297825&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297825&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.07737
R-squared 0.005986
Adjusted R-squared-0.03459
F-TEST (value) 0.1475
F-TEST (DF numerator)4
F-TEST (DF denominator)98
p-value 0.9637
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.905
Sum Squared Residuals 355.5







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 13 15.56-2.557
2 16 15.56 0.443
3 17 15.22 1.777
4 16 15.42 0.5762
5 17 15.6 1.397
6 17 15.22 1.783
7 15 15.29-0.2901
8 16 15.54 0.4559
9 14 15.69-1.69
10 16 15.5 0.5034
11 17 15.58 1.423
12 16 15.56 0.4363
13 16 15.58 0.4169
14 16 15.29 0.7099
15 15 15.5-0.4966
16 16 15.29 0.7095
17 16 15.35 0.6497
18 13 15.57-2.57
19 15 15.48-0.4836
20 17 15.5 1.503
21 13 15.57-2.57
22 17 15.57 1.43
23 14 15.57-1.57
24 14 15.37-1.369
25 18 15.34 2.663
26 17 15.54 1.464
27 13 15.45-2.45
28 16 15.3 0.6969
29 15 15.71-0.7099
30 15 15.52-0.5162
31 15 15.55-0.5507
32 13 15.56-2.564
33 17 15.38 1.624
34 11 15.58-4.577
35 14 15.59-1.59
36 13 15.63-2.63
37 17 15.28 1.723
38 16 15.49 0.5097
39 17 15.32 1.684
40 16 15.48 0.524
41 16 15.5 0.5034
42 16 15.36 0.6367
43 15 15.59-0.5896
44 12 15.42-3.423
45 17 15.42 1.576
46 14 15.71-1.71
47 14 15.34-1.337
48 16 15.48 0.5164
49 15 15.47-0.4706
50 16 15.52 0.4771
51 14 15.55-1.55
52 15 15.35-0.35
53 17 15.37 1.63
54 10 15.21-5.21
55 17 15.22 1.784
56 20 15.5 4.503
57 17 15.56 1.436
58 18 15.5 2.497
59 17 15.48 1.516
60 14 15.5-1.497
61 17 15.35 1.65
62 17 15.36 1.643
63 16 15.56 0.4369
64 18 15.5 2.503
65 18 15.68 2.323
66 16 15.21 0.7902
67 15 15.7-0.697
68 13 15.24-2.243
69 16 15.68 0.316
70 12 15.58-3.577
71 16 15.56 0.4363
72 16 15.52 0.4771
73 16 15.72 0.2838
74 14 15.66-1.656
75 15 15.52-0.5162
76 14 15.5-1.503
77 15 15.7-0.7036
78 15 15.49-0.4903
79 16 15.57 0.43
80 11 15.24-4.237
81 18 15.42 2.583
82 11 15.21-4.21
83 18 15.36 2.637
84 15 15.28-0.2832
85 19 15.51 3.49
86 17 15.7 1.296
87 14 15.2-1.204
88 13 15.44-2.443
89 17 15.51 1.49
90 14 15.01-1.01
91 19 15.72 3.283
92 14 15.37-1.37
93 16 15.5 0.5034
94 16 15.48 0.5164
95 15 15.5-0.5033
96 12 15.51-3.51
97 17 15.6 1.404
98 18 15.36 2.637
99 15 15.56-0.5574
100 18 15.58 2.417
101 15 15.64-0.6436
102 16 15.68 0.316
103 16 15.36 0.6367

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  13 &  15.56 & -2.557 \tabularnewline
2 &  16 &  15.56 &  0.443 \tabularnewline
3 &  17 &  15.22 &  1.777 \tabularnewline
4 &  16 &  15.42 &  0.5762 \tabularnewline
5 &  17 &  15.6 &  1.397 \tabularnewline
6 &  17 &  15.22 &  1.783 \tabularnewline
7 &  15 &  15.29 & -0.2901 \tabularnewline
8 &  16 &  15.54 &  0.4559 \tabularnewline
9 &  14 &  15.69 & -1.69 \tabularnewline
10 &  16 &  15.5 &  0.5034 \tabularnewline
11 &  17 &  15.58 &  1.423 \tabularnewline
12 &  16 &  15.56 &  0.4363 \tabularnewline
13 &  16 &  15.58 &  0.4169 \tabularnewline
14 &  16 &  15.29 &  0.7099 \tabularnewline
15 &  15 &  15.5 & -0.4966 \tabularnewline
16 &  16 &  15.29 &  0.7095 \tabularnewline
17 &  16 &  15.35 &  0.6497 \tabularnewline
18 &  13 &  15.57 & -2.57 \tabularnewline
19 &  15 &  15.48 & -0.4836 \tabularnewline
20 &  17 &  15.5 &  1.503 \tabularnewline
21 &  13 &  15.57 & -2.57 \tabularnewline
22 &  17 &  15.57 &  1.43 \tabularnewline
23 &  14 &  15.57 & -1.57 \tabularnewline
24 &  14 &  15.37 & -1.369 \tabularnewline
25 &  18 &  15.34 &  2.663 \tabularnewline
26 &  17 &  15.54 &  1.464 \tabularnewline
27 &  13 &  15.45 & -2.45 \tabularnewline
28 &  16 &  15.3 &  0.6969 \tabularnewline
29 &  15 &  15.71 & -0.7099 \tabularnewline
30 &  15 &  15.52 & -0.5162 \tabularnewline
31 &  15 &  15.55 & -0.5507 \tabularnewline
32 &  13 &  15.56 & -2.564 \tabularnewline
33 &  17 &  15.38 &  1.624 \tabularnewline
34 &  11 &  15.58 & -4.577 \tabularnewline
35 &  14 &  15.59 & -1.59 \tabularnewline
36 &  13 &  15.63 & -2.63 \tabularnewline
37 &  17 &  15.28 &  1.723 \tabularnewline
38 &  16 &  15.49 &  0.5097 \tabularnewline
39 &  17 &  15.32 &  1.684 \tabularnewline
40 &  16 &  15.48 &  0.524 \tabularnewline
41 &  16 &  15.5 &  0.5034 \tabularnewline
42 &  16 &  15.36 &  0.6367 \tabularnewline
43 &  15 &  15.59 & -0.5896 \tabularnewline
44 &  12 &  15.42 & -3.423 \tabularnewline
45 &  17 &  15.42 &  1.576 \tabularnewline
46 &  14 &  15.71 & -1.71 \tabularnewline
47 &  14 &  15.34 & -1.337 \tabularnewline
48 &  16 &  15.48 &  0.5164 \tabularnewline
49 &  15 &  15.47 & -0.4706 \tabularnewline
50 &  16 &  15.52 &  0.4771 \tabularnewline
51 &  14 &  15.55 & -1.55 \tabularnewline
52 &  15 &  15.35 & -0.35 \tabularnewline
53 &  17 &  15.37 &  1.63 \tabularnewline
54 &  10 &  15.21 & -5.21 \tabularnewline
55 &  17 &  15.22 &  1.784 \tabularnewline
56 &  20 &  15.5 &  4.503 \tabularnewline
57 &  17 &  15.56 &  1.436 \tabularnewline
58 &  18 &  15.5 &  2.497 \tabularnewline
59 &  17 &  15.48 &  1.516 \tabularnewline
60 &  14 &  15.5 & -1.497 \tabularnewline
61 &  17 &  15.35 &  1.65 \tabularnewline
62 &  17 &  15.36 &  1.643 \tabularnewline
63 &  16 &  15.56 &  0.4369 \tabularnewline
64 &  18 &  15.5 &  2.503 \tabularnewline
65 &  18 &  15.68 &  2.323 \tabularnewline
66 &  16 &  15.21 &  0.7902 \tabularnewline
67 &  15 &  15.7 & -0.697 \tabularnewline
68 &  13 &  15.24 & -2.243 \tabularnewline
69 &  16 &  15.68 &  0.316 \tabularnewline
70 &  12 &  15.58 & -3.577 \tabularnewline
71 &  16 &  15.56 &  0.4363 \tabularnewline
72 &  16 &  15.52 &  0.4771 \tabularnewline
73 &  16 &  15.72 &  0.2838 \tabularnewline
74 &  14 &  15.66 & -1.656 \tabularnewline
75 &  15 &  15.52 & -0.5162 \tabularnewline
76 &  14 &  15.5 & -1.503 \tabularnewline
77 &  15 &  15.7 & -0.7036 \tabularnewline
78 &  15 &  15.49 & -0.4903 \tabularnewline
79 &  16 &  15.57 &  0.43 \tabularnewline
80 &  11 &  15.24 & -4.237 \tabularnewline
81 &  18 &  15.42 &  2.583 \tabularnewline
82 &  11 &  15.21 & -4.21 \tabularnewline
83 &  18 &  15.36 &  2.637 \tabularnewline
84 &  15 &  15.28 & -0.2832 \tabularnewline
85 &  19 &  15.51 &  3.49 \tabularnewline
86 &  17 &  15.7 &  1.296 \tabularnewline
87 &  14 &  15.2 & -1.204 \tabularnewline
88 &  13 &  15.44 & -2.443 \tabularnewline
89 &  17 &  15.51 &  1.49 \tabularnewline
90 &  14 &  15.01 & -1.01 \tabularnewline
91 &  19 &  15.72 &  3.283 \tabularnewline
92 &  14 &  15.37 & -1.37 \tabularnewline
93 &  16 &  15.5 &  0.5034 \tabularnewline
94 &  16 &  15.48 &  0.5164 \tabularnewline
95 &  15 &  15.5 & -0.5033 \tabularnewline
96 &  12 &  15.51 & -3.51 \tabularnewline
97 &  17 &  15.6 &  1.404 \tabularnewline
98 &  18 &  15.36 &  2.637 \tabularnewline
99 &  15 &  15.56 & -0.5574 \tabularnewline
100 &  18 &  15.58 &  2.417 \tabularnewline
101 &  15 &  15.64 & -0.6436 \tabularnewline
102 &  16 &  15.68 &  0.316 \tabularnewline
103 &  16 &  15.36 &  0.6367 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297825&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 13[/C][C] 15.56[/C][C]-2.557[/C][/ROW]
[ROW][C]2[/C][C] 16[/C][C] 15.56[/C][C] 0.443[/C][/ROW]
[ROW][C]3[/C][C] 17[/C][C] 15.22[/C][C] 1.777[/C][/ROW]
[ROW][C]4[/C][C] 16[/C][C] 15.42[/C][C] 0.5762[/C][/ROW]
[ROW][C]5[/C][C] 17[/C][C] 15.6[/C][C] 1.397[/C][/ROW]
[ROW][C]6[/C][C] 17[/C][C] 15.22[/C][C] 1.783[/C][/ROW]
[ROW][C]7[/C][C] 15[/C][C] 15.29[/C][C]-0.2901[/C][/ROW]
[ROW][C]8[/C][C] 16[/C][C] 15.54[/C][C] 0.4559[/C][/ROW]
[ROW][C]9[/C][C] 14[/C][C] 15.69[/C][C]-1.69[/C][/ROW]
[ROW][C]10[/C][C] 16[/C][C] 15.5[/C][C] 0.5034[/C][/ROW]
[ROW][C]11[/C][C] 17[/C][C] 15.58[/C][C] 1.423[/C][/ROW]
[ROW][C]12[/C][C] 16[/C][C] 15.56[/C][C] 0.4363[/C][/ROW]
[ROW][C]13[/C][C] 16[/C][C] 15.58[/C][C] 0.4169[/C][/ROW]
[ROW][C]14[/C][C] 16[/C][C] 15.29[/C][C] 0.7099[/C][/ROW]
[ROW][C]15[/C][C] 15[/C][C] 15.5[/C][C]-0.4966[/C][/ROW]
[ROW][C]16[/C][C] 16[/C][C] 15.29[/C][C] 0.7095[/C][/ROW]
[ROW][C]17[/C][C] 16[/C][C] 15.35[/C][C] 0.6497[/C][/ROW]
[ROW][C]18[/C][C] 13[/C][C] 15.57[/C][C]-2.57[/C][/ROW]
[ROW][C]19[/C][C] 15[/C][C] 15.48[/C][C]-0.4836[/C][/ROW]
[ROW][C]20[/C][C] 17[/C][C] 15.5[/C][C] 1.503[/C][/ROW]
[ROW][C]21[/C][C] 13[/C][C] 15.57[/C][C]-2.57[/C][/ROW]
[ROW][C]22[/C][C] 17[/C][C] 15.57[/C][C] 1.43[/C][/ROW]
[ROW][C]23[/C][C] 14[/C][C] 15.57[/C][C]-1.57[/C][/ROW]
[ROW][C]24[/C][C] 14[/C][C] 15.37[/C][C]-1.369[/C][/ROW]
[ROW][C]25[/C][C] 18[/C][C] 15.34[/C][C] 2.663[/C][/ROW]
[ROW][C]26[/C][C] 17[/C][C] 15.54[/C][C] 1.464[/C][/ROW]
[ROW][C]27[/C][C] 13[/C][C] 15.45[/C][C]-2.45[/C][/ROW]
[ROW][C]28[/C][C] 16[/C][C] 15.3[/C][C] 0.6969[/C][/ROW]
[ROW][C]29[/C][C] 15[/C][C] 15.71[/C][C]-0.7099[/C][/ROW]
[ROW][C]30[/C][C] 15[/C][C] 15.52[/C][C]-0.5162[/C][/ROW]
[ROW][C]31[/C][C] 15[/C][C] 15.55[/C][C]-0.5507[/C][/ROW]
[ROW][C]32[/C][C] 13[/C][C] 15.56[/C][C]-2.564[/C][/ROW]
[ROW][C]33[/C][C] 17[/C][C] 15.38[/C][C] 1.624[/C][/ROW]
[ROW][C]34[/C][C] 11[/C][C] 15.58[/C][C]-4.577[/C][/ROW]
[ROW][C]35[/C][C] 14[/C][C] 15.59[/C][C]-1.59[/C][/ROW]
[ROW][C]36[/C][C] 13[/C][C] 15.63[/C][C]-2.63[/C][/ROW]
[ROW][C]37[/C][C] 17[/C][C] 15.28[/C][C] 1.723[/C][/ROW]
[ROW][C]38[/C][C] 16[/C][C] 15.49[/C][C] 0.5097[/C][/ROW]
[ROW][C]39[/C][C] 17[/C][C] 15.32[/C][C] 1.684[/C][/ROW]
[ROW][C]40[/C][C] 16[/C][C] 15.48[/C][C] 0.524[/C][/ROW]
[ROW][C]41[/C][C] 16[/C][C] 15.5[/C][C] 0.5034[/C][/ROW]
[ROW][C]42[/C][C] 16[/C][C] 15.36[/C][C] 0.6367[/C][/ROW]
[ROW][C]43[/C][C] 15[/C][C] 15.59[/C][C]-0.5896[/C][/ROW]
[ROW][C]44[/C][C] 12[/C][C] 15.42[/C][C]-3.423[/C][/ROW]
[ROW][C]45[/C][C] 17[/C][C] 15.42[/C][C] 1.576[/C][/ROW]
[ROW][C]46[/C][C] 14[/C][C] 15.71[/C][C]-1.71[/C][/ROW]
[ROW][C]47[/C][C] 14[/C][C] 15.34[/C][C]-1.337[/C][/ROW]
[ROW][C]48[/C][C] 16[/C][C] 15.48[/C][C] 0.5164[/C][/ROW]
[ROW][C]49[/C][C] 15[/C][C] 15.47[/C][C]-0.4706[/C][/ROW]
[ROW][C]50[/C][C] 16[/C][C] 15.52[/C][C] 0.4771[/C][/ROW]
[ROW][C]51[/C][C] 14[/C][C] 15.55[/C][C]-1.55[/C][/ROW]
[ROW][C]52[/C][C] 15[/C][C] 15.35[/C][C]-0.35[/C][/ROW]
[ROW][C]53[/C][C] 17[/C][C] 15.37[/C][C] 1.63[/C][/ROW]
[ROW][C]54[/C][C] 10[/C][C] 15.21[/C][C]-5.21[/C][/ROW]
[ROW][C]55[/C][C] 17[/C][C] 15.22[/C][C] 1.784[/C][/ROW]
[ROW][C]56[/C][C] 20[/C][C] 15.5[/C][C] 4.503[/C][/ROW]
[ROW][C]57[/C][C] 17[/C][C] 15.56[/C][C] 1.436[/C][/ROW]
[ROW][C]58[/C][C] 18[/C][C] 15.5[/C][C] 2.497[/C][/ROW]
[ROW][C]59[/C][C] 17[/C][C] 15.48[/C][C] 1.516[/C][/ROW]
[ROW][C]60[/C][C] 14[/C][C] 15.5[/C][C]-1.497[/C][/ROW]
[ROW][C]61[/C][C] 17[/C][C] 15.35[/C][C] 1.65[/C][/ROW]
[ROW][C]62[/C][C] 17[/C][C] 15.36[/C][C] 1.643[/C][/ROW]
[ROW][C]63[/C][C] 16[/C][C] 15.56[/C][C] 0.4369[/C][/ROW]
[ROW][C]64[/C][C] 18[/C][C] 15.5[/C][C] 2.503[/C][/ROW]
[ROW][C]65[/C][C] 18[/C][C] 15.68[/C][C] 2.323[/C][/ROW]
[ROW][C]66[/C][C] 16[/C][C] 15.21[/C][C] 0.7902[/C][/ROW]
[ROW][C]67[/C][C] 15[/C][C] 15.7[/C][C]-0.697[/C][/ROW]
[ROW][C]68[/C][C] 13[/C][C] 15.24[/C][C]-2.243[/C][/ROW]
[ROW][C]69[/C][C] 16[/C][C] 15.68[/C][C] 0.316[/C][/ROW]
[ROW][C]70[/C][C] 12[/C][C] 15.58[/C][C]-3.577[/C][/ROW]
[ROW][C]71[/C][C] 16[/C][C] 15.56[/C][C] 0.4363[/C][/ROW]
[ROW][C]72[/C][C] 16[/C][C] 15.52[/C][C] 0.4771[/C][/ROW]
[ROW][C]73[/C][C] 16[/C][C] 15.72[/C][C] 0.2838[/C][/ROW]
[ROW][C]74[/C][C] 14[/C][C] 15.66[/C][C]-1.656[/C][/ROW]
[ROW][C]75[/C][C] 15[/C][C] 15.52[/C][C]-0.5162[/C][/ROW]
[ROW][C]76[/C][C] 14[/C][C] 15.5[/C][C]-1.503[/C][/ROW]
[ROW][C]77[/C][C] 15[/C][C] 15.7[/C][C]-0.7036[/C][/ROW]
[ROW][C]78[/C][C] 15[/C][C] 15.49[/C][C]-0.4903[/C][/ROW]
[ROW][C]79[/C][C] 16[/C][C] 15.57[/C][C] 0.43[/C][/ROW]
[ROW][C]80[/C][C] 11[/C][C] 15.24[/C][C]-4.237[/C][/ROW]
[ROW][C]81[/C][C] 18[/C][C] 15.42[/C][C] 2.583[/C][/ROW]
[ROW][C]82[/C][C] 11[/C][C] 15.21[/C][C]-4.21[/C][/ROW]
[ROW][C]83[/C][C] 18[/C][C] 15.36[/C][C] 2.637[/C][/ROW]
[ROW][C]84[/C][C] 15[/C][C] 15.28[/C][C]-0.2832[/C][/ROW]
[ROW][C]85[/C][C] 19[/C][C] 15.51[/C][C] 3.49[/C][/ROW]
[ROW][C]86[/C][C] 17[/C][C] 15.7[/C][C] 1.296[/C][/ROW]
[ROW][C]87[/C][C] 14[/C][C] 15.2[/C][C]-1.204[/C][/ROW]
[ROW][C]88[/C][C] 13[/C][C] 15.44[/C][C]-2.443[/C][/ROW]
[ROW][C]89[/C][C] 17[/C][C] 15.51[/C][C] 1.49[/C][/ROW]
[ROW][C]90[/C][C] 14[/C][C] 15.01[/C][C]-1.01[/C][/ROW]
[ROW][C]91[/C][C] 19[/C][C] 15.72[/C][C] 3.283[/C][/ROW]
[ROW][C]92[/C][C] 14[/C][C] 15.37[/C][C]-1.37[/C][/ROW]
[ROW][C]93[/C][C] 16[/C][C] 15.5[/C][C] 0.5034[/C][/ROW]
[ROW][C]94[/C][C] 16[/C][C] 15.48[/C][C] 0.5164[/C][/ROW]
[ROW][C]95[/C][C] 15[/C][C] 15.5[/C][C]-0.5033[/C][/ROW]
[ROW][C]96[/C][C] 12[/C][C] 15.51[/C][C]-3.51[/C][/ROW]
[ROW][C]97[/C][C] 17[/C][C] 15.6[/C][C] 1.404[/C][/ROW]
[ROW][C]98[/C][C] 18[/C][C] 15.36[/C][C] 2.637[/C][/ROW]
[ROW][C]99[/C][C] 15[/C][C] 15.56[/C][C]-0.5574[/C][/ROW]
[ROW][C]100[/C][C] 18[/C][C] 15.58[/C][C] 2.417[/C][/ROW]
[ROW][C]101[/C][C] 15[/C][C] 15.64[/C][C]-0.6436[/C][/ROW]
[ROW][C]102[/C][C] 16[/C][C] 15.68[/C][C] 0.316[/C][/ROW]
[ROW][C]103[/C][C] 16[/C][C] 15.36[/C][C] 0.6367[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297825&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297825&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 13 15.56-2.557
2 16 15.56 0.443
3 17 15.22 1.777
4 16 15.42 0.5762
5 17 15.6 1.397
6 17 15.22 1.783
7 15 15.29-0.2901
8 16 15.54 0.4559
9 14 15.69-1.69
10 16 15.5 0.5034
11 17 15.58 1.423
12 16 15.56 0.4363
13 16 15.58 0.4169
14 16 15.29 0.7099
15 15 15.5-0.4966
16 16 15.29 0.7095
17 16 15.35 0.6497
18 13 15.57-2.57
19 15 15.48-0.4836
20 17 15.5 1.503
21 13 15.57-2.57
22 17 15.57 1.43
23 14 15.57-1.57
24 14 15.37-1.369
25 18 15.34 2.663
26 17 15.54 1.464
27 13 15.45-2.45
28 16 15.3 0.6969
29 15 15.71-0.7099
30 15 15.52-0.5162
31 15 15.55-0.5507
32 13 15.56-2.564
33 17 15.38 1.624
34 11 15.58-4.577
35 14 15.59-1.59
36 13 15.63-2.63
37 17 15.28 1.723
38 16 15.49 0.5097
39 17 15.32 1.684
40 16 15.48 0.524
41 16 15.5 0.5034
42 16 15.36 0.6367
43 15 15.59-0.5896
44 12 15.42-3.423
45 17 15.42 1.576
46 14 15.71-1.71
47 14 15.34-1.337
48 16 15.48 0.5164
49 15 15.47-0.4706
50 16 15.52 0.4771
51 14 15.55-1.55
52 15 15.35-0.35
53 17 15.37 1.63
54 10 15.21-5.21
55 17 15.22 1.784
56 20 15.5 4.503
57 17 15.56 1.436
58 18 15.5 2.497
59 17 15.48 1.516
60 14 15.5-1.497
61 17 15.35 1.65
62 17 15.36 1.643
63 16 15.56 0.4369
64 18 15.5 2.503
65 18 15.68 2.323
66 16 15.21 0.7902
67 15 15.7-0.697
68 13 15.24-2.243
69 16 15.68 0.316
70 12 15.58-3.577
71 16 15.56 0.4363
72 16 15.52 0.4771
73 16 15.72 0.2838
74 14 15.66-1.656
75 15 15.52-0.5162
76 14 15.5-1.503
77 15 15.7-0.7036
78 15 15.49-0.4903
79 16 15.57 0.43
80 11 15.24-4.237
81 18 15.42 2.583
82 11 15.21-4.21
83 18 15.36 2.637
84 15 15.28-0.2832
85 19 15.51 3.49
86 17 15.7 1.296
87 14 15.2-1.204
88 13 15.44-2.443
89 17 15.51 1.49
90 14 15.01-1.01
91 19 15.72 3.283
92 14 15.37-1.37
93 16 15.5 0.5034
94 16 15.48 0.5164
95 15 15.5-0.5033
96 12 15.51-3.51
97 17 15.6 1.404
98 18 15.36 2.637
99 15 15.56-0.5574
100 18 15.58 2.417
101 15 15.64-0.6436
102 16 15.68 0.316
103 16 15.36 0.6367







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
8 0.3195 0.639 0.6805
9 0.1791 0.3582 0.8209
10 0.09012 0.1802 0.9099
11 0.04625 0.0925 0.9537
12 0.02111 0.04221 0.9789
13 0.02973 0.05947 0.9703
14 0.01411 0.02822 0.9859
15 0.00649 0.01298 0.9935
16 0.004936 0.009872 0.9951
17 0.002232 0.004464 0.9978
18 0.004845 0.009691 0.9952
19 0.002378 0.004756 0.9976
20 0.003536 0.007073 0.9965
21 0.004875 0.009751 0.9951
22 0.01002 0.02003 0.99
23 0.006924 0.01385 0.9931
24 0.009855 0.01971 0.9901
25 0.009717 0.01943 0.9903
26 0.009368 0.01874 0.9906
27 0.0178 0.0356 0.9822
28 0.01242 0.02484 0.9876
29 0.008563 0.01713 0.9914
30 0.005238 0.01048 0.9948
31 0.00366 0.007319 0.9963
32 0.007261 0.01452 0.9927
33 0.006003 0.01201 0.994
34 0.04886 0.09772 0.9511
35 0.03897 0.07794 0.961
36 0.04037 0.08074 0.9596
37 0.03535 0.07071 0.9646
38 0.02491 0.04982 0.9751
39 0.02521 0.05043 0.9748
40 0.01806 0.03612 0.9819
41 0.01355 0.02709 0.9865
42 0.009414 0.01883 0.9906
43 0.006327 0.01265 0.9937
44 0.02404 0.04808 0.976
45 0.01934 0.03868 0.9807
46 0.01827 0.03655 0.9817
47 0.01574 0.03149 0.9843
48 0.01182 0.02365 0.9882
49 0.008161 0.01632 0.9918
50 0.005506 0.01101 0.9945
51 0.004866 0.009733 0.9951
52 0.003308 0.006616 0.9967
53 0.00318 0.00636 0.9968
54 0.1214 0.2429 0.8786
55 0.1149 0.2298 0.8851
56 0.3413 0.6826 0.6587
57 0.3265 0.6529 0.6735
58 0.3765 0.7531 0.6235
59 0.3553 0.7106 0.6447
60 0.3355 0.671 0.6645
61 0.3325 0.6649 0.6675
62 0.3181 0.6361 0.6819
63 0.2725 0.5449 0.7275
64 0.3069 0.6138 0.6931
65 0.3379 0.6759 0.6621
66 0.302 0.6039 0.698
67 0.2716 0.5433 0.7284
68 0.2797 0.5594 0.7203
69 0.2336 0.4671 0.7664
70 0.3901 0.7801 0.6099
71 0.3318 0.6636 0.6682
72 0.2821 0.5642 0.7179
73 0.2654 0.5308 0.7346
74 0.2856 0.5713 0.7144
75 0.2432 0.4865 0.7568
76 0.2311 0.4622 0.7689
77 0.2129 0.4258 0.7871
78 0.1691 0.3381 0.8309
79 0.1366 0.2733 0.8634
80 0.2351 0.4702 0.7649
81 0.2665 0.5331 0.7335
82 0.5823 0.8353 0.4177
83 0.6402 0.7195 0.3598
84 0.5606 0.8788 0.4394
85 0.6473 0.7053 0.3527
86 0.5762 0.8476 0.4238
87 0.5509 0.8982 0.4491
88 0.581 0.8379 0.419
89 0.4916 0.9831 0.5084
90 0.5096 0.9807 0.4904
91 0.5547 0.8905 0.4453
92 0.6546 0.6908 0.3454
93 0.5239 0.9521 0.4761
94 0.382 0.764 0.618
95 0.249 0.4979 0.751

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
8 &  0.3195 &  0.639 &  0.6805 \tabularnewline
9 &  0.1791 &  0.3582 &  0.8209 \tabularnewline
10 &  0.09012 &  0.1802 &  0.9099 \tabularnewline
11 &  0.04625 &  0.0925 &  0.9537 \tabularnewline
12 &  0.02111 &  0.04221 &  0.9789 \tabularnewline
13 &  0.02973 &  0.05947 &  0.9703 \tabularnewline
14 &  0.01411 &  0.02822 &  0.9859 \tabularnewline
15 &  0.00649 &  0.01298 &  0.9935 \tabularnewline
16 &  0.004936 &  0.009872 &  0.9951 \tabularnewline
17 &  0.002232 &  0.004464 &  0.9978 \tabularnewline
18 &  0.004845 &  0.009691 &  0.9952 \tabularnewline
19 &  0.002378 &  0.004756 &  0.9976 \tabularnewline
20 &  0.003536 &  0.007073 &  0.9965 \tabularnewline
21 &  0.004875 &  0.009751 &  0.9951 \tabularnewline
22 &  0.01002 &  0.02003 &  0.99 \tabularnewline
23 &  0.006924 &  0.01385 &  0.9931 \tabularnewline
24 &  0.009855 &  0.01971 &  0.9901 \tabularnewline
25 &  0.009717 &  0.01943 &  0.9903 \tabularnewline
26 &  0.009368 &  0.01874 &  0.9906 \tabularnewline
27 &  0.0178 &  0.0356 &  0.9822 \tabularnewline
28 &  0.01242 &  0.02484 &  0.9876 \tabularnewline
29 &  0.008563 &  0.01713 &  0.9914 \tabularnewline
30 &  0.005238 &  0.01048 &  0.9948 \tabularnewline
31 &  0.00366 &  0.007319 &  0.9963 \tabularnewline
32 &  0.007261 &  0.01452 &  0.9927 \tabularnewline
33 &  0.006003 &  0.01201 &  0.994 \tabularnewline
34 &  0.04886 &  0.09772 &  0.9511 \tabularnewline
35 &  0.03897 &  0.07794 &  0.961 \tabularnewline
36 &  0.04037 &  0.08074 &  0.9596 \tabularnewline
37 &  0.03535 &  0.07071 &  0.9646 \tabularnewline
38 &  0.02491 &  0.04982 &  0.9751 \tabularnewline
39 &  0.02521 &  0.05043 &  0.9748 \tabularnewline
40 &  0.01806 &  0.03612 &  0.9819 \tabularnewline
41 &  0.01355 &  0.02709 &  0.9865 \tabularnewline
42 &  0.009414 &  0.01883 &  0.9906 \tabularnewline
43 &  0.006327 &  0.01265 &  0.9937 \tabularnewline
44 &  0.02404 &  0.04808 &  0.976 \tabularnewline
45 &  0.01934 &  0.03868 &  0.9807 \tabularnewline
46 &  0.01827 &  0.03655 &  0.9817 \tabularnewline
47 &  0.01574 &  0.03149 &  0.9843 \tabularnewline
48 &  0.01182 &  0.02365 &  0.9882 \tabularnewline
49 &  0.008161 &  0.01632 &  0.9918 \tabularnewline
50 &  0.005506 &  0.01101 &  0.9945 \tabularnewline
51 &  0.004866 &  0.009733 &  0.9951 \tabularnewline
52 &  0.003308 &  0.006616 &  0.9967 \tabularnewline
53 &  0.00318 &  0.00636 &  0.9968 \tabularnewline
54 &  0.1214 &  0.2429 &  0.8786 \tabularnewline
55 &  0.1149 &  0.2298 &  0.8851 \tabularnewline
56 &  0.3413 &  0.6826 &  0.6587 \tabularnewline
57 &  0.3265 &  0.6529 &  0.6735 \tabularnewline
58 &  0.3765 &  0.7531 &  0.6235 \tabularnewline
59 &  0.3553 &  0.7106 &  0.6447 \tabularnewline
60 &  0.3355 &  0.671 &  0.6645 \tabularnewline
61 &  0.3325 &  0.6649 &  0.6675 \tabularnewline
62 &  0.3181 &  0.6361 &  0.6819 \tabularnewline
63 &  0.2725 &  0.5449 &  0.7275 \tabularnewline
64 &  0.3069 &  0.6138 &  0.6931 \tabularnewline
65 &  0.3379 &  0.6759 &  0.6621 \tabularnewline
66 &  0.302 &  0.6039 &  0.698 \tabularnewline
67 &  0.2716 &  0.5433 &  0.7284 \tabularnewline
68 &  0.2797 &  0.5594 &  0.7203 \tabularnewline
69 &  0.2336 &  0.4671 &  0.7664 \tabularnewline
70 &  0.3901 &  0.7801 &  0.6099 \tabularnewline
71 &  0.3318 &  0.6636 &  0.6682 \tabularnewline
72 &  0.2821 &  0.5642 &  0.7179 \tabularnewline
73 &  0.2654 &  0.5308 &  0.7346 \tabularnewline
74 &  0.2856 &  0.5713 &  0.7144 \tabularnewline
75 &  0.2432 &  0.4865 &  0.7568 \tabularnewline
76 &  0.2311 &  0.4622 &  0.7689 \tabularnewline
77 &  0.2129 &  0.4258 &  0.7871 \tabularnewline
78 &  0.1691 &  0.3381 &  0.8309 \tabularnewline
79 &  0.1366 &  0.2733 &  0.8634 \tabularnewline
80 &  0.2351 &  0.4702 &  0.7649 \tabularnewline
81 &  0.2665 &  0.5331 &  0.7335 \tabularnewline
82 &  0.5823 &  0.8353 &  0.4177 \tabularnewline
83 &  0.6402 &  0.7195 &  0.3598 \tabularnewline
84 &  0.5606 &  0.8788 &  0.4394 \tabularnewline
85 &  0.6473 &  0.7053 &  0.3527 \tabularnewline
86 &  0.5762 &  0.8476 &  0.4238 \tabularnewline
87 &  0.5509 &  0.8982 &  0.4491 \tabularnewline
88 &  0.581 &  0.8379 &  0.419 \tabularnewline
89 &  0.4916 &  0.9831 &  0.5084 \tabularnewline
90 &  0.5096 &  0.9807 &  0.4904 \tabularnewline
91 &  0.5547 &  0.8905 &  0.4453 \tabularnewline
92 &  0.6546 &  0.6908 &  0.3454 \tabularnewline
93 &  0.5239 &  0.9521 &  0.4761 \tabularnewline
94 &  0.382 &  0.764 &  0.618 \tabularnewline
95 &  0.249 &  0.4979 &  0.751 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297825&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]8[/C][C] 0.3195[/C][C] 0.639[/C][C] 0.6805[/C][/ROW]
[ROW][C]9[/C][C] 0.1791[/C][C] 0.3582[/C][C] 0.8209[/C][/ROW]
[ROW][C]10[/C][C] 0.09012[/C][C] 0.1802[/C][C] 0.9099[/C][/ROW]
[ROW][C]11[/C][C] 0.04625[/C][C] 0.0925[/C][C] 0.9537[/C][/ROW]
[ROW][C]12[/C][C] 0.02111[/C][C] 0.04221[/C][C] 0.9789[/C][/ROW]
[ROW][C]13[/C][C] 0.02973[/C][C] 0.05947[/C][C] 0.9703[/C][/ROW]
[ROW][C]14[/C][C] 0.01411[/C][C] 0.02822[/C][C] 0.9859[/C][/ROW]
[ROW][C]15[/C][C] 0.00649[/C][C] 0.01298[/C][C] 0.9935[/C][/ROW]
[ROW][C]16[/C][C] 0.004936[/C][C] 0.009872[/C][C] 0.9951[/C][/ROW]
[ROW][C]17[/C][C] 0.002232[/C][C] 0.004464[/C][C] 0.9978[/C][/ROW]
[ROW][C]18[/C][C] 0.004845[/C][C] 0.009691[/C][C] 0.9952[/C][/ROW]
[ROW][C]19[/C][C] 0.002378[/C][C] 0.004756[/C][C] 0.9976[/C][/ROW]
[ROW][C]20[/C][C] 0.003536[/C][C] 0.007073[/C][C] 0.9965[/C][/ROW]
[ROW][C]21[/C][C] 0.004875[/C][C] 0.009751[/C][C] 0.9951[/C][/ROW]
[ROW][C]22[/C][C] 0.01002[/C][C] 0.02003[/C][C] 0.99[/C][/ROW]
[ROW][C]23[/C][C] 0.006924[/C][C] 0.01385[/C][C] 0.9931[/C][/ROW]
[ROW][C]24[/C][C] 0.009855[/C][C] 0.01971[/C][C] 0.9901[/C][/ROW]
[ROW][C]25[/C][C] 0.009717[/C][C] 0.01943[/C][C] 0.9903[/C][/ROW]
[ROW][C]26[/C][C] 0.009368[/C][C] 0.01874[/C][C] 0.9906[/C][/ROW]
[ROW][C]27[/C][C] 0.0178[/C][C] 0.0356[/C][C] 0.9822[/C][/ROW]
[ROW][C]28[/C][C] 0.01242[/C][C] 0.02484[/C][C] 0.9876[/C][/ROW]
[ROW][C]29[/C][C] 0.008563[/C][C] 0.01713[/C][C] 0.9914[/C][/ROW]
[ROW][C]30[/C][C] 0.005238[/C][C] 0.01048[/C][C] 0.9948[/C][/ROW]
[ROW][C]31[/C][C] 0.00366[/C][C] 0.007319[/C][C] 0.9963[/C][/ROW]
[ROW][C]32[/C][C] 0.007261[/C][C] 0.01452[/C][C] 0.9927[/C][/ROW]
[ROW][C]33[/C][C] 0.006003[/C][C] 0.01201[/C][C] 0.994[/C][/ROW]
[ROW][C]34[/C][C] 0.04886[/C][C] 0.09772[/C][C] 0.9511[/C][/ROW]
[ROW][C]35[/C][C] 0.03897[/C][C] 0.07794[/C][C] 0.961[/C][/ROW]
[ROW][C]36[/C][C] 0.04037[/C][C] 0.08074[/C][C] 0.9596[/C][/ROW]
[ROW][C]37[/C][C] 0.03535[/C][C] 0.07071[/C][C] 0.9646[/C][/ROW]
[ROW][C]38[/C][C] 0.02491[/C][C] 0.04982[/C][C] 0.9751[/C][/ROW]
[ROW][C]39[/C][C] 0.02521[/C][C] 0.05043[/C][C] 0.9748[/C][/ROW]
[ROW][C]40[/C][C] 0.01806[/C][C] 0.03612[/C][C] 0.9819[/C][/ROW]
[ROW][C]41[/C][C] 0.01355[/C][C] 0.02709[/C][C] 0.9865[/C][/ROW]
[ROW][C]42[/C][C] 0.009414[/C][C] 0.01883[/C][C] 0.9906[/C][/ROW]
[ROW][C]43[/C][C] 0.006327[/C][C] 0.01265[/C][C] 0.9937[/C][/ROW]
[ROW][C]44[/C][C] 0.02404[/C][C] 0.04808[/C][C] 0.976[/C][/ROW]
[ROW][C]45[/C][C] 0.01934[/C][C] 0.03868[/C][C] 0.9807[/C][/ROW]
[ROW][C]46[/C][C] 0.01827[/C][C] 0.03655[/C][C] 0.9817[/C][/ROW]
[ROW][C]47[/C][C] 0.01574[/C][C] 0.03149[/C][C] 0.9843[/C][/ROW]
[ROW][C]48[/C][C] 0.01182[/C][C] 0.02365[/C][C] 0.9882[/C][/ROW]
[ROW][C]49[/C][C] 0.008161[/C][C] 0.01632[/C][C] 0.9918[/C][/ROW]
[ROW][C]50[/C][C] 0.005506[/C][C] 0.01101[/C][C] 0.9945[/C][/ROW]
[ROW][C]51[/C][C] 0.004866[/C][C] 0.009733[/C][C] 0.9951[/C][/ROW]
[ROW][C]52[/C][C] 0.003308[/C][C] 0.006616[/C][C] 0.9967[/C][/ROW]
[ROW][C]53[/C][C] 0.00318[/C][C] 0.00636[/C][C] 0.9968[/C][/ROW]
[ROW][C]54[/C][C] 0.1214[/C][C] 0.2429[/C][C] 0.8786[/C][/ROW]
[ROW][C]55[/C][C] 0.1149[/C][C] 0.2298[/C][C] 0.8851[/C][/ROW]
[ROW][C]56[/C][C] 0.3413[/C][C] 0.6826[/C][C] 0.6587[/C][/ROW]
[ROW][C]57[/C][C] 0.3265[/C][C] 0.6529[/C][C] 0.6735[/C][/ROW]
[ROW][C]58[/C][C] 0.3765[/C][C] 0.7531[/C][C] 0.6235[/C][/ROW]
[ROW][C]59[/C][C] 0.3553[/C][C] 0.7106[/C][C] 0.6447[/C][/ROW]
[ROW][C]60[/C][C] 0.3355[/C][C] 0.671[/C][C] 0.6645[/C][/ROW]
[ROW][C]61[/C][C] 0.3325[/C][C] 0.6649[/C][C] 0.6675[/C][/ROW]
[ROW][C]62[/C][C] 0.3181[/C][C] 0.6361[/C][C] 0.6819[/C][/ROW]
[ROW][C]63[/C][C] 0.2725[/C][C] 0.5449[/C][C] 0.7275[/C][/ROW]
[ROW][C]64[/C][C] 0.3069[/C][C] 0.6138[/C][C] 0.6931[/C][/ROW]
[ROW][C]65[/C][C] 0.3379[/C][C] 0.6759[/C][C] 0.6621[/C][/ROW]
[ROW][C]66[/C][C] 0.302[/C][C] 0.6039[/C][C] 0.698[/C][/ROW]
[ROW][C]67[/C][C] 0.2716[/C][C] 0.5433[/C][C] 0.7284[/C][/ROW]
[ROW][C]68[/C][C] 0.2797[/C][C] 0.5594[/C][C] 0.7203[/C][/ROW]
[ROW][C]69[/C][C] 0.2336[/C][C] 0.4671[/C][C] 0.7664[/C][/ROW]
[ROW][C]70[/C][C] 0.3901[/C][C] 0.7801[/C][C] 0.6099[/C][/ROW]
[ROW][C]71[/C][C] 0.3318[/C][C] 0.6636[/C][C] 0.6682[/C][/ROW]
[ROW][C]72[/C][C] 0.2821[/C][C] 0.5642[/C][C] 0.7179[/C][/ROW]
[ROW][C]73[/C][C] 0.2654[/C][C] 0.5308[/C][C] 0.7346[/C][/ROW]
[ROW][C]74[/C][C] 0.2856[/C][C] 0.5713[/C][C] 0.7144[/C][/ROW]
[ROW][C]75[/C][C] 0.2432[/C][C] 0.4865[/C][C] 0.7568[/C][/ROW]
[ROW][C]76[/C][C] 0.2311[/C][C] 0.4622[/C][C] 0.7689[/C][/ROW]
[ROW][C]77[/C][C] 0.2129[/C][C] 0.4258[/C][C] 0.7871[/C][/ROW]
[ROW][C]78[/C][C] 0.1691[/C][C] 0.3381[/C][C] 0.8309[/C][/ROW]
[ROW][C]79[/C][C] 0.1366[/C][C] 0.2733[/C][C] 0.8634[/C][/ROW]
[ROW][C]80[/C][C] 0.2351[/C][C] 0.4702[/C][C] 0.7649[/C][/ROW]
[ROW][C]81[/C][C] 0.2665[/C][C] 0.5331[/C][C] 0.7335[/C][/ROW]
[ROW][C]82[/C][C] 0.5823[/C][C] 0.8353[/C][C] 0.4177[/C][/ROW]
[ROW][C]83[/C][C] 0.6402[/C][C] 0.7195[/C][C] 0.3598[/C][/ROW]
[ROW][C]84[/C][C] 0.5606[/C][C] 0.8788[/C][C] 0.4394[/C][/ROW]
[ROW][C]85[/C][C] 0.6473[/C][C] 0.7053[/C][C] 0.3527[/C][/ROW]
[ROW][C]86[/C][C] 0.5762[/C][C] 0.8476[/C][C] 0.4238[/C][/ROW]
[ROW][C]87[/C][C] 0.5509[/C][C] 0.8982[/C][C] 0.4491[/C][/ROW]
[ROW][C]88[/C][C] 0.581[/C][C] 0.8379[/C][C] 0.419[/C][/ROW]
[ROW][C]89[/C][C] 0.4916[/C][C] 0.9831[/C][C] 0.5084[/C][/ROW]
[ROW][C]90[/C][C] 0.5096[/C][C] 0.9807[/C][C] 0.4904[/C][/ROW]
[ROW][C]91[/C][C] 0.5547[/C][C] 0.8905[/C][C] 0.4453[/C][/ROW]
[ROW][C]92[/C][C] 0.6546[/C][C] 0.6908[/C][C] 0.3454[/C][/ROW]
[ROW][C]93[/C][C] 0.5239[/C][C] 0.9521[/C][C] 0.4761[/C][/ROW]
[ROW][C]94[/C][C] 0.382[/C][C] 0.764[/C][C] 0.618[/C][/ROW]
[ROW][C]95[/C][C] 0.249[/C][C] 0.4979[/C][C] 0.751[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297825&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297825&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
8 0.3195 0.639 0.6805
9 0.1791 0.3582 0.8209
10 0.09012 0.1802 0.9099
11 0.04625 0.0925 0.9537
12 0.02111 0.04221 0.9789
13 0.02973 0.05947 0.9703
14 0.01411 0.02822 0.9859
15 0.00649 0.01298 0.9935
16 0.004936 0.009872 0.9951
17 0.002232 0.004464 0.9978
18 0.004845 0.009691 0.9952
19 0.002378 0.004756 0.9976
20 0.003536 0.007073 0.9965
21 0.004875 0.009751 0.9951
22 0.01002 0.02003 0.99
23 0.006924 0.01385 0.9931
24 0.009855 0.01971 0.9901
25 0.009717 0.01943 0.9903
26 0.009368 0.01874 0.9906
27 0.0178 0.0356 0.9822
28 0.01242 0.02484 0.9876
29 0.008563 0.01713 0.9914
30 0.005238 0.01048 0.9948
31 0.00366 0.007319 0.9963
32 0.007261 0.01452 0.9927
33 0.006003 0.01201 0.994
34 0.04886 0.09772 0.9511
35 0.03897 0.07794 0.961
36 0.04037 0.08074 0.9596
37 0.03535 0.07071 0.9646
38 0.02491 0.04982 0.9751
39 0.02521 0.05043 0.9748
40 0.01806 0.03612 0.9819
41 0.01355 0.02709 0.9865
42 0.009414 0.01883 0.9906
43 0.006327 0.01265 0.9937
44 0.02404 0.04808 0.976
45 0.01934 0.03868 0.9807
46 0.01827 0.03655 0.9817
47 0.01574 0.03149 0.9843
48 0.01182 0.02365 0.9882
49 0.008161 0.01632 0.9918
50 0.005506 0.01101 0.9945
51 0.004866 0.009733 0.9951
52 0.003308 0.006616 0.9967
53 0.00318 0.00636 0.9968
54 0.1214 0.2429 0.8786
55 0.1149 0.2298 0.8851
56 0.3413 0.6826 0.6587
57 0.3265 0.6529 0.6735
58 0.3765 0.7531 0.6235
59 0.3553 0.7106 0.6447
60 0.3355 0.671 0.6645
61 0.3325 0.6649 0.6675
62 0.3181 0.6361 0.6819
63 0.2725 0.5449 0.7275
64 0.3069 0.6138 0.6931
65 0.3379 0.6759 0.6621
66 0.302 0.6039 0.698
67 0.2716 0.5433 0.7284
68 0.2797 0.5594 0.7203
69 0.2336 0.4671 0.7664
70 0.3901 0.7801 0.6099
71 0.3318 0.6636 0.6682
72 0.2821 0.5642 0.7179
73 0.2654 0.5308 0.7346
74 0.2856 0.5713 0.7144
75 0.2432 0.4865 0.7568
76 0.2311 0.4622 0.7689
77 0.2129 0.4258 0.7871
78 0.1691 0.3381 0.8309
79 0.1366 0.2733 0.8634
80 0.2351 0.4702 0.7649
81 0.2665 0.5331 0.7335
82 0.5823 0.8353 0.4177
83 0.6402 0.7195 0.3598
84 0.5606 0.8788 0.4394
85 0.6473 0.7053 0.3527
86 0.5762 0.8476 0.4238
87 0.5509 0.8982 0.4491
88 0.581 0.8379 0.419
89 0.4916 0.9831 0.5084
90 0.5096 0.9807 0.4904
91 0.5547 0.8905 0.4453
92 0.6546 0.6908 0.3454
93 0.5239 0.9521 0.4761
94 0.382 0.764 0.618
95 0.249 0.4979 0.751







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level10 0.1136NOK
5% type I error level360.409091NOK
10% type I error level430.488636NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 10 &  0.1136 & NOK \tabularnewline
5% type I error level & 36 & 0.409091 & NOK \tabularnewline
10% type I error level & 43 & 0.488636 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297825&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]10[/C][C] 0.1136[/C][C]NOK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]36[/C][C]0.409091[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]43[/C][C]0.488636[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297825&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297825&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level10 0.1136NOK
5% type I error level360.409091NOK
10% type I error level430.488636NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.3669, df1 = 2, df2 = 96, p-value = 0.2598
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.7932, df1 = 8, df2 = 90, p-value = 0.08865
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.0432, df1 = 2, df2 = 96, p-value = 0.3563

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.3669, df1 = 2, df2 = 96, p-value = 0.2598
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.7932, df1 = 8, df2 = 90, p-value = 0.08865
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.0432, df1 = 2, df2 = 96, p-value = 0.3563
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=297825&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.3669, df1 = 2, df2 = 96, p-value = 0.2598
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.7932, df1 = 8, df2 = 90, p-value = 0.08865
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.0432, df1 = 2, df2 = 96, p-value = 0.3563
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297825&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297825&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.3669, df1 = 2, df2 = 96, p-value = 0.2598
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.7932, df1 = 8, df2 = 90, p-value = 0.08865
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.0432, df1 = 2, df2 = 96, p-value = 0.3563







Variance Inflation Factors (Multicollinearity)
> vif
   KVDD1    KVDD2    KVDD3    KVDD4 
1.065855 1.040676 1.065567 1.052178 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
   KVDD1    KVDD2    KVDD3    KVDD4 
1.065855 1.040676 1.065567 1.052178 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=297825&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
   KVDD1    KVDD2    KVDD3    KVDD4 
1.065855 1.040676 1.065567 1.052178 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297825&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297825&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
   KVDD1    KVDD2    KVDD3    KVDD4 
1.065855 1.040676 1.065567 1.052178 



Parameters (Session):
par1 = 5 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
Parameters (R input):
par1 = 5 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
par5 <- '0'
par4 <- '0'
par3 <- 'No Linear Trend'
par2 <- 'Do not include Seasonal Dummies'
par1 <- '1'
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')