Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationFri, 02 Dec 2016 17:06:07 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/02/t1480695365t43xpzjktz1ujl4.htm/, Retrieved Wed, 08 May 2024 00:04:31 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=297582, Retrieved Wed, 08 May 2024 00:04:31 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywordsRegressieanalyse
Estimated Impact125
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Regressieanalyse] [2016-12-02 16:06:07] [404ac5ee4f7301873f6a96ef36861981] [Current]
Feedback Forum

Post a new message
Dataseries X:
4	2	4	3	5	4	13
5	3	3	4	5	4	16
4	4	5	4	5	4	17
3	4	3	3	4	4	15
4	4	5	4	5	4	16
3	4	4	4	5	5	16
3	4	4	3	3	4	17
3	4	5	4	4	4	16
4	5	4	4	5	5	17
4	5	5	4	5	5	17
4	4	2	4	5	4	17
4	4	5	3	5	4	15
4	4	4	3	4	5	16
3	3	5	4	4	5	14
4	4	5	4	2	5	16
3	4	5	4	4	5	17
3	4	5	4	4	5	16
5	5	4	3	4	4	15
4	4	4	4	5	4	17
3	4	5	3	4	5	16
4	4	4	4	5	5	15
4	4	5	4	4	5	16
4	4	5	4	4	4	15
4	4	5	4	4	5	17
3	4	4	4	4	4	15
3	4	4	3	5	5	16
4	4	4	4	4	4	15
2	4	5	4	5	5	16
5	4	4	4	4	4	16
4	3	5	4	4	4	13
4	5	5	4	5	5	15
5	4	5	4	4	5	17
2	3	5	4	5	4	13
4	5	2	4	4	4	17
3	4	5	4	4	4	15
4	3	5	3	4	5	14
4	3	3	4	4	4	14
4	4	5	4	4	4	18
5	4	4	4	4	4	15
4	5	5	4	5	5	17
3	3	4	4	4	4	13
5	5	5	3	5	5	16
5	4	5	3	4	4	15
4	4	4	3	4	5	15
4	4	4	4	4	4	16
3	5	5	3	3	4	15
4	4	4	4	5	4	13
4	5	5	4	4	4	17
5	5	2	4	5	4	17
5	5	5	4	4	4	17
4	3	5	4	5	5	11
4	3	4	3	4	5	14
4	4	5	4	4	4	13
3	4	4	3	3	4	15
3	4	4	4	4	3	17
4	4	4	3	5	4	16
4	4	4	4	5	4	15
5	5	3	4	5	5	17
2	4	4	4	5	5	16
4	4	4	4	5	5	16
3	4	4	4	2	4	16
4	4	5	4	5	5	15
4	2	4	4	4	4	12
4	4	4	3	5	3	17
4	4	4	3	5	4	14
5	4	5	3	3	5	14
3	4	4	3	5	5	16
3	4	4	3	4	5	15
4	5	5	5	5	4	15
4	4	4	4	4	4	13
4	4	4	5	5	4	17
3	4	3	4	4	4	15
4	4	4	4	5	4	16
3	4	5	3	5	5	14
3	3	5	4	4	5	15
4	3	5	4	4	4	17
4	4	5	4	4	5	16
3	3	3	4	4	4	10
4	4	4	4	5	4	16
4	4	3	4	5	5	17
4	4	4	4	5	5	17
5	4	4	4	4	4	20
5	4	3	5	4	5	17
4	4	5	4	5	5	18
3	4	5	4	4	5	15
4	2	3	3	4	4	14
4	4	5	4	4	3	15
4	4	5	4	4	5	17
4	4	4	4	5	4	16
4	5	4	4	5	3	17
3	4	4	3	5	5	15
4	4	5	4	4	5	16
5	4	3	4	4	5	18
5	4	5	5	4	5	18
4	5	4	4	5	5	16
5	3	4	4	5	5	17
4	4	5	4	4	5	15
5	4	4	4	4	5	13
5	4	4	5	5	5	17
4	4	3	3	4	3	16
4	4	5	4	4	4	15
4	4	5	4	4	4	16
3	4	5	4	5	3	16
4	4	4	4	4	4	13
4	4	4	3	4	5	15
3	3	4	3	5	5	12
4	4	4	3	4	4	19
3	4	5	4	4	4	16
4	4	5	4	3	4	16
5	4	5	1	5	5	17
5	4	5	4	5	5	16
4	4	4	4	4	3	14
4	4	5	3	4	4	15
3	4	4	3	4	5	14
4	4	4	4	4	4	16
4	4	4	4	5	4	15
4	5	3	4	4	4	17
3	4	4	4	4	4	15
4	4	4	3	4	4	16
4	4	4	4	4	5	16
3	4	3	3	4	4	15
4	4	4	3	4	3	15
3	2	4	2	4	4	11
4	4	4	3	5	4	16
5	4	4	3	5	4	18
2	4	4	3	3	5	13
3	3	4	4	4	4	11
5	5	4	4	5	4	18
4	5	5	4	4	4	15
5	5	5	5	5	4	19
4	5	5	4	5	5	17
4	4	4	3	4	5	13
3	4	5	4	5	4	14
4	4	5	4	4	4	16
4	4	2	4	4	4	13
4	4	3	4	5	5	17
4	4	4	4	5	5	14
5	4	5	3	5	4	19
4	3	5	4	4	4	14
4	4	5	4	4	4	16
3	3	2	3	4	4	12
4	5	5	4	4	3	16
4	4	4	3	4	4	16
4	4	4	4	4	5	15
3	4	5	3	5	5	12
4	4	5	4	4	5	15
5	4	5	4	5	4	17
4	4	5	4	3	4	13
2	3	5	4	4	4	15
4	4	4	4	4	5	18
4	3	4	3	5	5	15
4	4	4	4	4	3	18
4	5	5	5	4	4	15
5	4	3	4	4	4	15
5	4	4	3	4	4	16
3	3	1	4	5	5	13
4	4	4	4	4	5	16
4	4	4	4	5	4	13
2	3	4	5	5	4	16




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time7 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297582&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]7 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297582&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297582&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
TVDC[t] = + 6.51162 + 0.559398SK1[t] + 1.12488SK2[t] + 0.088725SK3[t] + 0.266163SK4[t] + 0.219108SK5[t] + 0.00273889SK6[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
TVDC[t] =  +  6.51162 +  0.559398SK1[t] +  1.12488SK2[t] +  0.088725SK3[t] +  0.266163SK4[t] +  0.219108SK5[t] +  0.00273889SK6[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297582&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]TVDC[t] =  +  6.51162 +  0.559398SK1[t] +  1.12488SK2[t] +  0.088725SK3[t] +  0.266163SK4[t] +  0.219108SK5[t] +  0.00273889SK6[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297582&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297582&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
TVDC[t] = + 6.51162 + 0.559398SK1[t] + 1.12488SK2[t] + 0.088725SK3[t] + 0.266163SK4[t] + 0.219108SK5[t] + 0.00273889SK6[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+6.512 1.552+4.1950e+00 4.614e-05 2.307e-05
SK1+0.5594 0.1636+3.4200e+00 0.0008037 0.0004019
SK2+1.125 0.1993+5.6440e+00 7.92e-08 3.96e-08
SK3+0.08872 0.1461+6.0740e-01 0.5445 0.2722
SK4+0.2662 0.1997+1.3330e+00 0.1846 0.09232
SK5+0.2191 0.1879+1.1660e+00 0.2454 0.1227
SK6+0.002739 0.195+1.4050e-02 0.9888 0.4944

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +6.512 &  1.552 & +4.1950e+00 &  4.614e-05 &  2.307e-05 \tabularnewline
SK1 & +0.5594 &  0.1636 & +3.4200e+00 &  0.0008037 &  0.0004019 \tabularnewline
SK2 & +1.125 &  0.1993 & +5.6440e+00 &  7.92e-08 &  3.96e-08 \tabularnewline
SK3 & +0.08872 &  0.1461 & +6.0740e-01 &  0.5445 &  0.2722 \tabularnewline
SK4 & +0.2662 &  0.1997 & +1.3330e+00 &  0.1846 &  0.09232 \tabularnewline
SK5 & +0.2191 &  0.1879 & +1.1660e+00 &  0.2454 &  0.1227 \tabularnewline
SK6 & +0.002739 &  0.195 & +1.4050e-02 &  0.9888 &  0.4944 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297582&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+6.512[/C][C] 1.552[/C][C]+4.1950e+00[/C][C] 4.614e-05[/C][C] 2.307e-05[/C][/ROW]
[ROW][C]SK1[/C][C]+0.5594[/C][C] 0.1636[/C][C]+3.4200e+00[/C][C] 0.0008037[/C][C] 0.0004019[/C][/ROW]
[ROW][C]SK2[/C][C]+1.125[/C][C] 0.1993[/C][C]+5.6440e+00[/C][C] 7.92e-08[/C][C] 3.96e-08[/C][/ROW]
[ROW][C]SK3[/C][C]+0.08872[/C][C] 0.1461[/C][C]+6.0740e-01[/C][C] 0.5445[/C][C] 0.2722[/C][/ROW]
[ROW][C]SK4[/C][C]+0.2662[/C][C] 0.1997[/C][C]+1.3330e+00[/C][C] 0.1846[/C][C] 0.09232[/C][/ROW]
[ROW][C]SK5[/C][C]+0.2191[/C][C] 0.1879[/C][C]+1.1660e+00[/C][C] 0.2454[/C][C] 0.1227[/C][/ROW]
[ROW][C]SK6[/C][C]+0.002739[/C][C] 0.195[/C][C]+1.4050e-02[/C][C] 0.9888[/C][C] 0.4944[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297582&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297582&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+6.512 1.552+4.1950e+00 4.614e-05 2.307e-05
SK1+0.5594 0.1636+3.4200e+00 0.0008037 0.0004019
SK2+1.125 0.1993+5.6440e+00 7.92e-08 3.96e-08
SK3+0.08872 0.1461+6.0740e-01 0.5445 0.2722
SK4+0.2662 0.1997+1.3330e+00 0.1846 0.09232
SK5+0.2191 0.1879+1.1660e+00 0.2454 0.1227
SK6+0.002739 0.195+1.4050e-02 0.9888 0.4944







Multiple Linear Regression - Regression Statistics
Multiple R 0.5641
R-squared 0.3182
Adjusted R-squared 0.2913
F-TEST (value) 11.83
F-TEST (DF numerator)6
F-TEST (DF denominator)152
p-value 7.293e-11
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.436
Sum Squared Residuals 313.3

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.5641 \tabularnewline
R-squared &  0.3182 \tabularnewline
Adjusted R-squared &  0.2913 \tabularnewline
F-TEST (value) &  11.83 \tabularnewline
F-TEST (DF numerator) & 6 \tabularnewline
F-TEST (DF denominator) & 152 \tabularnewline
p-value &  7.293e-11 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1.436 \tabularnewline
Sum Squared Residuals &  313.3 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297582&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.5641[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.3182[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.2913[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 11.83[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]6[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]152[/C][/ROW]
[ROW][C]p-value[/C][C] 7.293e-11[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1.436[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 313.3[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297582&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297582&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.5641
R-squared 0.3182
Adjusted R-squared 0.2913
F-TEST (value) 11.83
F-TEST (DF numerator)6
F-TEST (DF denominator)152
p-value 7.293e-11
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.436
Sum Squared Residuals 313.3







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 13 13.26-0.2589
2 16 15.12 0.8794
3 17 15.86 1.136
4 15 14.64 0.3586
5 16 15.86 0.1365
6 16 15.22 0.7819
7 17 14.51 2.489
8 16 15.09 0.915
9 17 16.9 0.0976
10 17 16.99 0.008872
11 17 15.6 1.403
12 15 15.6-0.5973
13 16 15.29 0.7077
14 14 13.96 0.03714
15 16 15.21 0.7911
16 17 15.09 1.912
17 16 15.09 0.9123
18 15 16.97-1.974
19 17 15.77 1.225
20 16 14.82 1.178
21 15 15.78-0.7775
22 16 15.65 0.3529
23 15 15.64-0.6444
24 17 15.65 1.353
25 15 15 0.003723
26 16 14.95 1.048
27 15 15.56-0.5557
28 16 14.75 1.253
29 16 16.12-0.1151
30 13 14.52-1.52
31 15 16.99-1.991
32 17 16.21 0.7935
33 13 13.62-0.6198
34 17 16.5 0.4969
35 15 15.09-0.085
36 14 14.26-0.2561
37 14 14.34-0.3421
38 18 15.64 2.356
39 15 16.12-1.115
40 17 16.99 0.008872
41 13 13.87-0.8714
42 16 17.28-1.284
43 15 15.94-0.9376
44 15 15.29-0.2923
45 16 15.56 0.4443
46 15 15.72-0.7246
47 13 15.77-2.775
48 17 16.77 0.2307
49 17 17.28-0.2816
50 17 17.33-0.3287
51 11 14.74-3.741
52 14 14.17-0.1674
53 13 15.64-2.644
54 15 14.51 0.489
55 17 14.99 2.006
56 16 15.51 0.4914
57 15 15.77-0.7748
58 17 17.37-0.3731
59 16 14.66 1.341
60 16 15.78 0.2225
61 16 14.56 1.442
62 15 15.87-0.8662
63 12 13.31-1.306
64 17 15.51 1.494
65 14 15.51-1.509
66 14 15.72-1.721
67 16 14.95 1.048
68 15 14.73 0.2671
69 15 17.25-2.255
70 13 15.56-2.556
71 17 16.04 0.9591
72 15 14.91 0.09245
73 16 15.77 0.2252
74 14 15.04-1.041
75 15 13.96 1.037
76 17 14.52 2.48
77 16 15.65 0.3529
78 10 13.78-3.783
79 16 15.77 0.2252
80 17 15.69 1.311
81 17 15.78 1.222
82 20 16.12 3.885
83 17 16.3 0.7047
84 18 15.87 2.134
85 15 15.09-0.08774
86 14 12.95 1.049
87 15 15.64-0.6417
88 17 15.65 1.353
89 16 15.77 0.2252
90 17 16.9 0.1031
91 15 14.95 0.04804
92 16 15.65 0.3529
93 18 16.03 1.971
94 18 16.47 1.527
95 16 16.9-0.9024
96 17 15.21 1.788
97 15 15.65-0.6471
98 13 16.12-3.118
99 17 16.6 0.3969
100 16 15.2 0.802
101 15 15.64-0.6444
102 16 15.64 0.3556
103 16 15.3 0.6986
104 13 15.56-2.556
105 15 15.29-0.2923
106 12 13.83-1.827
107 19 15.29 3.71
108 16 15.09 0.915
109 16 15.43 0.5747
110 17 15.63 1.373
111 16 16.43-0.4256
112 14 15.55-1.553
113 15 15.38-0.3782
114 14 14.73-0.7329
115 16 15.56 0.4443
116 15 15.77-0.7748
117 17 16.59 0.4082
118 15 15 0.003723
119 16 15.29 0.7105
120 16 15.56 0.4416
121 15 14.64 0.3586
122 15 15.29-0.2868
123 11 12.21-1.214
124 16 15.51 0.4914
125 18 16.07 1.932
126 13 13.95-0.9543
127 11 13.87-2.871
128 18 17.46 0.5409
129 15 16.77-1.769
130 19 17.81 1.186
131 17 16.99 0.008872
132 13 15.29-2.292
133 14 15.3-1.304
134 16 15.64 0.3556
135 13 15.38-2.378
136 17 15.69 1.311
137 14 15.78-1.778
138 19 16.16 2.843
139 14 14.52-0.5195
140 16 15.64 0.3556
141 12 13.43-1.428
142 16 16.77-0.7665
143 16 15.29 0.7105
144 15 15.56-0.5584
145 12 15.04-3.041
146 15 15.65-0.6471
147 17 16.42 0.5771
148 13 15.43-2.425
149 15 13.4 1.599
150 18 15.56 2.442
151 15 14.39 0.6135
152 18 15.55 2.447
153 15 17.04-2.035
154 15 16.03-1.026
155 16 15.85 0.1511
156 13 13.83-0.8271
157 16 15.56 0.4416
158 13 15.77-2.775
159 16 13.8 2.203

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  13 &  13.26 & -0.2589 \tabularnewline
2 &  16 &  15.12 &  0.8794 \tabularnewline
3 &  17 &  15.86 &  1.136 \tabularnewline
4 &  15 &  14.64 &  0.3586 \tabularnewline
5 &  16 &  15.86 &  0.1365 \tabularnewline
6 &  16 &  15.22 &  0.7819 \tabularnewline
7 &  17 &  14.51 &  2.489 \tabularnewline
8 &  16 &  15.09 &  0.915 \tabularnewline
9 &  17 &  16.9 &  0.0976 \tabularnewline
10 &  17 &  16.99 &  0.008872 \tabularnewline
11 &  17 &  15.6 &  1.403 \tabularnewline
12 &  15 &  15.6 & -0.5973 \tabularnewline
13 &  16 &  15.29 &  0.7077 \tabularnewline
14 &  14 &  13.96 &  0.03714 \tabularnewline
15 &  16 &  15.21 &  0.7911 \tabularnewline
16 &  17 &  15.09 &  1.912 \tabularnewline
17 &  16 &  15.09 &  0.9123 \tabularnewline
18 &  15 &  16.97 & -1.974 \tabularnewline
19 &  17 &  15.77 &  1.225 \tabularnewline
20 &  16 &  14.82 &  1.178 \tabularnewline
21 &  15 &  15.78 & -0.7775 \tabularnewline
22 &  16 &  15.65 &  0.3529 \tabularnewline
23 &  15 &  15.64 & -0.6444 \tabularnewline
24 &  17 &  15.65 &  1.353 \tabularnewline
25 &  15 &  15 &  0.003723 \tabularnewline
26 &  16 &  14.95 &  1.048 \tabularnewline
27 &  15 &  15.56 & -0.5557 \tabularnewline
28 &  16 &  14.75 &  1.253 \tabularnewline
29 &  16 &  16.12 & -0.1151 \tabularnewline
30 &  13 &  14.52 & -1.52 \tabularnewline
31 &  15 &  16.99 & -1.991 \tabularnewline
32 &  17 &  16.21 &  0.7935 \tabularnewline
33 &  13 &  13.62 & -0.6198 \tabularnewline
34 &  17 &  16.5 &  0.4969 \tabularnewline
35 &  15 &  15.09 & -0.085 \tabularnewline
36 &  14 &  14.26 & -0.2561 \tabularnewline
37 &  14 &  14.34 & -0.3421 \tabularnewline
38 &  18 &  15.64 &  2.356 \tabularnewline
39 &  15 &  16.12 & -1.115 \tabularnewline
40 &  17 &  16.99 &  0.008872 \tabularnewline
41 &  13 &  13.87 & -0.8714 \tabularnewline
42 &  16 &  17.28 & -1.284 \tabularnewline
43 &  15 &  15.94 & -0.9376 \tabularnewline
44 &  15 &  15.29 & -0.2923 \tabularnewline
45 &  16 &  15.56 &  0.4443 \tabularnewline
46 &  15 &  15.72 & -0.7246 \tabularnewline
47 &  13 &  15.77 & -2.775 \tabularnewline
48 &  17 &  16.77 &  0.2307 \tabularnewline
49 &  17 &  17.28 & -0.2816 \tabularnewline
50 &  17 &  17.33 & -0.3287 \tabularnewline
51 &  11 &  14.74 & -3.741 \tabularnewline
52 &  14 &  14.17 & -0.1674 \tabularnewline
53 &  13 &  15.64 & -2.644 \tabularnewline
54 &  15 &  14.51 &  0.489 \tabularnewline
55 &  17 &  14.99 &  2.006 \tabularnewline
56 &  16 &  15.51 &  0.4914 \tabularnewline
57 &  15 &  15.77 & -0.7748 \tabularnewline
58 &  17 &  17.37 & -0.3731 \tabularnewline
59 &  16 &  14.66 &  1.341 \tabularnewline
60 &  16 &  15.78 &  0.2225 \tabularnewline
61 &  16 &  14.56 &  1.442 \tabularnewline
62 &  15 &  15.87 & -0.8662 \tabularnewline
63 &  12 &  13.31 & -1.306 \tabularnewline
64 &  17 &  15.51 &  1.494 \tabularnewline
65 &  14 &  15.51 & -1.509 \tabularnewline
66 &  14 &  15.72 & -1.721 \tabularnewline
67 &  16 &  14.95 &  1.048 \tabularnewline
68 &  15 &  14.73 &  0.2671 \tabularnewline
69 &  15 &  17.25 & -2.255 \tabularnewline
70 &  13 &  15.56 & -2.556 \tabularnewline
71 &  17 &  16.04 &  0.9591 \tabularnewline
72 &  15 &  14.91 &  0.09245 \tabularnewline
73 &  16 &  15.77 &  0.2252 \tabularnewline
74 &  14 &  15.04 & -1.041 \tabularnewline
75 &  15 &  13.96 &  1.037 \tabularnewline
76 &  17 &  14.52 &  2.48 \tabularnewline
77 &  16 &  15.65 &  0.3529 \tabularnewline
78 &  10 &  13.78 & -3.783 \tabularnewline
79 &  16 &  15.77 &  0.2252 \tabularnewline
80 &  17 &  15.69 &  1.311 \tabularnewline
81 &  17 &  15.78 &  1.222 \tabularnewline
82 &  20 &  16.12 &  3.885 \tabularnewline
83 &  17 &  16.3 &  0.7047 \tabularnewline
84 &  18 &  15.87 &  2.134 \tabularnewline
85 &  15 &  15.09 & -0.08774 \tabularnewline
86 &  14 &  12.95 &  1.049 \tabularnewline
87 &  15 &  15.64 & -0.6417 \tabularnewline
88 &  17 &  15.65 &  1.353 \tabularnewline
89 &  16 &  15.77 &  0.2252 \tabularnewline
90 &  17 &  16.9 &  0.1031 \tabularnewline
91 &  15 &  14.95 &  0.04804 \tabularnewline
92 &  16 &  15.65 &  0.3529 \tabularnewline
93 &  18 &  16.03 &  1.971 \tabularnewline
94 &  18 &  16.47 &  1.527 \tabularnewline
95 &  16 &  16.9 & -0.9024 \tabularnewline
96 &  17 &  15.21 &  1.788 \tabularnewline
97 &  15 &  15.65 & -0.6471 \tabularnewline
98 &  13 &  16.12 & -3.118 \tabularnewline
99 &  17 &  16.6 &  0.3969 \tabularnewline
100 &  16 &  15.2 &  0.802 \tabularnewline
101 &  15 &  15.64 & -0.6444 \tabularnewline
102 &  16 &  15.64 &  0.3556 \tabularnewline
103 &  16 &  15.3 &  0.6986 \tabularnewline
104 &  13 &  15.56 & -2.556 \tabularnewline
105 &  15 &  15.29 & -0.2923 \tabularnewline
106 &  12 &  13.83 & -1.827 \tabularnewline
107 &  19 &  15.29 &  3.71 \tabularnewline
108 &  16 &  15.09 &  0.915 \tabularnewline
109 &  16 &  15.43 &  0.5747 \tabularnewline
110 &  17 &  15.63 &  1.373 \tabularnewline
111 &  16 &  16.43 & -0.4256 \tabularnewline
112 &  14 &  15.55 & -1.553 \tabularnewline
113 &  15 &  15.38 & -0.3782 \tabularnewline
114 &  14 &  14.73 & -0.7329 \tabularnewline
115 &  16 &  15.56 &  0.4443 \tabularnewline
116 &  15 &  15.77 & -0.7748 \tabularnewline
117 &  17 &  16.59 &  0.4082 \tabularnewline
118 &  15 &  15 &  0.003723 \tabularnewline
119 &  16 &  15.29 &  0.7105 \tabularnewline
120 &  16 &  15.56 &  0.4416 \tabularnewline
121 &  15 &  14.64 &  0.3586 \tabularnewline
122 &  15 &  15.29 & -0.2868 \tabularnewline
123 &  11 &  12.21 & -1.214 \tabularnewline
124 &  16 &  15.51 &  0.4914 \tabularnewline
125 &  18 &  16.07 &  1.932 \tabularnewline
126 &  13 &  13.95 & -0.9543 \tabularnewline
127 &  11 &  13.87 & -2.871 \tabularnewline
128 &  18 &  17.46 &  0.5409 \tabularnewline
129 &  15 &  16.77 & -1.769 \tabularnewline
130 &  19 &  17.81 &  1.186 \tabularnewline
131 &  17 &  16.99 &  0.008872 \tabularnewline
132 &  13 &  15.29 & -2.292 \tabularnewline
133 &  14 &  15.3 & -1.304 \tabularnewline
134 &  16 &  15.64 &  0.3556 \tabularnewline
135 &  13 &  15.38 & -2.378 \tabularnewline
136 &  17 &  15.69 &  1.311 \tabularnewline
137 &  14 &  15.78 & -1.778 \tabularnewline
138 &  19 &  16.16 &  2.843 \tabularnewline
139 &  14 &  14.52 & -0.5195 \tabularnewline
140 &  16 &  15.64 &  0.3556 \tabularnewline
141 &  12 &  13.43 & -1.428 \tabularnewline
142 &  16 &  16.77 & -0.7665 \tabularnewline
143 &  16 &  15.29 &  0.7105 \tabularnewline
144 &  15 &  15.56 & -0.5584 \tabularnewline
145 &  12 &  15.04 & -3.041 \tabularnewline
146 &  15 &  15.65 & -0.6471 \tabularnewline
147 &  17 &  16.42 &  0.5771 \tabularnewline
148 &  13 &  15.43 & -2.425 \tabularnewline
149 &  15 &  13.4 &  1.599 \tabularnewline
150 &  18 &  15.56 &  2.442 \tabularnewline
151 &  15 &  14.39 &  0.6135 \tabularnewline
152 &  18 &  15.55 &  2.447 \tabularnewline
153 &  15 &  17.04 & -2.035 \tabularnewline
154 &  15 &  16.03 & -1.026 \tabularnewline
155 &  16 &  15.85 &  0.1511 \tabularnewline
156 &  13 &  13.83 & -0.8271 \tabularnewline
157 &  16 &  15.56 &  0.4416 \tabularnewline
158 &  13 &  15.77 & -2.775 \tabularnewline
159 &  16 &  13.8 &  2.203 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297582&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 13[/C][C] 13.26[/C][C]-0.2589[/C][/ROW]
[ROW][C]2[/C][C] 16[/C][C] 15.12[/C][C] 0.8794[/C][/ROW]
[ROW][C]3[/C][C] 17[/C][C] 15.86[/C][C] 1.136[/C][/ROW]
[ROW][C]4[/C][C] 15[/C][C] 14.64[/C][C] 0.3586[/C][/ROW]
[ROW][C]5[/C][C] 16[/C][C] 15.86[/C][C] 0.1365[/C][/ROW]
[ROW][C]6[/C][C] 16[/C][C] 15.22[/C][C] 0.7819[/C][/ROW]
[ROW][C]7[/C][C] 17[/C][C] 14.51[/C][C] 2.489[/C][/ROW]
[ROW][C]8[/C][C] 16[/C][C] 15.09[/C][C] 0.915[/C][/ROW]
[ROW][C]9[/C][C] 17[/C][C] 16.9[/C][C] 0.0976[/C][/ROW]
[ROW][C]10[/C][C] 17[/C][C] 16.99[/C][C] 0.008872[/C][/ROW]
[ROW][C]11[/C][C] 17[/C][C] 15.6[/C][C] 1.403[/C][/ROW]
[ROW][C]12[/C][C] 15[/C][C] 15.6[/C][C]-0.5973[/C][/ROW]
[ROW][C]13[/C][C] 16[/C][C] 15.29[/C][C] 0.7077[/C][/ROW]
[ROW][C]14[/C][C] 14[/C][C] 13.96[/C][C] 0.03714[/C][/ROW]
[ROW][C]15[/C][C] 16[/C][C] 15.21[/C][C] 0.7911[/C][/ROW]
[ROW][C]16[/C][C] 17[/C][C] 15.09[/C][C] 1.912[/C][/ROW]
[ROW][C]17[/C][C] 16[/C][C] 15.09[/C][C] 0.9123[/C][/ROW]
[ROW][C]18[/C][C] 15[/C][C] 16.97[/C][C]-1.974[/C][/ROW]
[ROW][C]19[/C][C] 17[/C][C] 15.77[/C][C] 1.225[/C][/ROW]
[ROW][C]20[/C][C] 16[/C][C] 14.82[/C][C] 1.178[/C][/ROW]
[ROW][C]21[/C][C] 15[/C][C] 15.78[/C][C]-0.7775[/C][/ROW]
[ROW][C]22[/C][C] 16[/C][C] 15.65[/C][C] 0.3529[/C][/ROW]
[ROW][C]23[/C][C] 15[/C][C] 15.64[/C][C]-0.6444[/C][/ROW]
[ROW][C]24[/C][C] 17[/C][C] 15.65[/C][C] 1.353[/C][/ROW]
[ROW][C]25[/C][C] 15[/C][C] 15[/C][C] 0.003723[/C][/ROW]
[ROW][C]26[/C][C] 16[/C][C] 14.95[/C][C] 1.048[/C][/ROW]
[ROW][C]27[/C][C] 15[/C][C] 15.56[/C][C]-0.5557[/C][/ROW]
[ROW][C]28[/C][C] 16[/C][C] 14.75[/C][C] 1.253[/C][/ROW]
[ROW][C]29[/C][C] 16[/C][C] 16.12[/C][C]-0.1151[/C][/ROW]
[ROW][C]30[/C][C] 13[/C][C] 14.52[/C][C]-1.52[/C][/ROW]
[ROW][C]31[/C][C] 15[/C][C] 16.99[/C][C]-1.991[/C][/ROW]
[ROW][C]32[/C][C] 17[/C][C] 16.21[/C][C] 0.7935[/C][/ROW]
[ROW][C]33[/C][C] 13[/C][C] 13.62[/C][C]-0.6198[/C][/ROW]
[ROW][C]34[/C][C] 17[/C][C] 16.5[/C][C] 0.4969[/C][/ROW]
[ROW][C]35[/C][C] 15[/C][C] 15.09[/C][C]-0.085[/C][/ROW]
[ROW][C]36[/C][C] 14[/C][C] 14.26[/C][C]-0.2561[/C][/ROW]
[ROW][C]37[/C][C] 14[/C][C] 14.34[/C][C]-0.3421[/C][/ROW]
[ROW][C]38[/C][C] 18[/C][C] 15.64[/C][C] 2.356[/C][/ROW]
[ROW][C]39[/C][C] 15[/C][C] 16.12[/C][C]-1.115[/C][/ROW]
[ROW][C]40[/C][C] 17[/C][C] 16.99[/C][C] 0.008872[/C][/ROW]
[ROW][C]41[/C][C] 13[/C][C] 13.87[/C][C]-0.8714[/C][/ROW]
[ROW][C]42[/C][C] 16[/C][C] 17.28[/C][C]-1.284[/C][/ROW]
[ROW][C]43[/C][C] 15[/C][C] 15.94[/C][C]-0.9376[/C][/ROW]
[ROW][C]44[/C][C] 15[/C][C] 15.29[/C][C]-0.2923[/C][/ROW]
[ROW][C]45[/C][C] 16[/C][C] 15.56[/C][C] 0.4443[/C][/ROW]
[ROW][C]46[/C][C] 15[/C][C] 15.72[/C][C]-0.7246[/C][/ROW]
[ROW][C]47[/C][C] 13[/C][C] 15.77[/C][C]-2.775[/C][/ROW]
[ROW][C]48[/C][C] 17[/C][C] 16.77[/C][C] 0.2307[/C][/ROW]
[ROW][C]49[/C][C] 17[/C][C] 17.28[/C][C]-0.2816[/C][/ROW]
[ROW][C]50[/C][C] 17[/C][C] 17.33[/C][C]-0.3287[/C][/ROW]
[ROW][C]51[/C][C] 11[/C][C] 14.74[/C][C]-3.741[/C][/ROW]
[ROW][C]52[/C][C] 14[/C][C] 14.17[/C][C]-0.1674[/C][/ROW]
[ROW][C]53[/C][C] 13[/C][C] 15.64[/C][C]-2.644[/C][/ROW]
[ROW][C]54[/C][C] 15[/C][C] 14.51[/C][C] 0.489[/C][/ROW]
[ROW][C]55[/C][C] 17[/C][C] 14.99[/C][C] 2.006[/C][/ROW]
[ROW][C]56[/C][C] 16[/C][C] 15.51[/C][C] 0.4914[/C][/ROW]
[ROW][C]57[/C][C] 15[/C][C] 15.77[/C][C]-0.7748[/C][/ROW]
[ROW][C]58[/C][C] 17[/C][C] 17.37[/C][C]-0.3731[/C][/ROW]
[ROW][C]59[/C][C] 16[/C][C] 14.66[/C][C] 1.341[/C][/ROW]
[ROW][C]60[/C][C] 16[/C][C] 15.78[/C][C] 0.2225[/C][/ROW]
[ROW][C]61[/C][C] 16[/C][C] 14.56[/C][C] 1.442[/C][/ROW]
[ROW][C]62[/C][C] 15[/C][C] 15.87[/C][C]-0.8662[/C][/ROW]
[ROW][C]63[/C][C] 12[/C][C] 13.31[/C][C]-1.306[/C][/ROW]
[ROW][C]64[/C][C] 17[/C][C] 15.51[/C][C] 1.494[/C][/ROW]
[ROW][C]65[/C][C] 14[/C][C] 15.51[/C][C]-1.509[/C][/ROW]
[ROW][C]66[/C][C] 14[/C][C] 15.72[/C][C]-1.721[/C][/ROW]
[ROW][C]67[/C][C] 16[/C][C] 14.95[/C][C] 1.048[/C][/ROW]
[ROW][C]68[/C][C] 15[/C][C] 14.73[/C][C] 0.2671[/C][/ROW]
[ROW][C]69[/C][C] 15[/C][C] 17.25[/C][C]-2.255[/C][/ROW]
[ROW][C]70[/C][C] 13[/C][C] 15.56[/C][C]-2.556[/C][/ROW]
[ROW][C]71[/C][C] 17[/C][C] 16.04[/C][C] 0.9591[/C][/ROW]
[ROW][C]72[/C][C] 15[/C][C] 14.91[/C][C] 0.09245[/C][/ROW]
[ROW][C]73[/C][C] 16[/C][C] 15.77[/C][C] 0.2252[/C][/ROW]
[ROW][C]74[/C][C] 14[/C][C] 15.04[/C][C]-1.041[/C][/ROW]
[ROW][C]75[/C][C] 15[/C][C] 13.96[/C][C] 1.037[/C][/ROW]
[ROW][C]76[/C][C] 17[/C][C] 14.52[/C][C] 2.48[/C][/ROW]
[ROW][C]77[/C][C] 16[/C][C] 15.65[/C][C] 0.3529[/C][/ROW]
[ROW][C]78[/C][C] 10[/C][C] 13.78[/C][C]-3.783[/C][/ROW]
[ROW][C]79[/C][C] 16[/C][C] 15.77[/C][C] 0.2252[/C][/ROW]
[ROW][C]80[/C][C] 17[/C][C] 15.69[/C][C] 1.311[/C][/ROW]
[ROW][C]81[/C][C] 17[/C][C] 15.78[/C][C] 1.222[/C][/ROW]
[ROW][C]82[/C][C] 20[/C][C] 16.12[/C][C] 3.885[/C][/ROW]
[ROW][C]83[/C][C] 17[/C][C] 16.3[/C][C] 0.7047[/C][/ROW]
[ROW][C]84[/C][C] 18[/C][C] 15.87[/C][C] 2.134[/C][/ROW]
[ROW][C]85[/C][C] 15[/C][C] 15.09[/C][C]-0.08774[/C][/ROW]
[ROW][C]86[/C][C] 14[/C][C] 12.95[/C][C] 1.049[/C][/ROW]
[ROW][C]87[/C][C] 15[/C][C] 15.64[/C][C]-0.6417[/C][/ROW]
[ROW][C]88[/C][C] 17[/C][C] 15.65[/C][C] 1.353[/C][/ROW]
[ROW][C]89[/C][C] 16[/C][C] 15.77[/C][C] 0.2252[/C][/ROW]
[ROW][C]90[/C][C] 17[/C][C] 16.9[/C][C] 0.1031[/C][/ROW]
[ROW][C]91[/C][C] 15[/C][C] 14.95[/C][C] 0.04804[/C][/ROW]
[ROW][C]92[/C][C] 16[/C][C] 15.65[/C][C] 0.3529[/C][/ROW]
[ROW][C]93[/C][C] 18[/C][C] 16.03[/C][C] 1.971[/C][/ROW]
[ROW][C]94[/C][C] 18[/C][C] 16.47[/C][C] 1.527[/C][/ROW]
[ROW][C]95[/C][C] 16[/C][C] 16.9[/C][C]-0.9024[/C][/ROW]
[ROW][C]96[/C][C] 17[/C][C] 15.21[/C][C] 1.788[/C][/ROW]
[ROW][C]97[/C][C] 15[/C][C] 15.65[/C][C]-0.6471[/C][/ROW]
[ROW][C]98[/C][C] 13[/C][C] 16.12[/C][C]-3.118[/C][/ROW]
[ROW][C]99[/C][C] 17[/C][C] 16.6[/C][C] 0.3969[/C][/ROW]
[ROW][C]100[/C][C] 16[/C][C] 15.2[/C][C] 0.802[/C][/ROW]
[ROW][C]101[/C][C] 15[/C][C] 15.64[/C][C]-0.6444[/C][/ROW]
[ROW][C]102[/C][C] 16[/C][C] 15.64[/C][C] 0.3556[/C][/ROW]
[ROW][C]103[/C][C] 16[/C][C] 15.3[/C][C] 0.6986[/C][/ROW]
[ROW][C]104[/C][C] 13[/C][C] 15.56[/C][C]-2.556[/C][/ROW]
[ROW][C]105[/C][C] 15[/C][C] 15.29[/C][C]-0.2923[/C][/ROW]
[ROW][C]106[/C][C] 12[/C][C] 13.83[/C][C]-1.827[/C][/ROW]
[ROW][C]107[/C][C] 19[/C][C] 15.29[/C][C] 3.71[/C][/ROW]
[ROW][C]108[/C][C] 16[/C][C] 15.09[/C][C] 0.915[/C][/ROW]
[ROW][C]109[/C][C] 16[/C][C] 15.43[/C][C] 0.5747[/C][/ROW]
[ROW][C]110[/C][C] 17[/C][C] 15.63[/C][C] 1.373[/C][/ROW]
[ROW][C]111[/C][C] 16[/C][C] 16.43[/C][C]-0.4256[/C][/ROW]
[ROW][C]112[/C][C] 14[/C][C] 15.55[/C][C]-1.553[/C][/ROW]
[ROW][C]113[/C][C] 15[/C][C] 15.38[/C][C]-0.3782[/C][/ROW]
[ROW][C]114[/C][C] 14[/C][C] 14.73[/C][C]-0.7329[/C][/ROW]
[ROW][C]115[/C][C] 16[/C][C] 15.56[/C][C] 0.4443[/C][/ROW]
[ROW][C]116[/C][C] 15[/C][C] 15.77[/C][C]-0.7748[/C][/ROW]
[ROW][C]117[/C][C] 17[/C][C] 16.59[/C][C] 0.4082[/C][/ROW]
[ROW][C]118[/C][C] 15[/C][C] 15[/C][C] 0.003723[/C][/ROW]
[ROW][C]119[/C][C] 16[/C][C] 15.29[/C][C] 0.7105[/C][/ROW]
[ROW][C]120[/C][C] 16[/C][C] 15.56[/C][C] 0.4416[/C][/ROW]
[ROW][C]121[/C][C] 15[/C][C] 14.64[/C][C] 0.3586[/C][/ROW]
[ROW][C]122[/C][C] 15[/C][C] 15.29[/C][C]-0.2868[/C][/ROW]
[ROW][C]123[/C][C] 11[/C][C] 12.21[/C][C]-1.214[/C][/ROW]
[ROW][C]124[/C][C] 16[/C][C] 15.51[/C][C] 0.4914[/C][/ROW]
[ROW][C]125[/C][C] 18[/C][C] 16.07[/C][C] 1.932[/C][/ROW]
[ROW][C]126[/C][C] 13[/C][C] 13.95[/C][C]-0.9543[/C][/ROW]
[ROW][C]127[/C][C] 11[/C][C] 13.87[/C][C]-2.871[/C][/ROW]
[ROW][C]128[/C][C] 18[/C][C] 17.46[/C][C] 0.5409[/C][/ROW]
[ROW][C]129[/C][C] 15[/C][C] 16.77[/C][C]-1.769[/C][/ROW]
[ROW][C]130[/C][C] 19[/C][C] 17.81[/C][C] 1.186[/C][/ROW]
[ROW][C]131[/C][C] 17[/C][C] 16.99[/C][C] 0.008872[/C][/ROW]
[ROW][C]132[/C][C] 13[/C][C] 15.29[/C][C]-2.292[/C][/ROW]
[ROW][C]133[/C][C] 14[/C][C] 15.3[/C][C]-1.304[/C][/ROW]
[ROW][C]134[/C][C] 16[/C][C] 15.64[/C][C] 0.3556[/C][/ROW]
[ROW][C]135[/C][C] 13[/C][C] 15.38[/C][C]-2.378[/C][/ROW]
[ROW][C]136[/C][C] 17[/C][C] 15.69[/C][C] 1.311[/C][/ROW]
[ROW][C]137[/C][C] 14[/C][C] 15.78[/C][C]-1.778[/C][/ROW]
[ROW][C]138[/C][C] 19[/C][C] 16.16[/C][C] 2.843[/C][/ROW]
[ROW][C]139[/C][C] 14[/C][C] 14.52[/C][C]-0.5195[/C][/ROW]
[ROW][C]140[/C][C] 16[/C][C] 15.64[/C][C] 0.3556[/C][/ROW]
[ROW][C]141[/C][C] 12[/C][C] 13.43[/C][C]-1.428[/C][/ROW]
[ROW][C]142[/C][C] 16[/C][C] 16.77[/C][C]-0.7665[/C][/ROW]
[ROW][C]143[/C][C] 16[/C][C] 15.29[/C][C] 0.7105[/C][/ROW]
[ROW][C]144[/C][C] 15[/C][C] 15.56[/C][C]-0.5584[/C][/ROW]
[ROW][C]145[/C][C] 12[/C][C] 15.04[/C][C]-3.041[/C][/ROW]
[ROW][C]146[/C][C] 15[/C][C] 15.65[/C][C]-0.6471[/C][/ROW]
[ROW][C]147[/C][C] 17[/C][C] 16.42[/C][C] 0.5771[/C][/ROW]
[ROW][C]148[/C][C] 13[/C][C] 15.43[/C][C]-2.425[/C][/ROW]
[ROW][C]149[/C][C] 15[/C][C] 13.4[/C][C] 1.599[/C][/ROW]
[ROW][C]150[/C][C] 18[/C][C] 15.56[/C][C] 2.442[/C][/ROW]
[ROW][C]151[/C][C] 15[/C][C] 14.39[/C][C] 0.6135[/C][/ROW]
[ROW][C]152[/C][C] 18[/C][C] 15.55[/C][C] 2.447[/C][/ROW]
[ROW][C]153[/C][C] 15[/C][C] 17.04[/C][C]-2.035[/C][/ROW]
[ROW][C]154[/C][C] 15[/C][C] 16.03[/C][C]-1.026[/C][/ROW]
[ROW][C]155[/C][C] 16[/C][C] 15.85[/C][C] 0.1511[/C][/ROW]
[ROW][C]156[/C][C] 13[/C][C] 13.83[/C][C]-0.8271[/C][/ROW]
[ROW][C]157[/C][C] 16[/C][C] 15.56[/C][C] 0.4416[/C][/ROW]
[ROW][C]158[/C][C] 13[/C][C] 15.77[/C][C]-2.775[/C][/ROW]
[ROW][C]159[/C][C] 16[/C][C] 13.8[/C][C] 2.203[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297582&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297582&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 13 13.26-0.2589
2 16 15.12 0.8794
3 17 15.86 1.136
4 15 14.64 0.3586
5 16 15.86 0.1365
6 16 15.22 0.7819
7 17 14.51 2.489
8 16 15.09 0.915
9 17 16.9 0.0976
10 17 16.99 0.008872
11 17 15.6 1.403
12 15 15.6-0.5973
13 16 15.29 0.7077
14 14 13.96 0.03714
15 16 15.21 0.7911
16 17 15.09 1.912
17 16 15.09 0.9123
18 15 16.97-1.974
19 17 15.77 1.225
20 16 14.82 1.178
21 15 15.78-0.7775
22 16 15.65 0.3529
23 15 15.64-0.6444
24 17 15.65 1.353
25 15 15 0.003723
26 16 14.95 1.048
27 15 15.56-0.5557
28 16 14.75 1.253
29 16 16.12-0.1151
30 13 14.52-1.52
31 15 16.99-1.991
32 17 16.21 0.7935
33 13 13.62-0.6198
34 17 16.5 0.4969
35 15 15.09-0.085
36 14 14.26-0.2561
37 14 14.34-0.3421
38 18 15.64 2.356
39 15 16.12-1.115
40 17 16.99 0.008872
41 13 13.87-0.8714
42 16 17.28-1.284
43 15 15.94-0.9376
44 15 15.29-0.2923
45 16 15.56 0.4443
46 15 15.72-0.7246
47 13 15.77-2.775
48 17 16.77 0.2307
49 17 17.28-0.2816
50 17 17.33-0.3287
51 11 14.74-3.741
52 14 14.17-0.1674
53 13 15.64-2.644
54 15 14.51 0.489
55 17 14.99 2.006
56 16 15.51 0.4914
57 15 15.77-0.7748
58 17 17.37-0.3731
59 16 14.66 1.341
60 16 15.78 0.2225
61 16 14.56 1.442
62 15 15.87-0.8662
63 12 13.31-1.306
64 17 15.51 1.494
65 14 15.51-1.509
66 14 15.72-1.721
67 16 14.95 1.048
68 15 14.73 0.2671
69 15 17.25-2.255
70 13 15.56-2.556
71 17 16.04 0.9591
72 15 14.91 0.09245
73 16 15.77 0.2252
74 14 15.04-1.041
75 15 13.96 1.037
76 17 14.52 2.48
77 16 15.65 0.3529
78 10 13.78-3.783
79 16 15.77 0.2252
80 17 15.69 1.311
81 17 15.78 1.222
82 20 16.12 3.885
83 17 16.3 0.7047
84 18 15.87 2.134
85 15 15.09-0.08774
86 14 12.95 1.049
87 15 15.64-0.6417
88 17 15.65 1.353
89 16 15.77 0.2252
90 17 16.9 0.1031
91 15 14.95 0.04804
92 16 15.65 0.3529
93 18 16.03 1.971
94 18 16.47 1.527
95 16 16.9-0.9024
96 17 15.21 1.788
97 15 15.65-0.6471
98 13 16.12-3.118
99 17 16.6 0.3969
100 16 15.2 0.802
101 15 15.64-0.6444
102 16 15.64 0.3556
103 16 15.3 0.6986
104 13 15.56-2.556
105 15 15.29-0.2923
106 12 13.83-1.827
107 19 15.29 3.71
108 16 15.09 0.915
109 16 15.43 0.5747
110 17 15.63 1.373
111 16 16.43-0.4256
112 14 15.55-1.553
113 15 15.38-0.3782
114 14 14.73-0.7329
115 16 15.56 0.4443
116 15 15.77-0.7748
117 17 16.59 0.4082
118 15 15 0.003723
119 16 15.29 0.7105
120 16 15.56 0.4416
121 15 14.64 0.3586
122 15 15.29-0.2868
123 11 12.21-1.214
124 16 15.51 0.4914
125 18 16.07 1.932
126 13 13.95-0.9543
127 11 13.87-2.871
128 18 17.46 0.5409
129 15 16.77-1.769
130 19 17.81 1.186
131 17 16.99 0.008872
132 13 15.29-2.292
133 14 15.3-1.304
134 16 15.64 0.3556
135 13 15.38-2.378
136 17 15.69 1.311
137 14 15.78-1.778
138 19 16.16 2.843
139 14 14.52-0.5195
140 16 15.64 0.3556
141 12 13.43-1.428
142 16 16.77-0.7665
143 16 15.29 0.7105
144 15 15.56-0.5584
145 12 15.04-3.041
146 15 15.65-0.6471
147 17 16.42 0.5771
148 13 15.43-2.425
149 15 13.4 1.599
150 18 15.56 2.442
151 15 14.39 0.6135
152 18 15.55 2.447
153 15 17.04-2.035
154 15 16.03-1.026
155 16 15.85 0.1511
156 13 13.83-0.8271
157 16 15.56 0.4416
158 13 15.77-2.775
159 16 13.8 2.203







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
10 0.1488 0.2976 0.8512
11 0.08174 0.1635 0.9183
12 0.03336 0.06672 0.9666
13 0.01247 0.02493 0.9875
14 0.01188 0.02375 0.9881
15 0.01252 0.02505 0.9875
16 0.01957 0.03914 0.9804
17 0.009441 0.01888 0.9906
18 0.02644 0.05288 0.9736
19 0.01695 0.03391 0.983
20 0.01393 0.02787 0.9861
21 0.01318 0.02636 0.9868
22 0.007185 0.01437 0.9928
23 0.007547 0.01509 0.9925
24 0.007449 0.0149 0.9926
25 0.009478 0.01896 0.9905
26 0.005974 0.01195 0.994
27 0.005276 0.01055 0.9947
28 0.003283 0.006565 0.9967
29 0.001827 0.003653 0.9982
30 0.003488 0.006977 0.9965
31 0.007586 0.01517 0.9924
32 0.007628 0.01526 0.9924
33 0.00834 0.01668 0.9917
34 0.005619 0.01124 0.9944
35 0.003568 0.007136 0.9964
36 0.002439 0.004878 0.9976
37 0.002065 0.004129 0.9979
38 0.008039 0.01608 0.992
39 0.006806 0.01361 0.9932
40 0.00439 0.008781 0.9956
41 0.005049 0.0101 0.995
42 0.00403 0.008061 0.996
43 0.002739 0.005478 0.9973
44 0.001924 0.003847 0.9981
45 0.00123 0.002459 0.9988
46 0.00103 0.00206 0.999
47 0.0047 0.009401 0.9953
48 0.003278 0.006555 0.9967
49 0.002157 0.004314 0.9978
50 0.001429 0.002857 0.9986
51 0.01884 0.03768 0.9812
52 0.01361 0.02721 0.9864
53 0.02952 0.05903 0.9705
54 0.02284 0.04567 0.9772
55 0.02978 0.05956 0.9702
56 0.02466 0.04931 0.9753
57 0.01944 0.03889 0.9806
58 0.01437 0.02874 0.9856
59 0.01247 0.02493 0.9875
60 0.009039 0.01808 0.991
61 0.008797 0.01759 0.9912
62 0.006866 0.01373 0.9931
63 0.006656 0.01331 0.9933
64 0.008469 0.01694 0.9915
65 0.009096 0.01819 0.9909
66 0.009414 0.01883 0.9906
67 0.007912 0.01582 0.9921
68 0.006139 0.01228 0.9939
69 0.009827 0.01965 0.9902
70 0.02225 0.04451 0.9777
71 0.02076 0.04153 0.9792
72 0.01779 0.03559 0.9822
73 0.01353 0.02706 0.9865
74 0.01213 0.02426 0.9879
75 0.01052 0.02104 0.9895
76 0.02473 0.04947 0.9753
77 0.0193 0.03861 0.9807
78 0.1213 0.2425 0.8787
79 0.1007 0.2013 0.8993
80 0.09832 0.1966 0.9017
81 0.09442 0.1888 0.9056
82 0.3047 0.6095 0.6953
83 0.2727 0.5454 0.7273
84 0.3255 0.6509 0.6745
85 0.2897 0.5794 0.7103
86 0.2661 0.5321 0.7339
87 0.2362 0.4724 0.7638
88 0.2375 0.4749 0.7625
89 0.2033 0.4067 0.7967
90 0.1722 0.3444 0.8278
91 0.1458 0.2915 0.8542
92 0.1234 0.2469 0.8766
93 0.1478 0.2957 0.8522
94 0.157 0.314 0.843
95 0.1391 0.2781 0.8609
96 0.1541 0.3083 0.8459
97 0.1314 0.2629 0.8686
98 0.2331 0.4662 0.7669
99 0.2008 0.4015 0.7992
100 0.1761 0.3523 0.8239
101 0.1505 0.3011 0.8495
102 0.126 0.252 0.874
103 0.1067 0.2135 0.8933
104 0.1572 0.3143 0.8428
105 0.1299 0.2598 0.8701
106 0.1423 0.2845 0.8577
107 0.3666 0.7331 0.6334
108 0.3504 0.7007 0.6496
109 0.3276 0.6551 0.6724
110 0.3186 0.6373 0.6814
111 0.2833 0.5666 0.7167
112 0.2887 0.5774 0.7113
113 0.2467 0.4934 0.7533
114 0.2153 0.4306 0.7847
115 0.1843 0.3685 0.8157
116 0.1646 0.3293 0.8354
117 0.1454 0.2909 0.8546
118 0.1208 0.2417 0.8792
119 0.1067 0.2135 0.8933
120 0.09228 0.1846 0.9077
121 0.08667 0.1733 0.9133
122 0.06675 0.1335 0.9333
123 0.06242 0.1248 0.9376
124 0.04782 0.09563 0.9522
125 0.05111 0.1022 0.9489
126 0.05703 0.1141 0.943
127 0.121 0.242 0.879
128 0.1003 0.2006 0.8997
129 0.08509 0.1702 0.9149
130 0.07057 0.1411 0.9294
131 0.0638 0.1276 0.9362
132 0.06184 0.1237 0.9382
133 0.05419 0.1084 0.9458
134 0.03871 0.07743 0.9613
135 0.04226 0.08451 0.9577
136 0.05071 0.1014 0.9493
137 0.04507 0.09014 0.9549
138 0.09117 0.1823 0.9088
139 0.1191 0.2382 0.8809
140 0.08406 0.1681 0.9159
141 0.08846 0.1769 0.9115
142 0.06379 0.1276 0.9362
143 0.06189 0.1238 0.9381
144 0.03852 0.07704 0.9615
145 0.03366 0.06732 0.9663
146 0.01948 0.03897 0.9805
147 0.01064 0.02128 0.9894
148 0.1182 0.2364 0.8818
149 0.6746 0.6508 0.3254

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
10 &  0.1488 &  0.2976 &  0.8512 \tabularnewline
11 &  0.08174 &  0.1635 &  0.9183 \tabularnewline
12 &  0.03336 &  0.06672 &  0.9666 \tabularnewline
13 &  0.01247 &  0.02493 &  0.9875 \tabularnewline
14 &  0.01188 &  0.02375 &  0.9881 \tabularnewline
15 &  0.01252 &  0.02505 &  0.9875 \tabularnewline
16 &  0.01957 &  0.03914 &  0.9804 \tabularnewline
17 &  0.009441 &  0.01888 &  0.9906 \tabularnewline
18 &  0.02644 &  0.05288 &  0.9736 \tabularnewline
19 &  0.01695 &  0.03391 &  0.983 \tabularnewline
20 &  0.01393 &  0.02787 &  0.9861 \tabularnewline
21 &  0.01318 &  0.02636 &  0.9868 \tabularnewline
22 &  0.007185 &  0.01437 &  0.9928 \tabularnewline
23 &  0.007547 &  0.01509 &  0.9925 \tabularnewline
24 &  0.007449 &  0.0149 &  0.9926 \tabularnewline
25 &  0.009478 &  0.01896 &  0.9905 \tabularnewline
26 &  0.005974 &  0.01195 &  0.994 \tabularnewline
27 &  0.005276 &  0.01055 &  0.9947 \tabularnewline
28 &  0.003283 &  0.006565 &  0.9967 \tabularnewline
29 &  0.001827 &  0.003653 &  0.9982 \tabularnewline
30 &  0.003488 &  0.006977 &  0.9965 \tabularnewline
31 &  0.007586 &  0.01517 &  0.9924 \tabularnewline
32 &  0.007628 &  0.01526 &  0.9924 \tabularnewline
33 &  0.00834 &  0.01668 &  0.9917 \tabularnewline
34 &  0.005619 &  0.01124 &  0.9944 \tabularnewline
35 &  0.003568 &  0.007136 &  0.9964 \tabularnewline
36 &  0.002439 &  0.004878 &  0.9976 \tabularnewline
37 &  0.002065 &  0.004129 &  0.9979 \tabularnewline
38 &  0.008039 &  0.01608 &  0.992 \tabularnewline
39 &  0.006806 &  0.01361 &  0.9932 \tabularnewline
40 &  0.00439 &  0.008781 &  0.9956 \tabularnewline
41 &  0.005049 &  0.0101 &  0.995 \tabularnewline
42 &  0.00403 &  0.008061 &  0.996 \tabularnewline
43 &  0.002739 &  0.005478 &  0.9973 \tabularnewline
44 &  0.001924 &  0.003847 &  0.9981 \tabularnewline
45 &  0.00123 &  0.002459 &  0.9988 \tabularnewline
46 &  0.00103 &  0.00206 &  0.999 \tabularnewline
47 &  0.0047 &  0.009401 &  0.9953 \tabularnewline
48 &  0.003278 &  0.006555 &  0.9967 \tabularnewline
49 &  0.002157 &  0.004314 &  0.9978 \tabularnewline
50 &  0.001429 &  0.002857 &  0.9986 \tabularnewline
51 &  0.01884 &  0.03768 &  0.9812 \tabularnewline
52 &  0.01361 &  0.02721 &  0.9864 \tabularnewline
53 &  0.02952 &  0.05903 &  0.9705 \tabularnewline
54 &  0.02284 &  0.04567 &  0.9772 \tabularnewline
55 &  0.02978 &  0.05956 &  0.9702 \tabularnewline
56 &  0.02466 &  0.04931 &  0.9753 \tabularnewline
57 &  0.01944 &  0.03889 &  0.9806 \tabularnewline
58 &  0.01437 &  0.02874 &  0.9856 \tabularnewline
59 &  0.01247 &  0.02493 &  0.9875 \tabularnewline
60 &  0.009039 &  0.01808 &  0.991 \tabularnewline
61 &  0.008797 &  0.01759 &  0.9912 \tabularnewline
62 &  0.006866 &  0.01373 &  0.9931 \tabularnewline
63 &  0.006656 &  0.01331 &  0.9933 \tabularnewline
64 &  0.008469 &  0.01694 &  0.9915 \tabularnewline
65 &  0.009096 &  0.01819 &  0.9909 \tabularnewline
66 &  0.009414 &  0.01883 &  0.9906 \tabularnewline
67 &  0.007912 &  0.01582 &  0.9921 \tabularnewline
68 &  0.006139 &  0.01228 &  0.9939 \tabularnewline
69 &  0.009827 &  0.01965 &  0.9902 \tabularnewline
70 &  0.02225 &  0.04451 &  0.9777 \tabularnewline
71 &  0.02076 &  0.04153 &  0.9792 \tabularnewline
72 &  0.01779 &  0.03559 &  0.9822 \tabularnewline
73 &  0.01353 &  0.02706 &  0.9865 \tabularnewline
74 &  0.01213 &  0.02426 &  0.9879 \tabularnewline
75 &  0.01052 &  0.02104 &  0.9895 \tabularnewline
76 &  0.02473 &  0.04947 &  0.9753 \tabularnewline
77 &  0.0193 &  0.03861 &  0.9807 \tabularnewline
78 &  0.1213 &  0.2425 &  0.8787 \tabularnewline
79 &  0.1007 &  0.2013 &  0.8993 \tabularnewline
80 &  0.09832 &  0.1966 &  0.9017 \tabularnewline
81 &  0.09442 &  0.1888 &  0.9056 \tabularnewline
82 &  0.3047 &  0.6095 &  0.6953 \tabularnewline
83 &  0.2727 &  0.5454 &  0.7273 \tabularnewline
84 &  0.3255 &  0.6509 &  0.6745 \tabularnewline
85 &  0.2897 &  0.5794 &  0.7103 \tabularnewline
86 &  0.2661 &  0.5321 &  0.7339 \tabularnewline
87 &  0.2362 &  0.4724 &  0.7638 \tabularnewline
88 &  0.2375 &  0.4749 &  0.7625 \tabularnewline
89 &  0.2033 &  0.4067 &  0.7967 \tabularnewline
90 &  0.1722 &  0.3444 &  0.8278 \tabularnewline
91 &  0.1458 &  0.2915 &  0.8542 \tabularnewline
92 &  0.1234 &  0.2469 &  0.8766 \tabularnewline
93 &  0.1478 &  0.2957 &  0.8522 \tabularnewline
94 &  0.157 &  0.314 &  0.843 \tabularnewline
95 &  0.1391 &  0.2781 &  0.8609 \tabularnewline
96 &  0.1541 &  0.3083 &  0.8459 \tabularnewline
97 &  0.1314 &  0.2629 &  0.8686 \tabularnewline
98 &  0.2331 &  0.4662 &  0.7669 \tabularnewline
99 &  0.2008 &  0.4015 &  0.7992 \tabularnewline
100 &  0.1761 &  0.3523 &  0.8239 \tabularnewline
101 &  0.1505 &  0.3011 &  0.8495 \tabularnewline
102 &  0.126 &  0.252 &  0.874 \tabularnewline
103 &  0.1067 &  0.2135 &  0.8933 \tabularnewline
104 &  0.1572 &  0.3143 &  0.8428 \tabularnewline
105 &  0.1299 &  0.2598 &  0.8701 \tabularnewline
106 &  0.1423 &  0.2845 &  0.8577 \tabularnewline
107 &  0.3666 &  0.7331 &  0.6334 \tabularnewline
108 &  0.3504 &  0.7007 &  0.6496 \tabularnewline
109 &  0.3276 &  0.6551 &  0.6724 \tabularnewline
110 &  0.3186 &  0.6373 &  0.6814 \tabularnewline
111 &  0.2833 &  0.5666 &  0.7167 \tabularnewline
112 &  0.2887 &  0.5774 &  0.7113 \tabularnewline
113 &  0.2467 &  0.4934 &  0.7533 \tabularnewline
114 &  0.2153 &  0.4306 &  0.7847 \tabularnewline
115 &  0.1843 &  0.3685 &  0.8157 \tabularnewline
116 &  0.1646 &  0.3293 &  0.8354 \tabularnewline
117 &  0.1454 &  0.2909 &  0.8546 \tabularnewline
118 &  0.1208 &  0.2417 &  0.8792 \tabularnewline
119 &  0.1067 &  0.2135 &  0.8933 \tabularnewline
120 &  0.09228 &  0.1846 &  0.9077 \tabularnewline
121 &  0.08667 &  0.1733 &  0.9133 \tabularnewline
122 &  0.06675 &  0.1335 &  0.9333 \tabularnewline
123 &  0.06242 &  0.1248 &  0.9376 \tabularnewline
124 &  0.04782 &  0.09563 &  0.9522 \tabularnewline
125 &  0.05111 &  0.1022 &  0.9489 \tabularnewline
126 &  0.05703 &  0.1141 &  0.943 \tabularnewline
127 &  0.121 &  0.242 &  0.879 \tabularnewline
128 &  0.1003 &  0.2006 &  0.8997 \tabularnewline
129 &  0.08509 &  0.1702 &  0.9149 \tabularnewline
130 &  0.07057 &  0.1411 &  0.9294 \tabularnewline
131 &  0.0638 &  0.1276 &  0.9362 \tabularnewline
132 &  0.06184 &  0.1237 &  0.9382 \tabularnewline
133 &  0.05419 &  0.1084 &  0.9458 \tabularnewline
134 &  0.03871 &  0.07743 &  0.9613 \tabularnewline
135 &  0.04226 &  0.08451 &  0.9577 \tabularnewline
136 &  0.05071 &  0.1014 &  0.9493 \tabularnewline
137 &  0.04507 &  0.09014 &  0.9549 \tabularnewline
138 &  0.09117 &  0.1823 &  0.9088 \tabularnewline
139 &  0.1191 &  0.2382 &  0.8809 \tabularnewline
140 &  0.08406 &  0.1681 &  0.9159 \tabularnewline
141 &  0.08846 &  0.1769 &  0.9115 \tabularnewline
142 &  0.06379 &  0.1276 &  0.9362 \tabularnewline
143 &  0.06189 &  0.1238 &  0.9381 \tabularnewline
144 &  0.03852 &  0.07704 &  0.9615 \tabularnewline
145 &  0.03366 &  0.06732 &  0.9663 \tabularnewline
146 &  0.01948 &  0.03897 &  0.9805 \tabularnewline
147 &  0.01064 &  0.02128 &  0.9894 \tabularnewline
148 &  0.1182 &  0.2364 &  0.8818 \tabularnewline
149 &  0.6746 &  0.6508 &  0.3254 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297582&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]10[/C][C] 0.1488[/C][C] 0.2976[/C][C] 0.8512[/C][/ROW]
[ROW][C]11[/C][C] 0.08174[/C][C] 0.1635[/C][C] 0.9183[/C][/ROW]
[ROW][C]12[/C][C] 0.03336[/C][C] 0.06672[/C][C] 0.9666[/C][/ROW]
[ROW][C]13[/C][C] 0.01247[/C][C] 0.02493[/C][C] 0.9875[/C][/ROW]
[ROW][C]14[/C][C] 0.01188[/C][C] 0.02375[/C][C] 0.9881[/C][/ROW]
[ROW][C]15[/C][C] 0.01252[/C][C] 0.02505[/C][C] 0.9875[/C][/ROW]
[ROW][C]16[/C][C] 0.01957[/C][C] 0.03914[/C][C] 0.9804[/C][/ROW]
[ROW][C]17[/C][C] 0.009441[/C][C] 0.01888[/C][C] 0.9906[/C][/ROW]
[ROW][C]18[/C][C] 0.02644[/C][C] 0.05288[/C][C] 0.9736[/C][/ROW]
[ROW][C]19[/C][C] 0.01695[/C][C] 0.03391[/C][C] 0.983[/C][/ROW]
[ROW][C]20[/C][C] 0.01393[/C][C] 0.02787[/C][C] 0.9861[/C][/ROW]
[ROW][C]21[/C][C] 0.01318[/C][C] 0.02636[/C][C] 0.9868[/C][/ROW]
[ROW][C]22[/C][C] 0.007185[/C][C] 0.01437[/C][C] 0.9928[/C][/ROW]
[ROW][C]23[/C][C] 0.007547[/C][C] 0.01509[/C][C] 0.9925[/C][/ROW]
[ROW][C]24[/C][C] 0.007449[/C][C] 0.0149[/C][C] 0.9926[/C][/ROW]
[ROW][C]25[/C][C] 0.009478[/C][C] 0.01896[/C][C] 0.9905[/C][/ROW]
[ROW][C]26[/C][C] 0.005974[/C][C] 0.01195[/C][C] 0.994[/C][/ROW]
[ROW][C]27[/C][C] 0.005276[/C][C] 0.01055[/C][C] 0.9947[/C][/ROW]
[ROW][C]28[/C][C] 0.003283[/C][C] 0.006565[/C][C] 0.9967[/C][/ROW]
[ROW][C]29[/C][C] 0.001827[/C][C] 0.003653[/C][C] 0.9982[/C][/ROW]
[ROW][C]30[/C][C] 0.003488[/C][C] 0.006977[/C][C] 0.9965[/C][/ROW]
[ROW][C]31[/C][C] 0.007586[/C][C] 0.01517[/C][C] 0.9924[/C][/ROW]
[ROW][C]32[/C][C] 0.007628[/C][C] 0.01526[/C][C] 0.9924[/C][/ROW]
[ROW][C]33[/C][C] 0.00834[/C][C] 0.01668[/C][C] 0.9917[/C][/ROW]
[ROW][C]34[/C][C] 0.005619[/C][C] 0.01124[/C][C] 0.9944[/C][/ROW]
[ROW][C]35[/C][C] 0.003568[/C][C] 0.007136[/C][C] 0.9964[/C][/ROW]
[ROW][C]36[/C][C] 0.002439[/C][C] 0.004878[/C][C] 0.9976[/C][/ROW]
[ROW][C]37[/C][C] 0.002065[/C][C] 0.004129[/C][C] 0.9979[/C][/ROW]
[ROW][C]38[/C][C] 0.008039[/C][C] 0.01608[/C][C] 0.992[/C][/ROW]
[ROW][C]39[/C][C] 0.006806[/C][C] 0.01361[/C][C] 0.9932[/C][/ROW]
[ROW][C]40[/C][C] 0.00439[/C][C] 0.008781[/C][C] 0.9956[/C][/ROW]
[ROW][C]41[/C][C] 0.005049[/C][C] 0.0101[/C][C] 0.995[/C][/ROW]
[ROW][C]42[/C][C] 0.00403[/C][C] 0.008061[/C][C] 0.996[/C][/ROW]
[ROW][C]43[/C][C] 0.002739[/C][C] 0.005478[/C][C] 0.9973[/C][/ROW]
[ROW][C]44[/C][C] 0.001924[/C][C] 0.003847[/C][C] 0.9981[/C][/ROW]
[ROW][C]45[/C][C] 0.00123[/C][C] 0.002459[/C][C] 0.9988[/C][/ROW]
[ROW][C]46[/C][C] 0.00103[/C][C] 0.00206[/C][C] 0.999[/C][/ROW]
[ROW][C]47[/C][C] 0.0047[/C][C] 0.009401[/C][C] 0.9953[/C][/ROW]
[ROW][C]48[/C][C] 0.003278[/C][C] 0.006555[/C][C] 0.9967[/C][/ROW]
[ROW][C]49[/C][C] 0.002157[/C][C] 0.004314[/C][C] 0.9978[/C][/ROW]
[ROW][C]50[/C][C] 0.001429[/C][C] 0.002857[/C][C] 0.9986[/C][/ROW]
[ROW][C]51[/C][C] 0.01884[/C][C] 0.03768[/C][C] 0.9812[/C][/ROW]
[ROW][C]52[/C][C] 0.01361[/C][C] 0.02721[/C][C] 0.9864[/C][/ROW]
[ROW][C]53[/C][C] 0.02952[/C][C] 0.05903[/C][C] 0.9705[/C][/ROW]
[ROW][C]54[/C][C] 0.02284[/C][C] 0.04567[/C][C] 0.9772[/C][/ROW]
[ROW][C]55[/C][C] 0.02978[/C][C] 0.05956[/C][C] 0.9702[/C][/ROW]
[ROW][C]56[/C][C] 0.02466[/C][C] 0.04931[/C][C] 0.9753[/C][/ROW]
[ROW][C]57[/C][C] 0.01944[/C][C] 0.03889[/C][C] 0.9806[/C][/ROW]
[ROW][C]58[/C][C] 0.01437[/C][C] 0.02874[/C][C] 0.9856[/C][/ROW]
[ROW][C]59[/C][C] 0.01247[/C][C] 0.02493[/C][C] 0.9875[/C][/ROW]
[ROW][C]60[/C][C] 0.009039[/C][C] 0.01808[/C][C] 0.991[/C][/ROW]
[ROW][C]61[/C][C] 0.008797[/C][C] 0.01759[/C][C] 0.9912[/C][/ROW]
[ROW][C]62[/C][C] 0.006866[/C][C] 0.01373[/C][C] 0.9931[/C][/ROW]
[ROW][C]63[/C][C] 0.006656[/C][C] 0.01331[/C][C] 0.9933[/C][/ROW]
[ROW][C]64[/C][C] 0.008469[/C][C] 0.01694[/C][C] 0.9915[/C][/ROW]
[ROW][C]65[/C][C] 0.009096[/C][C] 0.01819[/C][C] 0.9909[/C][/ROW]
[ROW][C]66[/C][C] 0.009414[/C][C] 0.01883[/C][C] 0.9906[/C][/ROW]
[ROW][C]67[/C][C] 0.007912[/C][C] 0.01582[/C][C] 0.9921[/C][/ROW]
[ROW][C]68[/C][C] 0.006139[/C][C] 0.01228[/C][C] 0.9939[/C][/ROW]
[ROW][C]69[/C][C] 0.009827[/C][C] 0.01965[/C][C] 0.9902[/C][/ROW]
[ROW][C]70[/C][C] 0.02225[/C][C] 0.04451[/C][C] 0.9777[/C][/ROW]
[ROW][C]71[/C][C] 0.02076[/C][C] 0.04153[/C][C] 0.9792[/C][/ROW]
[ROW][C]72[/C][C] 0.01779[/C][C] 0.03559[/C][C] 0.9822[/C][/ROW]
[ROW][C]73[/C][C] 0.01353[/C][C] 0.02706[/C][C] 0.9865[/C][/ROW]
[ROW][C]74[/C][C] 0.01213[/C][C] 0.02426[/C][C] 0.9879[/C][/ROW]
[ROW][C]75[/C][C] 0.01052[/C][C] 0.02104[/C][C] 0.9895[/C][/ROW]
[ROW][C]76[/C][C] 0.02473[/C][C] 0.04947[/C][C] 0.9753[/C][/ROW]
[ROW][C]77[/C][C] 0.0193[/C][C] 0.03861[/C][C] 0.9807[/C][/ROW]
[ROW][C]78[/C][C] 0.1213[/C][C] 0.2425[/C][C] 0.8787[/C][/ROW]
[ROW][C]79[/C][C] 0.1007[/C][C] 0.2013[/C][C] 0.8993[/C][/ROW]
[ROW][C]80[/C][C] 0.09832[/C][C] 0.1966[/C][C] 0.9017[/C][/ROW]
[ROW][C]81[/C][C] 0.09442[/C][C] 0.1888[/C][C] 0.9056[/C][/ROW]
[ROW][C]82[/C][C] 0.3047[/C][C] 0.6095[/C][C] 0.6953[/C][/ROW]
[ROW][C]83[/C][C] 0.2727[/C][C] 0.5454[/C][C] 0.7273[/C][/ROW]
[ROW][C]84[/C][C] 0.3255[/C][C] 0.6509[/C][C] 0.6745[/C][/ROW]
[ROW][C]85[/C][C] 0.2897[/C][C] 0.5794[/C][C] 0.7103[/C][/ROW]
[ROW][C]86[/C][C] 0.2661[/C][C] 0.5321[/C][C] 0.7339[/C][/ROW]
[ROW][C]87[/C][C] 0.2362[/C][C] 0.4724[/C][C] 0.7638[/C][/ROW]
[ROW][C]88[/C][C] 0.2375[/C][C] 0.4749[/C][C] 0.7625[/C][/ROW]
[ROW][C]89[/C][C] 0.2033[/C][C] 0.4067[/C][C] 0.7967[/C][/ROW]
[ROW][C]90[/C][C] 0.1722[/C][C] 0.3444[/C][C] 0.8278[/C][/ROW]
[ROW][C]91[/C][C] 0.1458[/C][C] 0.2915[/C][C] 0.8542[/C][/ROW]
[ROW][C]92[/C][C] 0.1234[/C][C] 0.2469[/C][C] 0.8766[/C][/ROW]
[ROW][C]93[/C][C] 0.1478[/C][C] 0.2957[/C][C] 0.8522[/C][/ROW]
[ROW][C]94[/C][C] 0.157[/C][C] 0.314[/C][C] 0.843[/C][/ROW]
[ROW][C]95[/C][C] 0.1391[/C][C] 0.2781[/C][C] 0.8609[/C][/ROW]
[ROW][C]96[/C][C] 0.1541[/C][C] 0.3083[/C][C] 0.8459[/C][/ROW]
[ROW][C]97[/C][C] 0.1314[/C][C] 0.2629[/C][C] 0.8686[/C][/ROW]
[ROW][C]98[/C][C] 0.2331[/C][C] 0.4662[/C][C] 0.7669[/C][/ROW]
[ROW][C]99[/C][C] 0.2008[/C][C] 0.4015[/C][C] 0.7992[/C][/ROW]
[ROW][C]100[/C][C] 0.1761[/C][C] 0.3523[/C][C] 0.8239[/C][/ROW]
[ROW][C]101[/C][C] 0.1505[/C][C] 0.3011[/C][C] 0.8495[/C][/ROW]
[ROW][C]102[/C][C] 0.126[/C][C] 0.252[/C][C] 0.874[/C][/ROW]
[ROW][C]103[/C][C] 0.1067[/C][C] 0.2135[/C][C] 0.8933[/C][/ROW]
[ROW][C]104[/C][C] 0.1572[/C][C] 0.3143[/C][C] 0.8428[/C][/ROW]
[ROW][C]105[/C][C] 0.1299[/C][C] 0.2598[/C][C] 0.8701[/C][/ROW]
[ROW][C]106[/C][C] 0.1423[/C][C] 0.2845[/C][C] 0.8577[/C][/ROW]
[ROW][C]107[/C][C] 0.3666[/C][C] 0.7331[/C][C] 0.6334[/C][/ROW]
[ROW][C]108[/C][C] 0.3504[/C][C] 0.7007[/C][C] 0.6496[/C][/ROW]
[ROW][C]109[/C][C] 0.3276[/C][C] 0.6551[/C][C] 0.6724[/C][/ROW]
[ROW][C]110[/C][C] 0.3186[/C][C] 0.6373[/C][C] 0.6814[/C][/ROW]
[ROW][C]111[/C][C] 0.2833[/C][C] 0.5666[/C][C] 0.7167[/C][/ROW]
[ROW][C]112[/C][C] 0.2887[/C][C] 0.5774[/C][C] 0.7113[/C][/ROW]
[ROW][C]113[/C][C] 0.2467[/C][C] 0.4934[/C][C] 0.7533[/C][/ROW]
[ROW][C]114[/C][C] 0.2153[/C][C] 0.4306[/C][C] 0.7847[/C][/ROW]
[ROW][C]115[/C][C] 0.1843[/C][C] 0.3685[/C][C] 0.8157[/C][/ROW]
[ROW][C]116[/C][C] 0.1646[/C][C] 0.3293[/C][C] 0.8354[/C][/ROW]
[ROW][C]117[/C][C] 0.1454[/C][C] 0.2909[/C][C] 0.8546[/C][/ROW]
[ROW][C]118[/C][C] 0.1208[/C][C] 0.2417[/C][C] 0.8792[/C][/ROW]
[ROW][C]119[/C][C] 0.1067[/C][C] 0.2135[/C][C] 0.8933[/C][/ROW]
[ROW][C]120[/C][C] 0.09228[/C][C] 0.1846[/C][C] 0.9077[/C][/ROW]
[ROW][C]121[/C][C] 0.08667[/C][C] 0.1733[/C][C] 0.9133[/C][/ROW]
[ROW][C]122[/C][C] 0.06675[/C][C] 0.1335[/C][C] 0.9333[/C][/ROW]
[ROW][C]123[/C][C] 0.06242[/C][C] 0.1248[/C][C] 0.9376[/C][/ROW]
[ROW][C]124[/C][C] 0.04782[/C][C] 0.09563[/C][C] 0.9522[/C][/ROW]
[ROW][C]125[/C][C] 0.05111[/C][C] 0.1022[/C][C] 0.9489[/C][/ROW]
[ROW][C]126[/C][C] 0.05703[/C][C] 0.1141[/C][C] 0.943[/C][/ROW]
[ROW][C]127[/C][C] 0.121[/C][C] 0.242[/C][C] 0.879[/C][/ROW]
[ROW][C]128[/C][C] 0.1003[/C][C] 0.2006[/C][C] 0.8997[/C][/ROW]
[ROW][C]129[/C][C] 0.08509[/C][C] 0.1702[/C][C] 0.9149[/C][/ROW]
[ROW][C]130[/C][C] 0.07057[/C][C] 0.1411[/C][C] 0.9294[/C][/ROW]
[ROW][C]131[/C][C] 0.0638[/C][C] 0.1276[/C][C] 0.9362[/C][/ROW]
[ROW][C]132[/C][C] 0.06184[/C][C] 0.1237[/C][C] 0.9382[/C][/ROW]
[ROW][C]133[/C][C] 0.05419[/C][C] 0.1084[/C][C] 0.9458[/C][/ROW]
[ROW][C]134[/C][C] 0.03871[/C][C] 0.07743[/C][C] 0.9613[/C][/ROW]
[ROW][C]135[/C][C] 0.04226[/C][C] 0.08451[/C][C] 0.9577[/C][/ROW]
[ROW][C]136[/C][C] 0.05071[/C][C] 0.1014[/C][C] 0.9493[/C][/ROW]
[ROW][C]137[/C][C] 0.04507[/C][C] 0.09014[/C][C] 0.9549[/C][/ROW]
[ROW][C]138[/C][C] 0.09117[/C][C] 0.1823[/C][C] 0.9088[/C][/ROW]
[ROW][C]139[/C][C] 0.1191[/C][C] 0.2382[/C][C] 0.8809[/C][/ROW]
[ROW][C]140[/C][C] 0.08406[/C][C] 0.1681[/C][C] 0.9159[/C][/ROW]
[ROW][C]141[/C][C] 0.08846[/C][C] 0.1769[/C][C] 0.9115[/C][/ROW]
[ROW][C]142[/C][C] 0.06379[/C][C] 0.1276[/C][C] 0.9362[/C][/ROW]
[ROW][C]143[/C][C] 0.06189[/C][C] 0.1238[/C][C] 0.9381[/C][/ROW]
[ROW][C]144[/C][C] 0.03852[/C][C] 0.07704[/C][C] 0.9615[/C][/ROW]
[ROW][C]145[/C][C] 0.03366[/C][C] 0.06732[/C][C] 0.9663[/C][/ROW]
[ROW][C]146[/C][C] 0.01948[/C][C] 0.03897[/C][C] 0.9805[/C][/ROW]
[ROW][C]147[/C][C] 0.01064[/C][C] 0.02128[/C][C] 0.9894[/C][/ROW]
[ROW][C]148[/C][C] 0.1182[/C][C] 0.2364[/C][C] 0.8818[/C][/ROW]
[ROW][C]149[/C][C] 0.6746[/C][C] 0.6508[/C][C] 0.3254[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297582&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297582&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
10 0.1488 0.2976 0.8512
11 0.08174 0.1635 0.9183
12 0.03336 0.06672 0.9666
13 0.01247 0.02493 0.9875
14 0.01188 0.02375 0.9881
15 0.01252 0.02505 0.9875
16 0.01957 0.03914 0.9804
17 0.009441 0.01888 0.9906
18 0.02644 0.05288 0.9736
19 0.01695 0.03391 0.983
20 0.01393 0.02787 0.9861
21 0.01318 0.02636 0.9868
22 0.007185 0.01437 0.9928
23 0.007547 0.01509 0.9925
24 0.007449 0.0149 0.9926
25 0.009478 0.01896 0.9905
26 0.005974 0.01195 0.994
27 0.005276 0.01055 0.9947
28 0.003283 0.006565 0.9967
29 0.001827 0.003653 0.9982
30 0.003488 0.006977 0.9965
31 0.007586 0.01517 0.9924
32 0.007628 0.01526 0.9924
33 0.00834 0.01668 0.9917
34 0.005619 0.01124 0.9944
35 0.003568 0.007136 0.9964
36 0.002439 0.004878 0.9976
37 0.002065 0.004129 0.9979
38 0.008039 0.01608 0.992
39 0.006806 0.01361 0.9932
40 0.00439 0.008781 0.9956
41 0.005049 0.0101 0.995
42 0.00403 0.008061 0.996
43 0.002739 0.005478 0.9973
44 0.001924 0.003847 0.9981
45 0.00123 0.002459 0.9988
46 0.00103 0.00206 0.999
47 0.0047 0.009401 0.9953
48 0.003278 0.006555 0.9967
49 0.002157 0.004314 0.9978
50 0.001429 0.002857 0.9986
51 0.01884 0.03768 0.9812
52 0.01361 0.02721 0.9864
53 0.02952 0.05903 0.9705
54 0.02284 0.04567 0.9772
55 0.02978 0.05956 0.9702
56 0.02466 0.04931 0.9753
57 0.01944 0.03889 0.9806
58 0.01437 0.02874 0.9856
59 0.01247 0.02493 0.9875
60 0.009039 0.01808 0.991
61 0.008797 0.01759 0.9912
62 0.006866 0.01373 0.9931
63 0.006656 0.01331 0.9933
64 0.008469 0.01694 0.9915
65 0.009096 0.01819 0.9909
66 0.009414 0.01883 0.9906
67 0.007912 0.01582 0.9921
68 0.006139 0.01228 0.9939
69 0.009827 0.01965 0.9902
70 0.02225 0.04451 0.9777
71 0.02076 0.04153 0.9792
72 0.01779 0.03559 0.9822
73 0.01353 0.02706 0.9865
74 0.01213 0.02426 0.9879
75 0.01052 0.02104 0.9895
76 0.02473 0.04947 0.9753
77 0.0193 0.03861 0.9807
78 0.1213 0.2425 0.8787
79 0.1007 0.2013 0.8993
80 0.09832 0.1966 0.9017
81 0.09442 0.1888 0.9056
82 0.3047 0.6095 0.6953
83 0.2727 0.5454 0.7273
84 0.3255 0.6509 0.6745
85 0.2897 0.5794 0.7103
86 0.2661 0.5321 0.7339
87 0.2362 0.4724 0.7638
88 0.2375 0.4749 0.7625
89 0.2033 0.4067 0.7967
90 0.1722 0.3444 0.8278
91 0.1458 0.2915 0.8542
92 0.1234 0.2469 0.8766
93 0.1478 0.2957 0.8522
94 0.157 0.314 0.843
95 0.1391 0.2781 0.8609
96 0.1541 0.3083 0.8459
97 0.1314 0.2629 0.8686
98 0.2331 0.4662 0.7669
99 0.2008 0.4015 0.7992
100 0.1761 0.3523 0.8239
101 0.1505 0.3011 0.8495
102 0.126 0.252 0.874
103 0.1067 0.2135 0.8933
104 0.1572 0.3143 0.8428
105 0.1299 0.2598 0.8701
106 0.1423 0.2845 0.8577
107 0.3666 0.7331 0.6334
108 0.3504 0.7007 0.6496
109 0.3276 0.6551 0.6724
110 0.3186 0.6373 0.6814
111 0.2833 0.5666 0.7167
112 0.2887 0.5774 0.7113
113 0.2467 0.4934 0.7533
114 0.2153 0.4306 0.7847
115 0.1843 0.3685 0.8157
116 0.1646 0.3293 0.8354
117 0.1454 0.2909 0.8546
118 0.1208 0.2417 0.8792
119 0.1067 0.2135 0.8933
120 0.09228 0.1846 0.9077
121 0.08667 0.1733 0.9133
122 0.06675 0.1335 0.9333
123 0.06242 0.1248 0.9376
124 0.04782 0.09563 0.9522
125 0.05111 0.1022 0.9489
126 0.05703 0.1141 0.943
127 0.121 0.242 0.879
128 0.1003 0.2006 0.8997
129 0.08509 0.1702 0.9149
130 0.07057 0.1411 0.9294
131 0.0638 0.1276 0.9362
132 0.06184 0.1237 0.9382
133 0.05419 0.1084 0.9458
134 0.03871 0.07743 0.9613
135 0.04226 0.08451 0.9577
136 0.05071 0.1014 0.9493
137 0.04507 0.09014 0.9549
138 0.09117 0.1823 0.9088
139 0.1191 0.2382 0.8809
140 0.08406 0.1681 0.9159
141 0.08846 0.1769 0.9115
142 0.06379 0.1276 0.9362
143 0.06189 0.1238 0.9381
144 0.03852 0.07704 0.9615
145 0.03366 0.06732 0.9663
146 0.01948 0.03897 0.9805
147 0.01064 0.02128 0.9894
148 0.1182 0.2364 0.8818
149 0.6746 0.6508 0.3254







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level16 0.1143NOK
5% type I error level640.457143NOK
10% type I error level740.528571NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 16 &  0.1143 & NOK \tabularnewline
5% type I error level & 64 & 0.457143 & NOK \tabularnewline
10% type I error level & 74 & 0.528571 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297582&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]16[/C][C] 0.1143[/C][C]NOK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]64[/C][C]0.457143[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]74[/C][C]0.528571[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297582&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297582&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level16 0.1143NOK
5% type I error level640.457143NOK
10% type I error level740.528571NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.8522, df1 = 2, df2 = 150, p-value = 0.1605
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3634, df1 = 12, df2 = 140, p-value = 0.1905
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.2002, df1 = 2, df2 = 150, p-value = 0.304

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.8522, df1 = 2, df2 = 150, p-value = 0.1605
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3634, df1 = 12, df2 = 140, p-value = 0.1905
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.2002, df1 = 2, df2 = 150, p-value = 0.304
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=297582&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.8522, df1 = 2, df2 = 150, p-value = 0.1605
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3634, df1 = 12, df2 = 140, p-value = 0.1905
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.2002, df1 = 2, df2 = 150, p-value = 0.304
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297582&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297582&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.8522, df1 = 2, df2 = 150, p-value = 0.1605
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.3634, df1 = 12, df2 = 140, p-value = 0.1905
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.2002, df1 = 2, df2 = 150, p-value = 0.304







Variance Inflation Factors (Multicollinearity)
> vif
     SK1      SK2      SK3      SK4      SK5      SK6 
1.093289 1.131316 1.044801 1.044030 1.044830 1.032498 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
     SK1      SK2      SK3      SK4      SK5      SK6 
1.093289 1.131316 1.044801 1.044030 1.044830 1.032498 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=297582&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
     SK1      SK2      SK3      SK4      SK5      SK6 
1.093289 1.131316 1.044801 1.044030 1.044830 1.032498 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297582&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297582&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
     SK1      SK2      SK3      SK4      SK5      SK6 
1.093289 1.131316 1.044801 1.044030 1.044830 1.032498 



Parameters (Session):
par1 = 7 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 7 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = ; par5 = ;
R code (references can be found in the software module):
par5 <- ''
par4 <- ''
par3 <- 'No Linear Trend'
par2 <- 'Do not include Seasonal Dummies'
par1 <- '7'
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')