Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationTue, 06 Dec 2016 17:17:46 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/06/t1481041121855zeegqvkecbv3.htm/, Retrieved Sat, 04 May 2024 07:00:15 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=297868, Retrieved Sat, 04 May 2024 07:00:15 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact81
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Multi reg TA V1] [2016-12-06 16:17:46] [8b2c6464bd93a4843579a2d15e9e0aeb] [Current]
Feedback Forum

Post a new message
Dataseries X:
4	2	4	3	5	4	13
5	3	3	4	5	4	16
4	4	5	4	5	4	17
3	4	3	3	4	4	15
4	4	5	4	5	4	16
3	4	4	4	5	5	16
3	4	4	3	3	4	18
3	4	5	4	4	4	16
4	5	4	4	5	5	17
4	5	5	4	5	5	17
4	4	2	4	5	4	17
4	4	5	3	5	4	15
4	4	4	3	4	5	16
3	3	5	4	4	5	14
4	4	5	4	2	5	16
3	4	5	4	4	5	17
3	4	5	4	4	5	16
5	5	4	3	4	4	15
4	4	4	4	5	4	17
3	4	5	3	4	5	16
4	4	4	4	5	5	15
4	4	5	4	4	5	16
4	4	5	4	4	4	15
4	4	5	4	4	5	17
3	4	4	4	4	4	14
3	4	4	3	5	5	16
4	4	4	4	4	4	15
2	4	5	4	5	5	16
5	4	4	4	4	4	16
4	3	5	4	4	4	13
4	5	5	4	5	5	15
5	4	5	4	4	5	17
4	3	5	4	NA	5	15
2	3	5	4	5	4	13
4	5	2	4	4	4	17
3	4	5	4	4	4	15
4	3	5	3	4	5	14
4	3	3	4	4	4	14
4	4	5	4	4	4	18
5	4	4	4	4	4	15
4	5	5	4	5	5	17
3	3	4	4	4	4	13
5	5	5	3	5	5	16
5	4	5	3	4	4	15
4	4	4	3	4	5	15
4	4	4	4	4	4	16
3	5	5	3	3	4	15
4	4	4	4	5	4	13
4	5	5	4	4	4	17
5	5	2	4	5	4	18
5	5	5	4	4	4	17
4	3	5	4	5	5	11
4	3	4	3	4	5	14
4	4	5	4	4	4	13
3	4	4	3	3	4	15
3	4	4	4	4	3	17
4	4	4	3	5	4	16
4	4	4	4	5	4	15
5	5	3	4	5	5	17
2	4	4	4	5	5	16
4	4	4	4	5	5	16
3	4	4	4	2	4	16
4	4	5	4	5	5	15
4	2	4	4	4	4	12
4	4	4	3	5	3	17
4	4	4	3	5	4	14
5	4	5	3	3	5	14
3	4	4	3	5	5	16
3	4	4	3	4	5	15
4	5	5	5	5	4	15
4	4	3	4	NA	4	13
4	4	4	4	4	4	13
4	4	4	5	5	4	17
3	4	3	4	4	4	15
4	4	4	4	5	4	16
3	4	5	3	5	5	14
3	3	5	4	4	5	15
4	3	5	4	4	4	17
4	4	5	4	4	5	16
3	3	3	4	4	4	10
4	4	4	4	5	4	16
4	4	3	4	5	5	17
4	4	4	4	5	5	17
5	4	4	4	4	4	20
5	4	3	5	4	5	17
4	4	5	4	5	5	18
3	4	5	4	4	5	15
3	NA	4	4	4	4	17
4	2	3	3	4	4	14
4	4	5	4	4	3	15
4	4	5	4	4	5	17
4	4	4	4	5	4	16
4	5	4	4	5	3	17
3	4	4	3	5	5	15
4	4	5	4	4	5	16
5	4	3	4	4	5	18
5	4	5	5	4	5	18
4	5	4	4	5	5	16
5	3	4	4	5	5	17
4	4	5	4	4	5	15
5	4	4	4	4	5	13
3	4	4	3	NA	4	15
5	4	4	5	5	5	17
4	4	5	3	NA	5	16
4	4	3	3	4	3	16
4	4	5	4	4	4	15
4	4	5	4	4	4	16
3	4	5	4	5	3	16
4	4	4	4	4	4	13
4	4	4	3	4	5	15
3	3	4	3	5	5	12
4	4	4	3	4	4	19
3	4	5	4	4	4	16
4	4	5	4	3	4	16
5	4	5	1	5	5	17
5	4	5	4	5	5	16
4	4	4	4	4	3	14
4	4	5	3	4	4	15
3	4	4	3	4	5	14
4	4	4	4	4	4	16
4	4	4	4	5	4	15
4	5	3	4	4	4	17
3	4	4	4	4	4	15
4	4	4	3	4	4	16
4	4	4	4	4	5	16
3	4	3	3	4	4	15
4	4	4	3	4	3	15
3	2	4	2	4	4	11
4	4	4	3	5	4	16
5	4	4	3	5	4	18
2	4	4	3	3	5	13
3	3	4	4	4	4	11
5	5	4	4	5	4	18
4	5	5	4	4	4	15
5	5	5	5	5	4	19
4	5	5	4	5	5	17
4	4	4	3	4	5	13
3	4	5	4	5	4	14
4	4	5	4	4	4	16
4	4	2	4	4	4	13
4	4	3	4	5	5	17
4	4	4	4	5	5	14
5	4	5	3	5	4	19
4	3	5	4	4	4	14
4	4	5	4	4	4	16
3	3	2	3	4	4	12
4	5	5	4	4	3	16
4	4	4	3	4	4	16
4	4	4	4	4	5	15
3	4	5	3	5	5	12
4	4	5	4	4	5	15
5	4	5	4	5	4	17
4	4	5	4	3	4	13
2	3	5	4	4	4	15
4	4	4	4	4	5	18
4	3	4	3	5	5	15
4	4	4	4	4	3	18
4	5	5	5	4	4	15
5	4	3	4	4	4	15
5	4	4	3	4	4	16
3	3	1	4	5	5	13
4	4	4	4	4	5	16
4	4	4	4	5	4	13
2	3	4	5	5	4	16




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center
R Framework error message
The field 'Names of X columns' contains a hard return which cannot be interpreted.
Please, resubmit your request without hard returns in the 'Names of X columns'.

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time7 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
R Framework error message & 
The field 'Names of X columns' contains a hard return which cannot be interpreted.
Please, resubmit your request without hard returns in the 'Names of X columns'.
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=297868&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]7 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [ROW]R Framework error message[/C][C]
The field 'Names of X columns' contains a hard return which cannot be interpreted.
Please, resubmit your request without hard returns in the 'Names of X columns'.
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297868&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297868&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center
R Framework error message
The field 'Names of X columns' contains a hard return which cannot be interpreted.
Please, resubmit your request without hard returns in the 'Names of X columns'.







Multiple Linear Regression - Estimated Regression Equation
TVDC [t] = + 6.60416 + 0.569262SK1[t] + 1.14692SK2[t] + 0.064918SK3[t] + 0.249701SK4[t] + 0.20974SK5[t] + 0.000795004SK6[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
TVDC
[t] =  +  6.60416 +  0.569262SK1[t] +  1.14692SK2[t] +  0.064918SK3[t] +  0.249701SK4[t] +  0.20974SK5[t] +  0.000795004SK6[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297868&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]TVDC
[t] =  +  6.60416 +  0.569262SK1[t] +  1.14692SK2[t] +  0.064918SK3[t] +  0.249701SK4[t] +  0.20974SK5[t] +  0.000795004SK6[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297868&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297868&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
TVDC [t] = + 6.60416 + 0.569262SK1[t] + 1.14692SK2[t] + 0.064918SK3[t] + 0.249701SK4[t] + 0.20974SK5[t] + 0.000795004SK6[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+6.604 1.57+4.2060e+00 4.419e-05 2.209e-05
SK1+0.5693 0.1655+3.4410e+00 0.0007494 0.0003747
SK2+1.147 0.2016+5.6890e+00 6.386e-08 3.193e-08
SK3+0.06492 0.1478+4.3930e-01 0.661 0.3305
SK4+0.2497 0.202+1.2360e+00 0.2184 0.1092
SK5+0.2097 0.1901+1.1040e+00 0.2716 0.1358
SK6+0.000795 0.1972+4.0310e-03 0.9968 0.4984

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +6.604 &  1.57 & +4.2060e+00 &  4.419e-05 &  2.209e-05 \tabularnewline
SK1 & +0.5693 &  0.1655 & +3.4410e+00 &  0.0007494 &  0.0003747 \tabularnewline
SK2 & +1.147 &  0.2016 & +5.6890e+00 &  6.386e-08 &  3.193e-08 \tabularnewline
SK3 & +0.06492 &  0.1478 & +4.3930e-01 &  0.661 &  0.3305 \tabularnewline
SK4 & +0.2497 &  0.202 & +1.2360e+00 &  0.2184 &  0.1092 \tabularnewline
SK5 & +0.2097 &  0.1901 & +1.1040e+00 &  0.2716 &  0.1358 \tabularnewline
SK6 & +0.000795 &  0.1972 & +4.0310e-03 &  0.9968 &  0.4984 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297868&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+6.604[/C][C] 1.57[/C][C]+4.2060e+00[/C][C] 4.419e-05[/C][C] 2.209e-05[/C][/ROW]
[ROW][C]SK1[/C][C]+0.5693[/C][C] 0.1655[/C][C]+3.4410e+00[/C][C] 0.0007494[/C][C] 0.0003747[/C][/ROW]
[ROW][C]SK2[/C][C]+1.147[/C][C] 0.2016[/C][C]+5.6890e+00[/C][C] 6.386e-08[/C][C] 3.193e-08[/C][/ROW]
[ROW][C]SK3[/C][C]+0.06492[/C][C] 0.1478[/C][C]+4.3930e-01[/C][C] 0.661[/C][C] 0.3305[/C][/ROW]
[ROW][C]SK4[/C][C]+0.2497[/C][C] 0.202[/C][C]+1.2360e+00[/C][C] 0.2184[/C][C] 0.1092[/C][/ROW]
[ROW][C]SK5[/C][C]+0.2097[/C][C] 0.1901[/C][C]+1.1040e+00[/C][C] 0.2716[/C][C] 0.1358[/C][/ROW]
[ROW][C]SK6[/C][C]+0.000795[/C][C] 0.1972[/C][C]+4.0310e-03[/C][C] 0.9968[/C][C] 0.4984[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297868&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297868&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+6.604 1.57+4.2060e+00 4.419e-05 2.209e-05
SK1+0.5693 0.1655+3.4410e+00 0.0007494 0.0003747
SK2+1.147 0.2016+5.6890e+00 6.386e-08 3.193e-08
SK3+0.06492 0.1478+4.3930e-01 0.661 0.3305
SK4+0.2497 0.202+1.2360e+00 0.2184 0.1092
SK5+0.2097 0.1901+1.1040e+00 0.2716 0.1358
SK6+0.000795 0.1972+4.0310e-03 0.9968 0.4984







Multiple Linear Regression - Regression Statistics
Multiple R 0.5633
R-squared 0.3173
Adjusted R-squared 0.2904
F-TEST (value) 11.78
F-TEST (DF numerator)6
F-TEST (DF denominator)152
p-value 8.029e-11
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.452
Sum Squared Residuals 320.6

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.5633 \tabularnewline
R-squared &  0.3173 \tabularnewline
Adjusted R-squared &  0.2904 \tabularnewline
F-TEST (value) &  11.78 \tabularnewline
F-TEST (DF numerator) & 6 \tabularnewline
F-TEST (DF denominator) & 152 \tabularnewline
p-value &  8.029e-11 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1.452 \tabularnewline
Sum Squared Residuals &  320.6 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297868&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.5633[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.3173[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.2904[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 11.78[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]6[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]152[/C][/ROW]
[ROW][C]p-value[/C][C] 8.029e-11[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1.452[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 320.6[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297868&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297868&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.5633
R-squared 0.3173
Adjusted R-squared 0.2904
F-TEST (value) 11.78
F-TEST (DF numerator)6
F-TEST (DF denominator)152
p-value 8.029e-11
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.452
Sum Squared Residuals 320.6







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 13 13.24-0.2357
2 16 15.14 0.8633
3 17 15.84 1.156
4 15 14.69 0.3144
5 16 15.84 0.1558
6 16 15.21 0.7892
7 18 14.54 3.459
8 16 15.07 0.9348
9 17 16.93 0.07302
10 17 16.99 0.008101
11 17 15.65 1.351
12 15 15.59-0.5945
13 16 15.32 0.6794
14 14 13.92 0.08095
15 16 15.22 0.7842
16 17 15.07 1.934
17 16 15.07 0.934
18 15 17.04-2.036
19 17 15.78 1.221
20 16 14.82 1.184
21 15 15.78-0.7801
22 16 15.64 0.3648
23 15 15.63-0.6344
24 17 15.64 1.365
25 14 15-1
26 16 14.96 1.039
27 15 15.57-0.5695
28 16 14.71 1.294
29 16 16.14-0.1388
30 13 14.49-1.488
31 15 16.99-1.992
32 17 16.2 0.7955
33 13 13.56-0.5587
34 17 16.59 0.4134
35 15 15.07-0.06518
36 14 14.24-0.2386
37 14 14.36-0.3577
38 18 15.63 2.366
39 15 16.14-1.139
40 17 16.99 0.008101
41 13 13.85-0.8533
42 16 17.31-1.311
43 15 15.95-0.954
44 15 15.32-0.3206
45 16 15.57 0.4305
46 15 15.75-0.7527
47 13 15.78-2.779
48 17 16.78 0.2186
49 18 17.37 0.6344
50 17 17.35-0.3506
51 11 14.7-3.698
52 14 14.17-0.1737
53 13 15.63-2.634
54 15 14.54 0.4592
55 17 15 2.001
56 16 15.53 0.4704
57 15 15.78-0.7793
58 17 17.43-0.4313
59 16 14.64 1.358
60 16 15.78 0.2199
61 16 14.58 1.419
62 15 15.85-0.845
63 12 13.28-1.276
64 17 15.53 1.471
65 14 15.53-1.53
66 14 15.75-1.745
67 16 14.96 1.039
68 15 14.75 0.2486
69 15 17.24-2.241
70 13 15.57-2.57
71 17 16.03 0.971
72 15 14.94 0.06466
73 16 15.78 0.2207
74 14 15.03-1.026
75 15 13.92 1.081
76 17 14.49 2.512
77 16 15.64 0.3648
78 10 13.79-3.788
79 16 15.78 0.2207
80 17 15.72 1.285
81 17 15.78 1.22
82 20 16.14 3.861
83 17 16.32 0.6756
84 18 15.85 2.155
85 15 15.07-0.06597
86 14 12.96 1.039
87 15 15.63-0.6336
88 17 15.64 1.365
89 16 15.78 0.2207
90 17 16.93 0.07461
91 15 14.96 0.03891
92 16 15.64 0.3648
93 18 16.07 1.925
94 18 16.45 1.546
95 16 16.93-0.927
96 17 15.2 1.798
97 15 15.64-0.6352
98 13 16.14-3.14
99 17 16.6 0.401
100 16 15.25 0.7459
101 15 15.63-0.6344
102 16 15.63 0.3656
103 16 15.27 0.7259
104 13 15.57-2.57
105 15 15.32-0.3206
106 12 13.81-1.814
107 19 15.32 3.68
108 16 15.07 0.9348
109 16 15.42 0.5753
110 17 15.67 1.335
111 16 16.41-0.4142
112 14 15.57-1.569
113 15 15.38-0.3847
114 14 14.75-0.7514
115 16 15.57 0.4305
116 15 15.78-0.7793
117 17 16.65 0.3485
118 15 15-0.0002596
119 16 15.32 0.6802
120 16 15.57 0.4297
121 15 14.69 0.3144
122 15 15.32-0.319
123 11 12.21-1.207
124 16 15.53 0.4704
125 18 16.1 1.901
126 13 13.97-0.9724
127 11 13.85-2.853
128 18 17.5 0.5046
129 15 16.78-1.781
130 19 17.81 1.19
131 17 16.99 0.008101
132 13 15.32-2.321
133 14 15.27-1.275
134 16 15.63 0.3656
135 13 15.44-2.44
136 17 15.72 1.285
137 14 15.78-1.78
138 19 16.16 2.836
139 14 14.49-0.4875
140 16 15.63 0.3656
141 12 13.47-1.474
142 16 16.78-0.7806
143 16 15.32 0.6802
144 15 15.57-0.5703
145 12 15.03-3.026
146 15 15.64-0.6352
147 17 16.41 0.5866
148 13 15.42-2.425
149 15 13.35 1.651
150 18 15.57 2.43
151 15 14.38 0.6166
152 18 15.57 2.431
153 15 17.03-2.031
154 15 16.07-1.074
155 16 15.89 0.1109
156 13 13.87-0.8691
157 16 15.57 0.4297
158 13 15.78-2.779
159 16 13.74 2.256

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  13 &  13.24 & -0.2357 \tabularnewline
2 &  16 &  15.14 &  0.8633 \tabularnewline
3 &  17 &  15.84 &  1.156 \tabularnewline
4 &  15 &  14.69 &  0.3144 \tabularnewline
5 &  16 &  15.84 &  0.1558 \tabularnewline
6 &  16 &  15.21 &  0.7892 \tabularnewline
7 &  18 &  14.54 &  3.459 \tabularnewline
8 &  16 &  15.07 &  0.9348 \tabularnewline
9 &  17 &  16.93 &  0.07302 \tabularnewline
10 &  17 &  16.99 &  0.008101 \tabularnewline
11 &  17 &  15.65 &  1.351 \tabularnewline
12 &  15 &  15.59 & -0.5945 \tabularnewline
13 &  16 &  15.32 &  0.6794 \tabularnewline
14 &  14 &  13.92 &  0.08095 \tabularnewline
15 &  16 &  15.22 &  0.7842 \tabularnewline
16 &  17 &  15.07 &  1.934 \tabularnewline
17 &  16 &  15.07 &  0.934 \tabularnewline
18 &  15 &  17.04 & -2.036 \tabularnewline
19 &  17 &  15.78 &  1.221 \tabularnewline
20 &  16 &  14.82 &  1.184 \tabularnewline
21 &  15 &  15.78 & -0.7801 \tabularnewline
22 &  16 &  15.64 &  0.3648 \tabularnewline
23 &  15 &  15.63 & -0.6344 \tabularnewline
24 &  17 &  15.64 &  1.365 \tabularnewline
25 &  14 &  15 & -1 \tabularnewline
26 &  16 &  14.96 &  1.039 \tabularnewline
27 &  15 &  15.57 & -0.5695 \tabularnewline
28 &  16 &  14.71 &  1.294 \tabularnewline
29 &  16 &  16.14 & -0.1388 \tabularnewline
30 &  13 &  14.49 & -1.488 \tabularnewline
31 &  15 &  16.99 & -1.992 \tabularnewline
32 &  17 &  16.2 &  0.7955 \tabularnewline
33 &  13 &  13.56 & -0.5587 \tabularnewline
34 &  17 &  16.59 &  0.4134 \tabularnewline
35 &  15 &  15.07 & -0.06518 \tabularnewline
36 &  14 &  14.24 & -0.2386 \tabularnewline
37 &  14 &  14.36 & -0.3577 \tabularnewline
38 &  18 &  15.63 &  2.366 \tabularnewline
39 &  15 &  16.14 & -1.139 \tabularnewline
40 &  17 &  16.99 &  0.008101 \tabularnewline
41 &  13 &  13.85 & -0.8533 \tabularnewline
42 &  16 &  17.31 & -1.311 \tabularnewline
43 &  15 &  15.95 & -0.954 \tabularnewline
44 &  15 &  15.32 & -0.3206 \tabularnewline
45 &  16 &  15.57 &  0.4305 \tabularnewline
46 &  15 &  15.75 & -0.7527 \tabularnewline
47 &  13 &  15.78 & -2.779 \tabularnewline
48 &  17 &  16.78 &  0.2186 \tabularnewline
49 &  18 &  17.37 &  0.6344 \tabularnewline
50 &  17 &  17.35 & -0.3506 \tabularnewline
51 &  11 &  14.7 & -3.698 \tabularnewline
52 &  14 &  14.17 & -0.1737 \tabularnewline
53 &  13 &  15.63 & -2.634 \tabularnewline
54 &  15 &  14.54 &  0.4592 \tabularnewline
55 &  17 &  15 &  2.001 \tabularnewline
56 &  16 &  15.53 &  0.4704 \tabularnewline
57 &  15 &  15.78 & -0.7793 \tabularnewline
58 &  17 &  17.43 & -0.4313 \tabularnewline
59 &  16 &  14.64 &  1.358 \tabularnewline
60 &  16 &  15.78 &  0.2199 \tabularnewline
61 &  16 &  14.58 &  1.419 \tabularnewline
62 &  15 &  15.85 & -0.845 \tabularnewline
63 &  12 &  13.28 & -1.276 \tabularnewline
64 &  17 &  15.53 &  1.471 \tabularnewline
65 &  14 &  15.53 & -1.53 \tabularnewline
66 &  14 &  15.75 & -1.745 \tabularnewline
67 &  16 &  14.96 &  1.039 \tabularnewline
68 &  15 &  14.75 &  0.2486 \tabularnewline
69 &  15 &  17.24 & -2.241 \tabularnewline
70 &  13 &  15.57 & -2.57 \tabularnewline
71 &  17 &  16.03 &  0.971 \tabularnewline
72 &  15 &  14.94 &  0.06466 \tabularnewline
73 &  16 &  15.78 &  0.2207 \tabularnewline
74 &  14 &  15.03 & -1.026 \tabularnewline
75 &  15 &  13.92 &  1.081 \tabularnewline
76 &  17 &  14.49 &  2.512 \tabularnewline
77 &  16 &  15.64 &  0.3648 \tabularnewline
78 &  10 &  13.79 & -3.788 \tabularnewline
79 &  16 &  15.78 &  0.2207 \tabularnewline
80 &  17 &  15.72 &  1.285 \tabularnewline
81 &  17 &  15.78 &  1.22 \tabularnewline
82 &  20 &  16.14 &  3.861 \tabularnewline
83 &  17 &  16.32 &  0.6756 \tabularnewline
84 &  18 &  15.85 &  2.155 \tabularnewline
85 &  15 &  15.07 & -0.06597 \tabularnewline
86 &  14 &  12.96 &  1.039 \tabularnewline
87 &  15 &  15.63 & -0.6336 \tabularnewline
88 &  17 &  15.64 &  1.365 \tabularnewline
89 &  16 &  15.78 &  0.2207 \tabularnewline
90 &  17 &  16.93 &  0.07461 \tabularnewline
91 &  15 &  14.96 &  0.03891 \tabularnewline
92 &  16 &  15.64 &  0.3648 \tabularnewline
93 &  18 &  16.07 &  1.925 \tabularnewline
94 &  18 &  16.45 &  1.546 \tabularnewline
95 &  16 &  16.93 & -0.927 \tabularnewline
96 &  17 &  15.2 &  1.798 \tabularnewline
97 &  15 &  15.64 & -0.6352 \tabularnewline
98 &  13 &  16.14 & -3.14 \tabularnewline
99 &  17 &  16.6 &  0.401 \tabularnewline
100 &  16 &  15.25 &  0.7459 \tabularnewline
101 &  15 &  15.63 & -0.6344 \tabularnewline
102 &  16 &  15.63 &  0.3656 \tabularnewline
103 &  16 &  15.27 &  0.7259 \tabularnewline
104 &  13 &  15.57 & -2.57 \tabularnewline
105 &  15 &  15.32 & -0.3206 \tabularnewline
106 &  12 &  13.81 & -1.814 \tabularnewline
107 &  19 &  15.32 &  3.68 \tabularnewline
108 &  16 &  15.07 &  0.9348 \tabularnewline
109 &  16 &  15.42 &  0.5753 \tabularnewline
110 &  17 &  15.67 &  1.335 \tabularnewline
111 &  16 &  16.41 & -0.4142 \tabularnewline
112 &  14 &  15.57 & -1.569 \tabularnewline
113 &  15 &  15.38 & -0.3847 \tabularnewline
114 &  14 &  14.75 & -0.7514 \tabularnewline
115 &  16 &  15.57 &  0.4305 \tabularnewline
116 &  15 &  15.78 & -0.7793 \tabularnewline
117 &  17 &  16.65 &  0.3485 \tabularnewline
118 &  15 &  15 & -0.0002596 \tabularnewline
119 &  16 &  15.32 &  0.6802 \tabularnewline
120 &  16 &  15.57 &  0.4297 \tabularnewline
121 &  15 &  14.69 &  0.3144 \tabularnewline
122 &  15 &  15.32 & -0.319 \tabularnewline
123 &  11 &  12.21 & -1.207 \tabularnewline
124 &  16 &  15.53 &  0.4704 \tabularnewline
125 &  18 &  16.1 &  1.901 \tabularnewline
126 &  13 &  13.97 & -0.9724 \tabularnewline
127 &  11 &  13.85 & -2.853 \tabularnewline
128 &  18 &  17.5 &  0.5046 \tabularnewline
129 &  15 &  16.78 & -1.781 \tabularnewline
130 &  19 &  17.81 &  1.19 \tabularnewline
131 &  17 &  16.99 &  0.008101 \tabularnewline
132 &  13 &  15.32 & -2.321 \tabularnewline
133 &  14 &  15.27 & -1.275 \tabularnewline
134 &  16 &  15.63 &  0.3656 \tabularnewline
135 &  13 &  15.44 & -2.44 \tabularnewline
136 &  17 &  15.72 &  1.285 \tabularnewline
137 &  14 &  15.78 & -1.78 \tabularnewline
138 &  19 &  16.16 &  2.836 \tabularnewline
139 &  14 &  14.49 & -0.4875 \tabularnewline
140 &  16 &  15.63 &  0.3656 \tabularnewline
141 &  12 &  13.47 & -1.474 \tabularnewline
142 &  16 &  16.78 & -0.7806 \tabularnewline
143 &  16 &  15.32 &  0.6802 \tabularnewline
144 &  15 &  15.57 & -0.5703 \tabularnewline
145 &  12 &  15.03 & -3.026 \tabularnewline
146 &  15 &  15.64 & -0.6352 \tabularnewline
147 &  17 &  16.41 &  0.5866 \tabularnewline
148 &  13 &  15.42 & -2.425 \tabularnewline
149 &  15 &  13.35 &  1.651 \tabularnewline
150 &  18 &  15.57 &  2.43 \tabularnewline
151 &  15 &  14.38 &  0.6166 \tabularnewline
152 &  18 &  15.57 &  2.431 \tabularnewline
153 &  15 &  17.03 & -2.031 \tabularnewline
154 &  15 &  16.07 & -1.074 \tabularnewline
155 &  16 &  15.89 &  0.1109 \tabularnewline
156 &  13 &  13.87 & -0.8691 \tabularnewline
157 &  16 &  15.57 &  0.4297 \tabularnewline
158 &  13 &  15.78 & -2.779 \tabularnewline
159 &  16 &  13.74 &  2.256 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297868&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 13[/C][C] 13.24[/C][C]-0.2357[/C][/ROW]
[ROW][C]2[/C][C] 16[/C][C] 15.14[/C][C] 0.8633[/C][/ROW]
[ROW][C]3[/C][C] 17[/C][C] 15.84[/C][C] 1.156[/C][/ROW]
[ROW][C]4[/C][C] 15[/C][C] 14.69[/C][C] 0.3144[/C][/ROW]
[ROW][C]5[/C][C] 16[/C][C] 15.84[/C][C] 0.1558[/C][/ROW]
[ROW][C]6[/C][C] 16[/C][C] 15.21[/C][C] 0.7892[/C][/ROW]
[ROW][C]7[/C][C] 18[/C][C] 14.54[/C][C] 3.459[/C][/ROW]
[ROW][C]8[/C][C] 16[/C][C] 15.07[/C][C] 0.9348[/C][/ROW]
[ROW][C]9[/C][C] 17[/C][C] 16.93[/C][C] 0.07302[/C][/ROW]
[ROW][C]10[/C][C] 17[/C][C] 16.99[/C][C] 0.008101[/C][/ROW]
[ROW][C]11[/C][C] 17[/C][C] 15.65[/C][C] 1.351[/C][/ROW]
[ROW][C]12[/C][C] 15[/C][C] 15.59[/C][C]-0.5945[/C][/ROW]
[ROW][C]13[/C][C] 16[/C][C] 15.32[/C][C] 0.6794[/C][/ROW]
[ROW][C]14[/C][C] 14[/C][C] 13.92[/C][C] 0.08095[/C][/ROW]
[ROW][C]15[/C][C] 16[/C][C] 15.22[/C][C] 0.7842[/C][/ROW]
[ROW][C]16[/C][C] 17[/C][C] 15.07[/C][C] 1.934[/C][/ROW]
[ROW][C]17[/C][C] 16[/C][C] 15.07[/C][C] 0.934[/C][/ROW]
[ROW][C]18[/C][C] 15[/C][C] 17.04[/C][C]-2.036[/C][/ROW]
[ROW][C]19[/C][C] 17[/C][C] 15.78[/C][C] 1.221[/C][/ROW]
[ROW][C]20[/C][C] 16[/C][C] 14.82[/C][C] 1.184[/C][/ROW]
[ROW][C]21[/C][C] 15[/C][C] 15.78[/C][C]-0.7801[/C][/ROW]
[ROW][C]22[/C][C] 16[/C][C] 15.64[/C][C] 0.3648[/C][/ROW]
[ROW][C]23[/C][C] 15[/C][C] 15.63[/C][C]-0.6344[/C][/ROW]
[ROW][C]24[/C][C] 17[/C][C] 15.64[/C][C] 1.365[/C][/ROW]
[ROW][C]25[/C][C] 14[/C][C] 15[/C][C]-1[/C][/ROW]
[ROW][C]26[/C][C] 16[/C][C] 14.96[/C][C] 1.039[/C][/ROW]
[ROW][C]27[/C][C] 15[/C][C] 15.57[/C][C]-0.5695[/C][/ROW]
[ROW][C]28[/C][C] 16[/C][C] 14.71[/C][C] 1.294[/C][/ROW]
[ROW][C]29[/C][C] 16[/C][C] 16.14[/C][C]-0.1388[/C][/ROW]
[ROW][C]30[/C][C] 13[/C][C] 14.49[/C][C]-1.488[/C][/ROW]
[ROW][C]31[/C][C] 15[/C][C] 16.99[/C][C]-1.992[/C][/ROW]
[ROW][C]32[/C][C] 17[/C][C] 16.2[/C][C] 0.7955[/C][/ROW]
[ROW][C]33[/C][C] 13[/C][C] 13.56[/C][C]-0.5587[/C][/ROW]
[ROW][C]34[/C][C] 17[/C][C] 16.59[/C][C] 0.4134[/C][/ROW]
[ROW][C]35[/C][C] 15[/C][C] 15.07[/C][C]-0.06518[/C][/ROW]
[ROW][C]36[/C][C] 14[/C][C] 14.24[/C][C]-0.2386[/C][/ROW]
[ROW][C]37[/C][C] 14[/C][C] 14.36[/C][C]-0.3577[/C][/ROW]
[ROW][C]38[/C][C] 18[/C][C] 15.63[/C][C] 2.366[/C][/ROW]
[ROW][C]39[/C][C] 15[/C][C] 16.14[/C][C]-1.139[/C][/ROW]
[ROW][C]40[/C][C] 17[/C][C] 16.99[/C][C] 0.008101[/C][/ROW]
[ROW][C]41[/C][C] 13[/C][C] 13.85[/C][C]-0.8533[/C][/ROW]
[ROW][C]42[/C][C] 16[/C][C] 17.31[/C][C]-1.311[/C][/ROW]
[ROW][C]43[/C][C] 15[/C][C] 15.95[/C][C]-0.954[/C][/ROW]
[ROW][C]44[/C][C] 15[/C][C] 15.32[/C][C]-0.3206[/C][/ROW]
[ROW][C]45[/C][C] 16[/C][C] 15.57[/C][C] 0.4305[/C][/ROW]
[ROW][C]46[/C][C] 15[/C][C] 15.75[/C][C]-0.7527[/C][/ROW]
[ROW][C]47[/C][C] 13[/C][C] 15.78[/C][C]-2.779[/C][/ROW]
[ROW][C]48[/C][C] 17[/C][C] 16.78[/C][C] 0.2186[/C][/ROW]
[ROW][C]49[/C][C] 18[/C][C] 17.37[/C][C] 0.6344[/C][/ROW]
[ROW][C]50[/C][C] 17[/C][C] 17.35[/C][C]-0.3506[/C][/ROW]
[ROW][C]51[/C][C] 11[/C][C] 14.7[/C][C]-3.698[/C][/ROW]
[ROW][C]52[/C][C] 14[/C][C] 14.17[/C][C]-0.1737[/C][/ROW]
[ROW][C]53[/C][C] 13[/C][C] 15.63[/C][C]-2.634[/C][/ROW]
[ROW][C]54[/C][C] 15[/C][C] 14.54[/C][C] 0.4592[/C][/ROW]
[ROW][C]55[/C][C] 17[/C][C] 15[/C][C] 2.001[/C][/ROW]
[ROW][C]56[/C][C] 16[/C][C] 15.53[/C][C] 0.4704[/C][/ROW]
[ROW][C]57[/C][C] 15[/C][C] 15.78[/C][C]-0.7793[/C][/ROW]
[ROW][C]58[/C][C] 17[/C][C] 17.43[/C][C]-0.4313[/C][/ROW]
[ROW][C]59[/C][C] 16[/C][C] 14.64[/C][C] 1.358[/C][/ROW]
[ROW][C]60[/C][C] 16[/C][C] 15.78[/C][C] 0.2199[/C][/ROW]
[ROW][C]61[/C][C] 16[/C][C] 14.58[/C][C] 1.419[/C][/ROW]
[ROW][C]62[/C][C] 15[/C][C] 15.85[/C][C]-0.845[/C][/ROW]
[ROW][C]63[/C][C] 12[/C][C] 13.28[/C][C]-1.276[/C][/ROW]
[ROW][C]64[/C][C] 17[/C][C] 15.53[/C][C] 1.471[/C][/ROW]
[ROW][C]65[/C][C] 14[/C][C] 15.53[/C][C]-1.53[/C][/ROW]
[ROW][C]66[/C][C] 14[/C][C] 15.75[/C][C]-1.745[/C][/ROW]
[ROW][C]67[/C][C] 16[/C][C] 14.96[/C][C] 1.039[/C][/ROW]
[ROW][C]68[/C][C] 15[/C][C] 14.75[/C][C] 0.2486[/C][/ROW]
[ROW][C]69[/C][C] 15[/C][C] 17.24[/C][C]-2.241[/C][/ROW]
[ROW][C]70[/C][C] 13[/C][C] 15.57[/C][C]-2.57[/C][/ROW]
[ROW][C]71[/C][C] 17[/C][C] 16.03[/C][C] 0.971[/C][/ROW]
[ROW][C]72[/C][C] 15[/C][C] 14.94[/C][C] 0.06466[/C][/ROW]
[ROW][C]73[/C][C] 16[/C][C] 15.78[/C][C] 0.2207[/C][/ROW]
[ROW][C]74[/C][C] 14[/C][C] 15.03[/C][C]-1.026[/C][/ROW]
[ROW][C]75[/C][C] 15[/C][C] 13.92[/C][C] 1.081[/C][/ROW]
[ROW][C]76[/C][C] 17[/C][C] 14.49[/C][C] 2.512[/C][/ROW]
[ROW][C]77[/C][C] 16[/C][C] 15.64[/C][C] 0.3648[/C][/ROW]
[ROW][C]78[/C][C] 10[/C][C] 13.79[/C][C]-3.788[/C][/ROW]
[ROW][C]79[/C][C] 16[/C][C] 15.78[/C][C] 0.2207[/C][/ROW]
[ROW][C]80[/C][C] 17[/C][C] 15.72[/C][C] 1.285[/C][/ROW]
[ROW][C]81[/C][C] 17[/C][C] 15.78[/C][C] 1.22[/C][/ROW]
[ROW][C]82[/C][C] 20[/C][C] 16.14[/C][C] 3.861[/C][/ROW]
[ROW][C]83[/C][C] 17[/C][C] 16.32[/C][C] 0.6756[/C][/ROW]
[ROW][C]84[/C][C] 18[/C][C] 15.85[/C][C] 2.155[/C][/ROW]
[ROW][C]85[/C][C] 15[/C][C] 15.07[/C][C]-0.06597[/C][/ROW]
[ROW][C]86[/C][C] 14[/C][C] 12.96[/C][C] 1.039[/C][/ROW]
[ROW][C]87[/C][C] 15[/C][C] 15.63[/C][C]-0.6336[/C][/ROW]
[ROW][C]88[/C][C] 17[/C][C] 15.64[/C][C] 1.365[/C][/ROW]
[ROW][C]89[/C][C] 16[/C][C] 15.78[/C][C] 0.2207[/C][/ROW]
[ROW][C]90[/C][C] 17[/C][C] 16.93[/C][C] 0.07461[/C][/ROW]
[ROW][C]91[/C][C] 15[/C][C] 14.96[/C][C] 0.03891[/C][/ROW]
[ROW][C]92[/C][C] 16[/C][C] 15.64[/C][C] 0.3648[/C][/ROW]
[ROW][C]93[/C][C] 18[/C][C] 16.07[/C][C] 1.925[/C][/ROW]
[ROW][C]94[/C][C] 18[/C][C] 16.45[/C][C] 1.546[/C][/ROW]
[ROW][C]95[/C][C] 16[/C][C] 16.93[/C][C]-0.927[/C][/ROW]
[ROW][C]96[/C][C] 17[/C][C] 15.2[/C][C] 1.798[/C][/ROW]
[ROW][C]97[/C][C] 15[/C][C] 15.64[/C][C]-0.6352[/C][/ROW]
[ROW][C]98[/C][C] 13[/C][C] 16.14[/C][C]-3.14[/C][/ROW]
[ROW][C]99[/C][C] 17[/C][C] 16.6[/C][C] 0.401[/C][/ROW]
[ROW][C]100[/C][C] 16[/C][C] 15.25[/C][C] 0.7459[/C][/ROW]
[ROW][C]101[/C][C] 15[/C][C] 15.63[/C][C]-0.6344[/C][/ROW]
[ROW][C]102[/C][C] 16[/C][C] 15.63[/C][C] 0.3656[/C][/ROW]
[ROW][C]103[/C][C] 16[/C][C] 15.27[/C][C] 0.7259[/C][/ROW]
[ROW][C]104[/C][C] 13[/C][C] 15.57[/C][C]-2.57[/C][/ROW]
[ROW][C]105[/C][C] 15[/C][C] 15.32[/C][C]-0.3206[/C][/ROW]
[ROW][C]106[/C][C] 12[/C][C] 13.81[/C][C]-1.814[/C][/ROW]
[ROW][C]107[/C][C] 19[/C][C] 15.32[/C][C] 3.68[/C][/ROW]
[ROW][C]108[/C][C] 16[/C][C] 15.07[/C][C] 0.9348[/C][/ROW]
[ROW][C]109[/C][C] 16[/C][C] 15.42[/C][C] 0.5753[/C][/ROW]
[ROW][C]110[/C][C] 17[/C][C] 15.67[/C][C] 1.335[/C][/ROW]
[ROW][C]111[/C][C] 16[/C][C] 16.41[/C][C]-0.4142[/C][/ROW]
[ROW][C]112[/C][C] 14[/C][C] 15.57[/C][C]-1.569[/C][/ROW]
[ROW][C]113[/C][C] 15[/C][C] 15.38[/C][C]-0.3847[/C][/ROW]
[ROW][C]114[/C][C] 14[/C][C] 14.75[/C][C]-0.7514[/C][/ROW]
[ROW][C]115[/C][C] 16[/C][C] 15.57[/C][C] 0.4305[/C][/ROW]
[ROW][C]116[/C][C] 15[/C][C] 15.78[/C][C]-0.7793[/C][/ROW]
[ROW][C]117[/C][C] 17[/C][C] 16.65[/C][C] 0.3485[/C][/ROW]
[ROW][C]118[/C][C] 15[/C][C] 15[/C][C]-0.0002596[/C][/ROW]
[ROW][C]119[/C][C] 16[/C][C] 15.32[/C][C] 0.6802[/C][/ROW]
[ROW][C]120[/C][C] 16[/C][C] 15.57[/C][C] 0.4297[/C][/ROW]
[ROW][C]121[/C][C] 15[/C][C] 14.69[/C][C] 0.3144[/C][/ROW]
[ROW][C]122[/C][C] 15[/C][C] 15.32[/C][C]-0.319[/C][/ROW]
[ROW][C]123[/C][C] 11[/C][C] 12.21[/C][C]-1.207[/C][/ROW]
[ROW][C]124[/C][C] 16[/C][C] 15.53[/C][C] 0.4704[/C][/ROW]
[ROW][C]125[/C][C] 18[/C][C] 16.1[/C][C] 1.901[/C][/ROW]
[ROW][C]126[/C][C] 13[/C][C] 13.97[/C][C]-0.9724[/C][/ROW]
[ROW][C]127[/C][C] 11[/C][C] 13.85[/C][C]-2.853[/C][/ROW]
[ROW][C]128[/C][C] 18[/C][C] 17.5[/C][C] 0.5046[/C][/ROW]
[ROW][C]129[/C][C] 15[/C][C] 16.78[/C][C]-1.781[/C][/ROW]
[ROW][C]130[/C][C] 19[/C][C] 17.81[/C][C] 1.19[/C][/ROW]
[ROW][C]131[/C][C] 17[/C][C] 16.99[/C][C] 0.008101[/C][/ROW]
[ROW][C]132[/C][C] 13[/C][C] 15.32[/C][C]-2.321[/C][/ROW]
[ROW][C]133[/C][C] 14[/C][C] 15.27[/C][C]-1.275[/C][/ROW]
[ROW][C]134[/C][C] 16[/C][C] 15.63[/C][C] 0.3656[/C][/ROW]
[ROW][C]135[/C][C] 13[/C][C] 15.44[/C][C]-2.44[/C][/ROW]
[ROW][C]136[/C][C] 17[/C][C] 15.72[/C][C] 1.285[/C][/ROW]
[ROW][C]137[/C][C] 14[/C][C] 15.78[/C][C]-1.78[/C][/ROW]
[ROW][C]138[/C][C] 19[/C][C] 16.16[/C][C] 2.836[/C][/ROW]
[ROW][C]139[/C][C] 14[/C][C] 14.49[/C][C]-0.4875[/C][/ROW]
[ROW][C]140[/C][C] 16[/C][C] 15.63[/C][C] 0.3656[/C][/ROW]
[ROW][C]141[/C][C] 12[/C][C] 13.47[/C][C]-1.474[/C][/ROW]
[ROW][C]142[/C][C] 16[/C][C] 16.78[/C][C]-0.7806[/C][/ROW]
[ROW][C]143[/C][C] 16[/C][C] 15.32[/C][C] 0.6802[/C][/ROW]
[ROW][C]144[/C][C] 15[/C][C] 15.57[/C][C]-0.5703[/C][/ROW]
[ROW][C]145[/C][C] 12[/C][C] 15.03[/C][C]-3.026[/C][/ROW]
[ROW][C]146[/C][C] 15[/C][C] 15.64[/C][C]-0.6352[/C][/ROW]
[ROW][C]147[/C][C] 17[/C][C] 16.41[/C][C] 0.5866[/C][/ROW]
[ROW][C]148[/C][C] 13[/C][C] 15.42[/C][C]-2.425[/C][/ROW]
[ROW][C]149[/C][C] 15[/C][C] 13.35[/C][C] 1.651[/C][/ROW]
[ROW][C]150[/C][C] 18[/C][C] 15.57[/C][C] 2.43[/C][/ROW]
[ROW][C]151[/C][C] 15[/C][C] 14.38[/C][C] 0.6166[/C][/ROW]
[ROW][C]152[/C][C] 18[/C][C] 15.57[/C][C] 2.431[/C][/ROW]
[ROW][C]153[/C][C] 15[/C][C] 17.03[/C][C]-2.031[/C][/ROW]
[ROW][C]154[/C][C] 15[/C][C] 16.07[/C][C]-1.074[/C][/ROW]
[ROW][C]155[/C][C] 16[/C][C] 15.89[/C][C] 0.1109[/C][/ROW]
[ROW][C]156[/C][C] 13[/C][C] 13.87[/C][C]-0.8691[/C][/ROW]
[ROW][C]157[/C][C] 16[/C][C] 15.57[/C][C] 0.4297[/C][/ROW]
[ROW][C]158[/C][C] 13[/C][C] 15.78[/C][C]-2.779[/C][/ROW]
[ROW][C]159[/C][C] 16[/C][C] 13.74[/C][C] 2.256[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297868&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297868&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 13 13.24-0.2357
2 16 15.14 0.8633
3 17 15.84 1.156
4 15 14.69 0.3144
5 16 15.84 0.1558
6 16 15.21 0.7892
7 18 14.54 3.459
8 16 15.07 0.9348
9 17 16.93 0.07302
10 17 16.99 0.008101
11 17 15.65 1.351
12 15 15.59-0.5945
13 16 15.32 0.6794
14 14 13.92 0.08095
15 16 15.22 0.7842
16 17 15.07 1.934
17 16 15.07 0.934
18 15 17.04-2.036
19 17 15.78 1.221
20 16 14.82 1.184
21 15 15.78-0.7801
22 16 15.64 0.3648
23 15 15.63-0.6344
24 17 15.64 1.365
25 14 15-1
26 16 14.96 1.039
27 15 15.57-0.5695
28 16 14.71 1.294
29 16 16.14-0.1388
30 13 14.49-1.488
31 15 16.99-1.992
32 17 16.2 0.7955
33 13 13.56-0.5587
34 17 16.59 0.4134
35 15 15.07-0.06518
36 14 14.24-0.2386
37 14 14.36-0.3577
38 18 15.63 2.366
39 15 16.14-1.139
40 17 16.99 0.008101
41 13 13.85-0.8533
42 16 17.31-1.311
43 15 15.95-0.954
44 15 15.32-0.3206
45 16 15.57 0.4305
46 15 15.75-0.7527
47 13 15.78-2.779
48 17 16.78 0.2186
49 18 17.37 0.6344
50 17 17.35-0.3506
51 11 14.7-3.698
52 14 14.17-0.1737
53 13 15.63-2.634
54 15 14.54 0.4592
55 17 15 2.001
56 16 15.53 0.4704
57 15 15.78-0.7793
58 17 17.43-0.4313
59 16 14.64 1.358
60 16 15.78 0.2199
61 16 14.58 1.419
62 15 15.85-0.845
63 12 13.28-1.276
64 17 15.53 1.471
65 14 15.53-1.53
66 14 15.75-1.745
67 16 14.96 1.039
68 15 14.75 0.2486
69 15 17.24-2.241
70 13 15.57-2.57
71 17 16.03 0.971
72 15 14.94 0.06466
73 16 15.78 0.2207
74 14 15.03-1.026
75 15 13.92 1.081
76 17 14.49 2.512
77 16 15.64 0.3648
78 10 13.79-3.788
79 16 15.78 0.2207
80 17 15.72 1.285
81 17 15.78 1.22
82 20 16.14 3.861
83 17 16.32 0.6756
84 18 15.85 2.155
85 15 15.07-0.06597
86 14 12.96 1.039
87 15 15.63-0.6336
88 17 15.64 1.365
89 16 15.78 0.2207
90 17 16.93 0.07461
91 15 14.96 0.03891
92 16 15.64 0.3648
93 18 16.07 1.925
94 18 16.45 1.546
95 16 16.93-0.927
96 17 15.2 1.798
97 15 15.64-0.6352
98 13 16.14-3.14
99 17 16.6 0.401
100 16 15.25 0.7459
101 15 15.63-0.6344
102 16 15.63 0.3656
103 16 15.27 0.7259
104 13 15.57-2.57
105 15 15.32-0.3206
106 12 13.81-1.814
107 19 15.32 3.68
108 16 15.07 0.9348
109 16 15.42 0.5753
110 17 15.67 1.335
111 16 16.41-0.4142
112 14 15.57-1.569
113 15 15.38-0.3847
114 14 14.75-0.7514
115 16 15.57 0.4305
116 15 15.78-0.7793
117 17 16.65 0.3485
118 15 15-0.0002596
119 16 15.32 0.6802
120 16 15.57 0.4297
121 15 14.69 0.3144
122 15 15.32-0.319
123 11 12.21-1.207
124 16 15.53 0.4704
125 18 16.1 1.901
126 13 13.97-0.9724
127 11 13.85-2.853
128 18 17.5 0.5046
129 15 16.78-1.781
130 19 17.81 1.19
131 17 16.99 0.008101
132 13 15.32-2.321
133 14 15.27-1.275
134 16 15.63 0.3656
135 13 15.44-2.44
136 17 15.72 1.285
137 14 15.78-1.78
138 19 16.16 2.836
139 14 14.49-0.4875
140 16 15.63 0.3656
141 12 13.47-1.474
142 16 16.78-0.7806
143 16 15.32 0.6802
144 15 15.57-0.5703
145 12 15.03-3.026
146 15 15.64-0.6352
147 17 16.41 0.5866
148 13 15.42-2.425
149 15 13.35 1.651
150 18 15.57 2.43
151 15 14.38 0.6166
152 18 15.57 2.431
153 15 17.03-2.031
154 15 16.07-1.074
155 16 15.89 0.1109
156 13 13.87-0.8691
157 16 15.57 0.4297
158 13 15.78-2.779
159 16 13.74 2.256







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
10 0.2643 0.5285 0.7357
11 0.1883 0.3766 0.8117
12 0.0979 0.1958 0.9021
13 0.05129 0.1026 0.9487
14 0.04752 0.09504 0.9525
15 0.06468 0.1294 0.9353
16 0.07076 0.1415 0.9292
17 0.04083 0.08166 0.9592
18 0.08694 0.1739 0.9131
19 0.06088 0.1218 0.9391
20 0.04672 0.09344 0.9533
21 0.03824 0.07647 0.9618
22 0.0233 0.0466 0.9767
23 0.02401 0.04802 0.976
24 0.02341 0.04682 0.9766
25 0.07271 0.1454 0.9273
26 0.05208 0.1042 0.9479
27 0.04421 0.08841 0.9558
28 0.0314 0.0628 0.9686
29 0.02062 0.04124 0.9794
30 0.02754 0.05508 0.9725
31 0.0407 0.0814 0.9593
32 0.0385 0.07701 0.9615
33 0.03687 0.07374 0.9631
34 0.02659 0.05317 0.9734
35 0.01842 0.03684 0.9816
36 0.01392 0.02785 0.9861
37 0.01156 0.02312 0.9884
38 0.03026 0.06052 0.9697
39 0.0259 0.0518 0.9741
40 0.01808 0.03616 0.9819
41 0.01899 0.03798 0.981
42 0.01598 0.03195 0.984
43 0.01171 0.02343 0.9883
44 0.008866 0.01773 0.9911
45 0.006085 0.01217 0.9939
46 0.005459 0.01092 0.9945
47 0.0164 0.0328 0.9836
48 0.01207 0.02413 0.9879
49 0.009542 0.01908 0.9905
50 0.006712 0.01342 0.9933
51 0.04616 0.09232 0.9538
52 0.0351 0.07021 0.9649
53 0.06389 0.1278 0.9361
54 0.05166 0.1033 0.9483
55 0.06345 0.1269 0.9365
56 0.05288 0.1058 0.9471
57 0.04304 0.08607 0.957
58 0.03334 0.06667 0.9667
59 0.02958 0.05916 0.9704
60 0.0223 0.0446 0.9777
61 0.02189 0.04378 0.9781
62 0.01751 0.03501 0.9825
63 0.01679 0.03358 0.9832
64 0.01958 0.03915 0.9804
65 0.02101 0.04202 0.979
66 0.02185 0.0437 0.9781
67 0.01859 0.03719 0.9814
68 0.01492 0.02984 0.9851
69 0.02157 0.04314 0.9784
70 0.04356 0.08712 0.9564
71 0.04079 0.08159 0.9592
72 0.03578 0.07156 0.9642
73 0.02795 0.0559 0.9721
74 0.02501 0.05001 0.975
75 0.02221 0.04442 0.9778
76 0.04673 0.09347 0.9533
77 0.03747 0.07493 0.9625
78 0.1801 0.3602 0.8199
79 0.1527 0.3054 0.8473
80 0.1476 0.2952 0.8524
81 0.1415 0.2829 0.8585
82 0.3775 0.755 0.6225
83 0.3418 0.6835 0.6582
84 0.3983 0.7966 0.6017
85 0.3591 0.7181 0.6409
86 0.3315 0.6629 0.6685
87 0.298 0.5959 0.702
88 0.2991 0.5983 0.7009
89 0.2603 0.5205 0.7397
90 0.2239 0.4477 0.7761
91 0.1926 0.3853 0.8074
92 0.1657 0.3314 0.8343
93 0.1922 0.3844 0.8078
94 0.2032 0.4064 0.7968
95 0.1821 0.3641 0.8179
96 0.1989 0.3979 0.8011
97 0.1716 0.3432 0.8284
98 0.2868 0.5735 0.7132
99 0.2504 0.5008 0.7496
100 0.221 0.442 0.779
101 0.1911 0.3822 0.8089
102 0.1623 0.3245 0.8377
103 0.1394 0.2789 0.8606
104 0.1976 0.3952 0.8024
105 0.1659 0.3319 0.8341
106 0.1793 0.3585 0.8207
107 0.4167 0.8334 0.5833
108 0.3998 0.7997 0.6002
109 0.3757 0.7515 0.6243
110 0.3635 0.7271 0.6365
111 0.326 0.652 0.674
112 0.3314 0.6628 0.6686
113 0.2863 0.5727 0.7137
114 0.2523 0.5047 0.7477
115 0.218 0.436 0.782
116 0.196 0.392 0.804
117 0.1743 0.3485 0.8257
118 0.1463 0.2926 0.8537
119 0.1298 0.2596 0.8702
120 0.113 0.2259 0.887
121 0.1063 0.2127 0.8937
122 0.083 0.166 0.917
123 0.07757 0.1551 0.9224
124 0.06004 0.1201 0.94
125 0.06318 0.1264 0.9368
126 0.06996 0.1399 0.93
127 0.1412 0.2824 0.8588
128 0.1177 0.2355 0.8823
129 0.1003 0.2006 0.8997
130 0.08368 0.1674 0.9163
131 0.07579 0.1516 0.9242
132 0.0734 0.1468 0.9266
133 0.06428 0.1286 0.9357
134 0.04642 0.09284 0.9536
135 0.05062 0.1012 0.9494
136 0.0599 0.1198 0.9401
137 0.05319 0.1064 0.9468
138 0.1037 0.2074 0.8963
139 0.1335 0.267 0.8665
140 0.09517 0.1903 0.9048
141 0.09973 0.1995 0.9003
142 0.07248 0.145 0.9275
143 0.06999 0.14 0.93
144 0.04402 0.08805 0.956
145 0.03815 0.0763 0.9618
146 0.02227 0.04454 0.9777
147 0.01226 0.02452 0.9877
148 0.1264 0.2528 0.8736
149 0.6831 0.6337 0.3169

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
10 &  0.2643 &  0.5285 &  0.7357 \tabularnewline
11 &  0.1883 &  0.3766 &  0.8117 \tabularnewline
12 &  0.0979 &  0.1958 &  0.9021 \tabularnewline
13 &  0.05129 &  0.1026 &  0.9487 \tabularnewline
14 &  0.04752 &  0.09504 &  0.9525 \tabularnewline
15 &  0.06468 &  0.1294 &  0.9353 \tabularnewline
16 &  0.07076 &  0.1415 &  0.9292 \tabularnewline
17 &  0.04083 &  0.08166 &  0.9592 \tabularnewline
18 &  0.08694 &  0.1739 &  0.9131 \tabularnewline
19 &  0.06088 &  0.1218 &  0.9391 \tabularnewline
20 &  0.04672 &  0.09344 &  0.9533 \tabularnewline
21 &  0.03824 &  0.07647 &  0.9618 \tabularnewline
22 &  0.0233 &  0.0466 &  0.9767 \tabularnewline
23 &  0.02401 &  0.04802 &  0.976 \tabularnewline
24 &  0.02341 &  0.04682 &  0.9766 \tabularnewline
25 &  0.07271 &  0.1454 &  0.9273 \tabularnewline
26 &  0.05208 &  0.1042 &  0.9479 \tabularnewline
27 &  0.04421 &  0.08841 &  0.9558 \tabularnewline
28 &  0.0314 &  0.0628 &  0.9686 \tabularnewline
29 &  0.02062 &  0.04124 &  0.9794 \tabularnewline
30 &  0.02754 &  0.05508 &  0.9725 \tabularnewline
31 &  0.0407 &  0.0814 &  0.9593 \tabularnewline
32 &  0.0385 &  0.07701 &  0.9615 \tabularnewline
33 &  0.03687 &  0.07374 &  0.9631 \tabularnewline
34 &  0.02659 &  0.05317 &  0.9734 \tabularnewline
35 &  0.01842 &  0.03684 &  0.9816 \tabularnewline
36 &  0.01392 &  0.02785 &  0.9861 \tabularnewline
37 &  0.01156 &  0.02312 &  0.9884 \tabularnewline
38 &  0.03026 &  0.06052 &  0.9697 \tabularnewline
39 &  0.0259 &  0.0518 &  0.9741 \tabularnewline
40 &  0.01808 &  0.03616 &  0.9819 \tabularnewline
41 &  0.01899 &  0.03798 &  0.981 \tabularnewline
42 &  0.01598 &  0.03195 &  0.984 \tabularnewline
43 &  0.01171 &  0.02343 &  0.9883 \tabularnewline
44 &  0.008866 &  0.01773 &  0.9911 \tabularnewline
45 &  0.006085 &  0.01217 &  0.9939 \tabularnewline
46 &  0.005459 &  0.01092 &  0.9945 \tabularnewline
47 &  0.0164 &  0.0328 &  0.9836 \tabularnewline
48 &  0.01207 &  0.02413 &  0.9879 \tabularnewline
49 &  0.009542 &  0.01908 &  0.9905 \tabularnewline
50 &  0.006712 &  0.01342 &  0.9933 \tabularnewline
51 &  0.04616 &  0.09232 &  0.9538 \tabularnewline
52 &  0.0351 &  0.07021 &  0.9649 \tabularnewline
53 &  0.06389 &  0.1278 &  0.9361 \tabularnewline
54 &  0.05166 &  0.1033 &  0.9483 \tabularnewline
55 &  0.06345 &  0.1269 &  0.9365 \tabularnewline
56 &  0.05288 &  0.1058 &  0.9471 \tabularnewline
57 &  0.04304 &  0.08607 &  0.957 \tabularnewline
58 &  0.03334 &  0.06667 &  0.9667 \tabularnewline
59 &  0.02958 &  0.05916 &  0.9704 \tabularnewline
60 &  0.0223 &  0.0446 &  0.9777 \tabularnewline
61 &  0.02189 &  0.04378 &  0.9781 \tabularnewline
62 &  0.01751 &  0.03501 &  0.9825 \tabularnewline
63 &  0.01679 &  0.03358 &  0.9832 \tabularnewline
64 &  0.01958 &  0.03915 &  0.9804 \tabularnewline
65 &  0.02101 &  0.04202 &  0.979 \tabularnewline
66 &  0.02185 &  0.0437 &  0.9781 \tabularnewline
67 &  0.01859 &  0.03719 &  0.9814 \tabularnewline
68 &  0.01492 &  0.02984 &  0.9851 \tabularnewline
69 &  0.02157 &  0.04314 &  0.9784 \tabularnewline
70 &  0.04356 &  0.08712 &  0.9564 \tabularnewline
71 &  0.04079 &  0.08159 &  0.9592 \tabularnewline
72 &  0.03578 &  0.07156 &  0.9642 \tabularnewline
73 &  0.02795 &  0.0559 &  0.9721 \tabularnewline
74 &  0.02501 &  0.05001 &  0.975 \tabularnewline
75 &  0.02221 &  0.04442 &  0.9778 \tabularnewline
76 &  0.04673 &  0.09347 &  0.9533 \tabularnewline
77 &  0.03747 &  0.07493 &  0.9625 \tabularnewline
78 &  0.1801 &  0.3602 &  0.8199 \tabularnewline
79 &  0.1527 &  0.3054 &  0.8473 \tabularnewline
80 &  0.1476 &  0.2952 &  0.8524 \tabularnewline
81 &  0.1415 &  0.2829 &  0.8585 \tabularnewline
82 &  0.3775 &  0.755 &  0.6225 \tabularnewline
83 &  0.3418 &  0.6835 &  0.6582 \tabularnewline
84 &  0.3983 &  0.7966 &  0.6017 \tabularnewline
85 &  0.3591 &  0.7181 &  0.6409 \tabularnewline
86 &  0.3315 &  0.6629 &  0.6685 \tabularnewline
87 &  0.298 &  0.5959 &  0.702 \tabularnewline
88 &  0.2991 &  0.5983 &  0.7009 \tabularnewline
89 &  0.2603 &  0.5205 &  0.7397 \tabularnewline
90 &  0.2239 &  0.4477 &  0.7761 \tabularnewline
91 &  0.1926 &  0.3853 &  0.8074 \tabularnewline
92 &  0.1657 &  0.3314 &  0.8343 \tabularnewline
93 &  0.1922 &  0.3844 &  0.8078 \tabularnewline
94 &  0.2032 &  0.4064 &  0.7968 \tabularnewline
95 &  0.1821 &  0.3641 &  0.8179 \tabularnewline
96 &  0.1989 &  0.3979 &  0.8011 \tabularnewline
97 &  0.1716 &  0.3432 &  0.8284 \tabularnewline
98 &  0.2868 &  0.5735 &  0.7132 \tabularnewline
99 &  0.2504 &  0.5008 &  0.7496 \tabularnewline
100 &  0.221 &  0.442 &  0.779 \tabularnewline
101 &  0.1911 &  0.3822 &  0.8089 \tabularnewline
102 &  0.1623 &  0.3245 &  0.8377 \tabularnewline
103 &  0.1394 &  0.2789 &  0.8606 \tabularnewline
104 &  0.1976 &  0.3952 &  0.8024 \tabularnewline
105 &  0.1659 &  0.3319 &  0.8341 \tabularnewline
106 &  0.1793 &  0.3585 &  0.8207 \tabularnewline
107 &  0.4167 &  0.8334 &  0.5833 \tabularnewline
108 &  0.3998 &  0.7997 &  0.6002 \tabularnewline
109 &  0.3757 &  0.7515 &  0.6243 \tabularnewline
110 &  0.3635 &  0.7271 &  0.6365 \tabularnewline
111 &  0.326 &  0.652 &  0.674 \tabularnewline
112 &  0.3314 &  0.6628 &  0.6686 \tabularnewline
113 &  0.2863 &  0.5727 &  0.7137 \tabularnewline
114 &  0.2523 &  0.5047 &  0.7477 \tabularnewline
115 &  0.218 &  0.436 &  0.782 \tabularnewline
116 &  0.196 &  0.392 &  0.804 \tabularnewline
117 &  0.1743 &  0.3485 &  0.8257 \tabularnewline
118 &  0.1463 &  0.2926 &  0.8537 \tabularnewline
119 &  0.1298 &  0.2596 &  0.8702 \tabularnewline
120 &  0.113 &  0.2259 &  0.887 \tabularnewline
121 &  0.1063 &  0.2127 &  0.8937 \tabularnewline
122 &  0.083 &  0.166 &  0.917 \tabularnewline
123 &  0.07757 &  0.1551 &  0.9224 \tabularnewline
124 &  0.06004 &  0.1201 &  0.94 \tabularnewline
125 &  0.06318 &  0.1264 &  0.9368 \tabularnewline
126 &  0.06996 &  0.1399 &  0.93 \tabularnewline
127 &  0.1412 &  0.2824 &  0.8588 \tabularnewline
128 &  0.1177 &  0.2355 &  0.8823 \tabularnewline
129 &  0.1003 &  0.2006 &  0.8997 \tabularnewline
130 &  0.08368 &  0.1674 &  0.9163 \tabularnewline
131 &  0.07579 &  0.1516 &  0.9242 \tabularnewline
132 &  0.0734 &  0.1468 &  0.9266 \tabularnewline
133 &  0.06428 &  0.1286 &  0.9357 \tabularnewline
134 &  0.04642 &  0.09284 &  0.9536 \tabularnewline
135 &  0.05062 &  0.1012 &  0.9494 \tabularnewline
136 &  0.0599 &  0.1198 &  0.9401 \tabularnewline
137 &  0.05319 &  0.1064 &  0.9468 \tabularnewline
138 &  0.1037 &  0.2074 &  0.8963 \tabularnewline
139 &  0.1335 &  0.267 &  0.8665 \tabularnewline
140 &  0.09517 &  0.1903 &  0.9048 \tabularnewline
141 &  0.09973 &  0.1995 &  0.9003 \tabularnewline
142 &  0.07248 &  0.145 &  0.9275 \tabularnewline
143 &  0.06999 &  0.14 &  0.93 \tabularnewline
144 &  0.04402 &  0.08805 &  0.956 \tabularnewline
145 &  0.03815 &  0.0763 &  0.9618 \tabularnewline
146 &  0.02227 &  0.04454 &  0.9777 \tabularnewline
147 &  0.01226 &  0.02452 &  0.9877 \tabularnewline
148 &  0.1264 &  0.2528 &  0.8736 \tabularnewline
149 &  0.6831 &  0.6337 &  0.3169 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297868&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]10[/C][C] 0.2643[/C][C] 0.5285[/C][C] 0.7357[/C][/ROW]
[ROW][C]11[/C][C] 0.1883[/C][C] 0.3766[/C][C] 0.8117[/C][/ROW]
[ROW][C]12[/C][C] 0.0979[/C][C] 0.1958[/C][C] 0.9021[/C][/ROW]
[ROW][C]13[/C][C] 0.05129[/C][C] 0.1026[/C][C] 0.9487[/C][/ROW]
[ROW][C]14[/C][C] 0.04752[/C][C] 0.09504[/C][C] 0.9525[/C][/ROW]
[ROW][C]15[/C][C] 0.06468[/C][C] 0.1294[/C][C] 0.9353[/C][/ROW]
[ROW][C]16[/C][C] 0.07076[/C][C] 0.1415[/C][C] 0.9292[/C][/ROW]
[ROW][C]17[/C][C] 0.04083[/C][C] 0.08166[/C][C] 0.9592[/C][/ROW]
[ROW][C]18[/C][C] 0.08694[/C][C] 0.1739[/C][C] 0.9131[/C][/ROW]
[ROW][C]19[/C][C] 0.06088[/C][C] 0.1218[/C][C] 0.9391[/C][/ROW]
[ROW][C]20[/C][C] 0.04672[/C][C] 0.09344[/C][C] 0.9533[/C][/ROW]
[ROW][C]21[/C][C] 0.03824[/C][C] 0.07647[/C][C] 0.9618[/C][/ROW]
[ROW][C]22[/C][C] 0.0233[/C][C] 0.0466[/C][C] 0.9767[/C][/ROW]
[ROW][C]23[/C][C] 0.02401[/C][C] 0.04802[/C][C] 0.976[/C][/ROW]
[ROW][C]24[/C][C] 0.02341[/C][C] 0.04682[/C][C] 0.9766[/C][/ROW]
[ROW][C]25[/C][C] 0.07271[/C][C] 0.1454[/C][C] 0.9273[/C][/ROW]
[ROW][C]26[/C][C] 0.05208[/C][C] 0.1042[/C][C] 0.9479[/C][/ROW]
[ROW][C]27[/C][C] 0.04421[/C][C] 0.08841[/C][C] 0.9558[/C][/ROW]
[ROW][C]28[/C][C] 0.0314[/C][C] 0.0628[/C][C] 0.9686[/C][/ROW]
[ROW][C]29[/C][C] 0.02062[/C][C] 0.04124[/C][C] 0.9794[/C][/ROW]
[ROW][C]30[/C][C] 0.02754[/C][C] 0.05508[/C][C] 0.9725[/C][/ROW]
[ROW][C]31[/C][C] 0.0407[/C][C] 0.0814[/C][C] 0.9593[/C][/ROW]
[ROW][C]32[/C][C] 0.0385[/C][C] 0.07701[/C][C] 0.9615[/C][/ROW]
[ROW][C]33[/C][C] 0.03687[/C][C] 0.07374[/C][C] 0.9631[/C][/ROW]
[ROW][C]34[/C][C] 0.02659[/C][C] 0.05317[/C][C] 0.9734[/C][/ROW]
[ROW][C]35[/C][C] 0.01842[/C][C] 0.03684[/C][C] 0.9816[/C][/ROW]
[ROW][C]36[/C][C] 0.01392[/C][C] 0.02785[/C][C] 0.9861[/C][/ROW]
[ROW][C]37[/C][C] 0.01156[/C][C] 0.02312[/C][C] 0.9884[/C][/ROW]
[ROW][C]38[/C][C] 0.03026[/C][C] 0.06052[/C][C] 0.9697[/C][/ROW]
[ROW][C]39[/C][C] 0.0259[/C][C] 0.0518[/C][C] 0.9741[/C][/ROW]
[ROW][C]40[/C][C] 0.01808[/C][C] 0.03616[/C][C] 0.9819[/C][/ROW]
[ROW][C]41[/C][C] 0.01899[/C][C] 0.03798[/C][C] 0.981[/C][/ROW]
[ROW][C]42[/C][C] 0.01598[/C][C] 0.03195[/C][C] 0.984[/C][/ROW]
[ROW][C]43[/C][C] 0.01171[/C][C] 0.02343[/C][C] 0.9883[/C][/ROW]
[ROW][C]44[/C][C] 0.008866[/C][C] 0.01773[/C][C] 0.9911[/C][/ROW]
[ROW][C]45[/C][C] 0.006085[/C][C] 0.01217[/C][C] 0.9939[/C][/ROW]
[ROW][C]46[/C][C] 0.005459[/C][C] 0.01092[/C][C] 0.9945[/C][/ROW]
[ROW][C]47[/C][C] 0.0164[/C][C] 0.0328[/C][C] 0.9836[/C][/ROW]
[ROW][C]48[/C][C] 0.01207[/C][C] 0.02413[/C][C] 0.9879[/C][/ROW]
[ROW][C]49[/C][C] 0.009542[/C][C] 0.01908[/C][C] 0.9905[/C][/ROW]
[ROW][C]50[/C][C] 0.006712[/C][C] 0.01342[/C][C] 0.9933[/C][/ROW]
[ROW][C]51[/C][C] 0.04616[/C][C] 0.09232[/C][C] 0.9538[/C][/ROW]
[ROW][C]52[/C][C] 0.0351[/C][C] 0.07021[/C][C] 0.9649[/C][/ROW]
[ROW][C]53[/C][C] 0.06389[/C][C] 0.1278[/C][C] 0.9361[/C][/ROW]
[ROW][C]54[/C][C] 0.05166[/C][C] 0.1033[/C][C] 0.9483[/C][/ROW]
[ROW][C]55[/C][C] 0.06345[/C][C] 0.1269[/C][C] 0.9365[/C][/ROW]
[ROW][C]56[/C][C] 0.05288[/C][C] 0.1058[/C][C] 0.9471[/C][/ROW]
[ROW][C]57[/C][C] 0.04304[/C][C] 0.08607[/C][C] 0.957[/C][/ROW]
[ROW][C]58[/C][C] 0.03334[/C][C] 0.06667[/C][C] 0.9667[/C][/ROW]
[ROW][C]59[/C][C] 0.02958[/C][C] 0.05916[/C][C] 0.9704[/C][/ROW]
[ROW][C]60[/C][C] 0.0223[/C][C] 0.0446[/C][C] 0.9777[/C][/ROW]
[ROW][C]61[/C][C] 0.02189[/C][C] 0.04378[/C][C] 0.9781[/C][/ROW]
[ROW][C]62[/C][C] 0.01751[/C][C] 0.03501[/C][C] 0.9825[/C][/ROW]
[ROW][C]63[/C][C] 0.01679[/C][C] 0.03358[/C][C] 0.9832[/C][/ROW]
[ROW][C]64[/C][C] 0.01958[/C][C] 0.03915[/C][C] 0.9804[/C][/ROW]
[ROW][C]65[/C][C] 0.02101[/C][C] 0.04202[/C][C] 0.979[/C][/ROW]
[ROW][C]66[/C][C] 0.02185[/C][C] 0.0437[/C][C] 0.9781[/C][/ROW]
[ROW][C]67[/C][C] 0.01859[/C][C] 0.03719[/C][C] 0.9814[/C][/ROW]
[ROW][C]68[/C][C] 0.01492[/C][C] 0.02984[/C][C] 0.9851[/C][/ROW]
[ROW][C]69[/C][C] 0.02157[/C][C] 0.04314[/C][C] 0.9784[/C][/ROW]
[ROW][C]70[/C][C] 0.04356[/C][C] 0.08712[/C][C] 0.9564[/C][/ROW]
[ROW][C]71[/C][C] 0.04079[/C][C] 0.08159[/C][C] 0.9592[/C][/ROW]
[ROW][C]72[/C][C] 0.03578[/C][C] 0.07156[/C][C] 0.9642[/C][/ROW]
[ROW][C]73[/C][C] 0.02795[/C][C] 0.0559[/C][C] 0.9721[/C][/ROW]
[ROW][C]74[/C][C] 0.02501[/C][C] 0.05001[/C][C] 0.975[/C][/ROW]
[ROW][C]75[/C][C] 0.02221[/C][C] 0.04442[/C][C] 0.9778[/C][/ROW]
[ROW][C]76[/C][C] 0.04673[/C][C] 0.09347[/C][C] 0.9533[/C][/ROW]
[ROW][C]77[/C][C] 0.03747[/C][C] 0.07493[/C][C] 0.9625[/C][/ROW]
[ROW][C]78[/C][C] 0.1801[/C][C] 0.3602[/C][C] 0.8199[/C][/ROW]
[ROW][C]79[/C][C] 0.1527[/C][C] 0.3054[/C][C] 0.8473[/C][/ROW]
[ROW][C]80[/C][C] 0.1476[/C][C] 0.2952[/C][C] 0.8524[/C][/ROW]
[ROW][C]81[/C][C] 0.1415[/C][C] 0.2829[/C][C] 0.8585[/C][/ROW]
[ROW][C]82[/C][C] 0.3775[/C][C] 0.755[/C][C] 0.6225[/C][/ROW]
[ROW][C]83[/C][C] 0.3418[/C][C] 0.6835[/C][C] 0.6582[/C][/ROW]
[ROW][C]84[/C][C] 0.3983[/C][C] 0.7966[/C][C] 0.6017[/C][/ROW]
[ROW][C]85[/C][C] 0.3591[/C][C] 0.7181[/C][C] 0.6409[/C][/ROW]
[ROW][C]86[/C][C] 0.3315[/C][C] 0.6629[/C][C] 0.6685[/C][/ROW]
[ROW][C]87[/C][C] 0.298[/C][C] 0.5959[/C][C] 0.702[/C][/ROW]
[ROW][C]88[/C][C] 0.2991[/C][C] 0.5983[/C][C] 0.7009[/C][/ROW]
[ROW][C]89[/C][C] 0.2603[/C][C] 0.5205[/C][C] 0.7397[/C][/ROW]
[ROW][C]90[/C][C] 0.2239[/C][C] 0.4477[/C][C] 0.7761[/C][/ROW]
[ROW][C]91[/C][C] 0.1926[/C][C] 0.3853[/C][C] 0.8074[/C][/ROW]
[ROW][C]92[/C][C] 0.1657[/C][C] 0.3314[/C][C] 0.8343[/C][/ROW]
[ROW][C]93[/C][C] 0.1922[/C][C] 0.3844[/C][C] 0.8078[/C][/ROW]
[ROW][C]94[/C][C] 0.2032[/C][C] 0.4064[/C][C] 0.7968[/C][/ROW]
[ROW][C]95[/C][C] 0.1821[/C][C] 0.3641[/C][C] 0.8179[/C][/ROW]
[ROW][C]96[/C][C] 0.1989[/C][C] 0.3979[/C][C] 0.8011[/C][/ROW]
[ROW][C]97[/C][C] 0.1716[/C][C] 0.3432[/C][C] 0.8284[/C][/ROW]
[ROW][C]98[/C][C] 0.2868[/C][C] 0.5735[/C][C] 0.7132[/C][/ROW]
[ROW][C]99[/C][C] 0.2504[/C][C] 0.5008[/C][C] 0.7496[/C][/ROW]
[ROW][C]100[/C][C] 0.221[/C][C] 0.442[/C][C] 0.779[/C][/ROW]
[ROW][C]101[/C][C] 0.1911[/C][C] 0.3822[/C][C] 0.8089[/C][/ROW]
[ROW][C]102[/C][C] 0.1623[/C][C] 0.3245[/C][C] 0.8377[/C][/ROW]
[ROW][C]103[/C][C] 0.1394[/C][C] 0.2789[/C][C] 0.8606[/C][/ROW]
[ROW][C]104[/C][C] 0.1976[/C][C] 0.3952[/C][C] 0.8024[/C][/ROW]
[ROW][C]105[/C][C] 0.1659[/C][C] 0.3319[/C][C] 0.8341[/C][/ROW]
[ROW][C]106[/C][C] 0.1793[/C][C] 0.3585[/C][C] 0.8207[/C][/ROW]
[ROW][C]107[/C][C] 0.4167[/C][C] 0.8334[/C][C] 0.5833[/C][/ROW]
[ROW][C]108[/C][C] 0.3998[/C][C] 0.7997[/C][C] 0.6002[/C][/ROW]
[ROW][C]109[/C][C] 0.3757[/C][C] 0.7515[/C][C] 0.6243[/C][/ROW]
[ROW][C]110[/C][C] 0.3635[/C][C] 0.7271[/C][C] 0.6365[/C][/ROW]
[ROW][C]111[/C][C] 0.326[/C][C] 0.652[/C][C] 0.674[/C][/ROW]
[ROW][C]112[/C][C] 0.3314[/C][C] 0.6628[/C][C] 0.6686[/C][/ROW]
[ROW][C]113[/C][C] 0.2863[/C][C] 0.5727[/C][C] 0.7137[/C][/ROW]
[ROW][C]114[/C][C] 0.2523[/C][C] 0.5047[/C][C] 0.7477[/C][/ROW]
[ROW][C]115[/C][C] 0.218[/C][C] 0.436[/C][C] 0.782[/C][/ROW]
[ROW][C]116[/C][C] 0.196[/C][C] 0.392[/C][C] 0.804[/C][/ROW]
[ROW][C]117[/C][C] 0.1743[/C][C] 0.3485[/C][C] 0.8257[/C][/ROW]
[ROW][C]118[/C][C] 0.1463[/C][C] 0.2926[/C][C] 0.8537[/C][/ROW]
[ROW][C]119[/C][C] 0.1298[/C][C] 0.2596[/C][C] 0.8702[/C][/ROW]
[ROW][C]120[/C][C] 0.113[/C][C] 0.2259[/C][C] 0.887[/C][/ROW]
[ROW][C]121[/C][C] 0.1063[/C][C] 0.2127[/C][C] 0.8937[/C][/ROW]
[ROW][C]122[/C][C] 0.083[/C][C] 0.166[/C][C] 0.917[/C][/ROW]
[ROW][C]123[/C][C] 0.07757[/C][C] 0.1551[/C][C] 0.9224[/C][/ROW]
[ROW][C]124[/C][C] 0.06004[/C][C] 0.1201[/C][C] 0.94[/C][/ROW]
[ROW][C]125[/C][C] 0.06318[/C][C] 0.1264[/C][C] 0.9368[/C][/ROW]
[ROW][C]126[/C][C] 0.06996[/C][C] 0.1399[/C][C] 0.93[/C][/ROW]
[ROW][C]127[/C][C] 0.1412[/C][C] 0.2824[/C][C] 0.8588[/C][/ROW]
[ROW][C]128[/C][C] 0.1177[/C][C] 0.2355[/C][C] 0.8823[/C][/ROW]
[ROW][C]129[/C][C] 0.1003[/C][C] 0.2006[/C][C] 0.8997[/C][/ROW]
[ROW][C]130[/C][C] 0.08368[/C][C] 0.1674[/C][C] 0.9163[/C][/ROW]
[ROW][C]131[/C][C] 0.07579[/C][C] 0.1516[/C][C] 0.9242[/C][/ROW]
[ROW][C]132[/C][C] 0.0734[/C][C] 0.1468[/C][C] 0.9266[/C][/ROW]
[ROW][C]133[/C][C] 0.06428[/C][C] 0.1286[/C][C] 0.9357[/C][/ROW]
[ROW][C]134[/C][C] 0.04642[/C][C] 0.09284[/C][C] 0.9536[/C][/ROW]
[ROW][C]135[/C][C] 0.05062[/C][C] 0.1012[/C][C] 0.9494[/C][/ROW]
[ROW][C]136[/C][C] 0.0599[/C][C] 0.1198[/C][C] 0.9401[/C][/ROW]
[ROW][C]137[/C][C] 0.05319[/C][C] 0.1064[/C][C] 0.9468[/C][/ROW]
[ROW][C]138[/C][C] 0.1037[/C][C] 0.2074[/C][C] 0.8963[/C][/ROW]
[ROW][C]139[/C][C] 0.1335[/C][C] 0.267[/C][C] 0.8665[/C][/ROW]
[ROW][C]140[/C][C] 0.09517[/C][C] 0.1903[/C][C] 0.9048[/C][/ROW]
[ROW][C]141[/C][C] 0.09973[/C][C] 0.1995[/C][C] 0.9003[/C][/ROW]
[ROW][C]142[/C][C] 0.07248[/C][C] 0.145[/C][C] 0.9275[/C][/ROW]
[ROW][C]143[/C][C] 0.06999[/C][C] 0.14[/C][C] 0.93[/C][/ROW]
[ROW][C]144[/C][C] 0.04402[/C][C] 0.08805[/C][C] 0.956[/C][/ROW]
[ROW][C]145[/C][C] 0.03815[/C][C] 0.0763[/C][C] 0.9618[/C][/ROW]
[ROW][C]146[/C][C] 0.02227[/C][C] 0.04454[/C][C] 0.9777[/C][/ROW]
[ROW][C]147[/C][C] 0.01226[/C][C] 0.02452[/C][C] 0.9877[/C][/ROW]
[ROW][C]148[/C][C] 0.1264[/C][C] 0.2528[/C][C] 0.8736[/C][/ROW]
[ROW][C]149[/C][C] 0.6831[/C][C] 0.6337[/C][C] 0.3169[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297868&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297868&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
10 0.2643 0.5285 0.7357
11 0.1883 0.3766 0.8117
12 0.0979 0.1958 0.9021
13 0.05129 0.1026 0.9487
14 0.04752 0.09504 0.9525
15 0.06468 0.1294 0.9353
16 0.07076 0.1415 0.9292
17 0.04083 0.08166 0.9592
18 0.08694 0.1739 0.9131
19 0.06088 0.1218 0.9391
20 0.04672 0.09344 0.9533
21 0.03824 0.07647 0.9618
22 0.0233 0.0466 0.9767
23 0.02401 0.04802 0.976
24 0.02341 0.04682 0.9766
25 0.07271 0.1454 0.9273
26 0.05208 0.1042 0.9479
27 0.04421 0.08841 0.9558
28 0.0314 0.0628 0.9686
29 0.02062 0.04124 0.9794
30 0.02754 0.05508 0.9725
31 0.0407 0.0814 0.9593
32 0.0385 0.07701 0.9615
33 0.03687 0.07374 0.9631
34 0.02659 0.05317 0.9734
35 0.01842 0.03684 0.9816
36 0.01392 0.02785 0.9861
37 0.01156 0.02312 0.9884
38 0.03026 0.06052 0.9697
39 0.0259 0.0518 0.9741
40 0.01808 0.03616 0.9819
41 0.01899 0.03798 0.981
42 0.01598 0.03195 0.984
43 0.01171 0.02343 0.9883
44 0.008866 0.01773 0.9911
45 0.006085 0.01217 0.9939
46 0.005459 0.01092 0.9945
47 0.0164 0.0328 0.9836
48 0.01207 0.02413 0.9879
49 0.009542 0.01908 0.9905
50 0.006712 0.01342 0.9933
51 0.04616 0.09232 0.9538
52 0.0351 0.07021 0.9649
53 0.06389 0.1278 0.9361
54 0.05166 0.1033 0.9483
55 0.06345 0.1269 0.9365
56 0.05288 0.1058 0.9471
57 0.04304 0.08607 0.957
58 0.03334 0.06667 0.9667
59 0.02958 0.05916 0.9704
60 0.0223 0.0446 0.9777
61 0.02189 0.04378 0.9781
62 0.01751 0.03501 0.9825
63 0.01679 0.03358 0.9832
64 0.01958 0.03915 0.9804
65 0.02101 0.04202 0.979
66 0.02185 0.0437 0.9781
67 0.01859 0.03719 0.9814
68 0.01492 0.02984 0.9851
69 0.02157 0.04314 0.9784
70 0.04356 0.08712 0.9564
71 0.04079 0.08159 0.9592
72 0.03578 0.07156 0.9642
73 0.02795 0.0559 0.9721
74 0.02501 0.05001 0.975
75 0.02221 0.04442 0.9778
76 0.04673 0.09347 0.9533
77 0.03747 0.07493 0.9625
78 0.1801 0.3602 0.8199
79 0.1527 0.3054 0.8473
80 0.1476 0.2952 0.8524
81 0.1415 0.2829 0.8585
82 0.3775 0.755 0.6225
83 0.3418 0.6835 0.6582
84 0.3983 0.7966 0.6017
85 0.3591 0.7181 0.6409
86 0.3315 0.6629 0.6685
87 0.298 0.5959 0.702
88 0.2991 0.5983 0.7009
89 0.2603 0.5205 0.7397
90 0.2239 0.4477 0.7761
91 0.1926 0.3853 0.8074
92 0.1657 0.3314 0.8343
93 0.1922 0.3844 0.8078
94 0.2032 0.4064 0.7968
95 0.1821 0.3641 0.8179
96 0.1989 0.3979 0.8011
97 0.1716 0.3432 0.8284
98 0.2868 0.5735 0.7132
99 0.2504 0.5008 0.7496
100 0.221 0.442 0.779
101 0.1911 0.3822 0.8089
102 0.1623 0.3245 0.8377
103 0.1394 0.2789 0.8606
104 0.1976 0.3952 0.8024
105 0.1659 0.3319 0.8341
106 0.1793 0.3585 0.8207
107 0.4167 0.8334 0.5833
108 0.3998 0.7997 0.6002
109 0.3757 0.7515 0.6243
110 0.3635 0.7271 0.6365
111 0.326 0.652 0.674
112 0.3314 0.6628 0.6686
113 0.2863 0.5727 0.7137
114 0.2523 0.5047 0.7477
115 0.218 0.436 0.782
116 0.196 0.392 0.804
117 0.1743 0.3485 0.8257
118 0.1463 0.2926 0.8537
119 0.1298 0.2596 0.8702
120 0.113 0.2259 0.887
121 0.1063 0.2127 0.8937
122 0.083 0.166 0.917
123 0.07757 0.1551 0.9224
124 0.06004 0.1201 0.94
125 0.06318 0.1264 0.9368
126 0.06996 0.1399 0.93
127 0.1412 0.2824 0.8588
128 0.1177 0.2355 0.8823
129 0.1003 0.2006 0.8997
130 0.08368 0.1674 0.9163
131 0.07579 0.1516 0.9242
132 0.0734 0.1468 0.9266
133 0.06428 0.1286 0.9357
134 0.04642 0.09284 0.9536
135 0.05062 0.1012 0.9494
136 0.0599 0.1198 0.9401
137 0.05319 0.1064 0.9468
138 0.1037 0.2074 0.8963
139 0.1335 0.267 0.8665
140 0.09517 0.1903 0.9048
141 0.09973 0.1995 0.9003
142 0.07248 0.145 0.9275
143 0.06999 0.14 0.93
144 0.04402 0.08805 0.956
145 0.03815 0.0763 0.9618
146 0.02227 0.04454 0.9777
147 0.01226 0.02452 0.9877
148 0.1264 0.2528 0.8736
149 0.6831 0.6337 0.3169







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level310.221429NOK
10% type I error level590.421429NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 31 & 0.221429 & NOK \tabularnewline
10% type I error level & 59 & 0.421429 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=297868&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]31[/C][C]0.221429[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]59[/C][C]0.421429[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=297868&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297868&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level310.221429NOK
10% type I error level590.421429NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.5224, df1 = 2, df2 = 150, p-value = 0.2215
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.2819, df1 = 12, df2 = 140, p-value = 0.2356
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.143, df1 = 2, df2 = 150, p-value = 0.3216

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.5224, df1 = 2, df2 = 150, p-value = 0.2215
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.2819, df1 = 12, df2 = 140, p-value = 0.2356
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.143, df1 = 2, df2 = 150, p-value = 0.3216
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=297868&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.5224, df1 = 2, df2 = 150, p-value = 0.2215
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.2819, df1 = 12, df2 = 140, p-value = 0.2356
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.143, df1 = 2, df2 = 150, p-value = 0.3216
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297868&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297868&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.5224, df1 = 2, df2 = 150, p-value = 0.2215
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.2819, df1 = 12, df2 = 140, p-value = 0.2356
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.143, df1 = 2, df2 = 150, p-value = 0.3216







Variance Inflation Factors (Multicollinearity)
> vif
     SK1      SK2      SK3      SK4      SK5      SK6 
1.093289 1.131316 1.044801 1.044030 1.044830 1.032498 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
     SK1      SK2      SK3      SK4      SK5      SK6 
1.093289 1.131316 1.044801 1.044030 1.044830 1.032498 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=297868&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
     SK1      SK2      SK3      SK4      SK5      SK6 
1.093289 1.131316 1.044801 1.044030 1.044830 1.032498 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=297868&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=297868&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
     SK1      SK2      SK3      SK4      SK5      SK6 
1.093289 1.131316 1.044801 1.044030 1.044830 1.032498 



Parameters (Session):
par1 = 7 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
Parameters (R input):
par1 = 7 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
par5 <- '0'
par4 <- '0'
par3 <- 'No Linear Trend'
par2 <- 'Do not include Seasonal Dummies'
par1 <- '7'
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')