Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationSat, 17 Dec 2016 16:32:28 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/17/t1481988967rnp12boxlsyxea7.htm/, Retrieved Thu, 02 May 2024 06:49:57 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=300857, Retrieved Thu, 02 May 2024 06:49:57 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact81
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Regressie] [2016-12-17 15:32:28] [695928fec7566687630f1ba48b31beaa] [Current]
Feedback Forum

Post a new message
Dataseries X:
3	5
4	5
5	5
4	4
4	5
5	5
5	4
5	NA
5	5
5	5
5	3
4	5
4	5
4	5
4	4
5	4
4	5
NA	NA
4	4
5	4
4	5
4	5
4	5
4	4
5	5
5	4
4	5
4	4
5	5
5	5
3	3
5	4
4	5
5	4
4	5
5	5
4	4
4	4
4	4
5	5
4	4
5	4
3	3
5	5
4	4
5	4
5	4
4	NA
3	4
NA	4
5	5
5	4
4	5
3	5
4	5
4	5
4	4
4	4
4	4
4	4
5	5
4	5
4	5
4	5
4	5
4	4
5	2
4	5
4	4
4	4
4	5
4	5
3	5
4	5
4	5
4	5
5	5
4	5
4	5
5	5
5	5
3	4
5	4
4	3
5	5
5	5
5	4
5	5
4	5
4	5
2	4
4	3
5	4
5	4
5	5
4	5
4	5
4	5
5	5
4	4
NA	5
5	5
4	5
4	4
4	4
5	5
4	5
5	5
4	5
4	5
4	4
3	4
4	4
3	3
5	4
4	4
4	5
5	5
4	5
4	5
4	5
4	4
4	5
4	5
5	4
4	5
4	5
5	4
4	5
4	5
3	5
4	5
4	4
4	5
3	3
4	5
5	3
3	3
4	5
5	5
5	4
4	5
4	5
4	5
3	5
5	5
4	5
5	5
4	5
4	5
3	4
4	5
4	5
4	5
3	5
4	5
5	5
3	5
4	4
5	5
4	5
5	4
4	4
4	5
4	5
4	4
4	5
3	4
4	5




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time8 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time8 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300857&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]8 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=300857&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300857&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time8 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
Tevredenheid[t] = + 4.01814 + 0.0835813Belang[t] -0.400229M1[t] -0.400229M2[t] -0.0966049M3[t] -0.114515M4[t] -0.239462M5[t] -0.257372M6[t] -0.191914M7[t] -0.185944M8[t] -0.192422M9[t] -0.307692M10[t] -0.0321466M11[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
Tevredenheid[t] =  +  4.01814 +  0.0835813Belang[t] -0.400229M1[t] -0.400229M2[t] -0.0966049M3[t] -0.114515M4[t] -0.239462M5[t] -0.257372M6[t] -0.191914M7[t] -0.185944M8[t] -0.192422M9[t] -0.307692M10[t] -0.0321466M11[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300857&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]Tevredenheid[t] =  +  4.01814 +  0.0835813Belang[t] -0.400229M1[t] -0.400229M2[t] -0.0966049M3[t] -0.114515M4[t] -0.239462M5[t] -0.257372M6[t] -0.191914M7[t] -0.185944M8[t] -0.192422M9[t] -0.307692M10[t] -0.0321466M11[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=300857&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300857&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
Tevredenheid[t] = + 4.01814 + 0.0835813Belang[t] -0.400229M1[t] -0.400229M2[t] -0.0966049M3[t] -0.114515M4[t] -0.239462M5[t] -0.257372M6[t] -0.191914M7[t] -0.185944M8[t] -0.192422M9[t] -0.307692M10[t] -0.0321466M11[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+4.018 0.404+9.9450e+00 3.084e-18 1.542e-18
Belang+0.08358 0.08262+1.0120e+00 0.3133 0.1567
M1-0.4002 0.2489-1.6080e+00 0.1099 0.05496
M2-0.4002 0.2489-1.6080e+00 0.1099 0.05496
M3-0.0966 0.2484-3.8890e-01 0.6979 0.349
M4-0.1145 0.2489-4.6010e-01 0.6461 0.3231
M5-0.2395 0.2484-9.6390e-01 0.3366 0.1683
M6-0.2574 0.2489-1.0340e+00 0.3028 0.1514
M7-0.1919 0.2493-7.6970e-01 0.4427 0.2213
M8-0.1859 0.2489-7.4710e-01 0.4562 0.2281
M9-0.1924 0.2558-7.5210e-01 0.4532 0.2266
M10-0.3077 0.253-1.2160e+00 0.2258 0.1129
M11-0.03215 0.255-1.2610e-01 0.8998 0.4499

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +4.018 &  0.404 & +9.9450e+00 &  3.084e-18 &  1.542e-18 \tabularnewline
Belang & +0.08358 &  0.08262 & +1.0120e+00 &  0.3133 &  0.1567 \tabularnewline
M1 & -0.4002 &  0.2489 & -1.6080e+00 &  0.1099 &  0.05496 \tabularnewline
M2 & -0.4002 &  0.2489 & -1.6080e+00 &  0.1099 &  0.05496 \tabularnewline
M3 & -0.0966 &  0.2484 & -3.8890e-01 &  0.6979 &  0.349 \tabularnewline
M4 & -0.1145 &  0.2489 & -4.6010e-01 &  0.6461 &  0.3231 \tabularnewline
M5 & -0.2395 &  0.2484 & -9.6390e-01 &  0.3366 &  0.1683 \tabularnewline
M6 & -0.2574 &  0.2489 & -1.0340e+00 &  0.3028 &  0.1514 \tabularnewline
M7 & -0.1919 &  0.2493 & -7.6970e-01 &  0.4427 &  0.2213 \tabularnewline
M8 & -0.1859 &  0.2489 & -7.4710e-01 &  0.4562 &  0.2281 \tabularnewline
M9 & -0.1924 &  0.2558 & -7.5210e-01 &  0.4532 &  0.2266 \tabularnewline
M10 & -0.3077 &  0.253 & -1.2160e+00 &  0.2258 &  0.1129 \tabularnewline
M11 & -0.03215 &  0.255 & -1.2610e-01 &  0.8998 &  0.4499 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300857&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+4.018[/C][C] 0.404[/C][C]+9.9450e+00[/C][C] 3.084e-18[/C][C] 1.542e-18[/C][/ROW]
[ROW][C]Belang[/C][C]+0.08358[/C][C] 0.08262[/C][C]+1.0120e+00[/C][C] 0.3133[/C][C] 0.1567[/C][/ROW]
[ROW][C]M1[/C][C]-0.4002[/C][C] 0.2489[/C][C]-1.6080e+00[/C][C] 0.1099[/C][C] 0.05496[/C][/ROW]
[ROW][C]M2[/C][C]-0.4002[/C][C] 0.2489[/C][C]-1.6080e+00[/C][C] 0.1099[/C][C] 0.05496[/C][/ROW]
[ROW][C]M3[/C][C]-0.0966[/C][C] 0.2484[/C][C]-3.8890e-01[/C][C] 0.6979[/C][C] 0.349[/C][/ROW]
[ROW][C]M4[/C][C]-0.1145[/C][C] 0.2489[/C][C]-4.6010e-01[/C][C] 0.6461[/C][C] 0.3231[/C][/ROW]
[ROW][C]M5[/C][C]-0.2395[/C][C] 0.2484[/C][C]-9.6390e-01[/C][C] 0.3366[/C][C] 0.1683[/C][/ROW]
[ROW][C]M6[/C][C]-0.2574[/C][C] 0.2489[/C][C]-1.0340e+00[/C][C] 0.3028[/C][C] 0.1514[/C][/ROW]
[ROW][C]M7[/C][C]-0.1919[/C][C] 0.2493[/C][C]-7.6970e-01[/C][C] 0.4427[/C][C] 0.2213[/C][/ROW]
[ROW][C]M8[/C][C]-0.1859[/C][C] 0.2489[/C][C]-7.4710e-01[/C][C] 0.4562[/C][C] 0.2281[/C][/ROW]
[ROW][C]M9[/C][C]-0.1924[/C][C] 0.2558[/C][C]-7.5210e-01[/C][C] 0.4532[/C][C] 0.2266[/C][/ROW]
[ROW][C]M10[/C][C]-0.3077[/C][C] 0.253[/C][C]-1.2160e+00[/C][C] 0.2258[/C][C] 0.1129[/C][/ROW]
[ROW][C]M11[/C][C]-0.03215[/C][C] 0.255[/C][C]-1.2610e-01[/C][C] 0.8998[/C][C] 0.4499[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=300857&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300857&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+4.018 0.404+9.9450e+00 3.084e-18 1.542e-18
Belang+0.08358 0.08262+1.0120e+00 0.3133 0.1567
M1-0.4002 0.2489-1.6080e+00 0.1099 0.05496
M2-0.4002 0.2489-1.6080e+00 0.1099 0.05496
M3-0.0966 0.2484-3.8890e-01 0.6979 0.349
M4-0.1145 0.2489-4.6010e-01 0.6461 0.3231
M5-0.2395 0.2484-9.6390e-01 0.3366 0.1683
M6-0.2574 0.2489-1.0340e+00 0.3028 0.1514
M7-0.1919 0.2493-7.6970e-01 0.4427 0.2213
M8-0.1859 0.2489-7.4710e-01 0.4562 0.2281
M9-0.1924 0.2558-7.5210e-01 0.4532 0.2266
M10-0.3077 0.253-1.2160e+00 0.2258 0.1129
M11-0.03215 0.255-1.2610e-01 0.8998 0.4499







Multiple Linear Regression - Regression Statistics
Multiple R 0.2115
R-squared 0.04471
Adjusted R-squared-0.03121
F-TEST (value) 0.589
F-TEST (DF numerator)12
F-TEST (DF denominator)151
p-value 0.8487
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 0.645
Sum Squared Residuals 62.82

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.2115 \tabularnewline
R-squared &  0.04471 \tabularnewline
Adjusted R-squared & -0.03121 \tabularnewline
F-TEST (value) &  0.589 \tabularnewline
F-TEST (DF numerator) & 12 \tabularnewline
F-TEST (DF denominator) & 151 \tabularnewline
p-value &  0.8487 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  0.645 \tabularnewline
Sum Squared Residuals &  62.82 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300857&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.2115[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.04471[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]-0.03121[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 0.589[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]12[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]151[/C][/ROW]
[ROW][C]p-value[/C][C] 0.8487[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 0.645[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 62.82[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=300857&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300857&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.2115
R-squared 0.04471
Adjusted R-squared-0.03121
F-TEST (value) 0.589
F-TEST (DF numerator)12
F-TEST (DF denominator)151
p-value 0.8487
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 0.645
Sum Squared Residuals 62.82







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 3 4.036-1.036
2 4 4.036-0.03582
3 5 4.339 0.6606
4 4 4.238-0.238
5 4 4.197-0.1966
6 5 4.179 0.8213
7 5 4.161 0.8394
8 5 4.25 0.7499
9 5 4.244 0.7564
10 5 3.961 1.039
11 4 4.404-0.4039
12 4 4.436-0.436
13 4 4.036-0.03582
14 4 3.952 0.04776
15 5 4.256 0.7441
16 4 4.322-0.3215
17 4 4.113-0.113
18 5 4.095 0.9049
19 4 4.244-0.2441
20 4 4.25-0.2501
21 4 4.244-0.2436
22 4 4.045-0.04478
23 5 4.404 0.5961
24 5 4.352 0.6475
25 4 4.036-0.03582
26 4 3.952 0.04776
27 5 4.339 0.6606
28 5 4.322 0.6785
29 3 4.029-1.029
30 5 4.095 0.9049
31 4 4.244-0.2441
32 5 4.167 0.8335
33 4 4.244-0.2436
34 5 4.128 0.8716
35 4 4.32-0.3203
36 4 4.352-0.3525
37 4 3.952 0.04776
38 5 4.036 0.9642
39 4 4.256-0.2559
40 5 4.238 0.762
41 3 4.029-1.029
42 5 4.179 0.8213
43 4 4.161-0.1606
44 5 4.167 0.8335
45 5 4.16 0.84
46 3 4.045-1.045
47 5 4.404 0.5961
48 5 4.352 0.6475
49 4 4.036-0.03582
50 3 4.036-1.036
51 4 4.339-0.3394
52 4 4.322-0.3215
53 4 4.113-0.113
54 4 4.095-0.0951
55 4 4.161-0.1606
56 4 4.167-0.1665
57 5 4.244 0.7564
58 4 4.128-0.1284
59 4 4.404-0.4039
60 4 4.436-0.436
61 4 4.036-0.03582
62 4 3.952 0.04776
63 5 4.089 0.9113
64 4 4.322-0.3215
65 4 4.113-0.113
66 4 4.095-0.0951
67 4 4.244-0.2441
68 4 4.25-0.2501
69 3 4.244-1.244
70 4 4.128-0.1284
71 4 4.404-0.4039
72 4 4.436-0.436
73 5 4.036 0.9642
74 4 4.036-0.03582
75 4 4.339-0.3394
76 5 4.322 0.6785
77 5 4.197 0.8034
78 3 4.095-1.095
79 5 4.161 0.8394
80 4 4.083-0.08294
81 5 4.244 0.7564
82 5 4.128 0.8716
83 5 4.32 0.6797
84 5 4.436 0.5639
85 4 4.036-0.03582
86 4 4.036-0.03582
87 2 4.256-2.256
88 4 4.154-0.1544
89 5 4.113 0.887
90 5 4.095 0.9049
91 5 4.244 0.7559
92 4 4.25-0.2501
93 4 4.244-0.2436
94 4 4.128-0.1284
95 5 4.404 0.5961
96 4 4.352-0.3525
97 5 4.036 0.9642
98 4 4.036-0.03582
99 4 4.256-0.2559
100 4 4.238-0.238
101 5 4.197 0.8034
102 4 4.179-0.1787
103 5 4.244 0.7559
104 4 4.25-0.2501
105 4 4.244-0.2436
106 4 4.045-0.04478
107 3 4.32-1.32
108 4 4.352-0.3525
109 3 3.869-0.8687
110 5 3.952 1.048
111 4 4.256-0.2559
112 4 4.322-0.3215
113 5 4.197 0.8034
114 4 4.179-0.1787
115 4 4.244-0.2441
116 4 4.25-0.2501
117 4 4.16-0.16
118 4 4.128-0.1284
119 4 4.404-0.4039
120 5 4.352 0.6475
121 4 4.036-0.03582
122 4 4.036-0.03582
123 5 4.256 0.7441
124 4 4.322-0.3215
125 4 4.197-0.1966
126 3 4.179-1.179
127 4 4.244-0.2441
128 4 4.167-0.1665
129 4 4.244-0.2436
130 3 3.961-0.9612
131 4 4.404-0.4039
132 5 4.269 0.7311
133 3 3.869-0.8687
134 4 4.036-0.03582
135 5 4.339 0.6606
136 5 4.238 0.762
137 4 4.197-0.1966
138 4 4.179-0.1787
139 4 4.244-0.2441
140 3 4.25-1.25
141 5 4.244 0.7564
142 4 4.128-0.1284
143 5 4.404 0.5961
144 4 4.436-0.436
145 4 4.036-0.03582
146 3 3.952-0.9522
147 4 4.339-0.3394
148 4 4.322-0.3215
149 4 4.197-0.1966
150 3 4.179-1.179
151 4 4.244-0.2441
152 5 4.25 0.7499
153 3 4.244-1.244
154 4 4.045-0.04478
155 5 4.404 0.5961
156 4 4.436-0.436
157 5 3.952 1.048
158 4 3.952 0.04776
159 4 4.339-0.3394
160 4 4.322-0.3215
161 4 4.113-0.113
162 4 4.179-0.1787
163 3 4.161-1.161
164 4 4.25-0.2501

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  3 &  4.036 & -1.036 \tabularnewline
2 &  4 &  4.036 & -0.03582 \tabularnewline
3 &  5 &  4.339 &  0.6606 \tabularnewline
4 &  4 &  4.238 & -0.238 \tabularnewline
5 &  4 &  4.197 & -0.1966 \tabularnewline
6 &  5 &  4.179 &  0.8213 \tabularnewline
7 &  5 &  4.161 &  0.8394 \tabularnewline
8 &  5 &  4.25 &  0.7499 \tabularnewline
9 &  5 &  4.244 &  0.7564 \tabularnewline
10 &  5 &  3.961 &  1.039 \tabularnewline
11 &  4 &  4.404 & -0.4039 \tabularnewline
12 &  4 &  4.436 & -0.436 \tabularnewline
13 &  4 &  4.036 & -0.03582 \tabularnewline
14 &  4 &  3.952 &  0.04776 \tabularnewline
15 &  5 &  4.256 &  0.7441 \tabularnewline
16 &  4 &  4.322 & -0.3215 \tabularnewline
17 &  4 &  4.113 & -0.113 \tabularnewline
18 &  5 &  4.095 &  0.9049 \tabularnewline
19 &  4 &  4.244 & -0.2441 \tabularnewline
20 &  4 &  4.25 & -0.2501 \tabularnewline
21 &  4 &  4.244 & -0.2436 \tabularnewline
22 &  4 &  4.045 & -0.04478 \tabularnewline
23 &  5 &  4.404 &  0.5961 \tabularnewline
24 &  5 &  4.352 &  0.6475 \tabularnewline
25 &  4 &  4.036 & -0.03582 \tabularnewline
26 &  4 &  3.952 &  0.04776 \tabularnewline
27 &  5 &  4.339 &  0.6606 \tabularnewline
28 &  5 &  4.322 &  0.6785 \tabularnewline
29 &  3 &  4.029 & -1.029 \tabularnewline
30 &  5 &  4.095 &  0.9049 \tabularnewline
31 &  4 &  4.244 & -0.2441 \tabularnewline
32 &  5 &  4.167 &  0.8335 \tabularnewline
33 &  4 &  4.244 & -0.2436 \tabularnewline
34 &  5 &  4.128 &  0.8716 \tabularnewline
35 &  4 &  4.32 & -0.3203 \tabularnewline
36 &  4 &  4.352 & -0.3525 \tabularnewline
37 &  4 &  3.952 &  0.04776 \tabularnewline
38 &  5 &  4.036 &  0.9642 \tabularnewline
39 &  4 &  4.256 & -0.2559 \tabularnewline
40 &  5 &  4.238 &  0.762 \tabularnewline
41 &  3 &  4.029 & -1.029 \tabularnewline
42 &  5 &  4.179 &  0.8213 \tabularnewline
43 &  4 &  4.161 & -0.1606 \tabularnewline
44 &  5 &  4.167 &  0.8335 \tabularnewline
45 &  5 &  4.16 &  0.84 \tabularnewline
46 &  3 &  4.045 & -1.045 \tabularnewline
47 &  5 &  4.404 &  0.5961 \tabularnewline
48 &  5 &  4.352 &  0.6475 \tabularnewline
49 &  4 &  4.036 & -0.03582 \tabularnewline
50 &  3 &  4.036 & -1.036 \tabularnewline
51 &  4 &  4.339 & -0.3394 \tabularnewline
52 &  4 &  4.322 & -0.3215 \tabularnewline
53 &  4 &  4.113 & -0.113 \tabularnewline
54 &  4 &  4.095 & -0.0951 \tabularnewline
55 &  4 &  4.161 & -0.1606 \tabularnewline
56 &  4 &  4.167 & -0.1665 \tabularnewline
57 &  5 &  4.244 &  0.7564 \tabularnewline
58 &  4 &  4.128 & -0.1284 \tabularnewline
59 &  4 &  4.404 & -0.4039 \tabularnewline
60 &  4 &  4.436 & -0.436 \tabularnewline
61 &  4 &  4.036 & -0.03582 \tabularnewline
62 &  4 &  3.952 &  0.04776 \tabularnewline
63 &  5 &  4.089 &  0.9113 \tabularnewline
64 &  4 &  4.322 & -0.3215 \tabularnewline
65 &  4 &  4.113 & -0.113 \tabularnewline
66 &  4 &  4.095 & -0.0951 \tabularnewline
67 &  4 &  4.244 & -0.2441 \tabularnewline
68 &  4 &  4.25 & -0.2501 \tabularnewline
69 &  3 &  4.244 & -1.244 \tabularnewline
70 &  4 &  4.128 & -0.1284 \tabularnewline
71 &  4 &  4.404 & -0.4039 \tabularnewline
72 &  4 &  4.436 & -0.436 \tabularnewline
73 &  5 &  4.036 &  0.9642 \tabularnewline
74 &  4 &  4.036 & -0.03582 \tabularnewline
75 &  4 &  4.339 & -0.3394 \tabularnewline
76 &  5 &  4.322 &  0.6785 \tabularnewline
77 &  5 &  4.197 &  0.8034 \tabularnewline
78 &  3 &  4.095 & -1.095 \tabularnewline
79 &  5 &  4.161 &  0.8394 \tabularnewline
80 &  4 &  4.083 & -0.08294 \tabularnewline
81 &  5 &  4.244 &  0.7564 \tabularnewline
82 &  5 &  4.128 &  0.8716 \tabularnewline
83 &  5 &  4.32 &  0.6797 \tabularnewline
84 &  5 &  4.436 &  0.5639 \tabularnewline
85 &  4 &  4.036 & -0.03582 \tabularnewline
86 &  4 &  4.036 & -0.03582 \tabularnewline
87 &  2 &  4.256 & -2.256 \tabularnewline
88 &  4 &  4.154 & -0.1544 \tabularnewline
89 &  5 &  4.113 &  0.887 \tabularnewline
90 &  5 &  4.095 &  0.9049 \tabularnewline
91 &  5 &  4.244 &  0.7559 \tabularnewline
92 &  4 &  4.25 & -0.2501 \tabularnewline
93 &  4 &  4.244 & -0.2436 \tabularnewline
94 &  4 &  4.128 & -0.1284 \tabularnewline
95 &  5 &  4.404 &  0.5961 \tabularnewline
96 &  4 &  4.352 & -0.3525 \tabularnewline
97 &  5 &  4.036 &  0.9642 \tabularnewline
98 &  4 &  4.036 & -0.03582 \tabularnewline
99 &  4 &  4.256 & -0.2559 \tabularnewline
100 &  4 &  4.238 & -0.238 \tabularnewline
101 &  5 &  4.197 &  0.8034 \tabularnewline
102 &  4 &  4.179 & -0.1787 \tabularnewline
103 &  5 &  4.244 &  0.7559 \tabularnewline
104 &  4 &  4.25 & -0.2501 \tabularnewline
105 &  4 &  4.244 & -0.2436 \tabularnewline
106 &  4 &  4.045 & -0.04478 \tabularnewline
107 &  3 &  4.32 & -1.32 \tabularnewline
108 &  4 &  4.352 & -0.3525 \tabularnewline
109 &  3 &  3.869 & -0.8687 \tabularnewline
110 &  5 &  3.952 &  1.048 \tabularnewline
111 &  4 &  4.256 & -0.2559 \tabularnewline
112 &  4 &  4.322 & -0.3215 \tabularnewline
113 &  5 &  4.197 &  0.8034 \tabularnewline
114 &  4 &  4.179 & -0.1787 \tabularnewline
115 &  4 &  4.244 & -0.2441 \tabularnewline
116 &  4 &  4.25 & -0.2501 \tabularnewline
117 &  4 &  4.16 & -0.16 \tabularnewline
118 &  4 &  4.128 & -0.1284 \tabularnewline
119 &  4 &  4.404 & -0.4039 \tabularnewline
120 &  5 &  4.352 &  0.6475 \tabularnewline
121 &  4 &  4.036 & -0.03582 \tabularnewline
122 &  4 &  4.036 & -0.03582 \tabularnewline
123 &  5 &  4.256 &  0.7441 \tabularnewline
124 &  4 &  4.322 & -0.3215 \tabularnewline
125 &  4 &  4.197 & -0.1966 \tabularnewline
126 &  3 &  4.179 & -1.179 \tabularnewline
127 &  4 &  4.244 & -0.2441 \tabularnewline
128 &  4 &  4.167 & -0.1665 \tabularnewline
129 &  4 &  4.244 & -0.2436 \tabularnewline
130 &  3 &  3.961 & -0.9612 \tabularnewline
131 &  4 &  4.404 & -0.4039 \tabularnewline
132 &  5 &  4.269 &  0.7311 \tabularnewline
133 &  3 &  3.869 & -0.8687 \tabularnewline
134 &  4 &  4.036 & -0.03582 \tabularnewline
135 &  5 &  4.339 &  0.6606 \tabularnewline
136 &  5 &  4.238 &  0.762 \tabularnewline
137 &  4 &  4.197 & -0.1966 \tabularnewline
138 &  4 &  4.179 & -0.1787 \tabularnewline
139 &  4 &  4.244 & -0.2441 \tabularnewline
140 &  3 &  4.25 & -1.25 \tabularnewline
141 &  5 &  4.244 &  0.7564 \tabularnewline
142 &  4 &  4.128 & -0.1284 \tabularnewline
143 &  5 &  4.404 &  0.5961 \tabularnewline
144 &  4 &  4.436 & -0.436 \tabularnewline
145 &  4 &  4.036 & -0.03582 \tabularnewline
146 &  3 &  3.952 & -0.9522 \tabularnewline
147 &  4 &  4.339 & -0.3394 \tabularnewline
148 &  4 &  4.322 & -0.3215 \tabularnewline
149 &  4 &  4.197 & -0.1966 \tabularnewline
150 &  3 &  4.179 & -1.179 \tabularnewline
151 &  4 &  4.244 & -0.2441 \tabularnewline
152 &  5 &  4.25 &  0.7499 \tabularnewline
153 &  3 &  4.244 & -1.244 \tabularnewline
154 &  4 &  4.045 & -0.04478 \tabularnewline
155 &  5 &  4.404 &  0.5961 \tabularnewline
156 &  4 &  4.436 & -0.436 \tabularnewline
157 &  5 &  3.952 &  1.048 \tabularnewline
158 &  4 &  3.952 &  0.04776 \tabularnewline
159 &  4 &  4.339 & -0.3394 \tabularnewline
160 &  4 &  4.322 & -0.3215 \tabularnewline
161 &  4 &  4.113 & -0.113 \tabularnewline
162 &  4 &  4.179 & -0.1787 \tabularnewline
163 &  3 &  4.161 & -1.161 \tabularnewline
164 &  4 &  4.25 & -0.2501 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300857&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 3[/C][C] 4.036[/C][C]-1.036[/C][/ROW]
[ROW][C]2[/C][C] 4[/C][C] 4.036[/C][C]-0.03582[/C][/ROW]
[ROW][C]3[/C][C] 5[/C][C] 4.339[/C][C] 0.6606[/C][/ROW]
[ROW][C]4[/C][C] 4[/C][C] 4.238[/C][C]-0.238[/C][/ROW]
[ROW][C]5[/C][C] 4[/C][C] 4.197[/C][C]-0.1966[/C][/ROW]
[ROW][C]6[/C][C] 5[/C][C] 4.179[/C][C] 0.8213[/C][/ROW]
[ROW][C]7[/C][C] 5[/C][C] 4.161[/C][C] 0.8394[/C][/ROW]
[ROW][C]8[/C][C] 5[/C][C] 4.25[/C][C] 0.7499[/C][/ROW]
[ROW][C]9[/C][C] 5[/C][C] 4.244[/C][C] 0.7564[/C][/ROW]
[ROW][C]10[/C][C] 5[/C][C] 3.961[/C][C] 1.039[/C][/ROW]
[ROW][C]11[/C][C] 4[/C][C] 4.404[/C][C]-0.4039[/C][/ROW]
[ROW][C]12[/C][C] 4[/C][C] 4.436[/C][C]-0.436[/C][/ROW]
[ROW][C]13[/C][C] 4[/C][C] 4.036[/C][C]-0.03582[/C][/ROW]
[ROW][C]14[/C][C] 4[/C][C] 3.952[/C][C] 0.04776[/C][/ROW]
[ROW][C]15[/C][C] 5[/C][C] 4.256[/C][C] 0.7441[/C][/ROW]
[ROW][C]16[/C][C] 4[/C][C] 4.322[/C][C]-0.3215[/C][/ROW]
[ROW][C]17[/C][C] 4[/C][C] 4.113[/C][C]-0.113[/C][/ROW]
[ROW][C]18[/C][C] 5[/C][C] 4.095[/C][C] 0.9049[/C][/ROW]
[ROW][C]19[/C][C] 4[/C][C] 4.244[/C][C]-0.2441[/C][/ROW]
[ROW][C]20[/C][C] 4[/C][C] 4.25[/C][C]-0.2501[/C][/ROW]
[ROW][C]21[/C][C] 4[/C][C] 4.244[/C][C]-0.2436[/C][/ROW]
[ROW][C]22[/C][C] 4[/C][C] 4.045[/C][C]-0.04478[/C][/ROW]
[ROW][C]23[/C][C] 5[/C][C] 4.404[/C][C] 0.5961[/C][/ROW]
[ROW][C]24[/C][C] 5[/C][C] 4.352[/C][C] 0.6475[/C][/ROW]
[ROW][C]25[/C][C] 4[/C][C] 4.036[/C][C]-0.03582[/C][/ROW]
[ROW][C]26[/C][C] 4[/C][C] 3.952[/C][C] 0.04776[/C][/ROW]
[ROW][C]27[/C][C] 5[/C][C] 4.339[/C][C] 0.6606[/C][/ROW]
[ROW][C]28[/C][C] 5[/C][C] 4.322[/C][C] 0.6785[/C][/ROW]
[ROW][C]29[/C][C] 3[/C][C] 4.029[/C][C]-1.029[/C][/ROW]
[ROW][C]30[/C][C] 5[/C][C] 4.095[/C][C] 0.9049[/C][/ROW]
[ROW][C]31[/C][C] 4[/C][C] 4.244[/C][C]-0.2441[/C][/ROW]
[ROW][C]32[/C][C] 5[/C][C] 4.167[/C][C] 0.8335[/C][/ROW]
[ROW][C]33[/C][C] 4[/C][C] 4.244[/C][C]-0.2436[/C][/ROW]
[ROW][C]34[/C][C] 5[/C][C] 4.128[/C][C] 0.8716[/C][/ROW]
[ROW][C]35[/C][C] 4[/C][C] 4.32[/C][C]-0.3203[/C][/ROW]
[ROW][C]36[/C][C] 4[/C][C] 4.352[/C][C]-0.3525[/C][/ROW]
[ROW][C]37[/C][C] 4[/C][C] 3.952[/C][C] 0.04776[/C][/ROW]
[ROW][C]38[/C][C] 5[/C][C] 4.036[/C][C] 0.9642[/C][/ROW]
[ROW][C]39[/C][C] 4[/C][C] 4.256[/C][C]-0.2559[/C][/ROW]
[ROW][C]40[/C][C] 5[/C][C] 4.238[/C][C] 0.762[/C][/ROW]
[ROW][C]41[/C][C] 3[/C][C] 4.029[/C][C]-1.029[/C][/ROW]
[ROW][C]42[/C][C] 5[/C][C] 4.179[/C][C] 0.8213[/C][/ROW]
[ROW][C]43[/C][C] 4[/C][C] 4.161[/C][C]-0.1606[/C][/ROW]
[ROW][C]44[/C][C] 5[/C][C] 4.167[/C][C] 0.8335[/C][/ROW]
[ROW][C]45[/C][C] 5[/C][C] 4.16[/C][C] 0.84[/C][/ROW]
[ROW][C]46[/C][C] 3[/C][C] 4.045[/C][C]-1.045[/C][/ROW]
[ROW][C]47[/C][C] 5[/C][C] 4.404[/C][C] 0.5961[/C][/ROW]
[ROW][C]48[/C][C] 5[/C][C] 4.352[/C][C] 0.6475[/C][/ROW]
[ROW][C]49[/C][C] 4[/C][C] 4.036[/C][C]-0.03582[/C][/ROW]
[ROW][C]50[/C][C] 3[/C][C] 4.036[/C][C]-1.036[/C][/ROW]
[ROW][C]51[/C][C] 4[/C][C] 4.339[/C][C]-0.3394[/C][/ROW]
[ROW][C]52[/C][C] 4[/C][C] 4.322[/C][C]-0.3215[/C][/ROW]
[ROW][C]53[/C][C] 4[/C][C] 4.113[/C][C]-0.113[/C][/ROW]
[ROW][C]54[/C][C] 4[/C][C] 4.095[/C][C]-0.0951[/C][/ROW]
[ROW][C]55[/C][C] 4[/C][C] 4.161[/C][C]-0.1606[/C][/ROW]
[ROW][C]56[/C][C] 4[/C][C] 4.167[/C][C]-0.1665[/C][/ROW]
[ROW][C]57[/C][C] 5[/C][C] 4.244[/C][C] 0.7564[/C][/ROW]
[ROW][C]58[/C][C] 4[/C][C] 4.128[/C][C]-0.1284[/C][/ROW]
[ROW][C]59[/C][C] 4[/C][C] 4.404[/C][C]-0.4039[/C][/ROW]
[ROW][C]60[/C][C] 4[/C][C] 4.436[/C][C]-0.436[/C][/ROW]
[ROW][C]61[/C][C] 4[/C][C] 4.036[/C][C]-0.03582[/C][/ROW]
[ROW][C]62[/C][C] 4[/C][C] 3.952[/C][C] 0.04776[/C][/ROW]
[ROW][C]63[/C][C] 5[/C][C] 4.089[/C][C] 0.9113[/C][/ROW]
[ROW][C]64[/C][C] 4[/C][C] 4.322[/C][C]-0.3215[/C][/ROW]
[ROW][C]65[/C][C] 4[/C][C] 4.113[/C][C]-0.113[/C][/ROW]
[ROW][C]66[/C][C] 4[/C][C] 4.095[/C][C]-0.0951[/C][/ROW]
[ROW][C]67[/C][C] 4[/C][C] 4.244[/C][C]-0.2441[/C][/ROW]
[ROW][C]68[/C][C] 4[/C][C] 4.25[/C][C]-0.2501[/C][/ROW]
[ROW][C]69[/C][C] 3[/C][C] 4.244[/C][C]-1.244[/C][/ROW]
[ROW][C]70[/C][C] 4[/C][C] 4.128[/C][C]-0.1284[/C][/ROW]
[ROW][C]71[/C][C] 4[/C][C] 4.404[/C][C]-0.4039[/C][/ROW]
[ROW][C]72[/C][C] 4[/C][C] 4.436[/C][C]-0.436[/C][/ROW]
[ROW][C]73[/C][C] 5[/C][C] 4.036[/C][C] 0.9642[/C][/ROW]
[ROW][C]74[/C][C] 4[/C][C] 4.036[/C][C]-0.03582[/C][/ROW]
[ROW][C]75[/C][C] 4[/C][C] 4.339[/C][C]-0.3394[/C][/ROW]
[ROW][C]76[/C][C] 5[/C][C] 4.322[/C][C] 0.6785[/C][/ROW]
[ROW][C]77[/C][C] 5[/C][C] 4.197[/C][C] 0.8034[/C][/ROW]
[ROW][C]78[/C][C] 3[/C][C] 4.095[/C][C]-1.095[/C][/ROW]
[ROW][C]79[/C][C] 5[/C][C] 4.161[/C][C] 0.8394[/C][/ROW]
[ROW][C]80[/C][C] 4[/C][C] 4.083[/C][C]-0.08294[/C][/ROW]
[ROW][C]81[/C][C] 5[/C][C] 4.244[/C][C] 0.7564[/C][/ROW]
[ROW][C]82[/C][C] 5[/C][C] 4.128[/C][C] 0.8716[/C][/ROW]
[ROW][C]83[/C][C] 5[/C][C] 4.32[/C][C] 0.6797[/C][/ROW]
[ROW][C]84[/C][C] 5[/C][C] 4.436[/C][C] 0.5639[/C][/ROW]
[ROW][C]85[/C][C] 4[/C][C] 4.036[/C][C]-0.03582[/C][/ROW]
[ROW][C]86[/C][C] 4[/C][C] 4.036[/C][C]-0.03582[/C][/ROW]
[ROW][C]87[/C][C] 2[/C][C] 4.256[/C][C]-2.256[/C][/ROW]
[ROW][C]88[/C][C] 4[/C][C] 4.154[/C][C]-0.1544[/C][/ROW]
[ROW][C]89[/C][C] 5[/C][C] 4.113[/C][C] 0.887[/C][/ROW]
[ROW][C]90[/C][C] 5[/C][C] 4.095[/C][C] 0.9049[/C][/ROW]
[ROW][C]91[/C][C] 5[/C][C] 4.244[/C][C] 0.7559[/C][/ROW]
[ROW][C]92[/C][C] 4[/C][C] 4.25[/C][C]-0.2501[/C][/ROW]
[ROW][C]93[/C][C] 4[/C][C] 4.244[/C][C]-0.2436[/C][/ROW]
[ROW][C]94[/C][C] 4[/C][C] 4.128[/C][C]-0.1284[/C][/ROW]
[ROW][C]95[/C][C] 5[/C][C] 4.404[/C][C] 0.5961[/C][/ROW]
[ROW][C]96[/C][C] 4[/C][C] 4.352[/C][C]-0.3525[/C][/ROW]
[ROW][C]97[/C][C] 5[/C][C] 4.036[/C][C] 0.9642[/C][/ROW]
[ROW][C]98[/C][C] 4[/C][C] 4.036[/C][C]-0.03582[/C][/ROW]
[ROW][C]99[/C][C] 4[/C][C] 4.256[/C][C]-0.2559[/C][/ROW]
[ROW][C]100[/C][C] 4[/C][C] 4.238[/C][C]-0.238[/C][/ROW]
[ROW][C]101[/C][C] 5[/C][C] 4.197[/C][C] 0.8034[/C][/ROW]
[ROW][C]102[/C][C] 4[/C][C] 4.179[/C][C]-0.1787[/C][/ROW]
[ROW][C]103[/C][C] 5[/C][C] 4.244[/C][C] 0.7559[/C][/ROW]
[ROW][C]104[/C][C] 4[/C][C] 4.25[/C][C]-0.2501[/C][/ROW]
[ROW][C]105[/C][C] 4[/C][C] 4.244[/C][C]-0.2436[/C][/ROW]
[ROW][C]106[/C][C] 4[/C][C] 4.045[/C][C]-0.04478[/C][/ROW]
[ROW][C]107[/C][C] 3[/C][C] 4.32[/C][C]-1.32[/C][/ROW]
[ROW][C]108[/C][C] 4[/C][C] 4.352[/C][C]-0.3525[/C][/ROW]
[ROW][C]109[/C][C] 3[/C][C] 3.869[/C][C]-0.8687[/C][/ROW]
[ROW][C]110[/C][C] 5[/C][C] 3.952[/C][C] 1.048[/C][/ROW]
[ROW][C]111[/C][C] 4[/C][C] 4.256[/C][C]-0.2559[/C][/ROW]
[ROW][C]112[/C][C] 4[/C][C] 4.322[/C][C]-0.3215[/C][/ROW]
[ROW][C]113[/C][C] 5[/C][C] 4.197[/C][C] 0.8034[/C][/ROW]
[ROW][C]114[/C][C] 4[/C][C] 4.179[/C][C]-0.1787[/C][/ROW]
[ROW][C]115[/C][C] 4[/C][C] 4.244[/C][C]-0.2441[/C][/ROW]
[ROW][C]116[/C][C] 4[/C][C] 4.25[/C][C]-0.2501[/C][/ROW]
[ROW][C]117[/C][C] 4[/C][C] 4.16[/C][C]-0.16[/C][/ROW]
[ROW][C]118[/C][C] 4[/C][C] 4.128[/C][C]-0.1284[/C][/ROW]
[ROW][C]119[/C][C] 4[/C][C] 4.404[/C][C]-0.4039[/C][/ROW]
[ROW][C]120[/C][C] 5[/C][C] 4.352[/C][C] 0.6475[/C][/ROW]
[ROW][C]121[/C][C] 4[/C][C] 4.036[/C][C]-0.03582[/C][/ROW]
[ROW][C]122[/C][C] 4[/C][C] 4.036[/C][C]-0.03582[/C][/ROW]
[ROW][C]123[/C][C] 5[/C][C] 4.256[/C][C] 0.7441[/C][/ROW]
[ROW][C]124[/C][C] 4[/C][C] 4.322[/C][C]-0.3215[/C][/ROW]
[ROW][C]125[/C][C] 4[/C][C] 4.197[/C][C]-0.1966[/C][/ROW]
[ROW][C]126[/C][C] 3[/C][C] 4.179[/C][C]-1.179[/C][/ROW]
[ROW][C]127[/C][C] 4[/C][C] 4.244[/C][C]-0.2441[/C][/ROW]
[ROW][C]128[/C][C] 4[/C][C] 4.167[/C][C]-0.1665[/C][/ROW]
[ROW][C]129[/C][C] 4[/C][C] 4.244[/C][C]-0.2436[/C][/ROW]
[ROW][C]130[/C][C] 3[/C][C] 3.961[/C][C]-0.9612[/C][/ROW]
[ROW][C]131[/C][C] 4[/C][C] 4.404[/C][C]-0.4039[/C][/ROW]
[ROW][C]132[/C][C] 5[/C][C] 4.269[/C][C] 0.7311[/C][/ROW]
[ROW][C]133[/C][C] 3[/C][C] 3.869[/C][C]-0.8687[/C][/ROW]
[ROW][C]134[/C][C] 4[/C][C] 4.036[/C][C]-0.03582[/C][/ROW]
[ROW][C]135[/C][C] 5[/C][C] 4.339[/C][C] 0.6606[/C][/ROW]
[ROW][C]136[/C][C] 5[/C][C] 4.238[/C][C] 0.762[/C][/ROW]
[ROW][C]137[/C][C] 4[/C][C] 4.197[/C][C]-0.1966[/C][/ROW]
[ROW][C]138[/C][C] 4[/C][C] 4.179[/C][C]-0.1787[/C][/ROW]
[ROW][C]139[/C][C] 4[/C][C] 4.244[/C][C]-0.2441[/C][/ROW]
[ROW][C]140[/C][C] 3[/C][C] 4.25[/C][C]-1.25[/C][/ROW]
[ROW][C]141[/C][C] 5[/C][C] 4.244[/C][C] 0.7564[/C][/ROW]
[ROW][C]142[/C][C] 4[/C][C] 4.128[/C][C]-0.1284[/C][/ROW]
[ROW][C]143[/C][C] 5[/C][C] 4.404[/C][C] 0.5961[/C][/ROW]
[ROW][C]144[/C][C] 4[/C][C] 4.436[/C][C]-0.436[/C][/ROW]
[ROW][C]145[/C][C] 4[/C][C] 4.036[/C][C]-0.03582[/C][/ROW]
[ROW][C]146[/C][C] 3[/C][C] 3.952[/C][C]-0.9522[/C][/ROW]
[ROW][C]147[/C][C] 4[/C][C] 4.339[/C][C]-0.3394[/C][/ROW]
[ROW][C]148[/C][C] 4[/C][C] 4.322[/C][C]-0.3215[/C][/ROW]
[ROW][C]149[/C][C] 4[/C][C] 4.197[/C][C]-0.1966[/C][/ROW]
[ROW][C]150[/C][C] 3[/C][C] 4.179[/C][C]-1.179[/C][/ROW]
[ROW][C]151[/C][C] 4[/C][C] 4.244[/C][C]-0.2441[/C][/ROW]
[ROW][C]152[/C][C] 5[/C][C] 4.25[/C][C] 0.7499[/C][/ROW]
[ROW][C]153[/C][C] 3[/C][C] 4.244[/C][C]-1.244[/C][/ROW]
[ROW][C]154[/C][C] 4[/C][C] 4.045[/C][C]-0.04478[/C][/ROW]
[ROW][C]155[/C][C] 5[/C][C] 4.404[/C][C] 0.5961[/C][/ROW]
[ROW][C]156[/C][C] 4[/C][C] 4.436[/C][C]-0.436[/C][/ROW]
[ROW][C]157[/C][C] 5[/C][C] 3.952[/C][C] 1.048[/C][/ROW]
[ROW][C]158[/C][C] 4[/C][C] 3.952[/C][C] 0.04776[/C][/ROW]
[ROW][C]159[/C][C] 4[/C][C] 4.339[/C][C]-0.3394[/C][/ROW]
[ROW][C]160[/C][C] 4[/C][C] 4.322[/C][C]-0.3215[/C][/ROW]
[ROW][C]161[/C][C] 4[/C][C] 4.113[/C][C]-0.113[/C][/ROW]
[ROW][C]162[/C][C] 4[/C][C] 4.179[/C][C]-0.1787[/C][/ROW]
[ROW][C]163[/C][C] 3[/C][C] 4.161[/C][C]-1.161[/C][/ROW]
[ROW][C]164[/C][C] 4[/C][C] 4.25[/C][C]-0.2501[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=300857&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300857&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 3 4.036-1.036
2 4 4.036-0.03582
3 5 4.339 0.6606
4 4 4.238-0.238
5 4 4.197-0.1966
6 5 4.179 0.8213
7 5 4.161 0.8394
8 5 4.25 0.7499
9 5 4.244 0.7564
10 5 3.961 1.039
11 4 4.404-0.4039
12 4 4.436-0.436
13 4 4.036-0.03582
14 4 3.952 0.04776
15 5 4.256 0.7441
16 4 4.322-0.3215
17 4 4.113-0.113
18 5 4.095 0.9049
19 4 4.244-0.2441
20 4 4.25-0.2501
21 4 4.244-0.2436
22 4 4.045-0.04478
23 5 4.404 0.5961
24 5 4.352 0.6475
25 4 4.036-0.03582
26 4 3.952 0.04776
27 5 4.339 0.6606
28 5 4.322 0.6785
29 3 4.029-1.029
30 5 4.095 0.9049
31 4 4.244-0.2441
32 5 4.167 0.8335
33 4 4.244-0.2436
34 5 4.128 0.8716
35 4 4.32-0.3203
36 4 4.352-0.3525
37 4 3.952 0.04776
38 5 4.036 0.9642
39 4 4.256-0.2559
40 5 4.238 0.762
41 3 4.029-1.029
42 5 4.179 0.8213
43 4 4.161-0.1606
44 5 4.167 0.8335
45 5 4.16 0.84
46 3 4.045-1.045
47 5 4.404 0.5961
48 5 4.352 0.6475
49 4 4.036-0.03582
50 3 4.036-1.036
51 4 4.339-0.3394
52 4 4.322-0.3215
53 4 4.113-0.113
54 4 4.095-0.0951
55 4 4.161-0.1606
56 4 4.167-0.1665
57 5 4.244 0.7564
58 4 4.128-0.1284
59 4 4.404-0.4039
60 4 4.436-0.436
61 4 4.036-0.03582
62 4 3.952 0.04776
63 5 4.089 0.9113
64 4 4.322-0.3215
65 4 4.113-0.113
66 4 4.095-0.0951
67 4 4.244-0.2441
68 4 4.25-0.2501
69 3 4.244-1.244
70 4 4.128-0.1284
71 4 4.404-0.4039
72 4 4.436-0.436
73 5 4.036 0.9642
74 4 4.036-0.03582
75 4 4.339-0.3394
76 5 4.322 0.6785
77 5 4.197 0.8034
78 3 4.095-1.095
79 5 4.161 0.8394
80 4 4.083-0.08294
81 5 4.244 0.7564
82 5 4.128 0.8716
83 5 4.32 0.6797
84 5 4.436 0.5639
85 4 4.036-0.03582
86 4 4.036-0.03582
87 2 4.256-2.256
88 4 4.154-0.1544
89 5 4.113 0.887
90 5 4.095 0.9049
91 5 4.244 0.7559
92 4 4.25-0.2501
93 4 4.244-0.2436
94 4 4.128-0.1284
95 5 4.404 0.5961
96 4 4.352-0.3525
97 5 4.036 0.9642
98 4 4.036-0.03582
99 4 4.256-0.2559
100 4 4.238-0.238
101 5 4.197 0.8034
102 4 4.179-0.1787
103 5 4.244 0.7559
104 4 4.25-0.2501
105 4 4.244-0.2436
106 4 4.045-0.04478
107 3 4.32-1.32
108 4 4.352-0.3525
109 3 3.869-0.8687
110 5 3.952 1.048
111 4 4.256-0.2559
112 4 4.322-0.3215
113 5 4.197 0.8034
114 4 4.179-0.1787
115 4 4.244-0.2441
116 4 4.25-0.2501
117 4 4.16-0.16
118 4 4.128-0.1284
119 4 4.404-0.4039
120 5 4.352 0.6475
121 4 4.036-0.03582
122 4 4.036-0.03582
123 5 4.256 0.7441
124 4 4.322-0.3215
125 4 4.197-0.1966
126 3 4.179-1.179
127 4 4.244-0.2441
128 4 4.167-0.1665
129 4 4.244-0.2436
130 3 3.961-0.9612
131 4 4.404-0.4039
132 5 4.269 0.7311
133 3 3.869-0.8687
134 4 4.036-0.03582
135 5 4.339 0.6606
136 5 4.238 0.762
137 4 4.197-0.1966
138 4 4.179-0.1787
139 4 4.244-0.2441
140 3 4.25-1.25
141 5 4.244 0.7564
142 4 4.128-0.1284
143 5 4.404 0.5961
144 4 4.436-0.436
145 4 4.036-0.03582
146 3 3.952-0.9522
147 4 4.339-0.3394
148 4 4.322-0.3215
149 4 4.197-0.1966
150 3 4.179-1.179
151 4 4.244-0.2441
152 5 4.25 0.7499
153 3 4.244-1.244
154 4 4.045-0.04478
155 5 4.404 0.5961
156 4 4.436-0.436
157 5 3.952 1.048
158 4 3.952 0.04776
159 4 4.339-0.3394
160 4 4.322-0.3215
161 4 4.113-0.113
162 4 4.179-0.1787
163 3 4.161-1.161
164 4 4.25-0.2501







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
16 0.2485 0.497 0.7515
17 0.1223 0.2445 0.8777
18 0.05714 0.1143 0.9429
19 0.1019 0.2039 0.8981
20 0.1554 0.3108 0.8446
21 0.1986 0.3972 0.8014
22 0.1887 0.3773 0.8113
23 0.222 0.4439 0.778
24 0.1986 0.3972 0.8014
25 0.165 0.33 0.835
26 0.117 0.234 0.883
27 0.08419 0.1684 0.9158
28 0.1502 0.3005 0.8498
29 0.2473 0.4946 0.7527
30 0.2059 0.4119 0.7941
31 0.1749 0.3498 0.8251
32 0.1539 0.3077 0.8461
33 0.1311 0.2622 0.8689
34 0.127 0.2541 0.873
35 0.1085 0.2171 0.8915
36 0.09197 0.1839 0.908
37 0.07262 0.1452 0.9274
38 0.1062 0.2124 0.8938
39 0.1285 0.257 0.8715
40 0.1362 0.2724 0.8638
41 0.1439 0.2878 0.8561
42 0.1293 0.2586 0.8707
43 0.1034 0.2068 0.8966
44 0.09742 0.1948 0.9026
45 0.1098 0.2196 0.8902
46 0.276 0.552 0.724
47 0.2713 0.5426 0.7287
48 0.2726 0.5452 0.7274
49 0.2325 0.4649 0.7675
50 0.3419 0.6839 0.6581
51 0.3388 0.6777 0.6612
52 0.3111 0.6221 0.6889
53 0.2871 0.5742 0.7129
54 0.3152 0.6305 0.6848
55 0.2739 0.5477 0.7261
56 0.266 0.532 0.734
57 0.2659 0.5319 0.7341
58 0.23 0.46 0.77
59 0.2107 0.4213 0.7893
60 0.1936 0.3871 0.8064
61 0.1628 0.3256 0.8372
62 0.1332 0.2663 0.8668
63 0.1488 0.2976 0.8512
64 0.1298 0.2596 0.8702
65 0.1134 0.2269 0.8866
66 0.1171 0.2343 0.8829
67 0.0968 0.1936 0.9032
68 0.0871 0.1742 0.9129
69 0.189 0.378 0.811
70 0.1587 0.3174 0.8413
71 0.1416 0.2831 0.8584
72 0.1272 0.2545 0.8728
73 0.1722 0.3443 0.8278
74 0.1428 0.2857 0.8572
75 0.1302 0.2605 0.8698
76 0.1337 0.2674 0.8663
77 0.1841 0.3682 0.8159
78 0.2884 0.5767 0.7116
79 0.3199 0.6398 0.6801
80 0.2946 0.5892 0.7054
81 0.3102 0.6204 0.6898
82 0.3438 0.6876 0.6562
83 0.3529 0.7057 0.6471
84 0.3398 0.6795 0.6602
85 0.2977 0.5955 0.7023
86 0.2576 0.5152 0.7424
87 0.7495 0.501 0.2505
88 0.7138 0.5725 0.2862
89 0.7507 0.4986 0.2493
90 0.8321 0.3357 0.1679
91 0.8495 0.3011 0.1505
92 0.8248 0.3504 0.1752
93 0.7967 0.4066 0.2033
94 0.7631 0.4738 0.2369
95 0.762 0.476 0.238
96 0.7335 0.5331 0.2665
97 0.7736 0.4528 0.2264
98 0.7351 0.5299 0.2649
99 0.6987 0.6025 0.3013
100 0.6576 0.6848 0.3424
101 0.6756 0.6488 0.3244
102 0.6501 0.6998 0.3499
103 0.7008 0.5983 0.2992
104 0.661 0.678 0.339
105 0.6178 0.7643 0.3822
106 0.575 0.85 0.425
107 0.7115 0.5771 0.2885
108 0.6828 0.6344 0.3172
109 0.7105 0.579 0.2895
110 0.8008 0.3985 0.1992
111 0.774 0.452 0.226
112 0.7401 0.5198 0.2599
113 0.7748 0.4504 0.2252
114 0.7555 0.4889 0.2445
115 0.7201 0.5598 0.2799
116 0.6753 0.6494 0.3247
117 0.6243 0.7514 0.3757
118 0.578 0.844 0.422
119 0.5611 0.8779 0.4389
120 0.5563 0.8875 0.4437
121 0.4981 0.9962 0.5019
122 0.4472 0.8944 0.5528
123 0.4515 0.9031 0.5485
124 0.4066 0.8133 0.5934
125 0.3506 0.7013 0.6494
126 0.3793 0.7585 0.6207
127 0.3321 0.6641 0.6679
128 0.2784 0.5567 0.7216
129 0.2287 0.4574 0.7713
130 0.2473 0.4947 0.7527
131 0.2592 0.5184 0.7408
132 0.32 0.6399 0.68
133 0.4159 0.8318 0.5841
134 0.3798 0.7595 0.6202
135 0.4115 0.8231 0.5885
136 0.4404 0.8808 0.5596
137 0.3651 0.7302 0.6349
138 0.3172 0.6344 0.6828
139 0.2694 0.5387 0.7306
140 0.4648 0.9295 0.5352
141 0.811 0.3781 0.189
142 0.7327 0.5345 0.2673
143 0.6422 0.7157 0.3578
144 0.5351 0.9298 0.4649
145 0.6398 0.7203 0.3602
146 0.6961 0.6077 0.3039
147 0.5532 0.8936 0.4468
148 0.3868 0.7736 0.6132

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
16 &  0.2485 &  0.497 &  0.7515 \tabularnewline
17 &  0.1223 &  0.2445 &  0.8777 \tabularnewline
18 &  0.05714 &  0.1143 &  0.9429 \tabularnewline
19 &  0.1019 &  0.2039 &  0.8981 \tabularnewline
20 &  0.1554 &  0.3108 &  0.8446 \tabularnewline
21 &  0.1986 &  0.3972 &  0.8014 \tabularnewline
22 &  0.1887 &  0.3773 &  0.8113 \tabularnewline
23 &  0.222 &  0.4439 &  0.778 \tabularnewline
24 &  0.1986 &  0.3972 &  0.8014 \tabularnewline
25 &  0.165 &  0.33 &  0.835 \tabularnewline
26 &  0.117 &  0.234 &  0.883 \tabularnewline
27 &  0.08419 &  0.1684 &  0.9158 \tabularnewline
28 &  0.1502 &  0.3005 &  0.8498 \tabularnewline
29 &  0.2473 &  0.4946 &  0.7527 \tabularnewline
30 &  0.2059 &  0.4119 &  0.7941 \tabularnewline
31 &  0.1749 &  0.3498 &  0.8251 \tabularnewline
32 &  0.1539 &  0.3077 &  0.8461 \tabularnewline
33 &  0.1311 &  0.2622 &  0.8689 \tabularnewline
34 &  0.127 &  0.2541 &  0.873 \tabularnewline
35 &  0.1085 &  0.2171 &  0.8915 \tabularnewline
36 &  0.09197 &  0.1839 &  0.908 \tabularnewline
37 &  0.07262 &  0.1452 &  0.9274 \tabularnewline
38 &  0.1062 &  0.2124 &  0.8938 \tabularnewline
39 &  0.1285 &  0.257 &  0.8715 \tabularnewline
40 &  0.1362 &  0.2724 &  0.8638 \tabularnewline
41 &  0.1439 &  0.2878 &  0.8561 \tabularnewline
42 &  0.1293 &  0.2586 &  0.8707 \tabularnewline
43 &  0.1034 &  0.2068 &  0.8966 \tabularnewline
44 &  0.09742 &  0.1948 &  0.9026 \tabularnewline
45 &  0.1098 &  0.2196 &  0.8902 \tabularnewline
46 &  0.276 &  0.552 &  0.724 \tabularnewline
47 &  0.2713 &  0.5426 &  0.7287 \tabularnewline
48 &  0.2726 &  0.5452 &  0.7274 \tabularnewline
49 &  0.2325 &  0.4649 &  0.7675 \tabularnewline
50 &  0.3419 &  0.6839 &  0.6581 \tabularnewline
51 &  0.3388 &  0.6777 &  0.6612 \tabularnewline
52 &  0.3111 &  0.6221 &  0.6889 \tabularnewline
53 &  0.2871 &  0.5742 &  0.7129 \tabularnewline
54 &  0.3152 &  0.6305 &  0.6848 \tabularnewline
55 &  0.2739 &  0.5477 &  0.7261 \tabularnewline
56 &  0.266 &  0.532 &  0.734 \tabularnewline
57 &  0.2659 &  0.5319 &  0.7341 \tabularnewline
58 &  0.23 &  0.46 &  0.77 \tabularnewline
59 &  0.2107 &  0.4213 &  0.7893 \tabularnewline
60 &  0.1936 &  0.3871 &  0.8064 \tabularnewline
61 &  0.1628 &  0.3256 &  0.8372 \tabularnewline
62 &  0.1332 &  0.2663 &  0.8668 \tabularnewline
63 &  0.1488 &  0.2976 &  0.8512 \tabularnewline
64 &  0.1298 &  0.2596 &  0.8702 \tabularnewline
65 &  0.1134 &  0.2269 &  0.8866 \tabularnewline
66 &  0.1171 &  0.2343 &  0.8829 \tabularnewline
67 &  0.0968 &  0.1936 &  0.9032 \tabularnewline
68 &  0.0871 &  0.1742 &  0.9129 \tabularnewline
69 &  0.189 &  0.378 &  0.811 \tabularnewline
70 &  0.1587 &  0.3174 &  0.8413 \tabularnewline
71 &  0.1416 &  0.2831 &  0.8584 \tabularnewline
72 &  0.1272 &  0.2545 &  0.8728 \tabularnewline
73 &  0.1722 &  0.3443 &  0.8278 \tabularnewline
74 &  0.1428 &  0.2857 &  0.8572 \tabularnewline
75 &  0.1302 &  0.2605 &  0.8698 \tabularnewline
76 &  0.1337 &  0.2674 &  0.8663 \tabularnewline
77 &  0.1841 &  0.3682 &  0.8159 \tabularnewline
78 &  0.2884 &  0.5767 &  0.7116 \tabularnewline
79 &  0.3199 &  0.6398 &  0.6801 \tabularnewline
80 &  0.2946 &  0.5892 &  0.7054 \tabularnewline
81 &  0.3102 &  0.6204 &  0.6898 \tabularnewline
82 &  0.3438 &  0.6876 &  0.6562 \tabularnewline
83 &  0.3529 &  0.7057 &  0.6471 \tabularnewline
84 &  0.3398 &  0.6795 &  0.6602 \tabularnewline
85 &  0.2977 &  0.5955 &  0.7023 \tabularnewline
86 &  0.2576 &  0.5152 &  0.7424 \tabularnewline
87 &  0.7495 &  0.501 &  0.2505 \tabularnewline
88 &  0.7138 &  0.5725 &  0.2862 \tabularnewline
89 &  0.7507 &  0.4986 &  0.2493 \tabularnewline
90 &  0.8321 &  0.3357 &  0.1679 \tabularnewline
91 &  0.8495 &  0.3011 &  0.1505 \tabularnewline
92 &  0.8248 &  0.3504 &  0.1752 \tabularnewline
93 &  0.7967 &  0.4066 &  0.2033 \tabularnewline
94 &  0.7631 &  0.4738 &  0.2369 \tabularnewline
95 &  0.762 &  0.476 &  0.238 \tabularnewline
96 &  0.7335 &  0.5331 &  0.2665 \tabularnewline
97 &  0.7736 &  0.4528 &  0.2264 \tabularnewline
98 &  0.7351 &  0.5299 &  0.2649 \tabularnewline
99 &  0.6987 &  0.6025 &  0.3013 \tabularnewline
100 &  0.6576 &  0.6848 &  0.3424 \tabularnewline
101 &  0.6756 &  0.6488 &  0.3244 \tabularnewline
102 &  0.6501 &  0.6998 &  0.3499 \tabularnewline
103 &  0.7008 &  0.5983 &  0.2992 \tabularnewline
104 &  0.661 &  0.678 &  0.339 \tabularnewline
105 &  0.6178 &  0.7643 &  0.3822 \tabularnewline
106 &  0.575 &  0.85 &  0.425 \tabularnewline
107 &  0.7115 &  0.5771 &  0.2885 \tabularnewline
108 &  0.6828 &  0.6344 &  0.3172 \tabularnewline
109 &  0.7105 &  0.579 &  0.2895 \tabularnewline
110 &  0.8008 &  0.3985 &  0.1992 \tabularnewline
111 &  0.774 &  0.452 &  0.226 \tabularnewline
112 &  0.7401 &  0.5198 &  0.2599 \tabularnewline
113 &  0.7748 &  0.4504 &  0.2252 \tabularnewline
114 &  0.7555 &  0.4889 &  0.2445 \tabularnewline
115 &  0.7201 &  0.5598 &  0.2799 \tabularnewline
116 &  0.6753 &  0.6494 &  0.3247 \tabularnewline
117 &  0.6243 &  0.7514 &  0.3757 \tabularnewline
118 &  0.578 &  0.844 &  0.422 \tabularnewline
119 &  0.5611 &  0.8779 &  0.4389 \tabularnewline
120 &  0.5563 &  0.8875 &  0.4437 \tabularnewline
121 &  0.4981 &  0.9962 &  0.5019 \tabularnewline
122 &  0.4472 &  0.8944 &  0.5528 \tabularnewline
123 &  0.4515 &  0.9031 &  0.5485 \tabularnewline
124 &  0.4066 &  0.8133 &  0.5934 \tabularnewline
125 &  0.3506 &  0.7013 &  0.6494 \tabularnewline
126 &  0.3793 &  0.7585 &  0.6207 \tabularnewline
127 &  0.3321 &  0.6641 &  0.6679 \tabularnewline
128 &  0.2784 &  0.5567 &  0.7216 \tabularnewline
129 &  0.2287 &  0.4574 &  0.7713 \tabularnewline
130 &  0.2473 &  0.4947 &  0.7527 \tabularnewline
131 &  0.2592 &  0.5184 &  0.7408 \tabularnewline
132 &  0.32 &  0.6399 &  0.68 \tabularnewline
133 &  0.4159 &  0.8318 &  0.5841 \tabularnewline
134 &  0.3798 &  0.7595 &  0.6202 \tabularnewline
135 &  0.4115 &  0.8231 &  0.5885 \tabularnewline
136 &  0.4404 &  0.8808 &  0.5596 \tabularnewline
137 &  0.3651 &  0.7302 &  0.6349 \tabularnewline
138 &  0.3172 &  0.6344 &  0.6828 \tabularnewline
139 &  0.2694 &  0.5387 &  0.7306 \tabularnewline
140 &  0.4648 &  0.9295 &  0.5352 \tabularnewline
141 &  0.811 &  0.3781 &  0.189 \tabularnewline
142 &  0.7327 &  0.5345 &  0.2673 \tabularnewline
143 &  0.6422 &  0.7157 &  0.3578 \tabularnewline
144 &  0.5351 &  0.9298 &  0.4649 \tabularnewline
145 &  0.6398 &  0.7203 &  0.3602 \tabularnewline
146 &  0.6961 &  0.6077 &  0.3039 \tabularnewline
147 &  0.5532 &  0.8936 &  0.4468 \tabularnewline
148 &  0.3868 &  0.7736 &  0.6132 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300857&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]16[/C][C] 0.2485[/C][C] 0.497[/C][C] 0.7515[/C][/ROW]
[ROW][C]17[/C][C] 0.1223[/C][C] 0.2445[/C][C] 0.8777[/C][/ROW]
[ROW][C]18[/C][C] 0.05714[/C][C] 0.1143[/C][C] 0.9429[/C][/ROW]
[ROW][C]19[/C][C] 0.1019[/C][C] 0.2039[/C][C] 0.8981[/C][/ROW]
[ROW][C]20[/C][C] 0.1554[/C][C] 0.3108[/C][C] 0.8446[/C][/ROW]
[ROW][C]21[/C][C] 0.1986[/C][C] 0.3972[/C][C] 0.8014[/C][/ROW]
[ROW][C]22[/C][C] 0.1887[/C][C] 0.3773[/C][C] 0.8113[/C][/ROW]
[ROW][C]23[/C][C] 0.222[/C][C] 0.4439[/C][C] 0.778[/C][/ROW]
[ROW][C]24[/C][C] 0.1986[/C][C] 0.3972[/C][C] 0.8014[/C][/ROW]
[ROW][C]25[/C][C] 0.165[/C][C] 0.33[/C][C] 0.835[/C][/ROW]
[ROW][C]26[/C][C] 0.117[/C][C] 0.234[/C][C] 0.883[/C][/ROW]
[ROW][C]27[/C][C] 0.08419[/C][C] 0.1684[/C][C] 0.9158[/C][/ROW]
[ROW][C]28[/C][C] 0.1502[/C][C] 0.3005[/C][C] 0.8498[/C][/ROW]
[ROW][C]29[/C][C] 0.2473[/C][C] 0.4946[/C][C] 0.7527[/C][/ROW]
[ROW][C]30[/C][C] 0.2059[/C][C] 0.4119[/C][C] 0.7941[/C][/ROW]
[ROW][C]31[/C][C] 0.1749[/C][C] 0.3498[/C][C] 0.8251[/C][/ROW]
[ROW][C]32[/C][C] 0.1539[/C][C] 0.3077[/C][C] 0.8461[/C][/ROW]
[ROW][C]33[/C][C] 0.1311[/C][C] 0.2622[/C][C] 0.8689[/C][/ROW]
[ROW][C]34[/C][C] 0.127[/C][C] 0.2541[/C][C] 0.873[/C][/ROW]
[ROW][C]35[/C][C] 0.1085[/C][C] 0.2171[/C][C] 0.8915[/C][/ROW]
[ROW][C]36[/C][C] 0.09197[/C][C] 0.1839[/C][C] 0.908[/C][/ROW]
[ROW][C]37[/C][C] 0.07262[/C][C] 0.1452[/C][C] 0.9274[/C][/ROW]
[ROW][C]38[/C][C] 0.1062[/C][C] 0.2124[/C][C] 0.8938[/C][/ROW]
[ROW][C]39[/C][C] 0.1285[/C][C] 0.257[/C][C] 0.8715[/C][/ROW]
[ROW][C]40[/C][C] 0.1362[/C][C] 0.2724[/C][C] 0.8638[/C][/ROW]
[ROW][C]41[/C][C] 0.1439[/C][C] 0.2878[/C][C] 0.8561[/C][/ROW]
[ROW][C]42[/C][C] 0.1293[/C][C] 0.2586[/C][C] 0.8707[/C][/ROW]
[ROW][C]43[/C][C] 0.1034[/C][C] 0.2068[/C][C] 0.8966[/C][/ROW]
[ROW][C]44[/C][C] 0.09742[/C][C] 0.1948[/C][C] 0.9026[/C][/ROW]
[ROW][C]45[/C][C] 0.1098[/C][C] 0.2196[/C][C] 0.8902[/C][/ROW]
[ROW][C]46[/C][C] 0.276[/C][C] 0.552[/C][C] 0.724[/C][/ROW]
[ROW][C]47[/C][C] 0.2713[/C][C] 0.5426[/C][C] 0.7287[/C][/ROW]
[ROW][C]48[/C][C] 0.2726[/C][C] 0.5452[/C][C] 0.7274[/C][/ROW]
[ROW][C]49[/C][C] 0.2325[/C][C] 0.4649[/C][C] 0.7675[/C][/ROW]
[ROW][C]50[/C][C] 0.3419[/C][C] 0.6839[/C][C] 0.6581[/C][/ROW]
[ROW][C]51[/C][C] 0.3388[/C][C] 0.6777[/C][C] 0.6612[/C][/ROW]
[ROW][C]52[/C][C] 0.3111[/C][C] 0.6221[/C][C] 0.6889[/C][/ROW]
[ROW][C]53[/C][C] 0.2871[/C][C] 0.5742[/C][C] 0.7129[/C][/ROW]
[ROW][C]54[/C][C] 0.3152[/C][C] 0.6305[/C][C] 0.6848[/C][/ROW]
[ROW][C]55[/C][C] 0.2739[/C][C] 0.5477[/C][C] 0.7261[/C][/ROW]
[ROW][C]56[/C][C] 0.266[/C][C] 0.532[/C][C] 0.734[/C][/ROW]
[ROW][C]57[/C][C] 0.2659[/C][C] 0.5319[/C][C] 0.7341[/C][/ROW]
[ROW][C]58[/C][C] 0.23[/C][C] 0.46[/C][C] 0.77[/C][/ROW]
[ROW][C]59[/C][C] 0.2107[/C][C] 0.4213[/C][C] 0.7893[/C][/ROW]
[ROW][C]60[/C][C] 0.1936[/C][C] 0.3871[/C][C] 0.8064[/C][/ROW]
[ROW][C]61[/C][C] 0.1628[/C][C] 0.3256[/C][C] 0.8372[/C][/ROW]
[ROW][C]62[/C][C] 0.1332[/C][C] 0.2663[/C][C] 0.8668[/C][/ROW]
[ROW][C]63[/C][C] 0.1488[/C][C] 0.2976[/C][C] 0.8512[/C][/ROW]
[ROW][C]64[/C][C] 0.1298[/C][C] 0.2596[/C][C] 0.8702[/C][/ROW]
[ROW][C]65[/C][C] 0.1134[/C][C] 0.2269[/C][C] 0.8866[/C][/ROW]
[ROW][C]66[/C][C] 0.1171[/C][C] 0.2343[/C][C] 0.8829[/C][/ROW]
[ROW][C]67[/C][C] 0.0968[/C][C] 0.1936[/C][C] 0.9032[/C][/ROW]
[ROW][C]68[/C][C] 0.0871[/C][C] 0.1742[/C][C] 0.9129[/C][/ROW]
[ROW][C]69[/C][C] 0.189[/C][C] 0.378[/C][C] 0.811[/C][/ROW]
[ROW][C]70[/C][C] 0.1587[/C][C] 0.3174[/C][C] 0.8413[/C][/ROW]
[ROW][C]71[/C][C] 0.1416[/C][C] 0.2831[/C][C] 0.8584[/C][/ROW]
[ROW][C]72[/C][C] 0.1272[/C][C] 0.2545[/C][C] 0.8728[/C][/ROW]
[ROW][C]73[/C][C] 0.1722[/C][C] 0.3443[/C][C] 0.8278[/C][/ROW]
[ROW][C]74[/C][C] 0.1428[/C][C] 0.2857[/C][C] 0.8572[/C][/ROW]
[ROW][C]75[/C][C] 0.1302[/C][C] 0.2605[/C][C] 0.8698[/C][/ROW]
[ROW][C]76[/C][C] 0.1337[/C][C] 0.2674[/C][C] 0.8663[/C][/ROW]
[ROW][C]77[/C][C] 0.1841[/C][C] 0.3682[/C][C] 0.8159[/C][/ROW]
[ROW][C]78[/C][C] 0.2884[/C][C] 0.5767[/C][C] 0.7116[/C][/ROW]
[ROW][C]79[/C][C] 0.3199[/C][C] 0.6398[/C][C] 0.6801[/C][/ROW]
[ROW][C]80[/C][C] 0.2946[/C][C] 0.5892[/C][C] 0.7054[/C][/ROW]
[ROW][C]81[/C][C] 0.3102[/C][C] 0.6204[/C][C] 0.6898[/C][/ROW]
[ROW][C]82[/C][C] 0.3438[/C][C] 0.6876[/C][C] 0.6562[/C][/ROW]
[ROW][C]83[/C][C] 0.3529[/C][C] 0.7057[/C][C] 0.6471[/C][/ROW]
[ROW][C]84[/C][C] 0.3398[/C][C] 0.6795[/C][C] 0.6602[/C][/ROW]
[ROW][C]85[/C][C] 0.2977[/C][C] 0.5955[/C][C] 0.7023[/C][/ROW]
[ROW][C]86[/C][C] 0.2576[/C][C] 0.5152[/C][C] 0.7424[/C][/ROW]
[ROW][C]87[/C][C] 0.7495[/C][C] 0.501[/C][C] 0.2505[/C][/ROW]
[ROW][C]88[/C][C] 0.7138[/C][C] 0.5725[/C][C] 0.2862[/C][/ROW]
[ROW][C]89[/C][C] 0.7507[/C][C] 0.4986[/C][C] 0.2493[/C][/ROW]
[ROW][C]90[/C][C] 0.8321[/C][C] 0.3357[/C][C] 0.1679[/C][/ROW]
[ROW][C]91[/C][C] 0.8495[/C][C] 0.3011[/C][C] 0.1505[/C][/ROW]
[ROW][C]92[/C][C] 0.8248[/C][C] 0.3504[/C][C] 0.1752[/C][/ROW]
[ROW][C]93[/C][C] 0.7967[/C][C] 0.4066[/C][C] 0.2033[/C][/ROW]
[ROW][C]94[/C][C] 0.7631[/C][C] 0.4738[/C][C] 0.2369[/C][/ROW]
[ROW][C]95[/C][C] 0.762[/C][C] 0.476[/C][C] 0.238[/C][/ROW]
[ROW][C]96[/C][C] 0.7335[/C][C] 0.5331[/C][C] 0.2665[/C][/ROW]
[ROW][C]97[/C][C] 0.7736[/C][C] 0.4528[/C][C] 0.2264[/C][/ROW]
[ROW][C]98[/C][C] 0.7351[/C][C] 0.5299[/C][C] 0.2649[/C][/ROW]
[ROW][C]99[/C][C] 0.6987[/C][C] 0.6025[/C][C] 0.3013[/C][/ROW]
[ROW][C]100[/C][C] 0.6576[/C][C] 0.6848[/C][C] 0.3424[/C][/ROW]
[ROW][C]101[/C][C] 0.6756[/C][C] 0.6488[/C][C] 0.3244[/C][/ROW]
[ROW][C]102[/C][C] 0.6501[/C][C] 0.6998[/C][C] 0.3499[/C][/ROW]
[ROW][C]103[/C][C] 0.7008[/C][C] 0.5983[/C][C] 0.2992[/C][/ROW]
[ROW][C]104[/C][C] 0.661[/C][C] 0.678[/C][C] 0.339[/C][/ROW]
[ROW][C]105[/C][C] 0.6178[/C][C] 0.7643[/C][C] 0.3822[/C][/ROW]
[ROW][C]106[/C][C] 0.575[/C][C] 0.85[/C][C] 0.425[/C][/ROW]
[ROW][C]107[/C][C] 0.7115[/C][C] 0.5771[/C][C] 0.2885[/C][/ROW]
[ROW][C]108[/C][C] 0.6828[/C][C] 0.6344[/C][C] 0.3172[/C][/ROW]
[ROW][C]109[/C][C] 0.7105[/C][C] 0.579[/C][C] 0.2895[/C][/ROW]
[ROW][C]110[/C][C] 0.8008[/C][C] 0.3985[/C][C] 0.1992[/C][/ROW]
[ROW][C]111[/C][C] 0.774[/C][C] 0.452[/C][C] 0.226[/C][/ROW]
[ROW][C]112[/C][C] 0.7401[/C][C] 0.5198[/C][C] 0.2599[/C][/ROW]
[ROW][C]113[/C][C] 0.7748[/C][C] 0.4504[/C][C] 0.2252[/C][/ROW]
[ROW][C]114[/C][C] 0.7555[/C][C] 0.4889[/C][C] 0.2445[/C][/ROW]
[ROW][C]115[/C][C] 0.7201[/C][C] 0.5598[/C][C] 0.2799[/C][/ROW]
[ROW][C]116[/C][C] 0.6753[/C][C] 0.6494[/C][C] 0.3247[/C][/ROW]
[ROW][C]117[/C][C] 0.6243[/C][C] 0.7514[/C][C] 0.3757[/C][/ROW]
[ROW][C]118[/C][C] 0.578[/C][C] 0.844[/C][C] 0.422[/C][/ROW]
[ROW][C]119[/C][C] 0.5611[/C][C] 0.8779[/C][C] 0.4389[/C][/ROW]
[ROW][C]120[/C][C] 0.5563[/C][C] 0.8875[/C][C] 0.4437[/C][/ROW]
[ROW][C]121[/C][C] 0.4981[/C][C] 0.9962[/C][C] 0.5019[/C][/ROW]
[ROW][C]122[/C][C] 0.4472[/C][C] 0.8944[/C][C] 0.5528[/C][/ROW]
[ROW][C]123[/C][C] 0.4515[/C][C] 0.9031[/C][C] 0.5485[/C][/ROW]
[ROW][C]124[/C][C] 0.4066[/C][C] 0.8133[/C][C] 0.5934[/C][/ROW]
[ROW][C]125[/C][C] 0.3506[/C][C] 0.7013[/C][C] 0.6494[/C][/ROW]
[ROW][C]126[/C][C] 0.3793[/C][C] 0.7585[/C][C] 0.6207[/C][/ROW]
[ROW][C]127[/C][C] 0.3321[/C][C] 0.6641[/C][C] 0.6679[/C][/ROW]
[ROW][C]128[/C][C] 0.2784[/C][C] 0.5567[/C][C] 0.7216[/C][/ROW]
[ROW][C]129[/C][C] 0.2287[/C][C] 0.4574[/C][C] 0.7713[/C][/ROW]
[ROW][C]130[/C][C] 0.2473[/C][C] 0.4947[/C][C] 0.7527[/C][/ROW]
[ROW][C]131[/C][C] 0.2592[/C][C] 0.5184[/C][C] 0.7408[/C][/ROW]
[ROW][C]132[/C][C] 0.32[/C][C] 0.6399[/C][C] 0.68[/C][/ROW]
[ROW][C]133[/C][C] 0.4159[/C][C] 0.8318[/C][C] 0.5841[/C][/ROW]
[ROW][C]134[/C][C] 0.3798[/C][C] 0.7595[/C][C] 0.6202[/C][/ROW]
[ROW][C]135[/C][C] 0.4115[/C][C] 0.8231[/C][C] 0.5885[/C][/ROW]
[ROW][C]136[/C][C] 0.4404[/C][C] 0.8808[/C][C] 0.5596[/C][/ROW]
[ROW][C]137[/C][C] 0.3651[/C][C] 0.7302[/C][C] 0.6349[/C][/ROW]
[ROW][C]138[/C][C] 0.3172[/C][C] 0.6344[/C][C] 0.6828[/C][/ROW]
[ROW][C]139[/C][C] 0.2694[/C][C] 0.5387[/C][C] 0.7306[/C][/ROW]
[ROW][C]140[/C][C] 0.4648[/C][C] 0.9295[/C][C] 0.5352[/C][/ROW]
[ROW][C]141[/C][C] 0.811[/C][C] 0.3781[/C][C] 0.189[/C][/ROW]
[ROW][C]142[/C][C] 0.7327[/C][C] 0.5345[/C][C] 0.2673[/C][/ROW]
[ROW][C]143[/C][C] 0.6422[/C][C] 0.7157[/C][C] 0.3578[/C][/ROW]
[ROW][C]144[/C][C] 0.5351[/C][C] 0.9298[/C][C] 0.4649[/C][/ROW]
[ROW][C]145[/C][C] 0.6398[/C][C] 0.7203[/C][C] 0.3602[/C][/ROW]
[ROW][C]146[/C][C] 0.6961[/C][C] 0.6077[/C][C] 0.3039[/C][/ROW]
[ROW][C]147[/C][C] 0.5532[/C][C] 0.8936[/C][C] 0.4468[/C][/ROW]
[ROW][C]148[/C][C] 0.3868[/C][C] 0.7736[/C][C] 0.6132[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=300857&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300857&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
16 0.2485 0.497 0.7515
17 0.1223 0.2445 0.8777
18 0.05714 0.1143 0.9429
19 0.1019 0.2039 0.8981
20 0.1554 0.3108 0.8446
21 0.1986 0.3972 0.8014
22 0.1887 0.3773 0.8113
23 0.222 0.4439 0.778
24 0.1986 0.3972 0.8014
25 0.165 0.33 0.835
26 0.117 0.234 0.883
27 0.08419 0.1684 0.9158
28 0.1502 0.3005 0.8498
29 0.2473 0.4946 0.7527
30 0.2059 0.4119 0.7941
31 0.1749 0.3498 0.8251
32 0.1539 0.3077 0.8461
33 0.1311 0.2622 0.8689
34 0.127 0.2541 0.873
35 0.1085 0.2171 0.8915
36 0.09197 0.1839 0.908
37 0.07262 0.1452 0.9274
38 0.1062 0.2124 0.8938
39 0.1285 0.257 0.8715
40 0.1362 0.2724 0.8638
41 0.1439 0.2878 0.8561
42 0.1293 0.2586 0.8707
43 0.1034 0.2068 0.8966
44 0.09742 0.1948 0.9026
45 0.1098 0.2196 0.8902
46 0.276 0.552 0.724
47 0.2713 0.5426 0.7287
48 0.2726 0.5452 0.7274
49 0.2325 0.4649 0.7675
50 0.3419 0.6839 0.6581
51 0.3388 0.6777 0.6612
52 0.3111 0.6221 0.6889
53 0.2871 0.5742 0.7129
54 0.3152 0.6305 0.6848
55 0.2739 0.5477 0.7261
56 0.266 0.532 0.734
57 0.2659 0.5319 0.7341
58 0.23 0.46 0.77
59 0.2107 0.4213 0.7893
60 0.1936 0.3871 0.8064
61 0.1628 0.3256 0.8372
62 0.1332 0.2663 0.8668
63 0.1488 0.2976 0.8512
64 0.1298 0.2596 0.8702
65 0.1134 0.2269 0.8866
66 0.1171 0.2343 0.8829
67 0.0968 0.1936 0.9032
68 0.0871 0.1742 0.9129
69 0.189 0.378 0.811
70 0.1587 0.3174 0.8413
71 0.1416 0.2831 0.8584
72 0.1272 0.2545 0.8728
73 0.1722 0.3443 0.8278
74 0.1428 0.2857 0.8572
75 0.1302 0.2605 0.8698
76 0.1337 0.2674 0.8663
77 0.1841 0.3682 0.8159
78 0.2884 0.5767 0.7116
79 0.3199 0.6398 0.6801
80 0.2946 0.5892 0.7054
81 0.3102 0.6204 0.6898
82 0.3438 0.6876 0.6562
83 0.3529 0.7057 0.6471
84 0.3398 0.6795 0.6602
85 0.2977 0.5955 0.7023
86 0.2576 0.5152 0.7424
87 0.7495 0.501 0.2505
88 0.7138 0.5725 0.2862
89 0.7507 0.4986 0.2493
90 0.8321 0.3357 0.1679
91 0.8495 0.3011 0.1505
92 0.8248 0.3504 0.1752
93 0.7967 0.4066 0.2033
94 0.7631 0.4738 0.2369
95 0.762 0.476 0.238
96 0.7335 0.5331 0.2665
97 0.7736 0.4528 0.2264
98 0.7351 0.5299 0.2649
99 0.6987 0.6025 0.3013
100 0.6576 0.6848 0.3424
101 0.6756 0.6488 0.3244
102 0.6501 0.6998 0.3499
103 0.7008 0.5983 0.2992
104 0.661 0.678 0.339
105 0.6178 0.7643 0.3822
106 0.575 0.85 0.425
107 0.7115 0.5771 0.2885
108 0.6828 0.6344 0.3172
109 0.7105 0.579 0.2895
110 0.8008 0.3985 0.1992
111 0.774 0.452 0.226
112 0.7401 0.5198 0.2599
113 0.7748 0.4504 0.2252
114 0.7555 0.4889 0.2445
115 0.7201 0.5598 0.2799
116 0.6753 0.6494 0.3247
117 0.6243 0.7514 0.3757
118 0.578 0.844 0.422
119 0.5611 0.8779 0.4389
120 0.5563 0.8875 0.4437
121 0.4981 0.9962 0.5019
122 0.4472 0.8944 0.5528
123 0.4515 0.9031 0.5485
124 0.4066 0.8133 0.5934
125 0.3506 0.7013 0.6494
126 0.3793 0.7585 0.6207
127 0.3321 0.6641 0.6679
128 0.2784 0.5567 0.7216
129 0.2287 0.4574 0.7713
130 0.2473 0.4947 0.7527
131 0.2592 0.5184 0.7408
132 0.32 0.6399 0.68
133 0.4159 0.8318 0.5841
134 0.3798 0.7595 0.6202
135 0.4115 0.8231 0.5885
136 0.4404 0.8808 0.5596
137 0.3651 0.7302 0.6349
138 0.3172 0.6344 0.6828
139 0.2694 0.5387 0.7306
140 0.4648 0.9295 0.5352
141 0.811 0.3781 0.189
142 0.7327 0.5345 0.2673
143 0.6422 0.7157 0.3578
144 0.5351 0.9298 0.4649
145 0.6398 0.7203 0.3602
146 0.6961 0.6077 0.3039
147 0.5532 0.8936 0.4468
148 0.3868 0.7736 0.6132







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level00OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 0 & 0 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=300857&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=300857&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300857&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level00OK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.9062, df1 = 2, df2 = 149, p-value = 0.1523
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.21868, df1 = 24, df2 = 127, p-value = 1
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 3.634, df1 = 2, df2 = 149, p-value = 0.02878

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.9062, df1 = 2, df2 = 149, p-value = 0.1523
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.21868, df1 = 24, df2 = 127, p-value = 1
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 3.634, df1 = 2, df2 = 149, p-value = 0.02878
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=300857&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.9062, df1 = 2, df2 = 149, p-value = 0.1523
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.21868, df1 = 24, df2 = 127, p-value = 1
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 3.634, df1 = 2, df2 = 149, p-value = 0.02878
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=300857&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300857&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.9062, df1 = 2, df2 = 149, p-value = 0.1523
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.21868, df1 = 24, df2 = 127, p-value = 1
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 3.634, df1 = 2, df2 = 149, p-value = 0.02878







Variance Inflation Factors (Multicollinearity)
> vif
  Belang       M1       M2       M3       M4       M5       M6       M7 
1.060227 1.906958 1.906958 1.899783 1.906958 1.899783 1.906958 1.913638 
      M8       M9      M10      M11 
1.906958 1.883304 1.841463 1.870519 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
  Belang       M1       M2       M3       M4       M5       M6       M7 
1.060227 1.906958 1.906958 1.899783 1.906958 1.899783 1.906958 1.913638 
      M8       M9      M10      M11 
1.906958 1.883304 1.841463 1.870519 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=300857&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
  Belang       M1       M2       M3       M4       M5       M6       M7 
1.060227 1.906958 1.906958 1.899783 1.906958 1.899783 1.906958 1.913638 
      M8       M9      M10      M11 
1.906958 1.883304 1.841463 1.870519 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=300857&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=300857&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
  Belang       M1       M2       M3       M4       M5       M6       M7 
1.060227 1.906958 1.906958 1.899783 1.906958 1.899783 1.906958 1.913638 
      M8       M9      M10      M11 
1.906958 1.883304 1.841463 1.870519 



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = Linear Trend ; par4 = 0 ; par5 = 0 ;
Parameters (R input):
par1 = 1 ; par2 = Include Monthly Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
par5 <- '0'
par4 <- '0'
par3 <- 'No Linear Trend'
par2 <- 'Do not include Seasonal Dummies'
par1 <- '1'
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')