Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationThu, 08 Dec 2016 21:39:20 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/08/t1481229576meafy8t0x741p81.htm/, Retrieved Sat, 27 Apr 2024 21:24:48 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=298393, Retrieved Sat, 27 Apr 2024 21:24:48 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact57
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Multiple regression] [2016-12-08 20:39:20] [130d73899007e5ff8a4f636b9bcfb397] [Current]
Feedback Forum

Post a new message
Dataseries X:
4	5	3
5	5	5
4	4	5
4	4	5
4	5	4
5	5	5
5	5	5
5	4	5
5	4	5
5	5	5
5	4	5
4	4	4
4	5	5
5	5	5
5	4	4
4	5	3
4	5	5
4	4	4
4	4	5
5	5	5
5	5	4
4	5	2
4	5	5
4	4	5
5	5	5
4	3	4
5	4	4
4	4	4
5	4	5
4	4	5
4	5	4
3	5	5
5	5	5
4	5	1
4	5	5
5	5	4
4	4	4
5	4	4
5	4	5
5	4	4
4	4	4
4	4	5
4	4	3
5	4	5
4	5	5
4	4	2
4	4	3
2	4	4
4	4	4
4	4	4
5	5	5
5	4	5
5	4	4
4	5	5
4	4	5
4	4	4
5	4	5
5	3	4
4	4	4
5	4	3
5	4	4
4	5	4
5	5	5
5	5	4
5	5	4
5	3	4
3	4	4
5	5	4
4	4	2
3	4	4
5	5	4
5	5	5
3	4	3
4	5	3
4	4	5
5	4	4
5	5	5
5	5	4
5	5	4
5	5	5
5	5	5
4	4	5
4	4	4
4	4	5
5	5	5
4	4	2
5	4	5
5	5	5
5	5	5
2	5	5
4	2	4
5	4	4
4	4	5
4	3	4
5	5	5
5	5	5
4	5	5
5	5	4
5	5	5
5	4	5
5	5	3
5	4	5
4	5	5
5	4	5
4	4	4
5	5	4
5	5	4
4	4	4
5	4	5
5	4	5
5	5	4
3	4	3
5	5	3
3	4	4
4	4	4
4	4	5
5	5	5
4	4	4
5	5	5
5	5	5
5	4	4
4	5	4
4	5	5
5	5	5
4	3	5
5	4	5
5	4	4
3	4	5
5	5	3
4	4	5
4	4	5
4	4	4
3	4	5
5	5	4
4	4	4
4	4	4
4	2	4
4	4	2
5	4	4
5	5	5
5	4	5
5	4	5
4	5	4
4	4	4
5	5	5
5	4	5
4	4	5
5	4	5
5	4	5
5	5	5
5	4	4
5	5	5
5	5	3
5	5	3
3	5	4
4	4	4
5	5	3
3	5	3
4	5	5
4	5	5
5	4	5
2	5	5
5	2	5
3	4	5
5	5	5
1	5	5
5	4	4
5	4	4
5	5	2




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time8 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time8 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298393&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]8 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298393&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298393&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time8 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
waarschijnlijkheid[t] = + 3.66027 + 0.125694veiligheid[t] + 0.0255036informatie[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
waarschijnlijkheid[t] =  +  3.66027 +  0.125694veiligheid[t] +  0.0255036informatie[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298393&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]waarschijnlijkheid[t] =  +  3.66027 +  0.125694veiligheid[t] +  0.0255036informatie[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298393&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298393&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
waarschijnlijkheid[t] = + 3.66027 + 0.125694veiligheid[t] + 0.0255036informatie[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+3.66 0.5543+6.6040e+00 5.176e-10 2.588e-10
veiligheid+0.1257 0.08646+1.4540e+00 0.1479 0.07395
informatie+0.0255 0.1009+2.5270e-01 0.8008 0.4004

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +3.66 &  0.5543 & +6.6040e+00 &  5.176e-10 &  2.588e-10 \tabularnewline
veiligheid & +0.1257 &  0.08646 & +1.4540e+00 &  0.1479 &  0.07395 \tabularnewline
informatie & +0.0255 &  0.1009 & +2.5270e-01 &  0.8008 &  0.4004 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298393&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+3.66[/C][C] 0.5543[/C][C]+6.6040e+00[/C][C] 5.176e-10[/C][C] 2.588e-10[/C][/ROW]
[ROW][C]veiligheid[/C][C]+0.1257[/C][C] 0.08646[/C][C]+1.4540e+00[/C][C] 0.1479[/C][C] 0.07395[/C][/ROW]
[ROW][C]informatie[/C][C]+0.0255[/C][C] 0.1009[/C][C]+2.5270e-01[/C][C] 0.8008[/C][C] 0.4004[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298393&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298393&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+3.66 0.5543+6.6040e+00 5.176e-10 2.588e-10
veiligheid+0.1257 0.08646+1.4540e+00 0.1479 0.07395
informatie+0.0255 0.1009+2.5270e-01 0.8008 0.4004







Multiple Linear Regression - Regression Statistics
Multiple R 0.1166
R-squared 0.01359
Adjusted R-squared 0.001709
F-TEST (value) 1.144
F-TEST (DF numerator)2
F-TEST (DF denominator)166
p-value 0.3211
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 0.827
Sum Squared Residuals 113.5

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.1166 \tabularnewline
R-squared &  0.01359 \tabularnewline
Adjusted R-squared &  0.001709 \tabularnewline
F-TEST (value) &  1.144 \tabularnewline
F-TEST (DF numerator) & 2 \tabularnewline
F-TEST (DF denominator) & 166 \tabularnewline
p-value &  0.3211 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  0.827 \tabularnewline
Sum Squared Residuals &  113.5 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298393&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.1166[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.01359[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.001709[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 1.144[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]2[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]166[/C][/ROW]
[ROW][C]p-value[/C][C] 0.3211[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 0.827[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 113.5[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298393&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298393&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.1166
R-squared 0.01359
Adjusted R-squared 0.001709
F-TEST (value) 1.144
F-TEST (DF numerator)2
F-TEST (DF denominator)166
p-value 0.3211
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 0.827
Sum Squared Residuals 113.5







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 3 4.291-1.291
2 5 4.416 0.5837
3 5 4.265 0.7349
4 5 4.265 0.7349
5 4 4.291-0.2906
6 5 4.416 0.5837
7 5 4.416 0.5837
8 5 4.391 0.6092
9 5 4.391 0.6092
10 5 4.416 0.5837
11 5 4.391 0.6092
12 4 4.265-0.2651
13 5 4.291 0.7094
14 5 4.416 0.5837
15 4 4.391-0.3908
16 3 4.291-1.291
17 5 4.291 0.7094
18 4 4.265-0.2651
19 5 4.265 0.7349
20 5 4.416 0.5837
21 4 4.416-0.4163
22 2 4.291-2.291
23 5 4.291 0.7094
24 5 4.265 0.7349
25 5 4.416 0.5837
26 4 4.24-0.2396
27 4 4.391-0.3908
28 4 4.265-0.2651
29 5 4.391 0.6092
30 5 4.265 0.7349
31 4 4.291-0.2906
32 5 4.165 0.8351
33 5 4.416 0.5837
34 1 4.291-3.291
35 5 4.291 0.7094
36 4 4.416-0.4163
37 4 4.265-0.2651
38 4 4.391-0.3908
39 5 4.391 0.6092
40 4 4.391-0.3908
41 4 4.265-0.2651
42 5 4.265 0.7349
43 3 4.265-1.265
44 5 4.391 0.6092
45 5 4.291 0.7094
46 2 4.265-2.265
47 3 4.265-1.265
48 4 4.014-0.01367
49 4 4.265-0.2651
50 4 4.265-0.2651
51 5 4.416 0.5837
52 5 4.391 0.6092
53 4 4.391-0.3908
54 5 4.291 0.7094
55 5 4.265 0.7349
56 4 4.265-0.2651
57 5 4.391 0.6092
58 4 4.365-0.3653
59 4 4.265-0.2651
60 3 4.391-1.391
61 4 4.391-0.3908
62 4 4.291-0.2906
63 5 4.416 0.5837
64 4 4.416-0.4163
65 4 4.416-0.4163
66 4 4.365-0.3653
67 4 4.139-0.1394
68 4 4.416-0.4163
69 2 4.265-2.265
70 4 4.139-0.1394
71 4 4.416-0.4163
72 5 4.416 0.5837
73 3 4.139-1.139
74 3 4.291-1.291
75 5 4.265 0.7349
76 4 4.391-0.3908
77 5 4.416 0.5837
78 4 4.416-0.4163
79 4 4.416-0.4163
80 5 4.416 0.5837
81 5 4.416 0.5837
82 5 4.265 0.7349
83 4 4.265-0.2651
84 5 4.265 0.7349
85 5 4.416 0.5837
86 2 4.265-2.265
87 5 4.391 0.6092
88 5 4.416 0.5837
89 5 4.416 0.5837
90 5 4.039 0.9608
91 4 4.214-0.2141
92 4 4.391-0.3908
93 5 4.265 0.7349
94 4 4.24-0.2396
95 5 4.416 0.5837
96 5 4.416 0.5837
97 5 4.291 0.7094
98 4 4.416-0.4163
99 5 4.416 0.5837
100 5 4.391 0.6092
101 3 4.416-1.416
102 5 4.391 0.6092
103 5 4.291 0.7094
104 5 4.391 0.6092
105 4 4.265-0.2651
106 4 4.416-0.4163
107 4 4.416-0.4163
108 4 4.265-0.2651
109 5 4.391 0.6092
110 5 4.391 0.6092
111 4 4.416-0.4163
112 3 4.139-1.139
113 3 4.416-1.416
114 4 4.139-0.1394
115 4 4.265-0.2651
116 5 4.265 0.7349
117 5 4.416 0.5837
118 4 4.265-0.2651
119 5 4.416 0.5837
120 5 4.416 0.5837
121 4 4.391-0.3908
122 4 4.291-0.2906
123 5 4.291 0.7094
124 5 4.416 0.5837
125 5 4.24 0.7604
126 5 4.391 0.6092
127 4 4.391-0.3908
128 5 4.139 0.8606
129 3 4.416-1.416
130 5 4.265 0.7349
131 5 4.265 0.7349
132 4 4.265-0.2651
133 5 4.139 0.8606
134 4 4.416-0.4163
135 4 4.265-0.2651
136 4 4.265-0.2651
137 4 4.214-0.2141
138 2 4.265-2.265
139 4 4.391-0.3908
140 5 4.416 0.5837
141 5 4.391 0.6092
142 5 4.391 0.6092
143 4 4.291-0.2906
144 4 4.265-0.2651
145 5 4.416 0.5837
146 5 4.391 0.6092
147 5 4.265 0.7349
148 5 4.391 0.6092
149 5 4.391 0.6092
150 5 4.416 0.5837
151 4 4.391-0.3908
152 5 4.416 0.5837
153 3 4.416-1.416
154 3 4.416-1.416
155 4 4.165-0.1649
156 4 4.265-0.2651
157 3 4.416-1.416
158 3 4.165-1.165
159 5 4.291 0.7094
160 5 4.291 0.7094
161 5 4.391 0.6092
162 5 4.039 0.9608
163 5 4.34 0.6603
164 5 4.139 0.8606
165 5 4.416 0.5837
166 5 3.913 1.087
167 4 4.391-0.3908
168 4 4.391-0.3908
169 2 4.416-2.416

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  3 &  4.291 & -1.291 \tabularnewline
2 &  5 &  4.416 &  0.5837 \tabularnewline
3 &  5 &  4.265 &  0.7349 \tabularnewline
4 &  5 &  4.265 &  0.7349 \tabularnewline
5 &  4 &  4.291 & -0.2906 \tabularnewline
6 &  5 &  4.416 &  0.5837 \tabularnewline
7 &  5 &  4.416 &  0.5837 \tabularnewline
8 &  5 &  4.391 &  0.6092 \tabularnewline
9 &  5 &  4.391 &  0.6092 \tabularnewline
10 &  5 &  4.416 &  0.5837 \tabularnewline
11 &  5 &  4.391 &  0.6092 \tabularnewline
12 &  4 &  4.265 & -0.2651 \tabularnewline
13 &  5 &  4.291 &  0.7094 \tabularnewline
14 &  5 &  4.416 &  0.5837 \tabularnewline
15 &  4 &  4.391 & -0.3908 \tabularnewline
16 &  3 &  4.291 & -1.291 \tabularnewline
17 &  5 &  4.291 &  0.7094 \tabularnewline
18 &  4 &  4.265 & -0.2651 \tabularnewline
19 &  5 &  4.265 &  0.7349 \tabularnewline
20 &  5 &  4.416 &  0.5837 \tabularnewline
21 &  4 &  4.416 & -0.4163 \tabularnewline
22 &  2 &  4.291 & -2.291 \tabularnewline
23 &  5 &  4.291 &  0.7094 \tabularnewline
24 &  5 &  4.265 &  0.7349 \tabularnewline
25 &  5 &  4.416 &  0.5837 \tabularnewline
26 &  4 &  4.24 & -0.2396 \tabularnewline
27 &  4 &  4.391 & -0.3908 \tabularnewline
28 &  4 &  4.265 & -0.2651 \tabularnewline
29 &  5 &  4.391 &  0.6092 \tabularnewline
30 &  5 &  4.265 &  0.7349 \tabularnewline
31 &  4 &  4.291 & -0.2906 \tabularnewline
32 &  5 &  4.165 &  0.8351 \tabularnewline
33 &  5 &  4.416 &  0.5837 \tabularnewline
34 &  1 &  4.291 & -3.291 \tabularnewline
35 &  5 &  4.291 &  0.7094 \tabularnewline
36 &  4 &  4.416 & -0.4163 \tabularnewline
37 &  4 &  4.265 & -0.2651 \tabularnewline
38 &  4 &  4.391 & -0.3908 \tabularnewline
39 &  5 &  4.391 &  0.6092 \tabularnewline
40 &  4 &  4.391 & -0.3908 \tabularnewline
41 &  4 &  4.265 & -0.2651 \tabularnewline
42 &  5 &  4.265 &  0.7349 \tabularnewline
43 &  3 &  4.265 & -1.265 \tabularnewline
44 &  5 &  4.391 &  0.6092 \tabularnewline
45 &  5 &  4.291 &  0.7094 \tabularnewline
46 &  2 &  4.265 & -2.265 \tabularnewline
47 &  3 &  4.265 & -1.265 \tabularnewline
48 &  4 &  4.014 & -0.01367 \tabularnewline
49 &  4 &  4.265 & -0.2651 \tabularnewline
50 &  4 &  4.265 & -0.2651 \tabularnewline
51 &  5 &  4.416 &  0.5837 \tabularnewline
52 &  5 &  4.391 &  0.6092 \tabularnewline
53 &  4 &  4.391 & -0.3908 \tabularnewline
54 &  5 &  4.291 &  0.7094 \tabularnewline
55 &  5 &  4.265 &  0.7349 \tabularnewline
56 &  4 &  4.265 & -0.2651 \tabularnewline
57 &  5 &  4.391 &  0.6092 \tabularnewline
58 &  4 &  4.365 & -0.3653 \tabularnewline
59 &  4 &  4.265 & -0.2651 \tabularnewline
60 &  3 &  4.391 & -1.391 \tabularnewline
61 &  4 &  4.391 & -0.3908 \tabularnewline
62 &  4 &  4.291 & -0.2906 \tabularnewline
63 &  5 &  4.416 &  0.5837 \tabularnewline
64 &  4 &  4.416 & -0.4163 \tabularnewline
65 &  4 &  4.416 & -0.4163 \tabularnewline
66 &  4 &  4.365 & -0.3653 \tabularnewline
67 &  4 &  4.139 & -0.1394 \tabularnewline
68 &  4 &  4.416 & -0.4163 \tabularnewline
69 &  2 &  4.265 & -2.265 \tabularnewline
70 &  4 &  4.139 & -0.1394 \tabularnewline
71 &  4 &  4.416 & -0.4163 \tabularnewline
72 &  5 &  4.416 &  0.5837 \tabularnewline
73 &  3 &  4.139 & -1.139 \tabularnewline
74 &  3 &  4.291 & -1.291 \tabularnewline
75 &  5 &  4.265 &  0.7349 \tabularnewline
76 &  4 &  4.391 & -0.3908 \tabularnewline
77 &  5 &  4.416 &  0.5837 \tabularnewline
78 &  4 &  4.416 & -0.4163 \tabularnewline
79 &  4 &  4.416 & -0.4163 \tabularnewline
80 &  5 &  4.416 &  0.5837 \tabularnewline
81 &  5 &  4.416 &  0.5837 \tabularnewline
82 &  5 &  4.265 &  0.7349 \tabularnewline
83 &  4 &  4.265 & -0.2651 \tabularnewline
84 &  5 &  4.265 &  0.7349 \tabularnewline
85 &  5 &  4.416 &  0.5837 \tabularnewline
86 &  2 &  4.265 & -2.265 \tabularnewline
87 &  5 &  4.391 &  0.6092 \tabularnewline
88 &  5 &  4.416 &  0.5837 \tabularnewline
89 &  5 &  4.416 &  0.5837 \tabularnewline
90 &  5 &  4.039 &  0.9608 \tabularnewline
91 &  4 &  4.214 & -0.2141 \tabularnewline
92 &  4 &  4.391 & -0.3908 \tabularnewline
93 &  5 &  4.265 &  0.7349 \tabularnewline
94 &  4 &  4.24 & -0.2396 \tabularnewline
95 &  5 &  4.416 &  0.5837 \tabularnewline
96 &  5 &  4.416 &  0.5837 \tabularnewline
97 &  5 &  4.291 &  0.7094 \tabularnewline
98 &  4 &  4.416 & -0.4163 \tabularnewline
99 &  5 &  4.416 &  0.5837 \tabularnewline
100 &  5 &  4.391 &  0.6092 \tabularnewline
101 &  3 &  4.416 & -1.416 \tabularnewline
102 &  5 &  4.391 &  0.6092 \tabularnewline
103 &  5 &  4.291 &  0.7094 \tabularnewline
104 &  5 &  4.391 &  0.6092 \tabularnewline
105 &  4 &  4.265 & -0.2651 \tabularnewline
106 &  4 &  4.416 & -0.4163 \tabularnewline
107 &  4 &  4.416 & -0.4163 \tabularnewline
108 &  4 &  4.265 & -0.2651 \tabularnewline
109 &  5 &  4.391 &  0.6092 \tabularnewline
110 &  5 &  4.391 &  0.6092 \tabularnewline
111 &  4 &  4.416 & -0.4163 \tabularnewline
112 &  3 &  4.139 & -1.139 \tabularnewline
113 &  3 &  4.416 & -1.416 \tabularnewline
114 &  4 &  4.139 & -0.1394 \tabularnewline
115 &  4 &  4.265 & -0.2651 \tabularnewline
116 &  5 &  4.265 &  0.7349 \tabularnewline
117 &  5 &  4.416 &  0.5837 \tabularnewline
118 &  4 &  4.265 & -0.2651 \tabularnewline
119 &  5 &  4.416 &  0.5837 \tabularnewline
120 &  5 &  4.416 &  0.5837 \tabularnewline
121 &  4 &  4.391 & -0.3908 \tabularnewline
122 &  4 &  4.291 & -0.2906 \tabularnewline
123 &  5 &  4.291 &  0.7094 \tabularnewline
124 &  5 &  4.416 &  0.5837 \tabularnewline
125 &  5 &  4.24 &  0.7604 \tabularnewline
126 &  5 &  4.391 &  0.6092 \tabularnewline
127 &  4 &  4.391 & -0.3908 \tabularnewline
128 &  5 &  4.139 &  0.8606 \tabularnewline
129 &  3 &  4.416 & -1.416 \tabularnewline
130 &  5 &  4.265 &  0.7349 \tabularnewline
131 &  5 &  4.265 &  0.7349 \tabularnewline
132 &  4 &  4.265 & -0.2651 \tabularnewline
133 &  5 &  4.139 &  0.8606 \tabularnewline
134 &  4 &  4.416 & -0.4163 \tabularnewline
135 &  4 &  4.265 & -0.2651 \tabularnewline
136 &  4 &  4.265 & -0.2651 \tabularnewline
137 &  4 &  4.214 & -0.2141 \tabularnewline
138 &  2 &  4.265 & -2.265 \tabularnewline
139 &  4 &  4.391 & -0.3908 \tabularnewline
140 &  5 &  4.416 &  0.5837 \tabularnewline
141 &  5 &  4.391 &  0.6092 \tabularnewline
142 &  5 &  4.391 &  0.6092 \tabularnewline
143 &  4 &  4.291 & -0.2906 \tabularnewline
144 &  4 &  4.265 & -0.2651 \tabularnewline
145 &  5 &  4.416 &  0.5837 \tabularnewline
146 &  5 &  4.391 &  0.6092 \tabularnewline
147 &  5 &  4.265 &  0.7349 \tabularnewline
148 &  5 &  4.391 &  0.6092 \tabularnewline
149 &  5 &  4.391 &  0.6092 \tabularnewline
150 &  5 &  4.416 &  0.5837 \tabularnewline
151 &  4 &  4.391 & -0.3908 \tabularnewline
152 &  5 &  4.416 &  0.5837 \tabularnewline
153 &  3 &  4.416 & -1.416 \tabularnewline
154 &  3 &  4.416 & -1.416 \tabularnewline
155 &  4 &  4.165 & -0.1649 \tabularnewline
156 &  4 &  4.265 & -0.2651 \tabularnewline
157 &  3 &  4.416 & -1.416 \tabularnewline
158 &  3 &  4.165 & -1.165 \tabularnewline
159 &  5 &  4.291 &  0.7094 \tabularnewline
160 &  5 &  4.291 &  0.7094 \tabularnewline
161 &  5 &  4.391 &  0.6092 \tabularnewline
162 &  5 &  4.039 &  0.9608 \tabularnewline
163 &  5 &  4.34 &  0.6603 \tabularnewline
164 &  5 &  4.139 &  0.8606 \tabularnewline
165 &  5 &  4.416 &  0.5837 \tabularnewline
166 &  5 &  3.913 &  1.087 \tabularnewline
167 &  4 &  4.391 & -0.3908 \tabularnewline
168 &  4 &  4.391 & -0.3908 \tabularnewline
169 &  2 &  4.416 & -2.416 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298393&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 3[/C][C] 4.291[/C][C]-1.291[/C][/ROW]
[ROW][C]2[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]3[/C][C] 5[/C][C] 4.265[/C][C] 0.7349[/C][/ROW]
[ROW][C]4[/C][C] 5[/C][C] 4.265[/C][C] 0.7349[/C][/ROW]
[ROW][C]5[/C][C] 4[/C][C] 4.291[/C][C]-0.2906[/C][/ROW]
[ROW][C]6[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]7[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]8[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]9[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]10[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]11[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]12[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]13[/C][C] 5[/C][C] 4.291[/C][C] 0.7094[/C][/ROW]
[ROW][C]14[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]15[/C][C] 4[/C][C] 4.391[/C][C]-0.3908[/C][/ROW]
[ROW][C]16[/C][C] 3[/C][C] 4.291[/C][C]-1.291[/C][/ROW]
[ROW][C]17[/C][C] 5[/C][C] 4.291[/C][C] 0.7094[/C][/ROW]
[ROW][C]18[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]19[/C][C] 5[/C][C] 4.265[/C][C] 0.7349[/C][/ROW]
[ROW][C]20[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]21[/C][C] 4[/C][C] 4.416[/C][C]-0.4163[/C][/ROW]
[ROW][C]22[/C][C] 2[/C][C] 4.291[/C][C]-2.291[/C][/ROW]
[ROW][C]23[/C][C] 5[/C][C] 4.291[/C][C] 0.7094[/C][/ROW]
[ROW][C]24[/C][C] 5[/C][C] 4.265[/C][C] 0.7349[/C][/ROW]
[ROW][C]25[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]26[/C][C] 4[/C][C] 4.24[/C][C]-0.2396[/C][/ROW]
[ROW][C]27[/C][C] 4[/C][C] 4.391[/C][C]-0.3908[/C][/ROW]
[ROW][C]28[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]29[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]30[/C][C] 5[/C][C] 4.265[/C][C] 0.7349[/C][/ROW]
[ROW][C]31[/C][C] 4[/C][C] 4.291[/C][C]-0.2906[/C][/ROW]
[ROW][C]32[/C][C] 5[/C][C] 4.165[/C][C] 0.8351[/C][/ROW]
[ROW][C]33[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]34[/C][C] 1[/C][C] 4.291[/C][C]-3.291[/C][/ROW]
[ROW][C]35[/C][C] 5[/C][C] 4.291[/C][C] 0.7094[/C][/ROW]
[ROW][C]36[/C][C] 4[/C][C] 4.416[/C][C]-0.4163[/C][/ROW]
[ROW][C]37[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]38[/C][C] 4[/C][C] 4.391[/C][C]-0.3908[/C][/ROW]
[ROW][C]39[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]40[/C][C] 4[/C][C] 4.391[/C][C]-0.3908[/C][/ROW]
[ROW][C]41[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]42[/C][C] 5[/C][C] 4.265[/C][C] 0.7349[/C][/ROW]
[ROW][C]43[/C][C] 3[/C][C] 4.265[/C][C]-1.265[/C][/ROW]
[ROW][C]44[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]45[/C][C] 5[/C][C] 4.291[/C][C] 0.7094[/C][/ROW]
[ROW][C]46[/C][C] 2[/C][C] 4.265[/C][C]-2.265[/C][/ROW]
[ROW][C]47[/C][C] 3[/C][C] 4.265[/C][C]-1.265[/C][/ROW]
[ROW][C]48[/C][C] 4[/C][C] 4.014[/C][C]-0.01367[/C][/ROW]
[ROW][C]49[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]50[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]51[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]52[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]53[/C][C] 4[/C][C] 4.391[/C][C]-0.3908[/C][/ROW]
[ROW][C]54[/C][C] 5[/C][C] 4.291[/C][C] 0.7094[/C][/ROW]
[ROW][C]55[/C][C] 5[/C][C] 4.265[/C][C] 0.7349[/C][/ROW]
[ROW][C]56[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]57[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]58[/C][C] 4[/C][C] 4.365[/C][C]-0.3653[/C][/ROW]
[ROW][C]59[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]60[/C][C] 3[/C][C] 4.391[/C][C]-1.391[/C][/ROW]
[ROW][C]61[/C][C] 4[/C][C] 4.391[/C][C]-0.3908[/C][/ROW]
[ROW][C]62[/C][C] 4[/C][C] 4.291[/C][C]-0.2906[/C][/ROW]
[ROW][C]63[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]64[/C][C] 4[/C][C] 4.416[/C][C]-0.4163[/C][/ROW]
[ROW][C]65[/C][C] 4[/C][C] 4.416[/C][C]-0.4163[/C][/ROW]
[ROW][C]66[/C][C] 4[/C][C] 4.365[/C][C]-0.3653[/C][/ROW]
[ROW][C]67[/C][C] 4[/C][C] 4.139[/C][C]-0.1394[/C][/ROW]
[ROW][C]68[/C][C] 4[/C][C] 4.416[/C][C]-0.4163[/C][/ROW]
[ROW][C]69[/C][C] 2[/C][C] 4.265[/C][C]-2.265[/C][/ROW]
[ROW][C]70[/C][C] 4[/C][C] 4.139[/C][C]-0.1394[/C][/ROW]
[ROW][C]71[/C][C] 4[/C][C] 4.416[/C][C]-0.4163[/C][/ROW]
[ROW][C]72[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]73[/C][C] 3[/C][C] 4.139[/C][C]-1.139[/C][/ROW]
[ROW][C]74[/C][C] 3[/C][C] 4.291[/C][C]-1.291[/C][/ROW]
[ROW][C]75[/C][C] 5[/C][C] 4.265[/C][C] 0.7349[/C][/ROW]
[ROW][C]76[/C][C] 4[/C][C] 4.391[/C][C]-0.3908[/C][/ROW]
[ROW][C]77[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]78[/C][C] 4[/C][C] 4.416[/C][C]-0.4163[/C][/ROW]
[ROW][C]79[/C][C] 4[/C][C] 4.416[/C][C]-0.4163[/C][/ROW]
[ROW][C]80[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]81[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]82[/C][C] 5[/C][C] 4.265[/C][C] 0.7349[/C][/ROW]
[ROW][C]83[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]84[/C][C] 5[/C][C] 4.265[/C][C] 0.7349[/C][/ROW]
[ROW][C]85[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]86[/C][C] 2[/C][C] 4.265[/C][C]-2.265[/C][/ROW]
[ROW][C]87[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]88[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]89[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]90[/C][C] 5[/C][C] 4.039[/C][C] 0.9608[/C][/ROW]
[ROW][C]91[/C][C] 4[/C][C] 4.214[/C][C]-0.2141[/C][/ROW]
[ROW][C]92[/C][C] 4[/C][C] 4.391[/C][C]-0.3908[/C][/ROW]
[ROW][C]93[/C][C] 5[/C][C] 4.265[/C][C] 0.7349[/C][/ROW]
[ROW][C]94[/C][C] 4[/C][C] 4.24[/C][C]-0.2396[/C][/ROW]
[ROW][C]95[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]96[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]97[/C][C] 5[/C][C] 4.291[/C][C] 0.7094[/C][/ROW]
[ROW][C]98[/C][C] 4[/C][C] 4.416[/C][C]-0.4163[/C][/ROW]
[ROW][C]99[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]100[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]101[/C][C] 3[/C][C] 4.416[/C][C]-1.416[/C][/ROW]
[ROW][C]102[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]103[/C][C] 5[/C][C] 4.291[/C][C] 0.7094[/C][/ROW]
[ROW][C]104[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]105[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]106[/C][C] 4[/C][C] 4.416[/C][C]-0.4163[/C][/ROW]
[ROW][C]107[/C][C] 4[/C][C] 4.416[/C][C]-0.4163[/C][/ROW]
[ROW][C]108[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]109[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]110[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]111[/C][C] 4[/C][C] 4.416[/C][C]-0.4163[/C][/ROW]
[ROW][C]112[/C][C] 3[/C][C] 4.139[/C][C]-1.139[/C][/ROW]
[ROW][C]113[/C][C] 3[/C][C] 4.416[/C][C]-1.416[/C][/ROW]
[ROW][C]114[/C][C] 4[/C][C] 4.139[/C][C]-0.1394[/C][/ROW]
[ROW][C]115[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]116[/C][C] 5[/C][C] 4.265[/C][C] 0.7349[/C][/ROW]
[ROW][C]117[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]118[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]119[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]120[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]121[/C][C] 4[/C][C] 4.391[/C][C]-0.3908[/C][/ROW]
[ROW][C]122[/C][C] 4[/C][C] 4.291[/C][C]-0.2906[/C][/ROW]
[ROW][C]123[/C][C] 5[/C][C] 4.291[/C][C] 0.7094[/C][/ROW]
[ROW][C]124[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]125[/C][C] 5[/C][C] 4.24[/C][C] 0.7604[/C][/ROW]
[ROW][C]126[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]127[/C][C] 4[/C][C] 4.391[/C][C]-0.3908[/C][/ROW]
[ROW][C]128[/C][C] 5[/C][C] 4.139[/C][C] 0.8606[/C][/ROW]
[ROW][C]129[/C][C] 3[/C][C] 4.416[/C][C]-1.416[/C][/ROW]
[ROW][C]130[/C][C] 5[/C][C] 4.265[/C][C] 0.7349[/C][/ROW]
[ROW][C]131[/C][C] 5[/C][C] 4.265[/C][C] 0.7349[/C][/ROW]
[ROW][C]132[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]133[/C][C] 5[/C][C] 4.139[/C][C] 0.8606[/C][/ROW]
[ROW][C]134[/C][C] 4[/C][C] 4.416[/C][C]-0.4163[/C][/ROW]
[ROW][C]135[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]136[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]137[/C][C] 4[/C][C] 4.214[/C][C]-0.2141[/C][/ROW]
[ROW][C]138[/C][C] 2[/C][C] 4.265[/C][C]-2.265[/C][/ROW]
[ROW][C]139[/C][C] 4[/C][C] 4.391[/C][C]-0.3908[/C][/ROW]
[ROW][C]140[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]141[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]142[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]143[/C][C] 4[/C][C] 4.291[/C][C]-0.2906[/C][/ROW]
[ROW][C]144[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]145[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]146[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]147[/C][C] 5[/C][C] 4.265[/C][C] 0.7349[/C][/ROW]
[ROW][C]148[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]149[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]150[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]151[/C][C] 4[/C][C] 4.391[/C][C]-0.3908[/C][/ROW]
[ROW][C]152[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]153[/C][C] 3[/C][C] 4.416[/C][C]-1.416[/C][/ROW]
[ROW][C]154[/C][C] 3[/C][C] 4.416[/C][C]-1.416[/C][/ROW]
[ROW][C]155[/C][C] 4[/C][C] 4.165[/C][C]-0.1649[/C][/ROW]
[ROW][C]156[/C][C] 4[/C][C] 4.265[/C][C]-0.2651[/C][/ROW]
[ROW][C]157[/C][C] 3[/C][C] 4.416[/C][C]-1.416[/C][/ROW]
[ROW][C]158[/C][C] 3[/C][C] 4.165[/C][C]-1.165[/C][/ROW]
[ROW][C]159[/C][C] 5[/C][C] 4.291[/C][C] 0.7094[/C][/ROW]
[ROW][C]160[/C][C] 5[/C][C] 4.291[/C][C] 0.7094[/C][/ROW]
[ROW][C]161[/C][C] 5[/C][C] 4.391[/C][C] 0.6092[/C][/ROW]
[ROW][C]162[/C][C] 5[/C][C] 4.039[/C][C] 0.9608[/C][/ROW]
[ROW][C]163[/C][C] 5[/C][C] 4.34[/C][C] 0.6603[/C][/ROW]
[ROW][C]164[/C][C] 5[/C][C] 4.139[/C][C] 0.8606[/C][/ROW]
[ROW][C]165[/C][C] 5[/C][C] 4.416[/C][C] 0.5837[/C][/ROW]
[ROW][C]166[/C][C] 5[/C][C] 3.913[/C][C] 1.087[/C][/ROW]
[ROW][C]167[/C][C] 4[/C][C] 4.391[/C][C]-0.3908[/C][/ROW]
[ROW][C]168[/C][C] 4[/C][C] 4.391[/C][C]-0.3908[/C][/ROW]
[ROW][C]169[/C][C] 2[/C][C] 4.416[/C][C]-2.416[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298393&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298393&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 3 4.291-1.291
2 5 4.416 0.5837
3 5 4.265 0.7349
4 5 4.265 0.7349
5 4 4.291-0.2906
6 5 4.416 0.5837
7 5 4.416 0.5837
8 5 4.391 0.6092
9 5 4.391 0.6092
10 5 4.416 0.5837
11 5 4.391 0.6092
12 4 4.265-0.2651
13 5 4.291 0.7094
14 5 4.416 0.5837
15 4 4.391-0.3908
16 3 4.291-1.291
17 5 4.291 0.7094
18 4 4.265-0.2651
19 5 4.265 0.7349
20 5 4.416 0.5837
21 4 4.416-0.4163
22 2 4.291-2.291
23 5 4.291 0.7094
24 5 4.265 0.7349
25 5 4.416 0.5837
26 4 4.24-0.2396
27 4 4.391-0.3908
28 4 4.265-0.2651
29 5 4.391 0.6092
30 5 4.265 0.7349
31 4 4.291-0.2906
32 5 4.165 0.8351
33 5 4.416 0.5837
34 1 4.291-3.291
35 5 4.291 0.7094
36 4 4.416-0.4163
37 4 4.265-0.2651
38 4 4.391-0.3908
39 5 4.391 0.6092
40 4 4.391-0.3908
41 4 4.265-0.2651
42 5 4.265 0.7349
43 3 4.265-1.265
44 5 4.391 0.6092
45 5 4.291 0.7094
46 2 4.265-2.265
47 3 4.265-1.265
48 4 4.014-0.01367
49 4 4.265-0.2651
50 4 4.265-0.2651
51 5 4.416 0.5837
52 5 4.391 0.6092
53 4 4.391-0.3908
54 5 4.291 0.7094
55 5 4.265 0.7349
56 4 4.265-0.2651
57 5 4.391 0.6092
58 4 4.365-0.3653
59 4 4.265-0.2651
60 3 4.391-1.391
61 4 4.391-0.3908
62 4 4.291-0.2906
63 5 4.416 0.5837
64 4 4.416-0.4163
65 4 4.416-0.4163
66 4 4.365-0.3653
67 4 4.139-0.1394
68 4 4.416-0.4163
69 2 4.265-2.265
70 4 4.139-0.1394
71 4 4.416-0.4163
72 5 4.416 0.5837
73 3 4.139-1.139
74 3 4.291-1.291
75 5 4.265 0.7349
76 4 4.391-0.3908
77 5 4.416 0.5837
78 4 4.416-0.4163
79 4 4.416-0.4163
80 5 4.416 0.5837
81 5 4.416 0.5837
82 5 4.265 0.7349
83 4 4.265-0.2651
84 5 4.265 0.7349
85 5 4.416 0.5837
86 2 4.265-2.265
87 5 4.391 0.6092
88 5 4.416 0.5837
89 5 4.416 0.5837
90 5 4.039 0.9608
91 4 4.214-0.2141
92 4 4.391-0.3908
93 5 4.265 0.7349
94 4 4.24-0.2396
95 5 4.416 0.5837
96 5 4.416 0.5837
97 5 4.291 0.7094
98 4 4.416-0.4163
99 5 4.416 0.5837
100 5 4.391 0.6092
101 3 4.416-1.416
102 5 4.391 0.6092
103 5 4.291 0.7094
104 5 4.391 0.6092
105 4 4.265-0.2651
106 4 4.416-0.4163
107 4 4.416-0.4163
108 4 4.265-0.2651
109 5 4.391 0.6092
110 5 4.391 0.6092
111 4 4.416-0.4163
112 3 4.139-1.139
113 3 4.416-1.416
114 4 4.139-0.1394
115 4 4.265-0.2651
116 5 4.265 0.7349
117 5 4.416 0.5837
118 4 4.265-0.2651
119 5 4.416 0.5837
120 5 4.416 0.5837
121 4 4.391-0.3908
122 4 4.291-0.2906
123 5 4.291 0.7094
124 5 4.416 0.5837
125 5 4.24 0.7604
126 5 4.391 0.6092
127 4 4.391-0.3908
128 5 4.139 0.8606
129 3 4.416-1.416
130 5 4.265 0.7349
131 5 4.265 0.7349
132 4 4.265-0.2651
133 5 4.139 0.8606
134 4 4.416-0.4163
135 4 4.265-0.2651
136 4 4.265-0.2651
137 4 4.214-0.2141
138 2 4.265-2.265
139 4 4.391-0.3908
140 5 4.416 0.5837
141 5 4.391 0.6092
142 5 4.391 0.6092
143 4 4.291-0.2906
144 4 4.265-0.2651
145 5 4.416 0.5837
146 5 4.391 0.6092
147 5 4.265 0.7349
148 5 4.391 0.6092
149 5 4.391 0.6092
150 5 4.416 0.5837
151 4 4.391-0.3908
152 5 4.416 0.5837
153 3 4.416-1.416
154 3 4.416-1.416
155 4 4.165-0.1649
156 4 4.265-0.2651
157 3 4.416-1.416
158 3 4.165-1.165
159 5 4.291 0.7094
160 5 4.291 0.7094
161 5 4.391 0.6092
162 5 4.039 0.9608
163 5 4.34 0.6603
164 5 4.139 0.8606
165 5 4.416 0.5837
166 5 3.913 1.087
167 4 4.391-0.3908
168 4 4.391-0.3908
169 2 4.416-2.416







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
6 0.1334 0.2667 0.8666
7 0.05216 0.1043 0.9478
8 0.1682 0.3364 0.8318
9 0.1319 0.2638 0.8681
10 0.07967 0.1593 0.9203
11 0.05285 0.1057 0.9471
12 0.03865 0.07731 0.9613
13 0.07001 0.14 0.93
14 0.04242 0.08484 0.9576
15 0.08211 0.1642 0.9179
16 0.1493 0.2987 0.8507
17 0.1692 0.3384 0.8308
18 0.129 0.2581 0.871
19 0.1198 0.2395 0.8802
20 0.08758 0.1752 0.9124
21 0.08928 0.1786 0.9107
22 0.4055 0.8109 0.5945
23 0.4417 0.8835 0.5583
24 0.4122 0.8243 0.5878
25 0.3607 0.7213 0.6393
26 0.3381 0.6762 0.6619
27 0.3491 0.6982 0.6509
28 0.2982 0.5964 0.7018
29 0.2515 0.503 0.7485
30 0.2417 0.4834 0.7583
31 0.198 0.396 0.802
32 0.2561 0.5122 0.7439
33 0.2209 0.4419 0.7791
34 0.8836 0.2328 0.1164
35 0.8827 0.2347 0.1173
36 0.8661 0.2677 0.1339
37 0.8396 0.3207 0.1604
38 0.8276 0.3447 0.1724
39 0.8001 0.3998 0.1999
40 0.7843 0.4313 0.2157
41 0.7488 0.5024 0.2512
42 0.7364 0.5273 0.2636
43 0.7865 0.4269 0.2135
44 0.7585 0.4829 0.2415
45 0.7555 0.489 0.2445
46 0.9222 0.1555 0.07776
47 0.9374 0.1253 0.06264
48 0.9302 0.1396 0.06981
49 0.9137 0.1726 0.08628
50 0.8946 0.2109 0.1054
51 0.8794 0.2411 0.1206
52 0.8634 0.2732 0.1366
53 0.8462 0.3075 0.1538
54 0.8409 0.3181 0.1591
55 0.8368 0.3264 0.1632
56 0.8085 0.3831 0.1915
57 0.7881 0.4238 0.2119
58 0.7637 0.4726 0.2363
59 0.729 0.542 0.271
60 0.8007 0.3986 0.1993
61 0.7764 0.4473 0.2236
62 0.7444 0.5112 0.2556
63 0.7206 0.5589 0.2794
64 0.6957 0.6086 0.3043
65 0.669 0.662 0.331
66 0.6347 0.7306 0.3653
67 0.5939 0.8122 0.4061
68 0.5639 0.8723 0.4361
69 0.8001 0.3997 0.1999
70 0.7706 0.4587 0.2294
71 0.7463 0.5074 0.2537
72 0.7254 0.5491 0.2746
73 0.7497 0.5005 0.2503
74 0.7961 0.4077 0.2039
75 0.7934 0.4132 0.2066
76 0.7686 0.4628 0.2314
77 0.7493 0.5014 0.2507
78 0.724 0.5521 0.276
79 0.6971 0.6058 0.3029
80 0.6756 0.6488 0.3244
81 0.6537 0.6926 0.3463
82 0.6492 0.7017 0.3508
83 0.6121 0.7758 0.3879
84 0.6063 0.7874 0.3937
85 0.5835 0.8331 0.4165
86 0.827 0.346 0.173
87 0.8129 0.3741 0.1871
88 0.7976 0.4048 0.2024
89 0.7821 0.4358 0.2179
90 0.802 0.3961 0.198
91 0.7802 0.4395 0.2198
92 0.7543 0.4914 0.2457
93 0.7463 0.5074 0.2537
94 0.7187 0.5625 0.2813
95 0.7026 0.5949 0.2974
96 0.687 0.626 0.313
97 0.6787 0.6427 0.3213
98 0.6474 0.7053 0.3526
99 0.6326 0.7347 0.3674
100 0.613 0.774 0.387
101 0.6875 0.6251 0.3125
102 0.6688 0.6625 0.3312
103 0.661 0.678 0.339
104 0.6423 0.7155 0.3577
105 0.6054 0.7892 0.3946
106 0.5698 0.8604 0.4302
107 0.5333 0.9334 0.4667
108 0.4946 0.9891 0.5054
109 0.4737 0.9475 0.5263
110 0.4537 0.9073 0.5463
111 0.4163 0.8327 0.5837
112 0.4849 0.9698 0.5151
113 0.562 0.8761 0.438
114 0.528 0.944 0.472
115 0.4909 0.9818 0.5091
116 0.4733 0.9465 0.5267
117 0.4595 0.919 0.5405
118 0.4221 0.8442 0.5779
119 0.4112 0.8223 0.5888
120 0.4037 0.8073 0.5963
121 0.3656 0.7312 0.6344
122 0.3244 0.6488 0.6756
123 0.3158 0.6316 0.6842
124 0.3133 0.6267 0.6867
125 0.2901 0.5803 0.7099
126 0.2752 0.5505 0.7248
127 0.2404 0.4808 0.7596
128 0.2272 0.4544 0.7728
129 0.2738 0.5476 0.7262
130 0.2568 0.5135 0.7432
131 0.2408 0.4816 0.7592
132 0.2071 0.4143 0.7929
133 0.194 0.3879 0.806
134 0.162 0.3241 0.838
135 0.1349 0.2699 0.8651
136 0.1112 0.2224 0.8888
137 0.1039 0.2079 0.8961
138 0.4299 0.8598 0.5701
139 0.3906 0.7812 0.6094
140 0.3978 0.7956 0.6022
141 0.3654 0.7308 0.6346
142 0.3358 0.6716 0.6642
143 0.2835 0.5671 0.7165
144 0.2545 0.5091 0.7455
145 0.2737 0.5474 0.7263
146 0.2489 0.4978 0.7511
147 0.2146 0.4293 0.7854
148 0.1969 0.3937 0.8031
149 0.1843 0.3685 0.8157
150 0.2331 0.4663 0.7669
151 0.1832 0.3664 0.8168
152 0.2689 0.5378 0.7311
153 0.2471 0.4942 0.7529
154 0.2307 0.4614 0.7693
155 0.1784 0.3568 0.8216
156 0.1412 0.2824 0.8588
157 0.1367 0.2734 0.8633
158 0.26 0.52 0.74
159 0.233 0.4659 0.767
160 0.2253 0.4506 0.7747
161 0.2209 0.4417 0.7791
162 0.145 0.2899 0.855
163 0.0782 0.1564 0.9218

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
6 &  0.1334 &  0.2667 &  0.8666 \tabularnewline
7 &  0.05216 &  0.1043 &  0.9478 \tabularnewline
8 &  0.1682 &  0.3364 &  0.8318 \tabularnewline
9 &  0.1319 &  0.2638 &  0.8681 \tabularnewline
10 &  0.07967 &  0.1593 &  0.9203 \tabularnewline
11 &  0.05285 &  0.1057 &  0.9471 \tabularnewline
12 &  0.03865 &  0.07731 &  0.9613 \tabularnewline
13 &  0.07001 &  0.14 &  0.93 \tabularnewline
14 &  0.04242 &  0.08484 &  0.9576 \tabularnewline
15 &  0.08211 &  0.1642 &  0.9179 \tabularnewline
16 &  0.1493 &  0.2987 &  0.8507 \tabularnewline
17 &  0.1692 &  0.3384 &  0.8308 \tabularnewline
18 &  0.129 &  0.2581 &  0.871 \tabularnewline
19 &  0.1198 &  0.2395 &  0.8802 \tabularnewline
20 &  0.08758 &  0.1752 &  0.9124 \tabularnewline
21 &  0.08928 &  0.1786 &  0.9107 \tabularnewline
22 &  0.4055 &  0.8109 &  0.5945 \tabularnewline
23 &  0.4417 &  0.8835 &  0.5583 \tabularnewline
24 &  0.4122 &  0.8243 &  0.5878 \tabularnewline
25 &  0.3607 &  0.7213 &  0.6393 \tabularnewline
26 &  0.3381 &  0.6762 &  0.6619 \tabularnewline
27 &  0.3491 &  0.6982 &  0.6509 \tabularnewline
28 &  0.2982 &  0.5964 &  0.7018 \tabularnewline
29 &  0.2515 &  0.503 &  0.7485 \tabularnewline
30 &  0.2417 &  0.4834 &  0.7583 \tabularnewline
31 &  0.198 &  0.396 &  0.802 \tabularnewline
32 &  0.2561 &  0.5122 &  0.7439 \tabularnewline
33 &  0.2209 &  0.4419 &  0.7791 \tabularnewline
34 &  0.8836 &  0.2328 &  0.1164 \tabularnewline
35 &  0.8827 &  0.2347 &  0.1173 \tabularnewline
36 &  0.8661 &  0.2677 &  0.1339 \tabularnewline
37 &  0.8396 &  0.3207 &  0.1604 \tabularnewline
38 &  0.8276 &  0.3447 &  0.1724 \tabularnewline
39 &  0.8001 &  0.3998 &  0.1999 \tabularnewline
40 &  0.7843 &  0.4313 &  0.2157 \tabularnewline
41 &  0.7488 &  0.5024 &  0.2512 \tabularnewline
42 &  0.7364 &  0.5273 &  0.2636 \tabularnewline
43 &  0.7865 &  0.4269 &  0.2135 \tabularnewline
44 &  0.7585 &  0.4829 &  0.2415 \tabularnewline
45 &  0.7555 &  0.489 &  0.2445 \tabularnewline
46 &  0.9222 &  0.1555 &  0.07776 \tabularnewline
47 &  0.9374 &  0.1253 &  0.06264 \tabularnewline
48 &  0.9302 &  0.1396 &  0.06981 \tabularnewline
49 &  0.9137 &  0.1726 &  0.08628 \tabularnewline
50 &  0.8946 &  0.2109 &  0.1054 \tabularnewline
51 &  0.8794 &  0.2411 &  0.1206 \tabularnewline
52 &  0.8634 &  0.2732 &  0.1366 \tabularnewline
53 &  0.8462 &  0.3075 &  0.1538 \tabularnewline
54 &  0.8409 &  0.3181 &  0.1591 \tabularnewline
55 &  0.8368 &  0.3264 &  0.1632 \tabularnewline
56 &  0.8085 &  0.3831 &  0.1915 \tabularnewline
57 &  0.7881 &  0.4238 &  0.2119 \tabularnewline
58 &  0.7637 &  0.4726 &  0.2363 \tabularnewline
59 &  0.729 &  0.542 &  0.271 \tabularnewline
60 &  0.8007 &  0.3986 &  0.1993 \tabularnewline
61 &  0.7764 &  0.4473 &  0.2236 \tabularnewline
62 &  0.7444 &  0.5112 &  0.2556 \tabularnewline
63 &  0.7206 &  0.5589 &  0.2794 \tabularnewline
64 &  0.6957 &  0.6086 &  0.3043 \tabularnewline
65 &  0.669 &  0.662 &  0.331 \tabularnewline
66 &  0.6347 &  0.7306 &  0.3653 \tabularnewline
67 &  0.5939 &  0.8122 &  0.4061 \tabularnewline
68 &  0.5639 &  0.8723 &  0.4361 \tabularnewline
69 &  0.8001 &  0.3997 &  0.1999 \tabularnewline
70 &  0.7706 &  0.4587 &  0.2294 \tabularnewline
71 &  0.7463 &  0.5074 &  0.2537 \tabularnewline
72 &  0.7254 &  0.5491 &  0.2746 \tabularnewline
73 &  0.7497 &  0.5005 &  0.2503 \tabularnewline
74 &  0.7961 &  0.4077 &  0.2039 \tabularnewline
75 &  0.7934 &  0.4132 &  0.2066 \tabularnewline
76 &  0.7686 &  0.4628 &  0.2314 \tabularnewline
77 &  0.7493 &  0.5014 &  0.2507 \tabularnewline
78 &  0.724 &  0.5521 &  0.276 \tabularnewline
79 &  0.6971 &  0.6058 &  0.3029 \tabularnewline
80 &  0.6756 &  0.6488 &  0.3244 \tabularnewline
81 &  0.6537 &  0.6926 &  0.3463 \tabularnewline
82 &  0.6492 &  0.7017 &  0.3508 \tabularnewline
83 &  0.6121 &  0.7758 &  0.3879 \tabularnewline
84 &  0.6063 &  0.7874 &  0.3937 \tabularnewline
85 &  0.5835 &  0.8331 &  0.4165 \tabularnewline
86 &  0.827 &  0.346 &  0.173 \tabularnewline
87 &  0.8129 &  0.3741 &  0.1871 \tabularnewline
88 &  0.7976 &  0.4048 &  0.2024 \tabularnewline
89 &  0.7821 &  0.4358 &  0.2179 \tabularnewline
90 &  0.802 &  0.3961 &  0.198 \tabularnewline
91 &  0.7802 &  0.4395 &  0.2198 \tabularnewline
92 &  0.7543 &  0.4914 &  0.2457 \tabularnewline
93 &  0.7463 &  0.5074 &  0.2537 \tabularnewline
94 &  0.7187 &  0.5625 &  0.2813 \tabularnewline
95 &  0.7026 &  0.5949 &  0.2974 \tabularnewline
96 &  0.687 &  0.626 &  0.313 \tabularnewline
97 &  0.6787 &  0.6427 &  0.3213 \tabularnewline
98 &  0.6474 &  0.7053 &  0.3526 \tabularnewline
99 &  0.6326 &  0.7347 &  0.3674 \tabularnewline
100 &  0.613 &  0.774 &  0.387 \tabularnewline
101 &  0.6875 &  0.6251 &  0.3125 \tabularnewline
102 &  0.6688 &  0.6625 &  0.3312 \tabularnewline
103 &  0.661 &  0.678 &  0.339 \tabularnewline
104 &  0.6423 &  0.7155 &  0.3577 \tabularnewline
105 &  0.6054 &  0.7892 &  0.3946 \tabularnewline
106 &  0.5698 &  0.8604 &  0.4302 \tabularnewline
107 &  0.5333 &  0.9334 &  0.4667 \tabularnewline
108 &  0.4946 &  0.9891 &  0.5054 \tabularnewline
109 &  0.4737 &  0.9475 &  0.5263 \tabularnewline
110 &  0.4537 &  0.9073 &  0.5463 \tabularnewline
111 &  0.4163 &  0.8327 &  0.5837 \tabularnewline
112 &  0.4849 &  0.9698 &  0.5151 \tabularnewline
113 &  0.562 &  0.8761 &  0.438 \tabularnewline
114 &  0.528 &  0.944 &  0.472 \tabularnewline
115 &  0.4909 &  0.9818 &  0.5091 \tabularnewline
116 &  0.4733 &  0.9465 &  0.5267 \tabularnewline
117 &  0.4595 &  0.919 &  0.5405 \tabularnewline
118 &  0.4221 &  0.8442 &  0.5779 \tabularnewline
119 &  0.4112 &  0.8223 &  0.5888 \tabularnewline
120 &  0.4037 &  0.8073 &  0.5963 \tabularnewline
121 &  0.3656 &  0.7312 &  0.6344 \tabularnewline
122 &  0.3244 &  0.6488 &  0.6756 \tabularnewline
123 &  0.3158 &  0.6316 &  0.6842 \tabularnewline
124 &  0.3133 &  0.6267 &  0.6867 \tabularnewline
125 &  0.2901 &  0.5803 &  0.7099 \tabularnewline
126 &  0.2752 &  0.5505 &  0.7248 \tabularnewline
127 &  0.2404 &  0.4808 &  0.7596 \tabularnewline
128 &  0.2272 &  0.4544 &  0.7728 \tabularnewline
129 &  0.2738 &  0.5476 &  0.7262 \tabularnewline
130 &  0.2568 &  0.5135 &  0.7432 \tabularnewline
131 &  0.2408 &  0.4816 &  0.7592 \tabularnewline
132 &  0.2071 &  0.4143 &  0.7929 \tabularnewline
133 &  0.194 &  0.3879 &  0.806 \tabularnewline
134 &  0.162 &  0.3241 &  0.838 \tabularnewline
135 &  0.1349 &  0.2699 &  0.8651 \tabularnewline
136 &  0.1112 &  0.2224 &  0.8888 \tabularnewline
137 &  0.1039 &  0.2079 &  0.8961 \tabularnewline
138 &  0.4299 &  0.8598 &  0.5701 \tabularnewline
139 &  0.3906 &  0.7812 &  0.6094 \tabularnewline
140 &  0.3978 &  0.7956 &  0.6022 \tabularnewline
141 &  0.3654 &  0.7308 &  0.6346 \tabularnewline
142 &  0.3358 &  0.6716 &  0.6642 \tabularnewline
143 &  0.2835 &  0.5671 &  0.7165 \tabularnewline
144 &  0.2545 &  0.5091 &  0.7455 \tabularnewline
145 &  0.2737 &  0.5474 &  0.7263 \tabularnewline
146 &  0.2489 &  0.4978 &  0.7511 \tabularnewline
147 &  0.2146 &  0.4293 &  0.7854 \tabularnewline
148 &  0.1969 &  0.3937 &  0.8031 \tabularnewline
149 &  0.1843 &  0.3685 &  0.8157 \tabularnewline
150 &  0.2331 &  0.4663 &  0.7669 \tabularnewline
151 &  0.1832 &  0.3664 &  0.8168 \tabularnewline
152 &  0.2689 &  0.5378 &  0.7311 \tabularnewline
153 &  0.2471 &  0.4942 &  0.7529 \tabularnewline
154 &  0.2307 &  0.4614 &  0.7693 \tabularnewline
155 &  0.1784 &  0.3568 &  0.8216 \tabularnewline
156 &  0.1412 &  0.2824 &  0.8588 \tabularnewline
157 &  0.1367 &  0.2734 &  0.8633 \tabularnewline
158 &  0.26 &  0.52 &  0.74 \tabularnewline
159 &  0.233 &  0.4659 &  0.767 \tabularnewline
160 &  0.2253 &  0.4506 &  0.7747 \tabularnewline
161 &  0.2209 &  0.4417 &  0.7791 \tabularnewline
162 &  0.145 &  0.2899 &  0.855 \tabularnewline
163 &  0.0782 &  0.1564 &  0.9218 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298393&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]6[/C][C] 0.1334[/C][C] 0.2667[/C][C] 0.8666[/C][/ROW]
[ROW][C]7[/C][C] 0.05216[/C][C] 0.1043[/C][C] 0.9478[/C][/ROW]
[ROW][C]8[/C][C] 0.1682[/C][C] 0.3364[/C][C] 0.8318[/C][/ROW]
[ROW][C]9[/C][C] 0.1319[/C][C] 0.2638[/C][C] 0.8681[/C][/ROW]
[ROW][C]10[/C][C] 0.07967[/C][C] 0.1593[/C][C] 0.9203[/C][/ROW]
[ROW][C]11[/C][C] 0.05285[/C][C] 0.1057[/C][C] 0.9471[/C][/ROW]
[ROW][C]12[/C][C] 0.03865[/C][C] 0.07731[/C][C] 0.9613[/C][/ROW]
[ROW][C]13[/C][C] 0.07001[/C][C] 0.14[/C][C] 0.93[/C][/ROW]
[ROW][C]14[/C][C] 0.04242[/C][C] 0.08484[/C][C] 0.9576[/C][/ROW]
[ROW][C]15[/C][C] 0.08211[/C][C] 0.1642[/C][C] 0.9179[/C][/ROW]
[ROW][C]16[/C][C] 0.1493[/C][C] 0.2987[/C][C] 0.8507[/C][/ROW]
[ROW][C]17[/C][C] 0.1692[/C][C] 0.3384[/C][C] 0.8308[/C][/ROW]
[ROW][C]18[/C][C] 0.129[/C][C] 0.2581[/C][C] 0.871[/C][/ROW]
[ROW][C]19[/C][C] 0.1198[/C][C] 0.2395[/C][C] 0.8802[/C][/ROW]
[ROW][C]20[/C][C] 0.08758[/C][C] 0.1752[/C][C] 0.9124[/C][/ROW]
[ROW][C]21[/C][C] 0.08928[/C][C] 0.1786[/C][C] 0.9107[/C][/ROW]
[ROW][C]22[/C][C] 0.4055[/C][C] 0.8109[/C][C] 0.5945[/C][/ROW]
[ROW][C]23[/C][C] 0.4417[/C][C] 0.8835[/C][C] 0.5583[/C][/ROW]
[ROW][C]24[/C][C] 0.4122[/C][C] 0.8243[/C][C] 0.5878[/C][/ROW]
[ROW][C]25[/C][C] 0.3607[/C][C] 0.7213[/C][C] 0.6393[/C][/ROW]
[ROW][C]26[/C][C] 0.3381[/C][C] 0.6762[/C][C] 0.6619[/C][/ROW]
[ROW][C]27[/C][C] 0.3491[/C][C] 0.6982[/C][C] 0.6509[/C][/ROW]
[ROW][C]28[/C][C] 0.2982[/C][C] 0.5964[/C][C] 0.7018[/C][/ROW]
[ROW][C]29[/C][C] 0.2515[/C][C] 0.503[/C][C] 0.7485[/C][/ROW]
[ROW][C]30[/C][C] 0.2417[/C][C] 0.4834[/C][C] 0.7583[/C][/ROW]
[ROW][C]31[/C][C] 0.198[/C][C] 0.396[/C][C] 0.802[/C][/ROW]
[ROW][C]32[/C][C] 0.2561[/C][C] 0.5122[/C][C] 0.7439[/C][/ROW]
[ROW][C]33[/C][C] 0.2209[/C][C] 0.4419[/C][C] 0.7791[/C][/ROW]
[ROW][C]34[/C][C] 0.8836[/C][C] 0.2328[/C][C] 0.1164[/C][/ROW]
[ROW][C]35[/C][C] 0.8827[/C][C] 0.2347[/C][C] 0.1173[/C][/ROW]
[ROW][C]36[/C][C] 0.8661[/C][C] 0.2677[/C][C] 0.1339[/C][/ROW]
[ROW][C]37[/C][C] 0.8396[/C][C] 0.3207[/C][C] 0.1604[/C][/ROW]
[ROW][C]38[/C][C] 0.8276[/C][C] 0.3447[/C][C] 0.1724[/C][/ROW]
[ROW][C]39[/C][C] 0.8001[/C][C] 0.3998[/C][C] 0.1999[/C][/ROW]
[ROW][C]40[/C][C] 0.7843[/C][C] 0.4313[/C][C] 0.2157[/C][/ROW]
[ROW][C]41[/C][C] 0.7488[/C][C] 0.5024[/C][C] 0.2512[/C][/ROW]
[ROW][C]42[/C][C] 0.7364[/C][C] 0.5273[/C][C] 0.2636[/C][/ROW]
[ROW][C]43[/C][C] 0.7865[/C][C] 0.4269[/C][C] 0.2135[/C][/ROW]
[ROW][C]44[/C][C] 0.7585[/C][C] 0.4829[/C][C] 0.2415[/C][/ROW]
[ROW][C]45[/C][C] 0.7555[/C][C] 0.489[/C][C] 0.2445[/C][/ROW]
[ROW][C]46[/C][C] 0.9222[/C][C] 0.1555[/C][C] 0.07776[/C][/ROW]
[ROW][C]47[/C][C] 0.9374[/C][C] 0.1253[/C][C] 0.06264[/C][/ROW]
[ROW][C]48[/C][C] 0.9302[/C][C] 0.1396[/C][C] 0.06981[/C][/ROW]
[ROW][C]49[/C][C] 0.9137[/C][C] 0.1726[/C][C] 0.08628[/C][/ROW]
[ROW][C]50[/C][C] 0.8946[/C][C] 0.2109[/C][C] 0.1054[/C][/ROW]
[ROW][C]51[/C][C] 0.8794[/C][C] 0.2411[/C][C] 0.1206[/C][/ROW]
[ROW][C]52[/C][C] 0.8634[/C][C] 0.2732[/C][C] 0.1366[/C][/ROW]
[ROW][C]53[/C][C] 0.8462[/C][C] 0.3075[/C][C] 0.1538[/C][/ROW]
[ROW][C]54[/C][C] 0.8409[/C][C] 0.3181[/C][C] 0.1591[/C][/ROW]
[ROW][C]55[/C][C] 0.8368[/C][C] 0.3264[/C][C] 0.1632[/C][/ROW]
[ROW][C]56[/C][C] 0.8085[/C][C] 0.3831[/C][C] 0.1915[/C][/ROW]
[ROW][C]57[/C][C] 0.7881[/C][C] 0.4238[/C][C] 0.2119[/C][/ROW]
[ROW][C]58[/C][C] 0.7637[/C][C] 0.4726[/C][C] 0.2363[/C][/ROW]
[ROW][C]59[/C][C] 0.729[/C][C] 0.542[/C][C] 0.271[/C][/ROW]
[ROW][C]60[/C][C] 0.8007[/C][C] 0.3986[/C][C] 0.1993[/C][/ROW]
[ROW][C]61[/C][C] 0.7764[/C][C] 0.4473[/C][C] 0.2236[/C][/ROW]
[ROW][C]62[/C][C] 0.7444[/C][C] 0.5112[/C][C] 0.2556[/C][/ROW]
[ROW][C]63[/C][C] 0.7206[/C][C] 0.5589[/C][C] 0.2794[/C][/ROW]
[ROW][C]64[/C][C] 0.6957[/C][C] 0.6086[/C][C] 0.3043[/C][/ROW]
[ROW][C]65[/C][C] 0.669[/C][C] 0.662[/C][C] 0.331[/C][/ROW]
[ROW][C]66[/C][C] 0.6347[/C][C] 0.7306[/C][C] 0.3653[/C][/ROW]
[ROW][C]67[/C][C] 0.5939[/C][C] 0.8122[/C][C] 0.4061[/C][/ROW]
[ROW][C]68[/C][C] 0.5639[/C][C] 0.8723[/C][C] 0.4361[/C][/ROW]
[ROW][C]69[/C][C] 0.8001[/C][C] 0.3997[/C][C] 0.1999[/C][/ROW]
[ROW][C]70[/C][C] 0.7706[/C][C] 0.4587[/C][C] 0.2294[/C][/ROW]
[ROW][C]71[/C][C] 0.7463[/C][C] 0.5074[/C][C] 0.2537[/C][/ROW]
[ROW][C]72[/C][C] 0.7254[/C][C] 0.5491[/C][C] 0.2746[/C][/ROW]
[ROW][C]73[/C][C] 0.7497[/C][C] 0.5005[/C][C] 0.2503[/C][/ROW]
[ROW][C]74[/C][C] 0.7961[/C][C] 0.4077[/C][C] 0.2039[/C][/ROW]
[ROW][C]75[/C][C] 0.7934[/C][C] 0.4132[/C][C] 0.2066[/C][/ROW]
[ROW][C]76[/C][C] 0.7686[/C][C] 0.4628[/C][C] 0.2314[/C][/ROW]
[ROW][C]77[/C][C] 0.7493[/C][C] 0.5014[/C][C] 0.2507[/C][/ROW]
[ROW][C]78[/C][C] 0.724[/C][C] 0.5521[/C][C] 0.276[/C][/ROW]
[ROW][C]79[/C][C] 0.6971[/C][C] 0.6058[/C][C] 0.3029[/C][/ROW]
[ROW][C]80[/C][C] 0.6756[/C][C] 0.6488[/C][C] 0.3244[/C][/ROW]
[ROW][C]81[/C][C] 0.6537[/C][C] 0.6926[/C][C] 0.3463[/C][/ROW]
[ROW][C]82[/C][C] 0.6492[/C][C] 0.7017[/C][C] 0.3508[/C][/ROW]
[ROW][C]83[/C][C] 0.6121[/C][C] 0.7758[/C][C] 0.3879[/C][/ROW]
[ROW][C]84[/C][C] 0.6063[/C][C] 0.7874[/C][C] 0.3937[/C][/ROW]
[ROW][C]85[/C][C] 0.5835[/C][C] 0.8331[/C][C] 0.4165[/C][/ROW]
[ROW][C]86[/C][C] 0.827[/C][C] 0.346[/C][C] 0.173[/C][/ROW]
[ROW][C]87[/C][C] 0.8129[/C][C] 0.3741[/C][C] 0.1871[/C][/ROW]
[ROW][C]88[/C][C] 0.7976[/C][C] 0.4048[/C][C] 0.2024[/C][/ROW]
[ROW][C]89[/C][C] 0.7821[/C][C] 0.4358[/C][C] 0.2179[/C][/ROW]
[ROW][C]90[/C][C] 0.802[/C][C] 0.3961[/C][C] 0.198[/C][/ROW]
[ROW][C]91[/C][C] 0.7802[/C][C] 0.4395[/C][C] 0.2198[/C][/ROW]
[ROW][C]92[/C][C] 0.7543[/C][C] 0.4914[/C][C] 0.2457[/C][/ROW]
[ROW][C]93[/C][C] 0.7463[/C][C] 0.5074[/C][C] 0.2537[/C][/ROW]
[ROW][C]94[/C][C] 0.7187[/C][C] 0.5625[/C][C] 0.2813[/C][/ROW]
[ROW][C]95[/C][C] 0.7026[/C][C] 0.5949[/C][C] 0.2974[/C][/ROW]
[ROW][C]96[/C][C] 0.687[/C][C] 0.626[/C][C] 0.313[/C][/ROW]
[ROW][C]97[/C][C] 0.6787[/C][C] 0.6427[/C][C] 0.3213[/C][/ROW]
[ROW][C]98[/C][C] 0.6474[/C][C] 0.7053[/C][C] 0.3526[/C][/ROW]
[ROW][C]99[/C][C] 0.6326[/C][C] 0.7347[/C][C] 0.3674[/C][/ROW]
[ROW][C]100[/C][C] 0.613[/C][C] 0.774[/C][C] 0.387[/C][/ROW]
[ROW][C]101[/C][C] 0.6875[/C][C] 0.6251[/C][C] 0.3125[/C][/ROW]
[ROW][C]102[/C][C] 0.6688[/C][C] 0.6625[/C][C] 0.3312[/C][/ROW]
[ROW][C]103[/C][C] 0.661[/C][C] 0.678[/C][C] 0.339[/C][/ROW]
[ROW][C]104[/C][C] 0.6423[/C][C] 0.7155[/C][C] 0.3577[/C][/ROW]
[ROW][C]105[/C][C] 0.6054[/C][C] 0.7892[/C][C] 0.3946[/C][/ROW]
[ROW][C]106[/C][C] 0.5698[/C][C] 0.8604[/C][C] 0.4302[/C][/ROW]
[ROW][C]107[/C][C] 0.5333[/C][C] 0.9334[/C][C] 0.4667[/C][/ROW]
[ROW][C]108[/C][C] 0.4946[/C][C] 0.9891[/C][C] 0.5054[/C][/ROW]
[ROW][C]109[/C][C] 0.4737[/C][C] 0.9475[/C][C] 0.5263[/C][/ROW]
[ROW][C]110[/C][C] 0.4537[/C][C] 0.9073[/C][C] 0.5463[/C][/ROW]
[ROW][C]111[/C][C] 0.4163[/C][C] 0.8327[/C][C] 0.5837[/C][/ROW]
[ROW][C]112[/C][C] 0.4849[/C][C] 0.9698[/C][C] 0.5151[/C][/ROW]
[ROW][C]113[/C][C] 0.562[/C][C] 0.8761[/C][C] 0.438[/C][/ROW]
[ROW][C]114[/C][C] 0.528[/C][C] 0.944[/C][C] 0.472[/C][/ROW]
[ROW][C]115[/C][C] 0.4909[/C][C] 0.9818[/C][C] 0.5091[/C][/ROW]
[ROW][C]116[/C][C] 0.4733[/C][C] 0.9465[/C][C] 0.5267[/C][/ROW]
[ROW][C]117[/C][C] 0.4595[/C][C] 0.919[/C][C] 0.5405[/C][/ROW]
[ROW][C]118[/C][C] 0.4221[/C][C] 0.8442[/C][C] 0.5779[/C][/ROW]
[ROW][C]119[/C][C] 0.4112[/C][C] 0.8223[/C][C] 0.5888[/C][/ROW]
[ROW][C]120[/C][C] 0.4037[/C][C] 0.8073[/C][C] 0.5963[/C][/ROW]
[ROW][C]121[/C][C] 0.3656[/C][C] 0.7312[/C][C] 0.6344[/C][/ROW]
[ROW][C]122[/C][C] 0.3244[/C][C] 0.6488[/C][C] 0.6756[/C][/ROW]
[ROW][C]123[/C][C] 0.3158[/C][C] 0.6316[/C][C] 0.6842[/C][/ROW]
[ROW][C]124[/C][C] 0.3133[/C][C] 0.6267[/C][C] 0.6867[/C][/ROW]
[ROW][C]125[/C][C] 0.2901[/C][C] 0.5803[/C][C] 0.7099[/C][/ROW]
[ROW][C]126[/C][C] 0.2752[/C][C] 0.5505[/C][C] 0.7248[/C][/ROW]
[ROW][C]127[/C][C] 0.2404[/C][C] 0.4808[/C][C] 0.7596[/C][/ROW]
[ROW][C]128[/C][C] 0.2272[/C][C] 0.4544[/C][C] 0.7728[/C][/ROW]
[ROW][C]129[/C][C] 0.2738[/C][C] 0.5476[/C][C] 0.7262[/C][/ROW]
[ROW][C]130[/C][C] 0.2568[/C][C] 0.5135[/C][C] 0.7432[/C][/ROW]
[ROW][C]131[/C][C] 0.2408[/C][C] 0.4816[/C][C] 0.7592[/C][/ROW]
[ROW][C]132[/C][C] 0.2071[/C][C] 0.4143[/C][C] 0.7929[/C][/ROW]
[ROW][C]133[/C][C] 0.194[/C][C] 0.3879[/C][C] 0.806[/C][/ROW]
[ROW][C]134[/C][C] 0.162[/C][C] 0.3241[/C][C] 0.838[/C][/ROW]
[ROW][C]135[/C][C] 0.1349[/C][C] 0.2699[/C][C] 0.8651[/C][/ROW]
[ROW][C]136[/C][C] 0.1112[/C][C] 0.2224[/C][C] 0.8888[/C][/ROW]
[ROW][C]137[/C][C] 0.1039[/C][C] 0.2079[/C][C] 0.8961[/C][/ROW]
[ROW][C]138[/C][C] 0.4299[/C][C] 0.8598[/C][C] 0.5701[/C][/ROW]
[ROW][C]139[/C][C] 0.3906[/C][C] 0.7812[/C][C] 0.6094[/C][/ROW]
[ROW][C]140[/C][C] 0.3978[/C][C] 0.7956[/C][C] 0.6022[/C][/ROW]
[ROW][C]141[/C][C] 0.3654[/C][C] 0.7308[/C][C] 0.6346[/C][/ROW]
[ROW][C]142[/C][C] 0.3358[/C][C] 0.6716[/C][C] 0.6642[/C][/ROW]
[ROW][C]143[/C][C] 0.2835[/C][C] 0.5671[/C][C] 0.7165[/C][/ROW]
[ROW][C]144[/C][C] 0.2545[/C][C] 0.5091[/C][C] 0.7455[/C][/ROW]
[ROW][C]145[/C][C] 0.2737[/C][C] 0.5474[/C][C] 0.7263[/C][/ROW]
[ROW][C]146[/C][C] 0.2489[/C][C] 0.4978[/C][C] 0.7511[/C][/ROW]
[ROW][C]147[/C][C] 0.2146[/C][C] 0.4293[/C][C] 0.7854[/C][/ROW]
[ROW][C]148[/C][C] 0.1969[/C][C] 0.3937[/C][C] 0.8031[/C][/ROW]
[ROW][C]149[/C][C] 0.1843[/C][C] 0.3685[/C][C] 0.8157[/C][/ROW]
[ROW][C]150[/C][C] 0.2331[/C][C] 0.4663[/C][C] 0.7669[/C][/ROW]
[ROW][C]151[/C][C] 0.1832[/C][C] 0.3664[/C][C] 0.8168[/C][/ROW]
[ROW][C]152[/C][C] 0.2689[/C][C] 0.5378[/C][C] 0.7311[/C][/ROW]
[ROW][C]153[/C][C] 0.2471[/C][C] 0.4942[/C][C] 0.7529[/C][/ROW]
[ROW][C]154[/C][C] 0.2307[/C][C] 0.4614[/C][C] 0.7693[/C][/ROW]
[ROW][C]155[/C][C] 0.1784[/C][C] 0.3568[/C][C] 0.8216[/C][/ROW]
[ROW][C]156[/C][C] 0.1412[/C][C] 0.2824[/C][C] 0.8588[/C][/ROW]
[ROW][C]157[/C][C] 0.1367[/C][C] 0.2734[/C][C] 0.8633[/C][/ROW]
[ROW][C]158[/C][C] 0.26[/C][C] 0.52[/C][C] 0.74[/C][/ROW]
[ROW][C]159[/C][C] 0.233[/C][C] 0.4659[/C][C] 0.767[/C][/ROW]
[ROW][C]160[/C][C] 0.2253[/C][C] 0.4506[/C][C] 0.7747[/C][/ROW]
[ROW][C]161[/C][C] 0.2209[/C][C] 0.4417[/C][C] 0.7791[/C][/ROW]
[ROW][C]162[/C][C] 0.145[/C][C] 0.2899[/C][C] 0.855[/C][/ROW]
[ROW][C]163[/C][C] 0.0782[/C][C] 0.1564[/C][C] 0.9218[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298393&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298393&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
6 0.1334 0.2667 0.8666
7 0.05216 0.1043 0.9478
8 0.1682 0.3364 0.8318
9 0.1319 0.2638 0.8681
10 0.07967 0.1593 0.9203
11 0.05285 0.1057 0.9471
12 0.03865 0.07731 0.9613
13 0.07001 0.14 0.93
14 0.04242 0.08484 0.9576
15 0.08211 0.1642 0.9179
16 0.1493 0.2987 0.8507
17 0.1692 0.3384 0.8308
18 0.129 0.2581 0.871
19 0.1198 0.2395 0.8802
20 0.08758 0.1752 0.9124
21 0.08928 0.1786 0.9107
22 0.4055 0.8109 0.5945
23 0.4417 0.8835 0.5583
24 0.4122 0.8243 0.5878
25 0.3607 0.7213 0.6393
26 0.3381 0.6762 0.6619
27 0.3491 0.6982 0.6509
28 0.2982 0.5964 0.7018
29 0.2515 0.503 0.7485
30 0.2417 0.4834 0.7583
31 0.198 0.396 0.802
32 0.2561 0.5122 0.7439
33 0.2209 0.4419 0.7791
34 0.8836 0.2328 0.1164
35 0.8827 0.2347 0.1173
36 0.8661 0.2677 0.1339
37 0.8396 0.3207 0.1604
38 0.8276 0.3447 0.1724
39 0.8001 0.3998 0.1999
40 0.7843 0.4313 0.2157
41 0.7488 0.5024 0.2512
42 0.7364 0.5273 0.2636
43 0.7865 0.4269 0.2135
44 0.7585 0.4829 0.2415
45 0.7555 0.489 0.2445
46 0.9222 0.1555 0.07776
47 0.9374 0.1253 0.06264
48 0.9302 0.1396 0.06981
49 0.9137 0.1726 0.08628
50 0.8946 0.2109 0.1054
51 0.8794 0.2411 0.1206
52 0.8634 0.2732 0.1366
53 0.8462 0.3075 0.1538
54 0.8409 0.3181 0.1591
55 0.8368 0.3264 0.1632
56 0.8085 0.3831 0.1915
57 0.7881 0.4238 0.2119
58 0.7637 0.4726 0.2363
59 0.729 0.542 0.271
60 0.8007 0.3986 0.1993
61 0.7764 0.4473 0.2236
62 0.7444 0.5112 0.2556
63 0.7206 0.5589 0.2794
64 0.6957 0.6086 0.3043
65 0.669 0.662 0.331
66 0.6347 0.7306 0.3653
67 0.5939 0.8122 0.4061
68 0.5639 0.8723 0.4361
69 0.8001 0.3997 0.1999
70 0.7706 0.4587 0.2294
71 0.7463 0.5074 0.2537
72 0.7254 0.5491 0.2746
73 0.7497 0.5005 0.2503
74 0.7961 0.4077 0.2039
75 0.7934 0.4132 0.2066
76 0.7686 0.4628 0.2314
77 0.7493 0.5014 0.2507
78 0.724 0.5521 0.276
79 0.6971 0.6058 0.3029
80 0.6756 0.6488 0.3244
81 0.6537 0.6926 0.3463
82 0.6492 0.7017 0.3508
83 0.6121 0.7758 0.3879
84 0.6063 0.7874 0.3937
85 0.5835 0.8331 0.4165
86 0.827 0.346 0.173
87 0.8129 0.3741 0.1871
88 0.7976 0.4048 0.2024
89 0.7821 0.4358 0.2179
90 0.802 0.3961 0.198
91 0.7802 0.4395 0.2198
92 0.7543 0.4914 0.2457
93 0.7463 0.5074 0.2537
94 0.7187 0.5625 0.2813
95 0.7026 0.5949 0.2974
96 0.687 0.626 0.313
97 0.6787 0.6427 0.3213
98 0.6474 0.7053 0.3526
99 0.6326 0.7347 0.3674
100 0.613 0.774 0.387
101 0.6875 0.6251 0.3125
102 0.6688 0.6625 0.3312
103 0.661 0.678 0.339
104 0.6423 0.7155 0.3577
105 0.6054 0.7892 0.3946
106 0.5698 0.8604 0.4302
107 0.5333 0.9334 0.4667
108 0.4946 0.9891 0.5054
109 0.4737 0.9475 0.5263
110 0.4537 0.9073 0.5463
111 0.4163 0.8327 0.5837
112 0.4849 0.9698 0.5151
113 0.562 0.8761 0.438
114 0.528 0.944 0.472
115 0.4909 0.9818 0.5091
116 0.4733 0.9465 0.5267
117 0.4595 0.919 0.5405
118 0.4221 0.8442 0.5779
119 0.4112 0.8223 0.5888
120 0.4037 0.8073 0.5963
121 0.3656 0.7312 0.6344
122 0.3244 0.6488 0.6756
123 0.3158 0.6316 0.6842
124 0.3133 0.6267 0.6867
125 0.2901 0.5803 0.7099
126 0.2752 0.5505 0.7248
127 0.2404 0.4808 0.7596
128 0.2272 0.4544 0.7728
129 0.2738 0.5476 0.7262
130 0.2568 0.5135 0.7432
131 0.2408 0.4816 0.7592
132 0.2071 0.4143 0.7929
133 0.194 0.3879 0.806
134 0.162 0.3241 0.838
135 0.1349 0.2699 0.8651
136 0.1112 0.2224 0.8888
137 0.1039 0.2079 0.8961
138 0.4299 0.8598 0.5701
139 0.3906 0.7812 0.6094
140 0.3978 0.7956 0.6022
141 0.3654 0.7308 0.6346
142 0.3358 0.6716 0.6642
143 0.2835 0.5671 0.7165
144 0.2545 0.5091 0.7455
145 0.2737 0.5474 0.7263
146 0.2489 0.4978 0.7511
147 0.2146 0.4293 0.7854
148 0.1969 0.3937 0.8031
149 0.1843 0.3685 0.8157
150 0.2331 0.4663 0.7669
151 0.1832 0.3664 0.8168
152 0.2689 0.5378 0.7311
153 0.2471 0.4942 0.7529
154 0.2307 0.4614 0.7693
155 0.1784 0.3568 0.8216
156 0.1412 0.2824 0.8588
157 0.1367 0.2734 0.8633
158 0.26 0.52 0.74
159 0.233 0.4659 0.767
160 0.2253 0.4506 0.7747
161 0.2209 0.4417 0.7791
162 0.145 0.2899 0.855
163 0.0782 0.1564 0.9218







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level20.0126582OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 2 & 0.0126582 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298393&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]2[/C][C]0.0126582[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298393&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298393&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level00OK
10% type I error level20.0126582OK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.7033, df1 = 2, df2 = 164, p-value = 0.06997
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.5647, df1 = 4, df2 = 162, p-value = 0.1862
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 2.4591, df1 = 2, df2 = 164, p-value = 0.08866

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.7033, df1 = 2, df2 = 164, p-value = 0.06997
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.5647, df1 = 4, df2 = 162, p-value = 0.1862
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 2.4591, df1 = 2, df2 = 164, p-value = 0.08866
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=298393&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.7033, df1 = 2, df2 = 164, p-value = 0.06997
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.5647, df1 = 4, df2 = 162, p-value = 0.1862
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 2.4591, df1 = 2, df2 = 164, p-value = 0.08866
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298393&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298393&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.7033, df1 = 2, df2 = 164, p-value = 0.06997
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 1.5647, df1 = 4, df2 = 162, p-value = 0.1862
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 2.4591, df1 = 2, df2 = 164, p-value = 0.08866







Variance Inflation Factors (Multicollinearity)
> vif
veiligheid informatie 
  1.012579   1.012579 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
veiligheid informatie 
  1.012579   1.012579 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=298393&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
veiligheid informatie 
  1.012579   1.012579 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298393&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298393&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
veiligheid informatie 
  1.012579   1.012579 



Parameters (Session):
par1 = 3 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
Parameters (R input):
par1 = 3 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')