Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 12 Dec 2016 13:57:47 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/12/t1481547610kn1yfzzppmz3a4g.htm/, Retrieved Sat, 04 May 2024 00:21:19 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=298884, Retrieved Sat, 04 May 2024 00:21:19 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact122
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Multiple Regression] [2016-12-12 12:57:47] [8dbd6448339a84ba150e9d534057ba9c] [Current]
Feedback Forum

Post a new message
Dataseries X:
2	2	3	4
4	2	1	4
4	2	5	4
4	3	4	4
3	4	3	3
4	3	2	5
1	4	4	4
4	2	5	4
3	NA	5	2
4	4	3	4
2	2	2	4
4	2	2	3
4	5	4	3
5	4	4	4
4	2	4	4
1	3	5	4
2	1	2	5
4	1	NA	NA
4	3	2	4
5	4	4	4
5	5	4	4
4	5	4	4
1	1	5	4
4	4	3	4
2	2	4	4
4	4	3	4
5	4	3	3
3	3	3	3
5	4	5	5
3	2	4	4
5	2	4	4
2	4	3	4
1	2	3	4
NA	4	5	1
4	2	3	3
4	4	3	4
3	3	3	4
5	3	5	5
4	4	3	4
NA	2	3	4
4	3	3	4
2	2	4	3
3	4	3	4
1	2	1	5
3	2	4	4
3	3	4	3
3	3	3	3
4	NA	4	5
4	4	4	4
4	5	5	1
4	4	4	4
4	4	4	4
2	4	3	4
5	2	2	4
3	2	4	3
3	1	3	4
4	3	3	3
4	4	3	4
4	3	4	2
3	3	4	4
4	2	3	4
4	3	4	4
4	2	5	3
4	4	2	4
4	3	3	3
2	2	3	4
4	4	3	3
4	5	4	4
4	4	3	4
4	3	4	4
4	2	3	4
5	3	1	3
3	4	4	3
2	4	3	2
4	4	2	4
5	5	3	5
4	4	3	4
5	4	4	5
5	4	5	2
2	3	3	4
4	2	4	4
4	4	2	4
4	4	2	4
3	4	2	5
4	2	3	4
2	2	4	4
5	1	3	4
3	NA	5	4
4	4	4	1
2	4	4	4
4	4	3	4
3	3	4	3
3	4	3	4
4	4	5	4
4	4	4	3
4	2	4	3
3	4	3	4
4	4	4	5
3	1	1	3
3	4	4	4
1	2	4	3
4	3	4	4
3	3	4	5
3	4	4	3
5	3	3	4
5	4	5	4
4	4	3	NA
5	4	5	5
4	4	4	4
4	5	4	4
4	5	4	5
4	2	4	3
3	1	3	3
4	3	4	3
3	3	3	4
4	1	3	4
2	4	3	4
1	4	3	4
5	2	2	4
4	4	4	4
3	3	3	3
4	4	2	4
4	4	4	5
4	2	4	4
4	2	3	3
2	4	4	4
4	4	5	4
4	2	4	3
4	2	NA	3
4	2	4	4
3	2	4	2
4	5	4	4
5	2	5	3
2	NA	2	4
5	2	4	4
4	4	4	4
3	5	5	4
NA	4	4	3
2	4	4	2
2	3	5	5
2	3	2	3
4	1	4	4
4	4	5	4
5	5	3	4
3	4	4	5
3	4	4	4
4	5	3	4
4	4	5	3
4	5	5	1
4	5	3	4
4	3	2	5
4	5	4	4
4	1	5	4
2	3	3	4
5	2	3	5
4	2	4	4
4	NA	3	4
4	4	2	4
4	2	3	4
4	5	3	4
2	4	4	3
3	5	1	5
3	3	4	3
4	2	3	4
4	4	3	4
4	2	2	5
4	3	3	4
3	3	3	4
3	2	5	2




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time8 seconds
R ServerBig Analytics Cloud Computing Center
R Framework error message
The field 'Names of X columns' contains a hard return which cannot be interpreted.
Please, resubmit your request without hard returns in the 'Names of X columns'.

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time8 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
R Framework error message & 
The field 'Names of X columns' contains a hard return which cannot be interpreted.
Please, resubmit your request without hard returns in the 'Names of X columns'.
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=298884&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]8 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [ROW]R Framework error message[/C][C]
The field 'Names of X columns' contains a hard return which cannot be interpreted.
Please, resubmit your request without hard returns in the 'Names of X columns'.
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298884&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298884&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time8 seconds
R ServerBig Analytics Cloud Computing Center
R Framework error message
The field 'Names of X columns' contains a hard return which cannot be interpreted.
Please, resubmit your request without hard returns in the 'Names of X columns'.







Multiple Linear Regression - Estimated Regression Equation
IVHB1[t] = + 2.55812 + 0.128974IVHB2[t] + 0.0693758IVHB3[t] + 0.092364`IVHB4\r`[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
IVHB1[t] =  +  2.55812 +  0.128974IVHB2[t] +  0.0693758IVHB3[t] +  0.092364`IVHB4\r`[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298884&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]IVHB1[t] =  +  2.55812 +  0.128974IVHB2[t] +  0.0693758IVHB3[t] +  0.092364`IVHB4\r`[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298884&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298884&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
IVHB1[t] = + 2.55812 + 0.128974IVHB2[t] + 0.0693758IVHB3[t] + 0.092364`IVHB4\r`[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+2.558 0.5735+4.4600e+00 1.57e-05 7.848e-06
IVHB2+0.129 0.07219+1.7870e+00 0.07596 0.03798
IVHB3+0.06938 0.08545+8.1190e-01 0.4181 0.209
`IVHB4\r`+0.09236 0.1051+8.7870e-01 0.3809 0.1905

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +2.558 &  0.5735 & +4.4600e+00 &  1.57e-05 &  7.848e-06 \tabularnewline
IVHB2 & +0.129 &  0.07219 & +1.7870e+00 &  0.07596 &  0.03798 \tabularnewline
IVHB3 & +0.06938 &  0.08545 & +8.1190e-01 &  0.4181 &  0.209 \tabularnewline
`IVHB4\r` & +0.09236 &  0.1051 & +8.7870e-01 &  0.3809 &  0.1905 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298884&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+2.558[/C][C] 0.5735[/C][C]+4.4600e+00[/C][C] 1.57e-05[/C][C] 7.848e-06[/C][/ROW]
[ROW][C]IVHB2[/C][C]+0.129[/C][C] 0.07219[/C][C]+1.7870e+00[/C][C] 0.07596[/C][C] 0.03798[/C][/ROW]
[ROW][C]IVHB3[/C][C]+0.06938[/C][C] 0.08545[/C][C]+8.1190e-01[/C][C] 0.4181[/C][C] 0.209[/C][/ROW]
[ROW][C]`IVHB4\r`[/C][C]+0.09236[/C][C] 0.1051[/C][C]+8.7870e-01[/C][C] 0.3809[/C][C] 0.1905[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298884&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298884&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+2.558 0.5735+4.4600e+00 1.57e-05 7.848e-06
IVHB2+0.129 0.07219+1.7870e+00 0.07596 0.03798
IVHB3+0.06938 0.08545+8.1190e-01 0.4181 0.209
`IVHB4\r`+0.09236 0.1051+8.7870e-01 0.3809 0.1905







Multiple Linear Regression - Regression Statistics
Multiple R 0.1742
R-squared 0.03035
Adjusted R-squared 0.01146
F-TEST (value) 1.607
F-TEST (DF numerator)3
F-TEST (DF denominator)154
p-value 0.1901
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.006
Sum Squared Residuals 156

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.1742 \tabularnewline
R-squared &  0.03035 \tabularnewline
Adjusted R-squared &  0.01146 \tabularnewline
F-TEST (value) &  1.607 \tabularnewline
F-TEST (DF numerator) & 3 \tabularnewline
F-TEST (DF denominator) & 154 \tabularnewline
p-value &  0.1901 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1.006 \tabularnewline
Sum Squared Residuals &  156 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298884&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.1742[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.03035[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.01146[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 1.607[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]3[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]154[/C][/ROW]
[ROW][C]p-value[/C][C] 0.1901[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1.006[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 156[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298884&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298884&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.1742
R-squared 0.03035
Adjusted R-squared 0.01146
F-TEST (value) 1.607
F-TEST (DF numerator)3
F-TEST (DF denominator)154
p-value 0.1901
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.006
Sum Squared Residuals 156







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 2 3.394-1.394
2 4 3.255 0.7451
3 4 3.532 0.4676
4 4 3.592 0.408
5 3 3.559-0.5592
6 4 3.546 0.4544
7 1 3.721-2.721
8 4 3.532 0.4676
9 4 3.652 0.3484
10 2 3.324-1.324
11 4 3.232 0.7681
12 4 3.758 0.2424
13 5 3.721 1.279
14 4 3.463 0.537
15 1 3.661-2.661
16 2 3.288-1.288
17 4 3.453 0.5467
18 5 3.721 1.279
19 5 3.85 1.15
20 4 3.85 0.15
21 1 3.403-2.403
22 4 3.652 0.3484
23 2 3.463-1.463
24 4 3.652 0.3484
25 5 3.559 1.441
26 3 3.43-0.4303
27 5 3.883 1.117
28 3 3.463-0.463
29 5 3.463 1.537
30 2 3.652-1.652
31 1 3.394-2.394
32 4 3.301 0.6987
33 4 3.652 0.3484
34 3 3.523-0.5226
35 5 3.754 1.246
36 4 3.652 0.3484
37 4 3.523 0.4774
38 2 3.371-1.371
39 3 3.652-0.6516
40 1 3.347-2.347
41 3 3.463-0.463
42 3 3.5-0.4996
43 3 3.43-0.4303
44 4 3.721 0.279
45 4 3.642 0.3578
46 4 3.721 0.279
47 4 3.721 0.279
48 2 3.652-1.652
49 5 3.324 1.676
50 3 3.371-0.3707
51 3 3.265-0.2647
52 4 3.43 0.5697
53 4 3.652 0.3484
54 4 3.407 0.5927
55 3 3.592-0.592
56 4 3.394 0.6063
57 4 3.592 0.408
58 4 3.44 0.56
59 4 3.582 0.4178
60 4 3.43 0.5697
61 2 3.394-1.394
62 4 3.559 0.4408
63 4 3.85 0.15
64 4 3.652 0.3484
65 4 3.592 0.408
66 4 3.394 0.6063
67 5 3.292 1.708
68 3 3.629-0.6286
69 2 3.467-1.467
70 4 3.582 0.4178
71 5 3.873 1.127
72 4 3.652 0.3484
73 5 3.813 1.187
74 5 3.606 1.394
75 2 3.523-1.523
76 4 3.463 0.537
77 4 3.582 0.4178
78 4 3.582 0.4178
79 3 3.675-0.6746
80 4 3.394 0.6063
81 2 3.463-1.463
82 5 3.265 1.735
83 4 3.444 0.5561
84 2 3.721-1.721
85 4 3.652 0.3484
86 3 3.5-0.4996
87 3 3.652-0.6516
88 4 3.79 0.2096
89 4 3.629 0.3714
90 4 3.371 0.6293
91 3 3.652-0.6516
92 4 3.813 0.1867
93 3 3.034-0.03356
94 3 3.721-0.721
95 1 3.371-2.371
96 4 3.592 0.408
97 3 3.684-0.6844
98 3 3.629-0.6286
99 5 3.523 1.477
100 5 3.79 1.21
101 5 3.883 1.117
102 4 3.721 0.279
103 4 3.85 0.15
104 4 3.942 0.05768
105 4 3.371 0.6293
106 3 3.172-0.1723
107 4 3.5 0.5004
108 3 3.523-0.5226
109 4 3.265 0.7353
110 2 3.652-1.652
111 1 3.652-2.652
112 5 3.324 1.676
113 4 3.721 0.279
114 3 3.43-0.4303
115 4 3.582 0.4178
116 4 3.813 0.1867
117 4 3.463 0.537
118 4 3.301 0.6987
119 2 3.721-1.721
120 4 3.79 0.2096
121 4 3.371 0.6293
122 4 3.463 0.537
123 3 3.278-0.2783
124 4 3.85 0.15
125 5 3.44 1.56
126 5 3.463 1.537
127 4 3.721 0.279
128 3 3.919-0.9193
129 2 3.536-1.536
130 2 3.754-1.754
131 2 3.361-1.361
132 4 3.334 0.6659
133 4 3.79 0.2096
134 5 3.781 1.219
135 3 3.813-0.8133
136 3 3.721-0.721
137 4 3.781 0.2194
138 4 3.698 0.302
139 4 3.642 0.3578
140 4 3.781 0.2194
141 4 3.546 0.4544
142 4 3.85 0.15
143 4 3.403 0.5966
144 2 3.523-1.523
145 5 3.486 1.514
146 4 3.463 0.537
147 4 3.582 0.4178
148 4 3.394 0.6063
149 4 3.781 0.2194
150 2 3.629-1.629
151 3 3.734-0.7342
152 3 3.5-0.4996
153 4 3.394 0.6063
154 4 3.652 0.3484
155 4 3.417 0.5834
156 4 3.523 0.4774
157 3 3.523-0.5226
158 3 3.348-0.3477

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  2 &  3.394 & -1.394 \tabularnewline
2 &  4 &  3.255 &  0.7451 \tabularnewline
3 &  4 &  3.532 &  0.4676 \tabularnewline
4 &  4 &  3.592 &  0.408 \tabularnewline
5 &  3 &  3.559 & -0.5592 \tabularnewline
6 &  4 &  3.546 &  0.4544 \tabularnewline
7 &  1 &  3.721 & -2.721 \tabularnewline
8 &  4 &  3.532 &  0.4676 \tabularnewline
9 &  4 &  3.652 &  0.3484 \tabularnewline
10 &  2 &  3.324 & -1.324 \tabularnewline
11 &  4 &  3.232 &  0.7681 \tabularnewline
12 &  4 &  3.758 &  0.2424 \tabularnewline
13 &  5 &  3.721 &  1.279 \tabularnewline
14 &  4 &  3.463 &  0.537 \tabularnewline
15 &  1 &  3.661 & -2.661 \tabularnewline
16 &  2 &  3.288 & -1.288 \tabularnewline
17 &  4 &  3.453 &  0.5467 \tabularnewline
18 &  5 &  3.721 &  1.279 \tabularnewline
19 &  5 &  3.85 &  1.15 \tabularnewline
20 &  4 &  3.85 &  0.15 \tabularnewline
21 &  1 &  3.403 & -2.403 \tabularnewline
22 &  4 &  3.652 &  0.3484 \tabularnewline
23 &  2 &  3.463 & -1.463 \tabularnewline
24 &  4 &  3.652 &  0.3484 \tabularnewline
25 &  5 &  3.559 &  1.441 \tabularnewline
26 &  3 &  3.43 & -0.4303 \tabularnewline
27 &  5 &  3.883 &  1.117 \tabularnewline
28 &  3 &  3.463 & -0.463 \tabularnewline
29 &  5 &  3.463 &  1.537 \tabularnewline
30 &  2 &  3.652 & -1.652 \tabularnewline
31 &  1 &  3.394 & -2.394 \tabularnewline
32 &  4 &  3.301 &  0.6987 \tabularnewline
33 &  4 &  3.652 &  0.3484 \tabularnewline
34 &  3 &  3.523 & -0.5226 \tabularnewline
35 &  5 &  3.754 &  1.246 \tabularnewline
36 &  4 &  3.652 &  0.3484 \tabularnewline
37 &  4 &  3.523 &  0.4774 \tabularnewline
38 &  2 &  3.371 & -1.371 \tabularnewline
39 &  3 &  3.652 & -0.6516 \tabularnewline
40 &  1 &  3.347 & -2.347 \tabularnewline
41 &  3 &  3.463 & -0.463 \tabularnewline
42 &  3 &  3.5 & -0.4996 \tabularnewline
43 &  3 &  3.43 & -0.4303 \tabularnewline
44 &  4 &  3.721 &  0.279 \tabularnewline
45 &  4 &  3.642 &  0.3578 \tabularnewline
46 &  4 &  3.721 &  0.279 \tabularnewline
47 &  4 &  3.721 &  0.279 \tabularnewline
48 &  2 &  3.652 & -1.652 \tabularnewline
49 &  5 &  3.324 &  1.676 \tabularnewline
50 &  3 &  3.371 & -0.3707 \tabularnewline
51 &  3 &  3.265 & -0.2647 \tabularnewline
52 &  4 &  3.43 &  0.5697 \tabularnewline
53 &  4 &  3.652 &  0.3484 \tabularnewline
54 &  4 &  3.407 &  0.5927 \tabularnewline
55 &  3 &  3.592 & -0.592 \tabularnewline
56 &  4 &  3.394 &  0.6063 \tabularnewline
57 &  4 &  3.592 &  0.408 \tabularnewline
58 &  4 &  3.44 &  0.56 \tabularnewline
59 &  4 &  3.582 &  0.4178 \tabularnewline
60 &  4 &  3.43 &  0.5697 \tabularnewline
61 &  2 &  3.394 & -1.394 \tabularnewline
62 &  4 &  3.559 &  0.4408 \tabularnewline
63 &  4 &  3.85 &  0.15 \tabularnewline
64 &  4 &  3.652 &  0.3484 \tabularnewline
65 &  4 &  3.592 &  0.408 \tabularnewline
66 &  4 &  3.394 &  0.6063 \tabularnewline
67 &  5 &  3.292 &  1.708 \tabularnewline
68 &  3 &  3.629 & -0.6286 \tabularnewline
69 &  2 &  3.467 & -1.467 \tabularnewline
70 &  4 &  3.582 &  0.4178 \tabularnewline
71 &  5 &  3.873 &  1.127 \tabularnewline
72 &  4 &  3.652 &  0.3484 \tabularnewline
73 &  5 &  3.813 &  1.187 \tabularnewline
74 &  5 &  3.606 &  1.394 \tabularnewline
75 &  2 &  3.523 & -1.523 \tabularnewline
76 &  4 &  3.463 &  0.537 \tabularnewline
77 &  4 &  3.582 &  0.4178 \tabularnewline
78 &  4 &  3.582 &  0.4178 \tabularnewline
79 &  3 &  3.675 & -0.6746 \tabularnewline
80 &  4 &  3.394 &  0.6063 \tabularnewline
81 &  2 &  3.463 & -1.463 \tabularnewline
82 &  5 &  3.265 &  1.735 \tabularnewline
83 &  4 &  3.444 &  0.5561 \tabularnewline
84 &  2 &  3.721 & -1.721 \tabularnewline
85 &  4 &  3.652 &  0.3484 \tabularnewline
86 &  3 &  3.5 & -0.4996 \tabularnewline
87 &  3 &  3.652 & -0.6516 \tabularnewline
88 &  4 &  3.79 &  0.2096 \tabularnewline
89 &  4 &  3.629 &  0.3714 \tabularnewline
90 &  4 &  3.371 &  0.6293 \tabularnewline
91 &  3 &  3.652 & -0.6516 \tabularnewline
92 &  4 &  3.813 &  0.1867 \tabularnewline
93 &  3 &  3.034 & -0.03356 \tabularnewline
94 &  3 &  3.721 & -0.721 \tabularnewline
95 &  1 &  3.371 & -2.371 \tabularnewline
96 &  4 &  3.592 &  0.408 \tabularnewline
97 &  3 &  3.684 & -0.6844 \tabularnewline
98 &  3 &  3.629 & -0.6286 \tabularnewline
99 &  5 &  3.523 &  1.477 \tabularnewline
100 &  5 &  3.79 &  1.21 \tabularnewline
101 &  5 &  3.883 &  1.117 \tabularnewline
102 &  4 &  3.721 &  0.279 \tabularnewline
103 &  4 &  3.85 &  0.15 \tabularnewline
104 &  4 &  3.942 &  0.05768 \tabularnewline
105 &  4 &  3.371 &  0.6293 \tabularnewline
106 &  3 &  3.172 & -0.1723 \tabularnewline
107 &  4 &  3.5 &  0.5004 \tabularnewline
108 &  3 &  3.523 & -0.5226 \tabularnewline
109 &  4 &  3.265 &  0.7353 \tabularnewline
110 &  2 &  3.652 & -1.652 \tabularnewline
111 &  1 &  3.652 & -2.652 \tabularnewline
112 &  5 &  3.324 &  1.676 \tabularnewline
113 &  4 &  3.721 &  0.279 \tabularnewline
114 &  3 &  3.43 & -0.4303 \tabularnewline
115 &  4 &  3.582 &  0.4178 \tabularnewline
116 &  4 &  3.813 &  0.1867 \tabularnewline
117 &  4 &  3.463 &  0.537 \tabularnewline
118 &  4 &  3.301 &  0.6987 \tabularnewline
119 &  2 &  3.721 & -1.721 \tabularnewline
120 &  4 &  3.79 &  0.2096 \tabularnewline
121 &  4 &  3.371 &  0.6293 \tabularnewline
122 &  4 &  3.463 &  0.537 \tabularnewline
123 &  3 &  3.278 & -0.2783 \tabularnewline
124 &  4 &  3.85 &  0.15 \tabularnewline
125 &  5 &  3.44 &  1.56 \tabularnewline
126 &  5 &  3.463 &  1.537 \tabularnewline
127 &  4 &  3.721 &  0.279 \tabularnewline
128 &  3 &  3.919 & -0.9193 \tabularnewline
129 &  2 &  3.536 & -1.536 \tabularnewline
130 &  2 &  3.754 & -1.754 \tabularnewline
131 &  2 &  3.361 & -1.361 \tabularnewline
132 &  4 &  3.334 &  0.6659 \tabularnewline
133 &  4 &  3.79 &  0.2096 \tabularnewline
134 &  5 &  3.781 &  1.219 \tabularnewline
135 &  3 &  3.813 & -0.8133 \tabularnewline
136 &  3 &  3.721 & -0.721 \tabularnewline
137 &  4 &  3.781 &  0.2194 \tabularnewline
138 &  4 &  3.698 &  0.302 \tabularnewline
139 &  4 &  3.642 &  0.3578 \tabularnewline
140 &  4 &  3.781 &  0.2194 \tabularnewline
141 &  4 &  3.546 &  0.4544 \tabularnewline
142 &  4 &  3.85 &  0.15 \tabularnewline
143 &  4 &  3.403 &  0.5966 \tabularnewline
144 &  2 &  3.523 & -1.523 \tabularnewline
145 &  5 &  3.486 &  1.514 \tabularnewline
146 &  4 &  3.463 &  0.537 \tabularnewline
147 &  4 &  3.582 &  0.4178 \tabularnewline
148 &  4 &  3.394 &  0.6063 \tabularnewline
149 &  4 &  3.781 &  0.2194 \tabularnewline
150 &  2 &  3.629 & -1.629 \tabularnewline
151 &  3 &  3.734 & -0.7342 \tabularnewline
152 &  3 &  3.5 & -0.4996 \tabularnewline
153 &  4 &  3.394 &  0.6063 \tabularnewline
154 &  4 &  3.652 &  0.3484 \tabularnewline
155 &  4 &  3.417 &  0.5834 \tabularnewline
156 &  4 &  3.523 &  0.4774 \tabularnewline
157 &  3 &  3.523 & -0.5226 \tabularnewline
158 &  3 &  3.348 & -0.3477 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298884&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 2[/C][C] 3.394[/C][C]-1.394[/C][/ROW]
[ROW][C]2[/C][C] 4[/C][C] 3.255[/C][C] 0.7451[/C][/ROW]
[ROW][C]3[/C][C] 4[/C][C] 3.532[/C][C] 0.4676[/C][/ROW]
[ROW][C]4[/C][C] 4[/C][C] 3.592[/C][C] 0.408[/C][/ROW]
[ROW][C]5[/C][C] 3[/C][C] 3.559[/C][C]-0.5592[/C][/ROW]
[ROW][C]6[/C][C] 4[/C][C] 3.546[/C][C] 0.4544[/C][/ROW]
[ROW][C]7[/C][C] 1[/C][C] 3.721[/C][C]-2.721[/C][/ROW]
[ROW][C]8[/C][C] 4[/C][C] 3.532[/C][C] 0.4676[/C][/ROW]
[ROW][C]9[/C][C] 4[/C][C] 3.652[/C][C] 0.3484[/C][/ROW]
[ROW][C]10[/C][C] 2[/C][C] 3.324[/C][C]-1.324[/C][/ROW]
[ROW][C]11[/C][C] 4[/C][C] 3.232[/C][C] 0.7681[/C][/ROW]
[ROW][C]12[/C][C] 4[/C][C] 3.758[/C][C] 0.2424[/C][/ROW]
[ROW][C]13[/C][C] 5[/C][C] 3.721[/C][C] 1.279[/C][/ROW]
[ROW][C]14[/C][C] 4[/C][C] 3.463[/C][C] 0.537[/C][/ROW]
[ROW][C]15[/C][C] 1[/C][C] 3.661[/C][C]-2.661[/C][/ROW]
[ROW][C]16[/C][C] 2[/C][C] 3.288[/C][C]-1.288[/C][/ROW]
[ROW][C]17[/C][C] 4[/C][C] 3.453[/C][C] 0.5467[/C][/ROW]
[ROW][C]18[/C][C] 5[/C][C] 3.721[/C][C] 1.279[/C][/ROW]
[ROW][C]19[/C][C] 5[/C][C] 3.85[/C][C] 1.15[/C][/ROW]
[ROW][C]20[/C][C] 4[/C][C] 3.85[/C][C] 0.15[/C][/ROW]
[ROW][C]21[/C][C] 1[/C][C] 3.403[/C][C]-2.403[/C][/ROW]
[ROW][C]22[/C][C] 4[/C][C] 3.652[/C][C] 0.3484[/C][/ROW]
[ROW][C]23[/C][C] 2[/C][C] 3.463[/C][C]-1.463[/C][/ROW]
[ROW][C]24[/C][C] 4[/C][C] 3.652[/C][C] 0.3484[/C][/ROW]
[ROW][C]25[/C][C] 5[/C][C] 3.559[/C][C] 1.441[/C][/ROW]
[ROW][C]26[/C][C] 3[/C][C] 3.43[/C][C]-0.4303[/C][/ROW]
[ROW][C]27[/C][C] 5[/C][C] 3.883[/C][C] 1.117[/C][/ROW]
[ROW][C]28[/C][C] 3[/C][C] 3.463[/C][C]-0.463[/C][/ROW]
[ROW][C]29[/C][C] 5[/C][C] 3.463[/C][C] 1.537[/C][/ROW]
[ROW][C]30[/C][C] 2[/C][C] 3.652[/C][C]-1.652[/C][/ROW]
[ROW][C]31[/C][C] 1[/C][C] 3.394[/C][C]-2.394[/C][/ROW]
[ROW][C]32[/C][C] 4[/C][C] 3.301[/C][C] 0.6987[/C][/ROW]
[ROW][C]33[/C][C] 4[/C][C] 3.652[/C][C] 0.3484[/C][/ROW]
[ROW][C]34[/C][C] 3[/C][C] 3.523[/C][C]-0.5226[/C][/ROW]
[ROW][C]35[/C][C] 5[/C][C] 3.754[/C][C] 1.246[/C][/ROW]
[ROW][C]36[/C][C] 4[/C][C] 3.652[/C][C] 0.3484[/C][/ROW]
[ROW][C]37[/C][C] 4[/C][C] 3.523[/C][C] 0.4774[/C][/ROW]
[ROW][C]38[/C][C] 2[/C][C] 3.371[/C][C]-1.371[/C][/ROW]
[ROW][C]39[/C][C] 3[/C][C] 3.652[/C][C]-0.6516[/C][/ROW]
[ROW][C]40[/C][C] 1[/C][C] 3.347[/C][C]-2.347[/C][/ROW]
[ROW][C]41[/C][C] 3[/C][C] 3.463[/C][C]-0.463[/C][/ROW]
[ROW][C]42[/C][C] 3[/C][C] 3.5[/C][C]-0.4996[/C][/ROW]
[ROW][C]43[/C][C] 3[/C][C] 3.43[/C][C]-0.4303[/C][/ROW]
[ROW][C]44[/C][C] 4[/C][C] 3.721[/C][C] 0.279[/C][/ROW]
[ROW][C]45[/C][C] 4[/C][C] 3.642[/C][C] 0.3578[/C][/ROW]
[ROW][C]46[/C][C] 4[/C][C] 3.721[/C][C] 0.279[/C][/ROW]
[ROW][C]47[/C][C] 4[/C][C] 3.721[/C][C] 0.279[/C][/ROW]
[ROW][C]48[/C][C] 2[/C][C] 3.652[/C][C]-1.652[/C][/ROW]
[ROW][C]49[/C][C] 5[/C][C] 3.324[/C][C] 1.676[/C][/ROW]
[ROW][C]50[/C][C] 3[/C][C] 3.371[/C][C]-0.3707[/C][/ROW]
[ROW][C]51[/C][C] 3[/C][C] 3.265[/C][C]-0.2647[/C][/ROW]
[ROW][C]52[/C][C] 4[/C][C] 3.43[/C][C] 0.5697[/C][/ROW]
[ROW][C]53[/C][C] 4[/C][C] 3.652[/C][C] 0.3484[/C][/ROW]
[ROW][C]54[/C][C] 4[/C][C] 3.407[/C][C] 0.5927[/C][/ROW]
[ROW][C]55[/C][C] 3[/C][C] 3.592[/C][C]-0.592[/C][/ROW]
[ROW][C]56[/C][C] 4[/C][C] 3.394[/C][C] 0.6063[/C][/ROW]
[ROW][C]57[/C][C] 4[/C][C] 3.592[/C][C] 0.408[/C][/ROW]
[ROW][C]58[/C][C] 4[/C][C] 3.44[/C][C] 0.56[/C][/ROW]
[ROW][C]59[/C][C] 4[/C][C] 3.582[/C][C] 0.4178[/C][/ROW]
[ROW][C]60[/C][C] 4[/C][C] 3.43[/C][C] 0.5697[/C][/ROW]
[ROW][C]61[/C][C] 2[/C][C] 3.394[/C][C]-1.394[/C][/ROW]
[ROW][C]62[/C][C] 4[/C][C] 3.559[/C][C] 0.4408[/C][/ROW]
[ROW][C]63[/C][C] 4[/C][C] 3.85[/C][C] 0.15[/C][/ROW]
[ROW][C]64[/C][C] 4[/C][C] 3.652[/C][C] 0.3484[/C][/ROW]
[ROW][C]65[/C][C] 4[/C][C] 3.592[/C][C] 0.408[/C][/ROW]
[ROW][C]66[/C][C] 4[/C][C] 3.394[/C][C] 0.6063[/C][/ROW]
[ROW][C]67[/C][C] 5[/C][C] 3.292[/C][C] 1.708[/C][/ROW]
[ROW][C]68[/C][C] 3[/C][C] 3.629[/C][C]-0.6286[/C][/ROW]
[ROW][C]69[/C][C] 2[/C][C] 3.467[/C][C]-1.467[/C][/ROW]
[ROW][C]70[/C][C] 4[/C][C] 3.582[/C][C] 0.4178[/C][/ROW]
[ROW][C]71[/C][C] 5[/C][C] 3.873[/C][C] 1.127[/C][/ROW]
[ROW][C]72[/C][C] 4[/C][C] 3.652[/C][C] 0.3484[/C][/ROW]
[ROW][C]73[/C][C] 5[/C][C] 3.813[/C][C] 1.187[/C][/ROW]
[ROW][C]74[/C][C] 5[/C][C] 3.606[/C][C] 1.394[/C][/ROW]
[ROW][C]75[/C][C] 2[/C][C] 3.523[/C][C]-1.523[/C][/ROW]
[ROW][C]76[/C][C] 4[/C][C] 3.463[/C][C] 0.537[/C][/ROW]
[ROW][C]77[/C][C] 4[/C][C] 3.582[/C][C] 0.4178[/C][/ROW]
[ROW][C]78[/C][C] 4[/C][C] 3.582[/C][C] 0.4178[/C][/ROW]
[ROW][C]79[/C][C] 3[/C][C] 3.675[/C][C]-0.6746[/C][/ROW]
[ROW][C]80[/C][C] 4[/C][C] 3.394[/C][C] 0.6063[/C][/ROW]
[ROW][C]81[/C][C] 2[/C][C] 3.463[/C][C]-1.463[/C][/ROW]
[ROW][C]82[/C][C] 5[/C][C] 3.265[/C][C] 1.735[/C][/ROW]
[ROW][C]83[/C][C] 4[/C][C] 3.444[/C][C] 0.5561[/C][/ROW]
[ROW][C]84[/C][C] 2[/C][C] 3.721[/C][C]-1.721[/C][/ROW]
[ROW][C]85[/C][C] 4[/C][C] 3.652[/C][C] 0.3484[/C][/ROW]
[ROW][C]86[/C][C] 3[/C][C] 3.5[/C][C]-0.4996[/C][/ROW]
[ROW][C]87[/C][C] 3[/C][C] 3.652[/C][C]-0.6516[/C][/ROW]
[ROW][C]88[/C][C] 4[/C][C] 3.79[/C][C] 0.2096[/C][/ROW]
[ROW][C]89[/C][C] 4[/C][C] 3.629[/C][C] 0.3714[/C][/ROW]
[ROW][C]90[/C][C] 4[/C][C] 3.371[/C][C] 0.6293[/C][/ROW]
[ROW][C]91[/C][C] 3[/C][C] 3.652[/C][C]-0.6516[/C][/ROW]
[ROW][C]92[/C][C] 4[/C][C] 3.813[/C][C] 0.1867[/C][/ROW]
[ROW][C]93[/C][C] 3[/C][C] 3.034[/C][C]-0.03356[/C][/ROW]
[ROW][C]94[/C][C] 3[/C][C] 3.721[/C][C]-0.721[/C][/ROW]
[ROW][C]95[/C][C] 1[/C][C] 3.371[/C][C]-2.371[/C][/ROW]
[ROW][C]96[/C][C] 4[/C][C] 3.592[/C][C] 0.408[/C][/ROW]
[ROW][C]97[/C][C] 3[/C][C] 3.684[/C][C]-0.6844[/C][/ROW]
[ROW][C]98[/C][C] 3[/C][C] 3.629[/C][C]-0.6286[/C][/ROW]
[ROW][C]99[/C][C] 5[/C][C] 3.523[/C][C] 1.477[/C][/ROW]
[ROW][C]100[/C][C] 5[/C][C] 3.79[/C][C] 1.21[/C][/ROW]
[ROW][C]101[/C][C] 5[/C][C] 3.883[/C][C] 1.117[/C][/ROW]
[ROW][C]102[/C][C] 4[/C][C] 3.721[/C][C] 0.279[/C][/ROW]
[ROW][C]103[/C][C] 4[/C][C] 3.85[/C][C] 0.15[/C][/ROW]
[ROW][C]104[/C][C] 4[/C][C] 3.942[/C][C] 0.05768[/C][/ROW]
[ROW][C]105[/C][C] 4[/C][C] 3.371[/C][C] 0.6293[/C][/ROW]
[ROW][C]106[/C][C] 3[/C][C] 3.172[/C][C]-0.1723[/C][/ROW]
[ROW][C]107[/C][C] 4[/C][C] 3.5[/C][C] 0.5004[/C][/ROW]
[ROW][C]108[/C][C] 3[/C][C] 3.523[/C][C]-0.5226[/C][/ROW]
[ROW][C]109[/C][C] 4[/C][C] 3.265[/C][C] 0.7353[/C][/ROW]
[ROW][C]110[/C][C] 2[/C][C] 3.652[/C][C]-1.652[/C][/ROW]
[ROW][C]111[/C][C] 1[/C][C] 3.652[/C][C]-2.652[/C][/ROW]
[ROW][C]112[/C][C] 5[/C][C] 3.324[/C][C] 1.676[/C][/ROW]
[ROW][C]113[/C][C] 4[/C][C] 3.721[/C][C] 0.279[/C][/ROW]
[ROW][C]114[/C][C] 3[/C][C] 3.43[/C][C]-0.4303[/C][/ROW]
[ROW][C]115[/C][C] 4[/C][C] 3.582[/C][C] 0.4178[/C][/ROW]
[ROW][C]116[/C][C] 4[/C][C] 3.813[/C][C] 0.1867[/C][/ROW]
[ROW][C]117[/C][C] 4[/C][C] 3.463[/C][C] 0.537[/C][/ROW]
[ROW][C]118[/C][C] 4[/C][C] 3.301[/C][C] 0.6987[/C][/ROW]
[ROW][C]119[/C][C] 2[/C][C] 3.721[/C][C]-1.721[/C][/ROW]
[ROW][C]120[/C][C] 4[/C][C] 3.79[/C][C] 0.2096[/C][/ROW]
[ROW][C]121[/C][C] 4[/C][C] 3.371[/C][C] 0.6293[/C][/ROW]
[ROW][C]122[/C][C] 4[/C][C] 3.463[/C][C] 0.537[/C][/ROW]
[ROW][C]123[/C][C] 3[/C][C] 3.278[/C][C]-0.2783[/C][/ROW]
[ROW][C]124[/C][C] 4[/C][C] 3.85[/C][C] 0.15[/C][/ROW]
[ROW][C]125[/C][C] 5[/C][C] 3.44[/C][C] 1.56[/C][/ROW]
[ROW][C]126[/C][C] 5[/C][C] 3.463[/C][C] 1.537[/C][/ROW]
[ROW][C]127[/C][C] 4[/C][C] 3.721[/C][C] 0.279[/C][/ROW]
[ROW][C]128[/C][C] 3[/C][C] 3.919[/C][C]-0.9193[/C][/ROW]
[ROW][C]129[/C][C] 2[/C][C] 3.536[/C][C]-1.536[/C][/ROW]
[ROW][C]130[/C][C] 2[/C][C] 3.754[/C][C]-1.754[/C][/ROW]
[ROW][C]131[/C][C] 2[/C][C] 3.361[/C][C]-1.361[/C][/ROW]
[ROW][C]132[/C][C] 4[/C][C] 3.334[/C][C] 0.6659[/C][/ROW]
[ROW][C]133[/C][C] 4[/C][C] 3.79[/C][C] 0.2096[/C][/ROW]
[ROW][C]134[/C][C] 5[/C][C] 3.781[/C][C] 1.219[/C][/ROW]
[ROW][C]135[/C][C] 3[/C][C] 3.813[/C][C]-0.8133[/C][/ROW]
[ROW][C]136[/C][C] 3[/C][C] 3.721[/C][C]-0.721[/C][/ROW]
[ROW][C]137[/C][C] 4[/C][C] 3.781[/C][C] 0.2194[/C][/ROW]
[ROW][C]138[/C][C] 4[/C][C] 3.698[/C][C] 0.302[/C][/ROW]
[ROW][C]139[/C][C] 4[/C][C] 3.642[/C][C] 0.3578[/C][/ROW]
[ROW][C]140[/C][C] 4[/C][C] 3.781[/C][C] 0.2194[/C][/ROW]
[ROW][C]141[/C][C] 4[/C][C] 3.546[/C][C] 0.4544[/C][/ROW]
[ROW][C]142[/C][C] 4[/C][C] 3.85[/C][C] 0.15[/C][/ROW]
[ROW][C]143[/C][C] 4[/C][C] 3.403[/C][C] 0.5966[/C][/ROW]
[ROW][C]144[/C][C] 2[/C][C] 3.523[/C][C]-1.523[/C][/ROW]
[ROW][C]145[/C][C] 5[/C][C] 3.486[/C][C] 1.514[/C][/ROW]
[ROW][C]146[/C][C] 4[/C][C] 3.463[/C][C] 0.537[/C][/ROW]
[ROW][C]147[/C][C] 4[/C][C] 3.582[/C][C] 0.4178[/C][/ROW]
[ROW][C]148[/C][C] 4[/C][C] 3.394[/C][C] 0.6063[/C][/ROW]
[ROW][C]149[/C][C] 4[/C][C] 3.781[/C][C] 0.2194[/C][/ROW]
[ROW][C]150[/C][C] 2[/C][C] 3.629[/C][C]-1.629[/C][/ROW]
[ROW][C]151[/C][C] 3[/C][C] 3.734[/C][C]-0.7342[/C][/ROW]
[ROW][C]152[/C][C] 3[/C][C] 3.5[/C][C]-0.4996[/C][/ROW]
[ROW][C]153[/C][C] 4[/C][C] 3.394[/C][C] 0.6063[/C][/ROW]
[ROW][C]154[/C][C] 4[/C][C] 3.652[/C][C] 0.3484[/C][/ROW]
[ROW][C]155[/C][C] 4[/C][C] 3.417[/C][C] 0.5834[/C][/ROW]
[ROW][C]156[/C][C] 4[/C][C] 3.523[/C][C] 0.4774[/C][/ROW]
[ROW][C]157[/C][C] 3[/C][C] 3.523[/C][C]-0.5226[/C][/ROW]
[ROW][C]158[/C][C] 3[/C][C] 3.348[/C][C]-0.3477[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298884&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298884&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 2 3.394-1.394
2 4 3.255 0.7451
3 4 3.532 0.4676
4 4 3.592 0.408
5 3 3.559-0.5592
6 4 3.546 0.4544
7 1 3.721-2.721
8 4 3.532 0.4676
9 4 3.652 0.3484
10 2 3.324-1.324
11 4 3.232 0.7681
12 4 3.758 0.2424
13 5 3.721 1.279
14 4 3.463 0.537
15 1 3.661-2.661
16 2 3.288-1.288
17 4 3.453 0.5467
18 5 3.721 1.279
19 5 3.85 1.15
20 4 3.85 0.15
21 1 3.403-2.403
22 4 3.652 0.3484
23 2 3.463-1.463
24 4 3.652 0.3484
25 5 3.559 1.441
26 3 3.43-0.4303
27 5 3.883 1.117
28 3 3.463-0.463
29 5 3.463 1.537
30 2 3.652-1.652
31 1 3.394-2.394
32 4 3.301 0.6987
33 4 3.652 0.3484
34 3 3.523-0.5226
35 5 3.754 1.246
36 4 3.652 0.3484
37 4 3.523 0.4774
38 2 3.371-1.371
39 3 3.652-0.6516
40 1 3.347-2.347
41 3 3.463-0.463
42 3 3.5-0.4996
43 3 3.43-0.4303
44 4 3.721 0.279
45 4 3.642 0.3578
46 4 3.721 0.279
47 4 3.721 0.279
48 2 3.652-1.652
49 5 3.324 1.676
50 3 3.371-0.3707
51 3 3.265-0.2647
52 4 3.43 0.5697
53 4 3.652 0.3484
54 4 3.407 0.5927
55 3 3.592-0.592
56 4 3.394 0.6063
57 4 3.592 0.408
58 4 3.44 0.56
59 4 3.582 0.4178
60 4 3.43 0.5697
61 2 3.394-1.394
62 4 3.559 0.4408
63 4 3.85 0.15
64 4 3.652 0.3484
65 4 3.592 0.408
66 4 3.394 0.6063
67 5 3.292 1.708
68 3 3.629-0.6286
69 2 3.467-1.467
70 4 3.582 0.4178
71 5 3.873 1.127
72 4 3.652 0.3484
73 5 3.813 1.187
74 5 3.606 1.394
75 2 3.523-1.523
76 4 3.463 0.537
77 4 3.582 0.4178
78 4 3.582 0.4178
79 3 3.675-0.6746
80 4 3.394 0.6063
81 2 3.463-1.463
82 5 3.265 1.735
83 4 3.444 0.5561
84 2 3.721-1.721
85 4 3.652 0.3484
86 3 3.5-0.4996
87 3 3.652-0.6516
88 4 3.79 0.2096
89 4 3.629 0.3714
90 4 3.371 0.6293
91 3 3.652-0.6516
92 4 3.813 0.1867
93 3 3.034-0.03356
94 3 3.721-0.721
95 1 3.371-2.371
96 4 3.592 0.408
97 3 3.684-0.6844
98 3 3.629-0.6286
99 5 3.523 1.477
100 5 3.79 1.21
101 5 3.883 1.117
102 4 3.721 0.279
103 4 3.85 0.15
104 4 3.942 0.05768
105 4 3.371 0.6293
106 3 3.172-0.1723
107 4 3.5 0.5004
108 3 3.523-0.5226
109 4 3.265 0.7353
110 2 3.652-1.652
111 1 3.652-2.652
112 5 3.324 1.676
113 4 3.721 0.279
114 3 3.43-0.4303
115 4 3.582 0.4178
116 4 3.813 0.1867
117 4 3.463 0.537
118 4 3.301 0.6987
119 2 3.721-1.721
120 4 3.79 0.2096
121 4 3.371 0.6293
122 4 3.463 0.537
123 3 3.278-0.2783
124 4 3.85 0.15
125 5 3.44 1.56
126 5 3.463 1.537
127 4 3.721 0.279
128 3 3.919-0.9193
129 2 3.536-1.536
130 2 3.754-1.754
131 2 3.361-1.361
132 4 3.334 0.6659
133 4 3.79 0.2096
134 5 3.781 1.219
135 3 3.813-0.8133
136 3 3.721-0.721
137 4 3.781 0.2194
138 4 3.698 0.302
139 4 3.642 0.3578
140 4 3.781 0.2194
141 4 3.546 0.4544
142 4 3.85 0.15
143 4 3.403 0.5966
144 2 3.523-1.523
145 5 3.486 1.514
146 4 3.463 0.537
147 4 3.582 0.4178
148 4 3.394 0.6063
149 4 3.781 0.2194
150 2 3.629-1.629
151 3 3.734-0.7342
152 3 3.5-0.4996
153 4 3.394 0.6063
154 4 3.652 0.3484
155 4 3.417 0.5834
156 4 3.523 0.4774
157 3 3.523-0.5226
158 3 3.348-0.3477







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
7 0.9239 0.1522 0.07611
8 0.869 0.262 0.131
9 0.8718 0.2565 0.1282
10 0.9011 0.1978 0.09889
11 0.877 0.246 0.123
12 0.8597 0.2805 0.1403
13 0.8997 0.2005 0.1003
14 0.8629 0.2742 0.1371
15 0.9675 0.06507 0.03253
16 0.9656 0.06875 0.03438
17 0.9536 0.09288 0.04644
18 0.9649 0.07026 0.03513
19 0.9625 0.07503 0.03752
20 0.9455 0.1089 0.05446
21 0.9656 0.06879 0.0344
22 0.9509 0.09812 0.04906
23 0.948 0.1041 0.05203
24 0.9287 0.1427 0.07133
25 0.9301 0.1398 0.0699
26 0.9129 0.1741 0.08707
27 0.921 0.158 0.07898
28 0.8991 0.2019 0.1009
29 0.9506 0.09879 0.0494
30 0.975 0.04998 0.02499
31 0.9914 0.01724 0.008622
32 0.9907 0.01863 0.009315
33 0.9868 0.02643 0.01322
34 0.9825 0.03499 0.01749
35 0.9867 0.02662 0.01331
36 0.9816 0.03681 0.0184
37 0.9764 0.04728 0.02364
38 0.9769 0.04629 0.02314
39 0.974 0.05201 0.026
40 0.9911 0.01779 0.008895
41 0.9882 0.02368 0.01184
42 0.9846 0.03086 0.01543
43 0.9797 0.0406 0.0203
44 0.9729 0.05425 0.02712
45 0.9661 0.06782 0.03391
46 0.9559 0.08816 0.04408
47 0.9435 0.113 0.0565
48 0.9657 0.06858 0.03429
49 0.9837 0.03268 0.01634
50 0.9787 0.0425 0.02125
51 0.9741 0.05182 0.02591
52 0.9691 0.06184 0.03092
53 0.9604 0.07912 0.03956
54 0.9535 0.09298 0.04649
55 0.9449 0.1102 0.05508
56 0.9395 0.121 0.06052
57 0.9271 0.1457 0.07287
58 0.9175 0.165 0.08248
59 0.9005 0.1989 0.09946
60 0.8852 0.2296 0.1148
61 0.9032 0.1936 0.09681
62 0.8852 0.2296 0.1148
63 0.863 0.274 0.137
64 0.8383 0.3235 0.1617
65 0.8138 0.3724 0.1862
66 0.799 0.4019 0.201
67 0.8524 0.2953 0.1476
68 0.8393 0.3215 0.1607
69 0.8757 0.2486 0.1243
70 0.8546 0.2908 0.1454
71 0.8585 0.2829 0.1415
72 0.8344 0.3311 0.1656
73 0.8422 0.3156 0.1578
74 0.8716 0.2568 0.1284
75 0.9032 0.1935 0.09676
76 0.8904 0.2192 0.1096
77 0.8721 0.2558 0.1279
78 0.8524 0.2952 0.1476
79 0.8393 0.3214 0.1607
80 0.8228 0.3545 0.1772
81 0.8639 0.2721 0.1361
82 0.9096 0.1808 0.09041
83 0.9057 0.1886 0.09429
84 0.9404 0.1192 0.05958
85 0.9281 0.1438 0.07189
86 0.9148 0.1704 0.08518
87 0.9035 0.193 0.09651
88 0.8827 0.2347 0.1173
89 0.8652 0.2696 0.1348
90 0.8492 0.3015 0.1508
91 0.8319 0.3363 0.1681
92 0.801 0.398 0.199
93 0.7675 0.465 0.2325
94 0.7493 0.5015 0.2507
95 0.9034 0.1933 0.09665
96 0.8842 0.2316 0.1158
97 0.8818 0.2363 0.1182
98 0.8646 0.2708 0.1354
99 0.8927 0.2147 0.1073
100 0.9041 0.1919 0.09593
101 0.9078 0.1843 0.09217
102 0.8889 0.2223 0.1111
103 0.87 0.26 0.13
104 0.8453 0.3094 0.1547
105 0.8241 0.3518 0.1759
106 0.8034 0.3931 0.1966
107 0.777 0.4459 0.223
108 0.7525 0.4951 0.2475
109 0.725 0.5499 0.275
110 0.7899 0.4202 0.2101
111 0.9505 0.09893 0.04947
112 0.964 0.07199 0.03599
113 0.9538 0.09248 0.04624
114 0.943 0.114 0.05702
115 0.9297 0.1407 0.07033
116 0.9101 0.1799 0.08994
117 0.889 0.2221 0.111
118 0.8696 0.2609 0.1304
119 0.9214 0.1572 0.07862
120 0.8997 0.2005 0.1003
121 0.8797 0.2407 0.1203
122 0.8521 0.2958 0.1479
123 0.82 0.36 0.18
124 0.7856 0.4289 0.2144
125 0.8384 0.3232 0.1616
126 0.8793 0.2414 0.1207
127 0.8522 0.2957 0.1478
128 0.8317 0.3365 0.1683
129 0.8638 0.2723 0.1362
130 0.9545 0.09105 0.04552
131 0.9674 0.06517 0.03258
132 0.9549 0.09023 0.04512
133 0.9364 0.1272 0.06361
134 0.9604 0.07921 0.0396
135 0.9629 0.07414 0.03707
136 0.964 0.07201 0.03601
137 0.9469 0.1062 0.05309
138 0.923 0.154 0.07701
139 0.9706 0.05889 0.02944
140 0.9602 0.07957 0.03979
141 0.937 0.126 0.063
142 0.9105 0.179 0.08948
143 0.8737 0.2526 0.1263
144 0.9703 0.05936 0.02968
145 0.9561 0.08774 0.04387
146 0.9215 0.1571 0.07853
147 0.9418 0.1163 0.05817
148 0.8971 0.2058 0.1029
149 0.885 0.2299 0.115
150 0.9347 0.1307 0.06534
151 0.8427 0.3145 0.1573

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
7 &  0.9239 &  0.1522 &  0.07611 \tabularnewline
8 &  0.869 &  0.262 &  0.131 \tabularnewline
9 &  0.8718 &  0.2565 &  0.1282 \tabularnewline
10 &  0.9011 &  0.1978 &  0.09889 \tabularnewline
11 &  0.877 &  0.246 &  0.123 \tabularnewline
12 &  0.8597 &  0.2805 &  0.1403 \tabularnewline
13 &  0.8997 &  0.2005 &  0.1003 \tabularnewline
14 &  0.8629 &  0.2742 &  0.1371 \tabularnewline
15 &  0.9675 &  0.06507 &  0.03253 \tabularnewline
16 &  0.9656 &  0.06875 &  0.03438 \tabularnewline
17 &  0.9536 &  0.09288 &  0.04644 \tabularnewline
18 &  0.9649 &  0.07026 &  0.03513 \tabularnewline
19 &  0.9625 &  0.07503 &  0.03752 \tabularnewline
20 &  0.9455 &  0.1089 &  0.05446 \tabularnewline
21 &  0.9656 &  0.06879 &  0.0344 \tabularnewline
22 &  0.9509 &  0.09812 &  0.04906 \tabularnewline
23 &  0.948 &  0.1041 &  0.05203 \tabularnewline
24 &  0.9287 &  0.1427 &  0.07133 \tabularnewline
25 &  0.9301 &  0.1398 &  0.0699 \tabularnewline
26 &  0.9129 &  0.1741 &  0.08707 \tabularnewline
27 &  0.921 &  0.158 &  0.07898 \tabularnewline
28 &  0.8991 &  0.2019 &  0.1009 \tabularnewline
29 &  0.9506 &  0.09879 &  0.0494 \tabularnewline
30 &  0.975 &  0.04998 &  0.02499 \tabularnewline
31 &  0.9914 &  0.01724 &  0.008622 \tabularnewline
32 &  0.9907 &  0.01863 &  0.009315 \tabularnewline
33 &  0.9868 &  0.02643 &  0.01322 \tabularnewline
34 &  0.9825 &  0.03499 &  0.01749 \tabularnewline
35 &  0.9867 &  0.02662 &  0.01331 \tabularnewline
36 &  0.9816 &  0.03681 &  0.0184 \tabularnewline
37 &  0.9764 &  0.04728 &  0.02364 \tabularnewline
38 &  0.9769 &  0.04629 &  0.02314 \tabularnewline
39 &  0.974 &  0.05201 &  0.026 \tabularnewline
40 &  0.9911 &  0.01779 &  0.008895 \tabularnewline
41 &  0.9882 &  0.02368 &  0.01184 \tabularnewline
42 &  0.9846 &  0.03086 &  0.01543 \tabularnewline
43 &  0.9797 &  0.0406 &  0.0203 \tabularnewline
44 &  0.9729 &  0.05425 &  0.02712 \tabularnewline
45 &  0.9661 &  0.06782 &  0.03391 \tabularnewline
46 &  0.9559 &  0.08816 &  0.04408 \tabularnewline
47 &  0.9435 &  0.113 &  0.0565 \tabularnewline
48 &  0.9657 &  0.06858 &  0.03429 \tabularnewline
49 &  0.9837 &  0.03268 &  0.01634 \tabularnewline
50 &  0.9787 &  0.0425 &  0.02125 \tabularnewline
51 &  0.9741 &  0.05182 &  0.02591 \tabularnewline
52 &  0.9691 &  0.06184 &  0.03092 \tabularnewline
53 &  0.9604 &  0.07912 &  0.03956 \tabularnewline
54 &  0.9535 &  0.09298 &  0.04649 \tabularnewline
55 &  0.9449 &  0.1102 &  0.05508 \tabularnewline
56 &  0.9395 &  0.121 &  0.06052 \tabularnewline
57 &  0.9271 &  0.1457 &  0.07287 \tabularnewline
58 &  0.9175 &  0.165 &  0.08248 \tabularnewline
59 &  0.9005 &  0.1989 &  0.09946 \tabularnewline
60 &  0.8852 &  0.2296 &  0.1148 \tabularnewline
61 &  0.9032 &  0.1936 &  0.09681 \tabularnewline
62 &  0.8852 &  0.2296 &  0.1148 \tabularnewline
63 &  0.863 &  0.274 &  0.137 \tabularnewline
64 &  0.8383 &  0.3235 &  0.1617 \tabularnewline
65 &  0.8138 &  0.3724 &  0.1862 \tabularnewline
66 &  0.799 &  0.4019 &  0.201 \tabularnewline
67 &  0.8524 &  0.2953 &  0.1476 \tabularnewline
68 &  0.8393 &  0.3215 &  0.1607 \tabularnewline
69 &  0.8757 &  0.2486 &  0.1243 \tabularnewline
70 &  0.8546 &  0.2908 &  0.1454 \tabularnewline
71 &  0.8585 &  0.2829 &  0.1415 \tabularnewline
72 &  0.8344 &  0.3311 &  0.1656 \tabularnewline
73 &  0.8422 &  0.3156 &  0.1578 \tabularnewline
74 &  0.8716 &  0.2568 &  0.1284 \tabularnewline
75 &  0.9032 &  0.1935 &  0.09676 \tabularnewline
76 &  0.8904 &  0.2192 &  0.1096 \tabularnewline
77 &  0.8721 &  0.2558 &  0.1279 \tabularnewline
78 &  0.8524 &  0.2952 &  0.1476 \tabularnewline
79 &  0.8393 &  0.3214 &  0.1607 \tabularnewline
80 &  0.8228 &  0.3545 &  0.1772 \tabularnewline
81 &  0.8639 &  0.2721 &  0.1361 \tabularnewline
82 &  0.9096 &  0.1808 &  0.09041 \tabularnewline
83 &  0.9057 &  0.1886 &  0.09429 \tabularnewline
84 &  0.9404 &  0.1192 &  0.05958 \tabularnewline
85 &  0.9281 &  0.1438 &  0.07189 \tabularnewline
86 &  0.9148 &  0.1704 &  0.08518 \tabularnewline
87 &  0.9035 &  0.193 &  0.09651 \tabularnewline
88 &  0.8827 &  0.2347 &  0.1173 \tabularnewline
89 &  0.8652 &  0.2696 &  0.1348 \tabularnewline
90 &  0.8492 &  0.3015 &  0.1508 \tabularnewline
91 &  0.8319 &  0.3363 &  0.1681 \tabularnewline
92 &  0.801 &  0.398 &  0.199 \tabularnewline
93 &  0.7675 &  0.465 &  0.2325 \tabularnewline
94 &  0.7493 &  0.5015 &  0.2507 \tabularnewline
95 &  0.9034 &  0.1933 &  0.09665 \tabularnewline
96 &  0.8842 &  0.2316 &  0.1158 \tabularnewline
97 &  0.8818 &  0.2363 &  0.1182 \tabularnewline
98 &  0.8646 &  0.2708 &  0.1354 \tabularnewline
99 &  0.8927 &  0.2147 &  0.1073 \tabularnewline
100 &  0.9041 &  0.1919 &  0.09593 \tabularnewline
101 &  0.9078 &  0.1843 &  0.09217 \tabularnewline
102 &  0.8889 &  0.2223 &  0.1111 \tabularnewline
103 &  0.87 &  0.26 &  0.13 \tabularnewline
104 &  0.8453 &  0.3094 &  0.1547 \tabularnewline
105 &  0.8241 &  0.3518 &  0.1759 \tabularnewline
106 &  0.8034 &  0.3931 &  0.1966 \tabularnewline
107 &  0.777 &  0.4459 &  0.223 \tabularnewline
108 &  0.7525 &  0.4951 &  0.2475 \tabularnewline
109 &  0.725 &  0.5499 &  0.275 \tabularnewline
110 &  0.7899 &  0.4202 &  0.2101 \tabularnewline
111 &  0.9505 &  0.09893 &  0.04947 \tabularnewline
112 &  0.964 &  0.07199 &  0.03599 \tabularnewline
113 &  0.9538 &  0.09248 &  0.04624 \tabularnewline
114 &  0.943 &  0.114 &  0.05702 \tabularnewline
115 &  0.9297 &  0.1407 &  0.07033 \tabularnewline
116 &  0.9101 &  0.1799 &  0.08994 \tabularnewline
117 &  0.889 &  0.2221 &  0.111 \tabularnewline
118 &  0.8696 &  0.2609 &  0.1304 \tabularnewline
119 &  0.9214 &  0.1572 &  0.07862 \tabularnewline
120 &  0.8997 &  0.2005 &  0.1003 \tabularnewline
121 &  0.8797 &  0.2407 &  0.1203 \tabularnewline
122 &  0.8521 &  0.2958 &  0.1479 \tabularnewline
123 &  0.82 &  0.36 &  0.18 \tabularnewline
124 &  0.7856 &  0.4289 &  0.2144 \tabularnewline
125 &  0.8384 &  0.3232 &  0.1616 \tabularnewline
126 &  0.8793 &  0.2414 &  0.1207 \tabularnewline
127 &  0.8522 &  0.2957 &  0.1478 \tabularnewline
128 &  0.8317 &  0.3365 &  0.1683 \tabularnewline
129 &  0.8638 &  0.2723 &  0.1362 \tabularnewline
130 &  0.9545 &  0.09105 &  0.04552 \tabularnewline
131 &  0.9674 &  0.06517 &  0.03258 \tabularnewline
132 &  0.9549 &  0.09023 &  0.04512 \tabularnewline
133 &  0.9364 &  0.1272 &  0.06361 \tabularnewline
134 &  0.9604 &  0.07921 &  0.0396 \tabularnewline
135 &  0.9629 &  0.07414 &  0.03707 \tabularnewline
136 &  0.964 &  0.07201 &  0.03601 \tabularnewline
137 &  0.9469 &  0.1062 &  0.05309 \tabularnewline
138 &  0.923 &  0.154 &  0.07701 \tabularnewline
139 &  0.9706 &  0.05889 &  0.02944 \tabularnewline
140 &  0.9602 &  0.07957 &  0.03979 \tabularnewline
141 &  0.937 &  0.126 &  0.063 \tabularnewline
142 &  0.9105 &  0.179 &  0.08948 \tabularnewline
143 &  0.8737 &  0.2526 &  0.1263 \tabularnewline
144 &  0.9703 &  0.05936 &  0.02968 \tabularnewline
145 &  0.9561 &  0.08774 &  0.04387 \tabularnewline
146 &  0.9215 &  0.1571 &  0.07853 \tabularnewline
147 &  0.9418 &  0.1163 &  0.05817 \tabularnewline
148 &  0.8971 &  0.2058 &  0.1029 \tabularnewline
149 &  0.885 &  0.2299 &  0.115 \tabularnewline
150 &  0.9347 &  0.1307 &  0.06534 \tabularnewline
151 &  0.8427 &  0.3145 &  0.1573 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298884&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]7[/C][C] 0.9239[/C][C] 0.1522[/C][C] 0.07611[/C][/ROW]
[ROW][C]8[/C][C] 0.869[/C][C] 0.262[/C][C] 0.131[/C][/ROW]
[ROW][C]9[/C][C] 0.8718[/C][C] 0.2565[/C][C] 0.1282[/C][/ROW]
[ROW][C]10[/C][C] 0.9011[/C][C] 0.1978[/C][C] 0.09889[/C][/ROW]
[ROW][C]11[/C][C] 0.877[/C][C] 0.246[/C][C] 0.123[/C][/ROW]
[ROW][C]12[/C][C] 0.8597[/C][C] 0.2805[/C][C] 0.1403[/C][/ROW]
[ROW][C]13[/C][C] 0.8997[/C][C] 0.2005[/C][C] 0.1003[/C][/ROW]
[ROW][C]14[/C][C] 0.8629[/C][C] 0.2742[/C][C] 0.1371[/C][/ROW]
[ROW][C]15[/C][C] 0.9675[/C][C] 0.06507[/C][C] 0.03253[/C][/ROW]
[ROW][C]16[/C][C] 0.9656[/C][C] 0.06875[/C][C] 0.03438[/C][/ROW]
[ROW][C]17[/C][C] 0.9536[/C][C] 0.09288[/C][C] 0.04644[/C][/ROW]
[ROW][C]18[/C][C] 0.9649[/C][C] 0.07026[/C][C] 0.03513[/C][/ROW]
[ROW][C]19[/C][C] 0.9625[/C][C] 0.07503[/C][C] 0.03752[/C][/ROW]
[ROW][C]20[/C][C] 0.9455[/C][C] 0.1089[/C][C] 0.05446[/C][/ROW]
[ROW][C]21[/C][C] 0.9656[/C][C] 0.06879[/C][C] 0.0344[/C][/ROW]
[ROW][C]22[/C][C] 0.9509[/C][C] 0.09812[/C][C] 0.04906[/C][/ROW]
[ROW][C]23[/C][C] 0.948[/C][C] 0.1041[/C][C] 0.05203[/C][/ROW]
[ROW][C]24[/C][C] 0.9287[/C][C] 0.1427[/C][C] 0.07133[/C][/ROW]
[ROW][C]25[/C][C] 0.9301[/C][C] 0.1398[/C][C] 0.0699[/C][/ROW]
[ROW][C]26[/C][C] 0.9129[/C][C] 0.1741[/C][C] 0.08707[/C][/ROW]
[ROW][C]27[/C][C] 0.921[/C][C] 0.158[/C][C] 0.07898[/C][/ROW]
[ROW][C]28[/C][C] 0.8991[/C][C] 0.2019[/C][C] 0.1009[/C][/ROW]
[ROW][C]29[/C][C] 0.9506[/C][C] 0.09879[/C][C] 0.0494[/C][/ROW]
[ROW][C]30[/C][C] 0.975[/C][C] 0.04998[/C][C] 0.02499[/C][/ROW]
[ROW][C]31[/C][C] 0.9914[/C][C] 0.01724[/C][C] 0.008622[/C][/ROW]
[ROW][C]32[/C][C] 0.9907[/C][C] 0.01863[/C][C] 0.009315[/C][/ROW]
[ROW][C]33[/C][C] 0.9868[/C][C] 0.02643[/C][C] 0.01322[/C][/ROW]
[ROW][C]34[/C][C] 0.9825[/C][C] 0.03499[/C][C] 0.01749[/C][/ROW]
[ROW][C]35[/C][C] 0.9867[/C][C] 0.02662[/C][C] 0.01331[/C][/ROW]
[ROW][C]36[/C][C] 0.9816[/C][C] 0.03681[/C][C] 0.0184[/C][/ROW]
[ROW][C]37[/C][C] 0.9764[/C][C] 0.04728[/C][C] 0.02364[/C][/ROW]
[ROW][C]38[/C][C] 0.9769[/C][C] 0.04629[/C][C] 0.02314[/C][/ROW]
[ROW][C]39[/C][C] 0.974[/C][C] 0.05201[/C][C] 0.026[/C][/ROW]
[ROW][C]40[/C][C] 0.9911[/C][C] 0.01779[/C][C] 0.008895[/C][/ROW]
[ROW][C]41[/C][C] 0.9882[/C][C] 0.02368[/C][C] 0.01184[/C][/ROW]
[ROW][C]42[/C][C] 0.9846[/C][C] 0.03086[/C][C] 0.01543[/C][/ROW]
[ROW][C]43[/C][C] 0.9797[/C][C] 0.0406[/C][C] 0.0203[/C][/ROW]
[ROW][C]44[/C][C] 0.9729[/C][C] 0.05425[/C][C] 0.02712[/C][/ROW]
[ROW][C]45[/C][C] 0.9661[/C][C] 0.06782[/C][C] 0.03391[/C][/ROW]
[ROW][C]46[/C][C] 0.9559[/C][C] 0.08816[/C][C] 0.04408[/C][/ROW]
[ROW][C]47[/C][C] 0.9435[/C][C] 0.113[/C][C] 0.0565[/C][/ROW]
[ROW][C]48[/C][C] 0.9657[/C][C] 0.06858[/C][C] 0.03429[/C][/ROW]
[ROW][C]49[/C][C] 0.9837[/C][C] 0.03268[/C][C] 0.01634[/C][/ROW]
[ROW][C]50[/C][C] 0.9787[/C][C] 0.0425[/C][C] 0.02125[/C][/ROW]
[ROW][C]51[/C][C] 0.9741[/C][C] 0.05182[/C][C] 0.02591[/C][/ROW]
[ROW][C]52[/C][C] 0.9691[/C][C] 0.06184[/C][C] 0.03092[/C][/ROW]
[ROW][C]53[/C][C] 0.9604[/C][C] 0.07912[/C][C] 0.03956[/C][/ROW]
[ROW][C]54[/C][C] 0.9535[/C][C] 0.09298[/C][C] 0.04649[/C][/ROW]
[ROW][C]55[/C][C] 0.9449[/C][C] 0.1102[/C][C] 0.05508[/C][/ROW]
[ROW][C]56[/C][C] 0.9395[/C][C] 0.121[/C][C] 0.06052[/C][/ROW]
[ROW][C]57[/C][C] 0.9271[/C][C] 0.1457[/C][C] 0.07287[/C][/ROW]
[ROW][C]58[/C][C] 0.9175[/C][C] 0.165[/C][C] 0.08248[/C][/ROW]
[ROW][C]59[/C][C] 0.9005[/C][C] 0.1989[/C][C] 0.09946[/C][/ROW]
[ROW][C]60[/C][C] 0.8852[/C][C] 0.2296[/C][C] 0.1148[/C][/ROW]
[ROW][C]61[/C][C] 0.9032[/C][C] 0.1936[/C][C] 0.09681[/C][/ROW]
[ROW][C]62[/C][C] 0.8852[/C][C] 0.2296[/C][C] 0.1148[/C][/ROW]
[ROW][C]63[/C][C] 0.863[/C][C] 0.274[/C][C] 0.137[/C][/ROW]
[ROW][C]64[/C][C] 0.8383[/C][C] 0.3235[/C][C] 0.1617[/C][/ROW]
[ROW][C]65[/C][C] 0.8138[/C][C] 0.3724[/C][C] 0.1862[/C][/ROW]
[ROW][C]66[/C][C] 0.799[/C][C] 0.4019[/C][C] 0.201[/C][/ROW]
[ROW][C]67[/C][C] 0.8524[/C][C] 0.2953[/C][C] 0.1476[/C][/ROW]
[ROW][C]68[/C][C] 0.8393[/C][C] 0.3215[/C][C] 0.1607[/C][/ROW]
[ROW][C]69[/C][C] 0.8757[/C][C] 0.2486[/C][C] 0.1243[/C][/ROW]
[ROW][C]70[/C][C] 0.8546[/C][C] 0.2908[/C][C] 0.1454[/C][/ROW]
[ROW][C]71[/C][C] 0.8585[/C][C] 0.2829[/C][C] 0.1415[/C][/ROW]
[ROW][C]72[/C][C] 0.8344[/C][C] 0.3311[/C][C] 0.1656[/C][/ROW]
[ROW][C]73[/C][C] 0.8422[/C][C] 0.3156[/C][C] 0.1578[/C][/ROW]
[ROW][C]74[/C][C] 0.8716[/C][C] 0.2568[/C][C] 0.1284[/C][/ROW]
[ROW][C]75[/C][C] 0.9032[/C][C] 0.1935[/C][C] 0.09676[/C][/ROW]
[ROW][C]76[/C][C] 0.8904[/C][C] 0.2192[/C][C] 0.1096[/C][/ROW]
[ROW][C]77[/C][C] 0.8721[/C][C] 0.2558[/C][C] 0.1279[/C][/ROW]
[ROW][C]78[/C][C] 0.8524[/C][C] 0.2952[/C][C] 0.1476[/C][/ROW]
[ROW][C]79[/C][C] 0.8393[/C][C] 0.3214[/C][C] 0.1607[/C][/ROW]
[ROW][C]80[/C][C] 0.8228[/C][C] 0.3545[/C][C] 0.1772[/C][/ROW]
[ROW][C]81[/C][C] 0.8639[/C][C] 0.2721[/C][C] 0.1361[/C][/ROW]
[ROW][C]82[/C][C] 0.9096[/C][C] 0.1808[/C][C] 0.09041[/C][/ROW]
[ROW][C]83[/C][C] 0.9057[/C][C] 0.1886[/C][C] 0.09429[/C][/ROW]
[ROW][C]84[/C][C] 0.9404[/C][C] 0.1192[/C][C] 0.05958[/C][/ROW]
[ROW][C]85[/C][C] 0.9281[/C][C] 0.1438[/C][C] 0.07189[/C][/ROW]
[ROW][C]86[/C][C] 0.9148[/C][C] 0.1704[/C][C] 0.08518[/C][/ROW]
[ROW][C]87[/C][C] 0.9035[/C][C] 0.193[/C][C] 0.09651[/C][/ROW]
[ROW][C]88[/C][C] 0.8827[/C][C] 0.2347[/C][C] 0.1173[/C][/ROW]
[ROW][C]89[/C][C] 0.8652[/C][C] 0.2696[/C][C] 0.1348[/C][/ROW]
[ROW][C]90[/C][C] 0.8492[/C][C] 0.3015[/C][C] 0.1508[/C][/ROW]
[ROW][C]91[/C][C] 0.8319[/C][C] 0.3363[/C][C] 0.1681[/C][/ROW]
[ROW][C]92[/C][C] 0.801[/C][C] 0.398[/C][C] 0.199[/C][/ROW]
[ROW][C]93[/C][C] 0.7675[/C][C] 0.465[/C][C] 0.2325[/C][/ROW]
[ROW][C]94[/C][C] 0.7493[/C][C] 0.5015[/C][C] 0.2507[/C][/ROW]
[ROW][C]95[/C][C] 0.9034[/C][C] 0.1933[/C][C] 0.09665[/C][/ROW]
[ROW][C]96[/C][C] 0.8842[/C][C] 0.2316[/C][C] 0.1158[/C][/ROW]
[ROW][C]97[/C][C] 0.8818[/C][C] 0.2363[/C][C] 0.1182[/C][/ROW]
[ROW][C]98[/C][C] 0.8646[/C][C] 0.2708[/C][C] 0.1354[/C][/ROW]
[ROW][C]99[/C][C] 0.8927[/C][C] 0.2147[/C][C] 0.1073[/C][/ROW]
[ROW][C]100[/C][C] 0.9041[/C][C] 0.1919[/C][C] 0.09593[/C][/ROW]
[ROW][C]101[/C][C] 0.9078[/C][C] 0.1843[/C][C] 0.09217[/C][/ROW]
[ROW][C]102[/C][C] 0.8889[/C][C] 0.2223[/C][C] 0.1111[/C][/ROW]
[ROW][C]103[/C][C] 0.87[/C][C] 0.26[/C][C] 0.13[/C][/ROW]
[ROW][C]104[/C][C] 0.8453[/C][C] 0.3094[/C][C] 0.1547[/C][/ROW]
[ROW][C]105[/C][C] 0.8241[/C][C] 0.3518[/C][C] 0.1759[/C][/ROW]
[ROW][C]106[/C][C] 0.8034[/C][C] 0.3931[/C][C] 0.1966[/C][/ROW]
[ROW][C]107[/C][C] 0.777[/C][C] 0.4459[/C][C] 0.223[/C][/ROW]
[ROW][C]108[/C][C] 0.7525[/C][C] 0.4951[/C][C] 0.2475[/C][/ROW]
[ROW][C]109[/C][C] 0.725[/C][C] 0.5499[/C][C] 0.275[/C][/ROW]
[ROW][C]110[/C][C] 0.7899[/C][C] 0.4202[/C][C] 0.2101[/C][/ROW]
[ROW][C]111[/C][C] 0.9505[/C][C] 0.09893[/C][C] 0.04947[/C][/ROW]
[ROW][C]112[/C][C] 0.964[/C][C] 0.07199[/C][C] 0.03599[/C][/ROW]
[ROW][C]113[/C][C] 0.9538[/C][C] 0.09248[/C][C] 0.04624[/C][/ROW]
[ROW][C]114[/C][C] 0.943[/C][C] 0.114[/C][C] 0.05702[/C][/ROW]
[ROW][C]115[/C][C] 0.9297[/C][C] 0.1407[/C][C] 0.07033[/C][/ROW]
[ROW][C]116[/C][C] 0.9101[/C][C] 0.1799[/C][C] 0.08994[/C][/ROW]
[ROW][C]117[/C][C] 0.889[/C][C] 0.2221[/C][C] 0.111[/C][/ROW]
[ROW][C]118[/C][C] 0.8696[/C][C] 0.2609[/C][C] 0.1304[/C][/ROW]
[ROW][C]119[/C][C] 0.9214[/C][C] 0.1572[/C][C] 0.07862[/C][/ROW]
[ROW][C]120[/C][C] 0.8997[/C][C] 0.2005[/C][C] 0.1003[/C][/ROW]
[ROW][C]121[/C][C] 0.8797[/C][C] 0.2407[/C][C] 0.1203[/C][/ROW]
[ROW][C]122[/C][C] 0.8521[/C][C] 0.2958[/C][C] 0.1479[/C][/ROW]
[ROW][C]123[/C][C] 0.82[/C][C] 0.36[/C][C] 0.18[/C][/ROW]
[ROW][C]124[/C][C] 0.7856[/C][C] 0.4289[/C][C] 0.2144[/C][/ROW]
[ROW][C]125[/C][C] 0.8384[/C][C] 0.3232[/C][C] 0.1616[/C][/ROW]
[ROW][C]126[/C][C] 0.8793[/C][C] 0.2414[/C][C] 0.1207[/C][/ROW]
[ROW][C]127[/C][C] 0.8522[/C][C] 0.2957[/C][C] 0.1478[/C][/ROW]
[ROW][C]128[/C][C] 0.8317[/C][C] 0.3365[/C][C] 0.1683[/C][/ROW]
[ROW][C]129[/C][C] 0.8638[/C][C] 0.2723[/C][C] 0.1362[/C][/ROW]
[ROW][C]130[/C][C] 0.9545[/C][C] 0.09105[/C][C] 0.04552[/C][/ROW]
[ROW][C]131[/C][C] 0.9674[/C][C] 0.06517[/C][C] 0.03258[/C][/ROW]
[ROW][C]132[/C][C] 0.9549[/C][C] 0.09023[/C][C] 0.04512[/C][/ROW]
[ROW][C]133[/C][C] 0.9364[/C][C] 0.1272[/C][C] 0.06361[/C][/ROW]
[ROW][C]134[/C][C] 0.9604[/C][C] 0.07921[/C][C] 0.0396[/C][/ROW]
[ROW][C]135[/C][C] 0.9629[/C][C] 0.07414[/C][C] 0.03707[/C][/ROW]
[ROW][C]136[/C][C] 0.964[/C][C] 0.07201[/C][C] 0.03601[/C][/ROW]
[ROW][C]137[/C][C] 0.9469[/C][C] 0.1062[/C][C] 0.05309[/C][/ROW]
[ROW][C]138[/C][C] 0.923[/C][C] 0.154[/C][C] 0.07701[/C][/ROW]
[ROW][C]139[/C][C] 0.9706[/C][C] 0.05889[/C][C] 0.02944[/C][/ROW]
[ROW][C]140[/C][C] 0.9602[/C][C] 0.07957[/C][C] 0.03979[/C][/ROW]
[ROW][C]141[/C][C] 0.937[/C][C] 0.126[/C][C] 0.063[/C][/ROW]
[ROW][C]142[/C][C] 0.9105[/C][C] 0.179[/C][C] 0.08948[/C][/ROW]
[ROW][C]143[/C][C] 0.8737[/C][C] 0.2526[/C][C] 0.1263[/C][/ROW]
[ROW][C]144[/C][C] 0.9703[/C][C] 0.05936[/C][C] 0.02968[/C][/ROW]
[ROW][C]145[/C][C] 0.9561[/C][C] 0.08774[/C][C] 0.04387[/C][/ROW]
[ROW][C]146[/C][C] 0.9215[/C][C] 0.1571[/C][C] 0.07853[/C][/ROW]
[ROW][C]147[/C][C] 0.9418[/C][C] 0.1163[/C][C] 0.05817[/C][/ROW]
[ROW][C]148[/C][C] 0.8971[/C][C] 0.2058[/C][C] 0.1029[/C][/ROW]
[ROW][C]149[/C][C] 0.885[/C][C] 0.2299[/C][C] 0.115[/C][/ROW]
[ROW][C]150[/C][C] 0.9347[/C][C] 0.1307[/C][C] 0.06534[/C][/ROW]
[ROW][C]151[/C][C] 0.8427[/C][C] 0.3145[/C][C] 0.1573[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298884&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298884&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
7 0.9239 0.1522 0.07611
8 0.869 0.262 0.131
9 0.8718 0.2565 0.1282
10 0.9011 0.1978 0.09889
11 0.877 0.246 0.123
12 0.8597 0.2805 0.1403
13 0.8997 0.2005 0.1003
14 0.8629 0.2742 0.1371
15 0.9675 0.06507 0.03253
16 0.9656 0.06875 0.03438
17 0.9536 0.09288 0.04644
18 0.9649 0.07026 0.03513
19 0.9625 0.07503 0.03752
20 0.9455 0.1089 0.05446
21 0.9656 0.06879 0.0344
22 0.9509 0.09812 0.04906
23 0.948 0.1041 0.05203
24 0.9287 0.1427 0.07133
25 0.9301 0.1398 0.0699
26 0.9129 0.1741 0.08707
27 0.921 0.158 0.07898
28 0.8991 0.2019 0.1009
29 0.9506 0.09879 0.0494
30 0.975 0.04998 0.02499
31 0.9914 0.01724 0.008622
32 0.9907 0.01863 0.009315
33 0.9868 0.02643 0.01322
34 0.9825 0.03499 0.01749
35 0.9867 0.02662 0.01331
36 0.9816 0.03681 0.0184
37 0.9764 0.04728 0.02364
38 0.9769 0.04629 0.02314
39 0.974 0.05201 0.026
40 0.9911 0.01779 0.008895
41 0.9882 0.02368 0.01184
42 0.9846 0.03086 0.01543
43 0.9797 0.0406 0.0203
44 0.9729 0.05425 0.02712
45 0.9661 0.06782 0.03391
46 0.9559 0.08816 0.04408
47 0.9435 0.113 0.0565
48 0.9657 0.06858 0.03429
49 0.9837 0.03268 0.01634
50 0.9787 0.0425 0.02125
51 0.9741 0.05182 0.02591
52 0.9691 0.06184 0.03092
53 0.9604 0.07912 0.03956
54 0.9535 0.09298 0.04649
55 0.9449 0.1102 0.05508
56 0.9395 0.121 0.06052
57 0.9271 0.1457 0.07287
58 0.9175 0.165 0.08248
59 0.9005 0.1989 0.09946
60 0.8852 0.2296 0.1148
61 0.9032 0.1936 0.09681
62 0.8852 0.2296 0.1148
63 0.863 0.274 0.137
64 0.8383 0.3235 0.1617
65 0.8138 0.3724 0.1862
66 0.799 0.4019 0.201
67 0.8524 0.2953 0.1476
68 0.8393 0.3215 0.1607
69 0.8757 0.2486 0.1243
70 0.8546 0.2908 0.1454
71 0.8585 0.2829 0.1415
72 0.8344 0.3311 0.1656
73 0.8422 0.3156 0.1578
74 0.8716 0.2568 0.1284
75 0.9032 0.1935 0.09676
76 0.8904 0.2192 0.1096
77 0.8721 0.2558 0.1279
78 0.8524 0.2952 0.1476
79 0.8393 0.3214 0.1607
80 0.8228 0.3545 0.1772
81 0.8639 0.2721 0.1361
82 0.9096 0.1808 0.09041
83 0.9057 0.1886 0.09429
84 0.9404 0.1192 0.05958
85 0.9281 0.1438 0.07189
86 0.9148 0.1704 0.08518
87 0.9035 0.193 0.09651
88 0.8827 0.2347 0.1173
89 0.8652 0.2696 0.1348
90 0.8492 0.3015 0.1508
91 0.8319 0.3363 0.1681
92 0.801 0.398 0.199
93 0.7675 0.465 0.2325
94 0.7493 0.5015 0.2507
95 0.9034 0.1933 0.09665
96 0.8842 0.2316 0.1158
97 0.8818 0.2363 0.1182
98 0.8646 0.2708 0.1354
99 0.8927 0.2147 0.1073
100 0.9041 0.1919 0.09593
101 0.9078 0.1843 0.09217
102 0.8889 0.2223 0.1111
103 0.87 0.26 0.13
104 0.8453 0.3094 0.1547
105 0.8241 0.3518 0.1759
106 0.8034 0.3931 0.1966
107 0.777 0.4459 0.223
108 0.7525 0.4951 0.2475
109 0.725 0.5499 0.275
110 0.7899 0.4202 0.2101
111 0.9505 0.09893 0.04947
112 0.964 0.07199 0.03599
113 0.9538 0.09248 0.04624
114 0.943 0.114 0.05702
115 0.9297 0.1407 0.07033
116 0.9101 0.1799 0.08994
117 0.889 0.2221 0.111
118 0.8696 0.2609 0.1304
119 0.9214 0.1572 0.07862
120 0.8997 0.2005 0.1003
121 0.8797 0.2407 0.1203
122 0.8521 0.2958 0.1479
123 0.82 0.36 0.18
124 0.7856 0.4289 0.2144
125 0.8384 0.3232 0.1616
126 0.8793 0.2414 0.1207
127 0.8522 0.2957 0.1478
128 0.8317 0.3365 0.1683
129 0.8638 0.2723 0.1362
130 0.9545 0.09105 0.04552
131 0.9674 0.06517 0.03258
132 0.9549 0.09023 0.04512
133 0.9364 0.1272 0.06361
134 0.9604 0.07921 0.0396
135 0.9629 0.07414 0.03707
136 0.964 0.07201 0.03601
137 0.9469 0.1062 0.05309
138 0.923 0.154 0.07701
139 0.9706 0.05889 0.02944
140 0.9602 0.07957 0.03979
141 0.937 0.126 0.063
142 0.9105 0.179 0.08948
143 0.8737 0.2526 0.1263
144 0.9703 0.05936 0.02968
145 0.9561 0.08774 0.04387
146 0.9215 0.1571 0.07853
147 0.9418 0.1163 0.05817
148 0.8971 0.2058 0.1029
149 0.885 0.2299 0.115
150 0.9347 0.1307 0.06534
151 0.8427 0.3145 0.1573







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level150.103448NOK
10% type I error level450.310345NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 15 & 0.103448 & NOK \tabularnewline
10% type I error level & 45 & 0.310345 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298884&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]15[/C][C]0.103448[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]45[/C][C]0.310345[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298884&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298884&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level150.103448NOK
10% type I error level450.310345NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.637, df1 = 2, df2 = 152, p-value = 0.198
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.66032, df1 = 6, df2 = 148, p-value = 0.6818
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.451, df1 = 2, df2 = 152, p-value = 0.2376

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.637, df1 = 2, df2 = 152, p-value = 0.198
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.66032, df1 = 6, df2 = 148, p-value = 0.6818
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.451, df1 = 2, df2 = 152, p-value = 0.2376
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=298884&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.637, df1 = 2, df2 = 152, p-value = 0.198
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.66032, df1 = 6, df2 = 148, p-value = 0.6818
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.451, df1 = 2, df2 = 152, p-value = 0.2376
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298884&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298884&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.637, df1 = 2, df2 = 152, p-value = 0.198
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.66032, df1 = 6, df2 = 148, p-value = 0.6818
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.451, df1 = 2, df2 = 152, p-value = 0.2376







Variance Inflation Factors (Multicollinearity)
> vif
     IVHB2      IVHB3 `IVHB4\\r` 
  1.011863   1.048367   1.038905 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
     IVHB2      IVHB3 `IVHB4\\r` 
  1.011863   1.048367   1.038905 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=298884&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
     IVHB2      IVHB3 `IVHB4\\r` 
  1.011863   1.048367   1.038905 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298884&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298884&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
     IVHB2      IVHB3 `IVHB4\\r` 
  1.011863   1.048367   1.038905 



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')