Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 21 Dec 2016 15:43:15 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/21/t1482331432fon6anptepm1fbk.htm/, Retrieved Mon, 06 May 2024 13:30:32 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=302343, Retrieved Mon, 06 May 2024 13:30:32 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact71
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Multiple Regressi...] [2016-12-21 14:43:15] [bde5266f17215258f6d7c4cd7e531432] [Current]
Feedback Forum

Post a new message
Dataseries X:
14	2	2	3	4
19	4	2	1	4
17	4	2	5	4
17	4	3	4	4
15	3	4	3	3
20	4	3	2	5
15	1	4	4	4
19	4	2	5	4
15	3	3	5	2
15	4	4	3	4
19	2	2	2	4
16	4	2	2	3
20	4	5	4	3
18	5	4	4	4
15	4	2	4	4
14	1	3	5	4
20	2	1	2	5
16	4	1	3	3
16	4	3	2	4
16	5	4	4	4
10	5	5	4	4
19	4	5	4	4
19	1	1	5	4
16	4	4	3	4
15	2	2	4	4
18	4	4	3	4
17	5	4	3	3
19	3	3	3	3
17	5	4	5	5
13	3	2	4	4
19	5	2	4	4
20	2	4	3	4
5	1	2	3	4
19	3	4	5	1
16	4	2	3	3
15	4	4	3	4
16	3	3	3	4
18	5	3	5	5
16	4	4	3	4
15	3	2	3	4
17	4	3	3	4
13	2	2	4	3
20	3	4	3	4
19	1	2	1	5
7	3	2	4	4
13	3	3	4	3
16	3	3	3	3
16	4	3	4	5
18	4	4	4	4
18	4	5	5	1
16	4	4	4	4
17	4	4	4	4
19	2	4	3	4
16	5	2	2	4
19	3	2	4	3
13	3	1	3	4
16	4	3	3	3
13	4	4	3	4
12	4	3	4	2
17	3	3	4	4
17	4	2	3	4
17	4	3	4	4
16	4	2	5	3
16	4	4	2	4
14	4	3	3	3
16	2	2	3	4
13	4	4	3	3
16	4	5	4	4
14	4	4	3	4
20	4	3	4	4
12	4	2	3	4
13	5	3	1	3
18	3	4	4	3
14	2	4	3	2
19	4	4	2	4
18	5	5	3	5
14	4	4	3	4
18	5	4	4	5
19	5	4	5	2
15	2	3	3	4
14	4	2	4	4
17	4	4	2	4
19	4	4	2	4
13	3	4	2	5
19	4	2	3	4
18	2	2	4	4
20	5	1	3	4
15	3	4	5	4
15	4	4	4	1
15	2	4	4	4
20	4	4	3	4
15	3	3	4	3
19	3	4	3	4
18	4	4	5	4
18	4	4	4	3
15	4	2	4	3
20	3	4	3	4
17	4	4	4	5
12	3	1	1	3
18	3	4	4	4
19	1	2	4	3
20	4	3	4	4
13	3	3	4	5
17	3	4	4	3
15	5	3	3	4
16	5	4	5	4
18	4	4	3	4
18	5	4	5	5
14	4	4	4	4
15	4	5	4	4
12	4	5	4	5
17	4	2	4	3
14	3	1	3	3
18	4	3	4	3
17	3	3	3	4
17	4	1	3	4
20	2	4	3	4
16	1	4	3	4
14	5	2	2	4
15	4	4	4	4
18	3	3	3	3
20	4	4	2	4
17	4	4	4	5
17	4	2	4	4
17	4	2	3	3
17	2	4	4	4
15	4	4	5	4
17	4	2	4	3
18	4	2	3	3
17	4	2	4	4
20	3	2	4	2
15	4	5	4	4
16	5	2	5	3
15	2	2	2	4
18	5	2	4	4
11	4	4	4	4
15	3	5	5	4
18	4	4	4	3
20	2	4	4	2
19	2	3	5	5
14	2	3	2	3
16	4	1	4	4
15	4	4	5	4
17	5	5	3	4
18	3	4	4	5
20	3	4	4	4
17	4	5	3	4
18	4	4	5	3
15	4	5	5	1
16	4	5	3	4
11	4	3	2	5
15	4	5	4	4
18	4	1	5	4
17	2	3	3	4
16	5	2	3	5
12	4	2	4	4
19	4	4	3	4
18	4	4	2	4
15	4	2	3	4
17	4	5	3	4
19	2	4	4	3
18	3	5	1	5
19	3	3	4	3
16	4	2	3	4
16	4	4	3	4
16	4	2	2	5
14	4	3	3	4
16	3	3	3	4
14	3	2	5	2




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time9 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time9 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=302343&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]9 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=302343&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=302343&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time9 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
ITHSUM[t] = + 15.287 + 0.0286055IVHB1[t] + 0.212921IVHB2[t] + 0.0479789IVHB3[t] + 0.034385IVHB4[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
ITHSUM[t] =  +  15.287 +  0.0286055IVHB1[t] +  0.212921IVHB2[t] +  0.0479789IVHB3[t] +  0.034385IVHB4[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=302343&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]ITHSUM[t] =  +  15.287 +  0.0286055IVHB1[t] +  0.212921IVHB2[t] +  0.0479789IVHB3[t] +  0.034385IVHB4[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=302343&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=302343&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
ITHSUM[t] = + 15.287 + 0.0286055IVHB1[t] + 0.212921IVHB2[t] + 0.0479789IVHB3[t] + 0.034385IVHB4[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+15.29 1.463+1.0450e+01 6.662e-20 3.331e-20
IVHB1+0.02861 0.1989+1.4380e-01 0.8858 0.4429
IVHB2+0.2129 0.1777+1.1980e+00 0.2325 0.1162
IVHB3+0.04798 0.2088+2.2980e-01 0.8185 0.4093
IVHB4+0.03438 0.2478+1.3880e-01 0.8898 0.4449

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +15.29 &  1.463 & +1.0450e+01 &  6.662e-20 &  3.331e-20 \tabularnewline
IVHB1 & +0.02861 &  0.1989 & +1.4380e-01 &  0.8858 &  0.4429 \tabularnewline
IVHB2 & +0.2129 &  0.1777 & +1.1980e+00 &  0.2325 &  0.1162 \tabularnewline
IVHB3 & +0.04798 &  0.2088 & +2.2980e-01 &  0.8185 &  0.4093 \tabularnewline
IVHB4 & +0.03438 &  0.2478 & +1.3880e-01 &  0.8898 &  0.4449 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=302343&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+15.29[/C][C] 1.463[/C][C]+1.0450e+01[/C][C] 6.662e-20[/C][C] 3.331e-20[/C][/ROW]
[ROW][C]IVHB1[/C][C]+0.02861[/C][C] 0.1989[/C][C]+1.4380e-01[/C][C] 0.8858[/C][C] 0.4429[/C][/ROW]
[ROW][C]IVHB2[/C][C]+0.2129[/C][C] 0.1777[/C][C]+1.1980e+00[/C][C] 0.2325[/C][C] 0.1162[/C][/ROW]
[ROW][C]IVHB3[/C][C]+0.04798[/C][C] 0.2088[/C][C]+2.2980e-01[/C][C] 0.8185[/C][C] 0.4093[/C][/ROW]
[ROW][C]IVHB4[/C][C]+0.03438[/C][C] 0.2478[/C][C]+1.3880e-01[/C][C] 0.8898[/C][C] 0.4449[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=302343&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=302343&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+15.29 1.463+1.0450e+01 6.662e-20 3.331e-20
IVHB1+0.02861 0.1989+1.4380e-01 0.8858 0.4429
IVHB2+0.2129 0.1777+1.1980e+00 0.2325 0.1162
IVHB3+0.04798 0.2088+2.2980e-01 0.8185 0.4093
IVHB4+0.03438 0.2478+1.3880e-01 0.8898 0.4449







Multiple Linear Regression - Regression Statistics
Multiple R 0.1018
R-squared 0.01036
Adjusted R-squared-0.01378
F-TEST (value) 0.429
F-TEST (DF numerator)4
F-TEST (DF denominator)164
p-value 0.7875
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 2.521
Sum Squared Residuals 1042

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.1018 \tabularnewline
R-squared &  0.01036 \tabularnewline
Adjusted R-squared & -0.01378 \tabularnewline
F-TEST (value) &  0.429 \tabularnewline
F-TEST (DF numerator) & 4 \tabularnewline
F-TEST (DF denominator) & 164 \tabularnewline
p-value &  0.7875 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  2.521 \tabularnewline
Sum Squared Residuals &  1042 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=302343&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.1018[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.01036[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]-0.01378[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 0.429[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]4[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]164[/C][/ROW]
[ROW][C]p-value[/C][C] 0.7875[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 2.521[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 1042[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=302343&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=302343&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.1018
R-squared 0.01036
Adjusted R-squared-0.01378
F-TEST (value) 0.429
F-TEST (DF numerator)4
F-TEST (DF denominator)164
p-value 0.7875
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 2.521
Sum Squared Residuals 1042







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 14 16.05-2.052
2 19 16.01 2.987
3 17 16.2 0.7953
4 17 16.37 0.6303
5 15 16.47-1.472
6 20 16.31 3.692
7 15 16.5-1.497
8 19 16.2 2.795
9 15 16.32-1.32
10 15 16.53-1.535
11 19 16 2.996
12 16 16.03-0.02642
13 20 16.76 3.239
14 18 16.61 1.389
15 15 16.16-1.157
16 14 16.33-2.332
17 20 15.83 4.175
18 16 15.86 0.1385
19 16 16.27-0.2737
20 16 16.61-0.6112
21 10 16.82-6.824
22 19 16.8 2.204
23 19 15.91 3.094
24 16 16.53-0.5346
25 15 16.1-1.1
26 18 16.53 1.465
27 17 16.53 0.4712
28 19 16.26 2.741
29 17 16.69 0.3064
30 13 16.13-3.128
31 19 16.19 2.815
32 20 16.48 3.523
33 5 16.02-11.02
34 19 16.5 2.501
35 16 16.07-0.0744
36 15 16.53-1.535
37 16 16.29-0.2931
38 18 16.48 1.519
39 16 16.53-0.5346
40 15 16.08-1.08
41 17 16.32 0.6783
42 13 16.07-3.065
43 20 16.51 3.494
44 19 15.96 3.039
45 7 16.13-9.128
46 13 16.31-3.307
47 16 16.26-0.2587
48 16 16.4-0.4041
49 18 16.58 1.417
50 18 16.74 1.26
51 16 16.58-0.5826
52 17 16.58 0.4174
53 19 16.48 2.523
54 16 16.09-0.08941
55 19 16.09 2.906
56 13 15.87-2.867
57 16 16.29-0.2873
58 13 16.53-3.535
59 12 16.3-4.301
60 17 16.34 0.6589
61 17 16.11 0.8912
62 17 16.37 0.6303
63 16 16.17-0.1704
64 16 16.49-0.4866
65 14 16.29-2.287
66 16 16.05-0.05157
67 13 16.5-3.5
68 16 16.8-0.7955
69 14 16.53-2.535
70 20 16.37 3.63
71 12 16.11-4.109
72 13 16.22-3.22
73 18 16.52 1.48
74 14 16.41-2.409
75 19 16.49 2.513
76 18 16.81 1.189
77 14 16.53-2.535
78 18 16.65 1.354
79 19 16.59 2.41
80 15 16.26-1.264
81 14 16.16-2.157
82 17 16.49 0.5134
83 19 16.49 2.513
84 13 16.49-3.492
85 19 16.11 2.891
86 18 16.1 1.9
87 20 15.92 4.076
88 15 16.6-1.602
89 15 16.48-1.479
90 15 16.53-1.525
91 20 16.53 3.465
92 15 16.31-1.307
93 19 16.51 2.494
94 18 16.63 1.369
95 18 16.55 1.452
96 15 16.12-1.122
97 20 16.51 3.494
98 17 16.62 0.383
99 12 15.74-3.737
100 18 16.55 1.446
101 19 16.04 2.963
102 20 16.37 3.63
103 13 16.38-3.375
104 17 16.52 0.4804
105 15 16.35-1.35
106 16 16.66-0.6592
107 18 16.53 1.465
108 18 16.69 1.306
109 14 16.58-2.583
110 15 16.8-1.796
111 12 16.83-4.83
112 17 16.12 0.8776
113 14 15.83-1.833
114 18 16.34 1.665
115 17 16.29 0.7069
116 17 15.9 1.104
117 20 16.48 3.523
118 16 16.45-0.4488
119 14 16.09-2.089
120 15 16.58-1.583
121 18 16.26 1.741
122 20 16.49 3.513
123 17 16.62 0.383
124 17 16.16 0.8432
125 17 16.07 0.9256
126 17 16.53 0.4746
127 15 16.63-1.631
128 17 16.12 0.8776
129 18 16.07 1.926
130 17 16.16 0.8432
131 20 16.06 3.941
132 15 16.8-1.796
133 16 16.2-0.199
134 15 16-1.004
135 18 16.19 1.815
136 11 16.58-5.583
137 15 16.81-1.815
138 18 16.55 1.452
139 20 16.46 3.543
140 19 16.39 2.605
141 14 16.18-2.182
142 16 15.94 0.05616
143 15 16.63-1.631
144 17 16.78 0.2238
145 18 16.59 1.412
146 20 16.55 3.446
147 17 16.75 0.2525
148 18 16.6 1.404
149 15 16.74-1.74
150 16 16.75-0.7475
151 11 16.31-5.308
152 15 16.8-1.796
153 18 15.99 2.008
154 17 16.26 0.7355
155 16 16.17-0.1718
156 12 16.16-4.157
157 19 16.53 2.465
158 18 16.49 1.513
159 15 16.11-1.109
160 17 16.75 0.2525
161 19 16.49 2.509
162 18 16.66 1.343
163 19 16.31 2.693
164 16 16.11-0.1088
165 16 16.53-0.5346
166 16 16.1-0.09519
167 14 16.32-2.322
168 16 16.29-0.2931
169 14 16.11-2.107

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  14 &  16.05 & -2.052 \tabularnewline
2 &  19 &  16.01 &  2.987 \tabularnewline
3 &  17 &  16.2 &  0.7953 \tabularnewline
4 &  17 &  16.37 &  0.6303 \tabularnewline
5 &  15 &  16.47 & -1.472 \tabularnewline
6 &  20 &  16.31 &  3.692 \tabularnewline
7 &  15 &  16.5 & -1.497 \tabularnewline
8 &  19 &  16.2 &  2.795 \tabularnewline
9 &  15 &  16.32 & -1.32 \tabularnewline
10 &  15 &  16.53 & -1.535 \tabularnewline
11 &  19 &  16 &  2.996 \tabularnewline
12 &  16 &  16.03 & -0.02642 \tabularnewline
13 &  20 &  16.76 &  3.239 \tabularnewline
14 &  18 &  16.61 &  1.389 \tabularnewline
15 &  15 &  16.16 & -1.157 \tabularnewline
16 &  14 &  16.33 & -2.332 \tabularnewline
17 &  20 &  15.83 &  4.175 \tabularnewline
18 &  16 &  15.86 &  0.1385 \tabularnewline
19 &  16 &  16.27 & -0.2737 \tabularnewline
20 &  16 &  16.61 & -0.6112 \tabularnewline
21 &  10 &  16.82 & -6.824 \tabularnewline
22 &  19 &  16.8 &  2.204 \tabularnewline
23 &  19 &  15.91 &  3.094 \tabularnewline
24 &  16 &  16.53 & -0.5346 \tabularnewline
25 &  15 &  16.1 & -1.1 \tabularnewline
26 &  18 &  16.53 &  1.465 \tabularnewline
27 &  17 &  16.53 &  0.4712 \tabularnewline
28 &  19 &  16.26 &  2.741 \tabularnewline
29 &  17 &  16.69 &  0.3064 \tabularnewline
30 &  13 &  16.13 & -3.128 \tabularnewline
31 &  19 &  16.19 &  2.815 \tabularnewline
32 &  20 &  16.48 &  3.523 \tabularnewline
33 &  5 &  16.02 & -11.02 \tabularnewline
34 &  19 &  16.5 &  2.501 \tabularnewline
35 &  16 &  16.07 & -0.0744 \tabularnewline
36 &  15 &  16.53 & -1.535 \tabularnewline
37 &  16 &  16.29 & -0.2931 \tabularnewline
38 &  18 &  16.48 &  1.519 \tabularnewline
39 &  16 &  16.53 & -0.5346 \tabularnewline
40 &  15 &  16.08 & -1.08 \tabularnewline
41 &  17 &  16.32 &  0.6783 \tabularnewline
42 &  13 &  16.07 & -3.065 \tabularnewline
43 &  20 &  16.51 &  3.494 \tabularnewline
44 &  19 &  15.96 &  3.039 \tabularnewline
45 &  7 &  16.13 & -9.128 \tabularnewline
46 &  13 &  16.31 & -3.307 \tabularnewline
47 &  16 &  16.26 & -0.2587 \tabularnewline
48 &  16 &  16.4 & -0.4041 \tabularnewline
49 &  18 &  16.58 &  1.417 \tabularnewline
50 &  18 &  16.74 &  1.26 \tabularnewline
51 &  16 &  16.58 & -0.5826 \tabularnewline
52 &  17 &  16.58 &  0.4174 \tabularnewline
53 &  19 &  16.48 &  2.523 \tabularnewline
54 &  16 &  16.09 & -0.08941 \tabularnewline
55 &  19 &  16.09 &  2.906 \tabularnewline
56 &  13 &  15.87 & -2.867 \tabularnewline
57 &  16 &  16.29 & -0.2873 \tabularnewline
58 &  13 &  16.53 & -3.535 \tabularnewline
59 &  12 &  16.3 & -4.301 \tabularnewline
60 &  17 &  16.34 &  0.6589 \tabularnewline
61 &  17 &  16.11 &  0.8912 \tabularnewline
62 &  17 &  16.37 &  0.6303 \tabularnewline
63 &  16 &  16.17 & -0.1704 \tabularnewline
64 &  16 &  16.49 & -0.4866 \tabularnewline
65 &  14 &  16.29 & -2.287 \tabularnewline
66 &  16 &  16.05 & -0.05157 \tabularnewline
67 &  13 &  16.5 & -3.5 \tabularnewline
68 &  16 &  16.8 & -0.7955 \tabularnewline
69 &  14 &  16.53 & -2.535 \tabularnewline
70 &  20 &  16.37 &  3.63 \tabularnewline
71 &  12 &  16.11 & -4.109 \tabularnewline
72 &  13 &  16.22 & -3.22 \tabularnewline
73 &  18 &  16.52 &  1.48 \tabularnewline
74 &  14 &  16.41 & -2.409 \tabularnewline
75 &  19 &  16.49 &  2.513 \tabularnewline
76 &  18 &  16.81 &  1.189 \tabularnewline
77 &  14 &  16.53 & -2.535 \tabularnewline
78 &  18 &  16.65 &  1.354 \tabularnewline
79 &  19 &  16.59 &  2.41 \tabularnewline
80 &  15 &  16.26 & -1.264 \tabularnewline
81 &  14 &  16.16 & -2.157 \tabularnewline
82 &  17 &  16.49 &  0.5134 \tabularnewline
83 &  19 &  16.49 &  2.513 \tabularnewline
84 &  13 &  16.49 & -3.492 \tabularnewline
85 &  19 &  16.11 &  2.891 \tabularnewline
86 &  18 &  16.1 &  1.9 \tabularnewline
87 &  20 &  15.92 &  4.076 \tabularnewline
88 &  15 &  16.6 & -1.602 \tabularnewline
89 &  15 &  16.48 & -1.479 \tabularnewline
90 &  15 &  16.53 & -1.525 \tabularnewline
91 &  20 &  16.53 &  3.465 \tabularnewline
92 &  15 &  16.31 & -1.307 \tabularnewline
93 &  19 &  16.51 &  2.494 \tabularnewline
94 &  18 &  16.63 &  1.369 \tabularnewline
95 &  18 &  16.55 &  1.452 \tabularnewline
96 &  15 &  16.12 & -1.122 \tabularnewline
97 &  20 &  16.51 &  3.494 \tabularnewline
98 &  17 &  16.62 &  0.383 \tabularnewline
99 &  12 &  15.74 & -3.737 \tabularnewline
100 &  18 &  16.55 &  1.446 \tabularnewline
101 &  19 &  16.04 &  2.963 \tabularnewline
102 &  20 &  16.37 &  3.63 \tabularnewline
103 &  13 &  16.38 & -3.375 \tabularnewline
104 &  17 &  16.52 &  0.4804 \tabularnewline
105 &  15 &  16.35 & -1.35 \tabularnewline
106 &  16 &  16.66 & -0.6592 \tabularnewline
107 &  18 &  16.53 &  1.465 \tabularnewline
108 &  18 &  16.69 &  1.306 \tabularnewline
109 &  14 &  16.58 & -2.583 \tabularnewline
110 &  15 &  16.8 & -1.796 \tabularnewline
111 &  12 &  16.83 & -4.83 \tabularnewline
112 &  17 &  16.12 &  0.8776 \tabularnewline
113 &  14 &  15.83 & -1.833 \tabularnewline
114 &  18 &  16.34 &  1.665 \tabularnewline
115 &  17 &  16.29 &  0.7069 \tabularnewline
116 &  17 &  15.9 &  1.104 \tabularnewline
117 &  20 &  16.48 &  3.523 \tabularnewline
118 &  16 &  16.45 & -0.4488 \tabularnewline
119 &  14 &  16.09 & -2.089 \tabularnewline
120 &  15 &  16.58 & -1.583 \tabularnewline
121 &  18 &  16.26 &  1.741 \tabularnewline
122 &  20 &  16.49 &  3.513 \tabularnewline
123 &  17 &  16.62 &  0.383 \tabularnewline
124 &  17 &  16.16 &  0.8432 \tabularnewline
125 &  17 &  16.07 &  0.9256 \tabularnewline
126 &  17 &  16.53 &  0.4746 \tabularnewline
127 &  15 &  16.63 & -1.631 \tabularnewline
128 &  17 &  16.12 &  0.8776 \tabularnewline
129 &  18 &  16.07 &  1.926 \tabularnewline
130 &  17 &  16.16 &  0.8432 \tabularnewline
131 &  20 &  16.06 &  3.941 \tabularnewline
132 &  15 &  16.8 & -1.796 \tabularnewline
133 &  16 &  16.2 & -0.199 \tabularnewline
134 &  15 &  16 & -1.004 \tabularnewline
135 &  18 &  16.19 &  1.815 \tabularnewline
136 &  11 &  16.58 & -5.583 \tabularnewline
137 &  15 &  16.81 & -1.815 \tabularnewline
138 &  18 &  16.55 &  1.452 \tabularnewline
139 &  20 &  16.46 &  3.543 \tabularnewline
140 &  19 &  16.39 &  2.605 \tabularnewline
141 &  14 &  16.18 & -2.182 \tabularnewline
142 &  16 &  15.94 &  0.05616 \tabularnewline
143 &  15 &  16.63 & -1.631 \tabularnewline
144 &  17 &  16.78 &  0.2238 \tabularnewline
145 &  18 &  16.59 &  1.412 \tabularnewline
146 &  20 &  16.55 &  3.446 \tabularnewline
147 &  17 &  16.75 &  0.2525 \tabularnewline
148 &  18 &  16.6 &  1.404 \tabularnewline
149 &  15 &  16.74 & -1.74 \tabularnewline
150 &  16 &  16.75 & -0.7475 \tabularnewline
151 &  11 &  16.31 & -5.308 \tabularnewline
152 &  15 &  16.8 & -1.796 \tabularnewline
153 &  18 &  15.99 &  2.008 \tabularnewline
154 &  17 &  16.26 &  0.7355 \tabularnewline
155 &  16 &  16.17 & -0.1718 \tabularnewline
156 &  12 &  16.16 & -4.157 \tabularnewline
157 &  19 &  16.53 &  2.465 \tabularnewline
158 &  18 &  16.49 &  1.513 \tabularnewline
159 &  15 &  16.11 & -1.109 \tabularnewline
160 &  17 &  16.75 &  0.2525 \tabularnewline
161 &  19 &  16.49 &  2.509 \tabularnewline
162 &  18 &  16.66 &  1.343 \tabularnewline
163 &  19 &  16.31 &  2.693 \tabularnewline
164 &  16 &  16.11 & -0.1088 \tabularnewline
165 &  16 &  16.53 & -0.5346 \tabularnewline
166 &  16 &  16.1 & -0.09519 \tabularnewline
167 &  14 &  16.32 & -2.322 \tabularnewline
168 &  16 &  16.29 & -0.2931 \tabularnewline
169 &  14 &  16.11 & -2.107 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=302343&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 14[/C][C] 16.05[/C][C]-2.052[/C][/ROW]
[ROW][C]2[/C][C] 19[/C][C] 16.01[/C][C] 2.987[/C][/ROW]
[ROW][C]3[/C][C] 17[/C][C] 16.2[/C][C] 0.7953[/C][/ROW]
[ROW][C]4[/C][C] 17[/C][C] 16.37[/C][C] 0.6303[/C][/ROW]
[ROW][C]5[/C][C] 15[/C][C] 16.47[/C][C]-1.472[/C][/ROW]
[ROW][C]6[/C][C] 20[/C][C] 16.31[/C][C] 3.692[/C][/ROW]
[ROW][C]7[/C][C] 15[/C][C] 16.5[/C][C]-1.497[/C][/ROW]
[ROW][C]8[/C][C] 19[/C][C] 16.2[/C][C] 2.795[/C][/ROW]
[ROW][C]9[/C][C] 15[/C][C] 16.32[/C][C]-1.32[/C][/ROW]
[ROW][C]10[/C][C] 15[/C][C] 16.53[/C][C]-1.535[/C][/ROW]
[ROW][C]11[/C][C] 19[/C][C] 16[/C][C] 2.996[/C][/ROW]
[ROW][C]12[/C][C] 16[/C][C] 16.03[/C][C]-0.02642[/C][/ROW]
[ROW][C]13[/C][C] 20[/C][C] 16.76[/C][C] 3.239[/C][/ROW]
[ROW][C]14[/C][C] 18[/C][C] 16.61[/C][C] 1.389[/C][/ROW]
[ROW][C]15[/C][C] 15[/C][C] 16.16[/C][C]-1.157[/C][/ROW]
[ROW][C]16[/C][C] 14[/C][C] 16.33[/C][C]-2.332[/C][/ROW]
[ROW][C]17[/C][C] 20[/C][C] 15.83[/C][C] 4.175[/C][/ROW]
[ROW][C]18[/C][C] 16[/C][C] 15.86[/C][C] 0.1385[/C][/ROW]
[ROW][C]19[/C][C] 16[/C][C] 16.27[/C][C]-0.2737[/C][/ROW]
[ROW][C]20[/C][C] 16[/C][C] 16.61[/C][C]-0.6112[/C][/ROW]
[ROW][C]21[/C][C] 10[/C][C] 16.82[/C][C]-6.824[/C][/ROW]
[ROW][C]22[/C][C] 19[/C][C] 16.8[/C][C] 2.204[/C][/ROW]
[ROW][C]23[/C][C] 19[/C][C] 15.91[/C][C] 3.094[/C][/ROW]
[ROW][C]24[/C][C] 16[/C][C] 16.53[/C][C]-0.5346[/C][/ROW]
[ROW][C]25[/C][C] 15[/C][C] 16.1[/C][C]-1.1[/C][/ROW]
[ROW][C]26[/C][C] 18[/C][C] 16.53[/C][C] 1.465[/C][/ROW]
[ROW][C]27[/C][C] 17[/C][C] 16.53[/C][C] 0.4712[/C][/ROW]
[ROW][C]28[/C][C] 19[/C][C] 16.26[/C][C] 2.741[/C][/ROW]
[ROW][C]29[/C][C] 17[/C][C] 16.69[/C][C] 0.3064[/C][/ROW]
[ROW][C]30[/C][C] 13[/C][C] 16.13[/C][C]-3.128[/C][/ROW]
[ROW][C]31[/C][C] 19[/C][C] 16.19[/C][C] 2.815[/C][/ROW]
[ROW][C]32[/C][C] 20[/C][C] 16.48[/C][C] 3.523[/C][/ROW]
[ROW][C]33[/C][C] 5[/C][C] 16.02[/C][C]-11.02[/C][/ROW]
[ROW][C]34[/C][C] 19[/C][C] 16.5[/C][C] 2.501[/C][/ROW]
[ROW][C]35[/C][C] 16[/C][C] 16.07[/C][C]-0.0744[/C][/ROW]
[ROW][C]36[/C][C] 15[/C][C] 16.53[/C][C]-1.535[/C][/ROW]
[ROW][C]37[/C][C] 16[/C][C] 16.29[/C][C]-0.2931[/C][/ROW]
[ROW][C]38[/C][C] 18[/C][C] 16.48[/C][C] 1.519[/C][/ROW]
[ROW][C]39[/C][C] 16[/C][C] 16.53[/C][C]-0.5346[/C][/ROW]
[ROW][C]40[/C][C] 15[/C][C] 16.08[/C][C]-1.08[/C][/ROW]
[ROW][C]41[/C][C] 17[/C][C] 16.32[/C][C] 0.6783[/C][/ROW]
[ROW][C]42[/C][C] 13[/C][C] 16.07[/C][C]-3.065[/C][/ROW]
[ROW][C]43[/C][C] 20[/C][C] 16.51[/C][C] 3.494[/C][/ROW]
[ROW][C]44[/C][C] 19[/C][C] 15.96[/C][C] 3.039[/C][/ROW]
[ROW][C]45[/C][C] 7[/C][C] 16.13[/C][C]-9.128[/C][/ROW]
[ROW][C]46[/C][C] 13[/C][C] 16.31[/C][C]-3.307[/C][/ROW]
[ROW][C]47[/C][C] 16[/C][C] 16.26[/C][C]-0.2587[/C][/ROW]
[ROW][C]48[/C][C] 16[/C][C] 16.4[/C][C]-0.4041[/C][/ROW]
[ROW][C]49[/C][C] 18[/C][C] 16.58[/C][C] 1.417[/C][/ROW]
[ROW][C]50[/C][C] 18[/C][C] 16.74[/C][C] 1.26[/C][/ROW]
[ROW][C]51[/C][C] 16[/C][C] 16.58[/C][C]-0.5826[/C][/ROW]
[ROW][C]52[/C][C] 17[/C][C] 16.58[/C][C] 0.4174[/C][/ROW]
[ROW][C]53[/C][C] 19[/C][C] 16.48[/C][C] 2.523[/C][/ROW]
[ROW][C]54[/C][C] 16[/C][C] 16.09[/C][C]-0.08941[/C][/ROW]
[ROW][C]55[/C][C] 19[/C][C] 16.09[/C][C] 2.906[/C][/ROW]
[ROW][C]56[/C][C] 13[/C][C] 15.87[/C][C]-2.867[/C][/ROW]
[ROW][C]57[/C][C] 16[/C][C] 16.29[/C][C]-0.2873[/C][/ROW]
[ROW][C]58[/C][C] 13[/C][C] 16.53[/C][C]-3.535[/C][/ROW]
[ROW][C]59[/C][C] 12[/C][C] 16.3[/C][C]-4.301[/C][/ROW]
[ROW][C]60[/C][C] 17[/C][C] 16.34[/C][C] 0.6589[/C][/ROW]
[ROW][C]61[/C][C] 17[/C][C] 16.11[/C][C] 0.8912[/C][/ROW]
[ROW][C]62[/C][C] 17[/C][C] 16.37[/C][C] 0.6303[/C][/ROW]
[ROW][C]63[/C][C] 16[/C][C] 16.17[/C][C]-0.1704[/C][/ROW]
[ROW][C]64[/C][C] 16[/C][C] 16.49[/C][C]-0.4866[/C][/ROW]
[ROW][C]65[/C][C] 14[/C][C] 16.29[/C][C]-2.287[/C][/ROW]
[ROW][C]66[/C][C] 16[/C][C] 16.05[/C][C]-0.05157[/C][/ROW]
[ROW][C]67[/C][C] 13[/C][C] 16.5[/C][C]-3.5[/C][/ROW]
[ROW][C]68[/C][C] 16[/C][C] 16.8[/C][C]-0.7955[/C][/ROW]
[ROW][C]69[/C][C] 14[/C][C] 16.53[/C][C]-2.535[/C][/ROW]
[ROW][C]70[/C][C] 20[/C][C] 16.37[/C][C] 3.63[/C][/ROW]
[ROW][C]71[/C][C] 12[/C][C] 16.11[/C][C]-4.109[/C][/ROW]
[ROW][C]72[/C][C] 13[/C][C] 16.22[/C][C]-3.22[/C][/ROW]
[ROW][C]73[/C][C] 18[/C][C] 16.52[/C][C] 1.48[/C][/ROW]
[ROW][C]74[/C][C] 14[/C][C] 16.41[/C][C]-2.409[/C][/ROW]
[ROW][C]75[/C][C] 19[/C][C] 16.49[/C][C] 2.513[/C][/ROW]
[ROW][C]76[/C][C] 18[/C][C] 16.81[/C][C] 1.189[/C][/ROW]
[ROW][C]77[/C][C] 14[/C][C] 16.53[/C][C]-2.535[/C][/ROW]
[ROW][C]78[/C][C] 18[/C][C] 16.65[/C][C] 1.354[/C][/ROW]
[ROW][C]79[/C][C] 19[/C][C] 16.59[/C][C] 2.41[/C][/ROW]
[ROW][C]80[/C][C] 15[/C][C] 16.26[/C][C]-1.264[/C][/ROW]
[ROW][C]81[/C][C] 14[/C][C] 16.16[/C][C]-2.157[/C][/ROW]
[ROW][C]82[/C][C] 17[/C][C] 16.49[/C][C] 0.5134[/C][/ROW]
[ROW][C]83[/C][C] 19[/C][C] 16.49[/C][C] 2.513[/C][/ROW]
[ROW][C]84[/C][C] 13[/C][C] 16.49[/C][C]-3.492[/C][/ROW]
[ROW][C]85[/C][C] 19[/C][C] 16.11[/C][C] 2.891[/C][/ROW]
[ROW][C]86[/C][C] 18[/C][C] 16.1[/C][C] 1.9[/C][/ROW]
[ROW][C]87[/C][C] 20[/C][C] 15.92[/C][C] 4.076[/C][/ROW]
[ROW][C]88[/C][C] 15[/C][C] 16.6[/C][C]-1.602[/C][/ROW]
[ROW][C]89[/C][C] 15[/C][C] 16.48[/C][C]-1.479[/C][/ROW]
[ROW][C]90[/C][C] 15[/C][C] 16.53[/C][C]-1.525[/C][/ROW]
[ROW][C]91[/C][C] 20[/C][C] 16.53[/C][C] 3.465[/C][/ROW]
[ROW][C]92[/C][C] 15[/C][C] 16.31[/C][C]-1.307[/C][/ROW]
[ROW][C]93[/C][C] 19[/C][C] 16.51[/C][C] 2.494[/C][/ROW]
[ROW][C]94[/C][C] 18[/C][C] 16.63[/C][C] 1.369[/C][/ROW]
[ROW][C]95[/C][C] 18[/C][C] 16.55[/C][C] 1.452[/C][/ROW]
[ROW][C]96[/C][C] 15[/C][C] 16.12[/C][C]-1.122[/C][/ROW]
[ROW][C]97[/C][C] 20[/C][C] 16.51[/C][C] 3.494[/C][/ROW]
[ROW][C]98[/C][C] 17[/C][C] 16.62[/C][C] 0.383[/C][/ROW]
[ROW][C]99[/C][C] 12[/C][C] 15.74[/C][C]-3.737[/C][/ROW]
[ROW][C]100[/C][C] 18[/C][C] 16.55[/C][C] 1.446[/C][/ROW]
[ROW][C]101[/C][C] 19[/C][C] 16.04[/C][C] 2.963[/C][/ROW]
[ROW][C]102[/C][C] 20[/C][C] 16.37[/C][C] 3.63[/C][/ROW]
[ROW][C]103[/C][C] 13[/C][C] 16.38[/C][C]-3.375[/C][/ROW]
[ROW][C]104[/C][C] 17[/C][C] 16.52[/C][C] 0.4804[/C][/ROW]
[ROW][C]105[/C][C] 15[/C][C] 16.35[/C][C]-1.35[/C][/ROW]
[ROW][C]106[/C][C] 16[/C][C] 16.66[/C][C]-0.6592[/C][/ROW]
[ROW][C]107[/C][C] 18[/C][C] 16.53[/C][C] 1.465[/C][/ROW]
[ROW][C]108[/C][C] 18[/C][C] 16.69[/C][C] 1.306[/C][/ROW]
[ROW][C]109[/C][C] 14[/C][C] 16.58[/C][C]-2.583[/C][/ROW]
[ROW][C]110[/C][C] 15[/C][C] 16.8[/C][C]-1.796[/C][/ROW]
[ROW][C]111[/C][C] 12[/C][C] 16.83[/C][C]-4.83[/C][/ROW]
[ROW][C]112[/C][C] 17[/C][C] 16.12[/C][C] 0.8776[/C][/ROW]
[ROW][C]113[/C][C] 14[/C][C] 15.83[/C][C]-1.833[/C][/ROW]
[ROW][C]114[/C][C] 18[/C][C] 16.34[/C][C] 1.665[/C][/ROW]
[ROW][C]115[/C][C] 17[/C][C] 16.29[/C][C] 0.7069[/C][/ROW]
[ROW][C]116[/C][C] 17[/C][C] 15.9[/C][C] 1.104[/C][/ROW]
[ROW][C]117[/C][C] 20[/C][C] 16.48[/C][C] 3.523[/C][/ROW]
[ROW][C]118[/C][C] 16[/C][C] 16.45[/C][C]-0.4488[/C][/ROW]
[ROW][C]119[/C][C] 14[/C][C] 16.09[/C][C]-2.089[/C][/ROW]
[ROW][C]120[/C][C] 15[/C][C] 16.58[/C][C]-1.583[/C][/ROW]
[ROW][C]121[/C][C] 18[/C][C] 16.26[/C][C] 1.741[/C][/ROW]
[ROW][C]122[/C][C] 20[/C][C] 16.49[/C][C] 3.513[/C][/ROW]
[ROW][C]123[/C][C] 17[/C][C] 16.62[/C][C] 0.383[/C][/ROW]
[ROW][C]124[/C][C] 17[/C][C] 16.16[/C][C] 0.8432[/C][/ROW]
[ROW][C]125[/C][C] 17[/C][C] 16.07[/C][C] 0.9256[/C][/ROW]
[ROW][C]126[/C][C] 17[/C][C] 16.53[/C][C] 0.4746[/C][/ROW]
[ROW][C]127[/C][C] 15[/C][C] 16.63[/C][C]-1.631[/C][/ROW]
[ROW][C]128[/C][C] 17[/C][C] 16.12[/C][C] 0.8776[/C][/ROW]
[ROW][C]129[/C][C] 18[/C][C] 16.07[/C][C] 1.926[/C][/ROW]
[ROW][C]130[/C][C] 17[/C][C] 16.16[/C][C] 0.8432[/C][/ROW]
[ROW][C]131[/C][C] 20[/C][C] 16.06[/C][C] 3.941[/C][/ROW]
[ROW][C]132[/C][C] 15[/C][C] 16.8[/C][C]-1.796[/C][/ROW]
[ROW][C]133[/C][C] 16[/C][C] 16.2[/C][C]-0.199[/C][/ROW]
[ROW][C]134[/C][C] 15[/C][C] 16[/C][C]-1.004[/C][/ROW]
[ROW][C]135[/C][C] 18[/C][C] 16.19[/C][C] 1.815[/C][/ROW]
[ROW][C]136[/C][C] 11[/C][C] 16.58[/C][C]-5.583[/C][/ROW]
[ROW][C]137[/C][C] 15[/C][C] 16.81[/C][C]-1.815[/C][/ROW]
[ROW][C]138[/C][C] 18[/C][C] 16.55[/C][C] 1.452[/C][/ROW]
[ROW][C]139[/C][C] 20[/C][C] 16.46[/C][C] 3.543[/C][/ROW]
[ROW][C]140[/C][C] 19[/C][C] 16.39[/C][C] 2.605[/C][/ROW]
[ROW][C]141[/C][C] 14[/C][C] 16.18[/C][C]-2.182[/C][/ROW]
[ROW][C]142[/C][C] 16[/C][C] 15.94[/C][C] 0.05616[/C][/ROW]
[ROW][C]143[/C][C] 15[/C][C] 16.63[/C][C]-1.631[/C][/ROW]
[ROW][C]144[/C][C] 17[/C][C] 16.78[/C][C] 0.2238[/C][/ROW]
[ROW][C]145[/C][C] 18[/C][C] 16.59[/C][C] 1.412[/C][/ROW]
[ROW][C]146[/C][C] 20[/C][C] 16.55[/C][C] 3.446[/C][/ROW]
[ROW][C]147[/C][C] 17[/C][C] 16.75[/C][C] 0.2525[/C][/ROW]
[ROW][C]148[/C][C] 18[/C][C] 16.6[/C][C] 1.404[/C][/ROW]
[ROW][C]149[/C][C] 15[/C][C] 16.74[/C][C]-1.74[/C][/ROW]
[ROW][C]150[/C][C] 16[/C][C] 16.75[/C][C]-0.7475[/C][/ROW]
[ROW][C]151[/C][C] 11[/C][C] 16.31[/C][C]-5.308[/C][/ROW]
[ROW][C]152[/C][C] 15[/C][C] 16.8[/C][C]-1.796[/C][/ROW]
[ROW][C]153[/C][C] 18[/C][C] 15.99[/C][C] 2.008[/C][/ROW]
[ROW][C]154[/C][C] 17[/C][C] 16.26[/C][C] 0.7355[/C][/ROW]
[ROW][C]155[/C][C] 16[/C][C] 16.17[/C][C]-0.1718[/C][/ROW]
[ROW][C]156[/C][C] 12[/C][C] 16.16[/C][C]-4.157[/C][/ROW]
[ROW][C]157[/C][C] 19[/C][C] 16.53[/C][C] 2.465[/C][/ROW]
[ROW][C]158[/C][C] 18[/C][C] 16.49[/C][C] 1.513[/C][/ROW]
[ROW][C]159[/C][C] 15[/C][C] 16.11[/C][C]-1.109[/C][/ROW]
[ROW][C]160[/C][C] 17[/C][C] 16.75[/C][C] 0.2525[/C][/ROW]
[ROW][C]161[/C][C] 19[/C][C] 16.49[/C][C] 2.509[/C][/ROW]
[ROW][C]162[/C][C] 18[/C][C] 16.66[/C][C] 1.343[/C][/ROW]
[ROW][C]163[/C][C] 19[/C][C] 16.31[/C][C] 2.693[/C][/ROW]
[ROW][C]164[/C][C] 16[/C][C] 16.11[/C][C]-0.1088[/C][/ROW]
[ROW][C]165[/C][C] 16[/C][C] 16.53[/C][C]-0.5346[/C][/ROW]
[ROW][C]166[/C][C] 16[/C][C] 16.1[/C][C]-0.09519[/C][/ROW]
[ROW][C]167[/C][C] 14[/C][C] 16.32[/C][C]-2.322[/C][/ROW]
[ROW][C]168[/C][C] 16[/C][C] 16.29[/C][C]-0.2931[/C][/ROW]
[ROW][C]169[/C][C] 14[/C][C] 16.11[/C][C]-2.107[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=302343&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=302343&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 14 16.05-2.052
2 19 16.01 2.987
3 17 16.2 0.7953
4 17 16.37 0.6303
5 15 16.47-1.472
6 20 16.31 3.692
7 15 16.5-1.497
8 19 16.2 2.795
9 15 16.32-1.32
10 15 16.53-1.535
11 19 16 2.996
12 16 16.03-0.02642
13 20 16.76 3.239
14 18 16.61 1.389
15 15 16.16-1.157
16 14 16.33-2.332
17 20 15.83 4.175
18 16 15.86 0.1385
19 16 16.27-0.2737
20 16 16.61-0.6112
21 10 16.82-6.824
22 19 16.8 2.204
23 19 15.91 3.094
24 16 16.53-0.5346
25 15 16.1-1.1
26 18 16.53 1.465
27 17 16.53 0.4712
28 19 16.26 2.741
29 17 16.69 0.3064
30 13 16.13-3.128
31 19 16.19 2.815
32 20 16.48 3.523
33 5 16.02-11.02
34 19 16.5 2.501
35 16 16.07-0.0744
36 15 16.53-1.535
37 16 16.29-0.2931
38 18 16.48 1.519
39 16 16.53-0.5346
40 15 16.08-1.08
41 17 16.32 0.6783
42 13 16.07-3.065
43 20 16.51 3.494
44 19 15.96 3.039
45 7 16.13-9.128
46 13 16.31-3.307
47 16 16.26-0.2587
48 16 16.4-0.4041
49 18 16.58 1.417
50 18 16.74 1.26
51 16 16.58-0.5826
52 17 16.58 0.4174
53 19 16.48 2.523
54 16 16.09-0.08941
55 19 16.09 2.906
56 13 15.87-2.867
57 16 16.29-0.2873
58 13 16.53-3.535
59 12 16.3-4.301
60 17 16.34 0.6589
61 17 16.11 0.8912
62 17 16.37 0.6303
63 16 16.17-0.1704
64 16 16.49-0.4866
65 14 16.29-2.287
66 16 16.05-0.05157
67 13 16.5-3.5
68 16 16.8-0.7955
69 14 16.53-2.535
70 20 16.37 3.63
71 12 16.11-4.109
72 13 16.22-3.22
73 18 16.52 1.48
74 14 16.41-2.409
75 19 16.49 2.513
76 18 16.81 1.189
77 14 16.53-2.535
78 18 16.65 1.354
79 19 16.59 2.41
80 15 16.26-1.264
81 14 16.16-2.157
82 17 16.49 0.5134
83 19 16.49 2.513
84 13 16.49-3.492
85 19 16.11 2.891
86 18 16.1 1.9
87 20 15.92 4.076
88 15 16.6-1.602
89 15 16.48-1.479
90 15 16.53-1.525
91 20 16.53 3.465
92 15 16.31-1.307
93 19 16.51 2.494
94 18 16.63 1.369
95 18 16.55 1.452
96 15 16.12-1.122
97 20 16.51 3.494
98 17 16.62 0.383
99 12 15.74-3.737
100 18 16.55 1.446
101 19 16.04 2.963
102 20 16.37 3.63
103 13 16.38-3.375
104 17 16.52 0.4804
105 15 16.35-1.35
106 16 16.66-0.6592
107 18 16.53 1.465
108 18 16.69 1.306
109 14 16.58-2.583
110 15 16.8-1.796
111 12 16.83-4.83
112 17 16.12 0.8776
113 14 15.83-1.833
114 18 16.34 1.665
115 17 16.29 0.7069
116 17 15.9 1.104
117 20 16.48 3.523
118 16 16.45-0.4488
119 14 16.09-2.089
120 15 16.58-1.583
121 18 16.26 1.741
122 20 16.49 3.513
123 17 16.62 0.383
124 17 16.16 0.8432
125 17 16.07 0.9256
126 17 16.53 0.4746
127 15 16.63-1.631
128 17 16.12 0.8776
129 18 16.07 1.926
130 17 16.16 0.8432
131 20 16.06 3.941
132 15 16.8-1.796
133 16 16.2-0.199
134 15 16-1.004
135 18 16.19 1.815
136 11 16.58-5.583
137 15 16.81-1.815
138 18 16.55 1.452
139 20 16.46 3.543
140 19 16.39 2.605
141 14 16.18-2.182
142 16 15.94 0.05616
143 15 16.63-1.631
144 17 16.78 0.2238
145 18 16.59 1.412
146 20 16.55 3.446
147 17 16.75 0.2525
148 18 16.6 1.404
149 15 16.74-1.74
150 16 16.75-0.7475
151 11 16.31-5.308
152 15 16.8-1.796
153 18 15.99 2.008
154 17 16.26 0.7355
155 16 16.17-0.1718
156 12 16.16-4.157
157 19 16.53 2.465
158 18 16.49 1.513
159 15 16.11-1.109
160 17 16.75 0.2525
161 19 16.49 2.509
162 18 16.66 1.343
163 19 16.31 2.693
164 16 16.11-0.1088
165 16 16.53-0.5346
166 16 16.1-0.09519
167 14 16.32-2.322
168 16 16.29-0.2931
169 14 16.11-2.107







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
8 0.1866 0.3732 0.8134
9 0.1214 0.2428 0.8786
10 0.187 0.374 0.813
11 0.1905 0.3809 0.8095
12 0.1408 0.2817 0.8592
13 0.3274 0.6548 0.6726
14 0.2445 0.4889 0.7555
15 0.2454 0.4908 0.7546
16 0.1825 0.3651 0.8175
17 0.2126 0.4253 0.7874
18 0.1569 0.3138 0.8431
19 0.1463 0.2927 0.8537
20 0.1255 0.2511 0.8745
21 0.5363 0.9274 0.4637
22 0.5812 0.8377 0.4188
23 0.5766 0.8467 0.4234
24 0.5092 0.9816 0.4908
25 0.4745 0.949 0.5255
26 0.4297 0.8595 0.5703
27 0.3735 0.747 0.6265
28 0.3872 0.7743 0.6128
29 0.3276 0.6552 0.6724
30 0.4027 0.8055 0.5973
31 0.3841 0.7681 0.6159
32 0.4401 0.8802 0.5599
33 0.9897 0.02053 0.01027
34 0.9913 0.01731 0.008656
35 0.9881 0.02377 0.01188
36 0.9852 0.02958 0.01479
37 0.9795 0.04108 0.02054
38 0.9736 0.05274 0.02637
39 0.965 0.07007 0.03503
40 0.9565 0.08696 0.04348
41 0.9436 0.1129 0.05643
42 0.9463 0.1073 0.05366
43 0.9579 0.08422 0.04211
44 0.96 0.08004 0.04002
45 0.999 0.002028 0.001014
46 0.9991 0.001725 0.0008625
47 0.9987 0.002589 0.001294
48 0.9981 0.003789 0.001894
49 0.9975 0.004947 0.002473
50 0.9969 0.006176 0.003088
51 0.9957 0.008683 0.004341
52 0.9939 0.01216 0.00608
53 0.9938 0.01247 0.006237
54 0.9918 0.0165 0.00825
55 0.9927 0.01462 0.007312
56 0.9932 0.0135 0.006752
57 0.9908 0.01838 0.009191
58 0.9934 0.01317 0.006585
59 0.9962 0.007619 0.00381
60 0.9948 0.01037 0.005187
61 0.9931 0.01389 0.006946
62 0.9907 0.01867 0.009336
63 0.9876 0.02485 0.01242
64 0.9839 0.03213 0.01606
65 0.983 0.03409 0.01705
66 0.9776 0.04473 0.02237
67 0.9823 0.03537 0.01769
68 0.9773 0.04541 0.0227
69 0.9773 0.0454 0.0227
70 0.9829 0.03427 0.01714
71 0.9894 0.0212 0.0106
72 0.9911 0.01784 0.008922
73 0.9892 0.02157 0.01079
74 0.9893 0.02137 0.01069
75 0.9891 0.0217 0.01085
76 0.9867 0.0267 0.01335
77 0.9868 0.02647 0.01323
78 0.9841 0.03175 0.01588
79 0.9841 0.03177 0.01589
80 0.981 0.03793 0.01896
81 0.9798 0.04039 0.0202
82 0.9739 0.05214 0.02607
83 0.9743 0.0514 0.0257
84 0.9805 0.03892 0.01946
85 0.9821 0.03577 0.01789
86 0.9793 0.04133 0.02066
87 0.9877 0.02461 0.0123
88 0.9859 0.02825 0.01412
89 0.9838 0.03232 0.01616
90 0.9822 0.03564 0.01782
91 0.9871 0.02582 0.01291
92 0.9851 0.02984 0.01492
93 0.9849 0.03013 0.01507
94 0.9816 0.03685 0.01843
95 0.9778 0.04439 0.0222
96 0.9729 0.05416 0.02708
97 0.9792 0.04163 0.02082
98 0.9733 0.05331 0.02665
99 0.9846 0.03088 0.01544
100 0.9811 0.03774 0.01887
101 0.9809 0.03822 0.01911
102 0.9874 0.02516 0.01258
103 0.9906 0.01871 0.009356
104 0.9873 0.02541 0.01271
105 0.9838 0.03243 0.01622
106 0.9786 0.04275 0.02138
107 0.9752 0.04962 0.02481
108 0.9752 0.04957 0.02478
109 0.9747 0.05054 0.02527
110 0.9699 0.06022 0.03011
111 0.9847 0.03059 0.0153
112 0.9798 0.04044 0.02022
113 0.9809 0.03821 0.01911
114 0.9771 0.04572 0.02286
115 0.9697 0.06062 0.03031
116 0.9616 0.07672 0.03836
117 0.9674 0.06526 0.03263
118 0.9617 0.07659 0.0383
119 0.9567 0.08651 0.04326
120 0.9483 0.1035 0.05173
121 0.9371 0.1258 0.06288
122 0.9571 0.08571 0.04285
123 0.9457 0.1087 0.05434
124 0.9315 0.137 0.06851
125 0.914 0.172 0.08601
126 0.8913 0.2173 0.1087
127 0.8744 0.2512 0.1256
128 0.8465 0.307 0.1535
129 0.8349 0.3302 0.1651
130 0.8037 0.3926 0.1963
131 0.8463 0.3075 0.1537
132 0.8286 0.3428 0.1714
133 0.7914 0.4173 0.2086
134 0.7628 0.4744 0.2372
135 0.7837 0.4326 0.2163
136 0.9224 0.1553 0.07764
137 0.9442 0.1116 0.0558
138 0.9349 0.1303 0.06515
139 0.9446 0.1107 0.05535
140 0.927 0.146 0.07302
141 0.9278 0.1445 0.07225
142 0.9059 0.1883 0.09413
143 0.902 0.1959 0.09797
144 0.8762 0.2476 0.1238
145 0.8413 0.3174 0.1587
146 0.8371 0.3259 0.1629
147 0.7867 0.4265 0.2133
148 0.7571 0.4858 0.2429
149 0.7069 0.5862 0.2931
150 0.6389 0.7221 0.3611
151 0.8652 0.2695 0.1348
152 0.8759 0.2483 0.1241
153 0.9326 0.1349 0.06744
154 0.8925 0.2149 0.1075
155 0.9064 0.1873 0.09365
156 0.9117 0.1765 0.08826
157 0.9307 0.1385 0.06927
158 0.9249 0.1502 0.07508
159 0.8568 0.2865 0.1432
160 0.749 0.5019 0.251
161 0.5889 0.8222 0.4111

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
8 &  0.1866 &  0.3732 &  0.8134 \tabularnewline
9 &  0.1214 &  0.2428 &  0.8786 \tabularnewline
10 &  0.187 &  0.374 &  0.813 \tabularnewline
11 &  0.1905 &  0.3809 &  0.8095 \tabularnewline
12 &  0.1408 &  0.2817 &  0.8592 \tabularnewline
13 &  0.3274 &  0.6548 &  0.6726 \tabularnewline
14 &  0.2445 &  0.4889 &  0.7555 \tabularnewline
15 &  0.2454 &  0.4908 &  0.7546 \tabularnewline
16 &  0.1825 &  0.3651 &  0.8175 \tabularnewline
17 &  0.2126 &  0.4253 &  0.7874 \tabularnewline
18 &  0.1569 &  0.3138 &  0.8431 \tabularnewline
19 &  0.1463 &  0.2927 &  0.8537 \tabularnewline
20 &  0.1255 &  0.2511 &  0.8745 \tabularnewline
21 &  0.5363 &  0.9274 &  0.4637 \tabularnewline
22 &  0.5812 &  0.8377 &  0.4188 \tabularnewline
23 &  0.5766 &  0.8467 &  0.4234 \tabularnewline
24 &  0.5092 &  0.9816 &  0.4908 \tabularnewline
25 &  0.4745 &  0.949 &  0.5255 \tabularnewline
26 &  0.4297 &  0.8595 &  0.5703 \tabularnewline
27 &  0.3735 &  0.747 &  0.6265 \tabularnewline
28 &  0.3872 &  0.7743 &  0.6128 \tabularnewline
29 &  0.3276 &  0.6552 &  0.6724 \tabularnewline
30 &  0.4027 &  0.8055 &  0.5973 \tabularnewline
31 &  0.3841 &  0.7681 &  0.6159 \tabularnewline
32 &  0.4401 &  0.8802 &  0.5599 \tabularnewline
33 &  0.9897 &  0.02053 &  0.01027 \tabularnewline
34 &  0.9913 &  0.01731 &  0.008656 \tabularnewline
35 &  0.9881 &  0.02377 &  0.01188 \tabularnewline
36 &  0.9852 &  0.02958 &  0.01479 \tabularnewline
37 &  0.9795 &  0.04108 &  0.02054 \tabularnewline
38 &  0.9736 &  0.05274 &  0.02637 \tabularnewline
39 &  0.965 &  0.07007 &  0.03503 \tabularnewline
40 &  0.9565 &  0.08696 &  0.04348 \tabularnewline
41 &  0.9436 &  0.1129 &  0.05643 \tabularnewline
42 &  0.9463 &  0.1073 &  0.05366 \tabularnewline
43 &  0.9579 &  0.08422 &  0.04211 \tabularnewline
44 &  0.96 &  0.08004 &  0.04002 \tabularnewline
45 &  0.999 &  0.002028 &  0.001014 \tabularnewline
46 &  0.9991 &  0.001725 &  0.0008625 \tabularnewline
47 &  0.9987 &  0.002589 &  0.001294 \tabularnewline
48 &  0.9981 &  0.003789 &  0.001894 \tabularnewline
49 &  0.9975 &  0.004947 &  0.002473 \tabularnewline
50 &  0.9969 &  0.006176 &  0.003088 \tabularnewline
51 &  0.9957 &  0.008683 &  0.004341 \tabularnewline
52 &  0.9939 &  0.01216 &  0.00608 \tabularnewline
53 &  0.9938 &  0.01247 &  0.006237 \tabularnewline
54 &  0.9918 &  0.0165 &  0.00825 \tabularnewline
55 &  0.9927 &  0.01462 &  0.007312 \tabularnewline
56 &  0.9932 &  0.0135 &  0.006752 \tabularnewline
57 &  0.9908 &  0.01838 &  0.009191 \tabularnewline
58 &  0.9934 &  0.01317 &  0.006585 \tabularnewline
59 &  0.9962 &  0.007619 &  0.00381 \tabularnewline
60 &  0.9948 &  0.01037 &  0.005187 \tabularnewline
61 &  0.9931 &  0.01389 &  0.006946 \tabularnewline
62 &  0.9907 &  0.01867 &  0.009336 \tabularnewline
63 &  0.9876 &  0.02485 &  0.01242 \tabularnewline
64 &  0.9839 &  0.03213 &  0.01606 \tabularnewline
65 &  0.983 &  0.03409 &  0.01705 \tabularnewline
66 &  0.9776 &  0.04473 &  0.02237 \tabularnewline
67 &  0.9823 &  0.03537 &  0.01769 \tabularnewline
68 &  0.9773 &  0.04541 &  0.0227 \tabularnewline
69 &  0.9773 &  0.0454 &  0.0227 \tabularnewline
70 &  0.9829 &  0.03427 &  0.01714 \tabularnewline
71 &  0.9894 &  0.0212 &  0.0106 \tabularnewline
72 &  0.9911 &  0.01784 &  0.008922 \tabularnewline
73 &  0.9892 &  0.02157 &  0.01079 \tabularnewline
74 &  0.9893 &  0.02137 &  0.01069 \tabularnewline
75 &  0.9891 &  0.0217 &  0.01085 \tabularnewline
76 &  0.9867 &  0.0267 &  0.01335 \tabularnewline
77 &  0.9868 &  0.02647 &  0.01323 \tabularnewline
78 &  0.9841 &  0.03175 &  0.01588 \tabularnewline
79 &  0.9841 &  0.03177 &  0.01589 \tabularnewline
80 &  0.981 &  0.03793 &  0.01896 \tabularnewline
81 &  0.9798 &  0.04039 &  0.0202 \tabularnewline
82 &  0.9739 &  0.05214 &  0.02607 \tabularnewline
83 &  0.9743 &  0.0514 &  0.0257 \tabularnewline
84 &  0.9805 &  0.03892 &  0.01946 \tabularnewline
85 &  0.9821 &  0.03577 &  0.01789 \tabularnewline
86 &  0.9793 &  0.04133 &  0.02066 \tabularnewline
87 &  0.9877 &  0.02461 &  0.0123 \tabularnewline
88 &  0.9859 &  0.02825 &  0.01412 \tabularnewline
89 &  0.9838 &  0.03232 &  0.01616 \tabularnewline
90 &  0.9822 &  0.03564 &  0.01782 \tabularnewline
91 &  0.9871 &  0.02582 &  0.01291 \tabularnewline
92 &  0.9851 &  0.02984 &  0.01492 \tabularnewline
93 &  0.9849 &  0.03013 &  0.01507 \tabularnewline
94 &  0.9816 &  0.03685 &  0.01843 \tabularnewline
95 &  0.9778 &  0.04439 &  0.0222 \tabularnewline
96 &  0.9729 &  0.05416 &  0.02708 \tabularnewline
97 &  0.9792 &  0.04163 &  0.02082 \tabularnewline
98 &  0.9733 &  0.05331 &  0.02665 \tabularnewline
99 &  0.9846 &  0.03088 &  0.01544 \tabularnewline
100 &  0.9811 &  0.03774 &  0.01887 \tabularnewline
101 &  0.9809 &  0.03822 &  0.01911 \tabularnewline
102 &  0.9874 &  0.02516 &  0.01258 \tabularnewline
103 &  0.9906 &  0.01871 &  0.009356 \tabularnewline
104 &  0.9873 &  0.02541 &  0.01271 \tabularnewline
105 &  0.9838 &  0.03243 &  0.01622 \tabularnewline
106 &  0.9786 &  0.04275 &  0.02138 \tabularnewline
107 &  0.9752 &  0.04962 &  0.02481 \tabularnewline
108 &  0.9752 &  0.04957 &  0.02478 \tabularnewline
109 &  0.9747 &  0.05054 &  0.02527 \tabularnewline
110 &  0.9699 &  0.06022 &  0.03011 \tabularnewline
111 &  0.9847 &  0.03059 &  0.0153 \tabularnewline
112 &  0.9798 &  0.04044 &  0.02022 \tabularnewline
113 &  0.9809 &  0.03821 &  0.01911 \tabularnewline
114 &  0.9771 &  0.04572 &  0.02286 \tabularnewline
115 &  0.9697 &  0.06062 &  0.03031 \tabularnewline
116 &  0.9616 &  0.07672 &  0.03836 \tabularnewline
117 &  0.9674 &  0.06526 &  0.03263 \tabularnewline
118 &  0.9617 &  0.07659 &  0.0383 \tabularnewline
119 &  0.9567 &  0.08651 &  0.04326 \tabularnewline
120 &  0.9483 &  0.1035 &  0.05173 \tabularnewline
121 &  0.9371 &  0.1258 &  0.06288 \tabularnewline
122 &  0.9571 &  0.08571 &  0.04285 \tabularnewline
123 &  0.9457 &  0.1087 &  0.05434 \tabularnewline
124 &  0.9315 &  0.137 &  0.06851 \tabularnewline
125 &  0.914 &  0.172 &  0.08601 \tabularnewline
126 &  0.8913 &  0.2173 &  0.1087 \tabularnewline
127 &  0.8744 &  0.2512 &  0.1256 \tabularnewline
128 &  0.8465 &  0.307 &  0.1535 \tabularnewline
129 &  0.8349 &  0.3302 &  0.1651 \tabularnewline
130 &  0.8037 &  0.3926 &  0.1963 \tabularnewline
131 &  0.8463 &  0.3075 &  0.1537 \tabularnewline
132 &  0.8286 &  0.3428 &  0.1714 \tabularnewline
133 &  0.7914 &  0.4173 &  0.2086 \tabularnewline
134 &  0.7628 &  0.4744 &  0.2372 \tabularnewline
135 &  0.7837 &  0.4326 &  0.2163 \tabularnewline
136 &  0.9224 &  0.1553 &  0.07764 \tabularnewline
137 &  0.9442 &  0.1116 &  0.0558 \tabularnewline
138 &  0.9349 &  0.1303 &  0.06515 \tabularnewline
139 &  0.9446 &  0.1107 &  0.05535 \tabularnewline
140 &  0.927 &  0.146 &  0.07302 \tabularnewline
141 &  0.9278 &  0.1445 &  0.07225 \tabularnewline
142 &  0.9059 &  0.1883 &  0.09413 \tabularnewline
143 &  0.902 &  0.1959 &  0.09797 \tabularnewline
144 &  0.8762 &  0.2476 &  0.1238 \tabularnewline
145 &  0.8413 &  0.3174 &  0.1587 \tabularnewline
146 &  0.8371 &  0.3259 &  0.1629 \tabularnewline
147 &  0.7867 &  0.4265 &  0.2133 \tabularnewline
148 &  0.7571 &  0.4858 &  0.2429 \tabularnewline
149 &  0.7069 &  0.5862 &  0.2931 \tabularnewline
150 &  0.6389 &  0.7221 &  0.3611 \tabularnewline
151 &  0.8652 &  0.2695 &  0.1348 \tabularnewline
152 &  0.8759 &  0.2483 &  0.1241 \tabularnewline
153 &  0.9326 &  0.1349 &  0.06744 \tabularnewline
154 &  0.8925 &  0.2149 &  0.1075 \tabularnewline
155 &  0.9064 &  0.1873 &  0.09365 \tabularnewline
156 &  0.9117 &  0.1765 &  0.08826 \tabularnewline
157 &  0.9307 &  0.1385 &  0.06927 \tabularnewline
158 &  0.9249 &  0.1502 &  0.07508 \tabularnewline
159 &  0.8568 &  0.2865 &  0.1432 \tabularnewline
160 &  0.749 &  0.5019 &  0.251 \tabularnewline
161 &  0.5889 &  0.8222 &  0.4111 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=302343&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]8[/C][C] 0.1866[/C][C] 0.3732[/C][C] 0.8134[/C][/ROW]
[ROW][C]9[/C][C] 0.1214[/C][C] 0.2428[/C][C] 0.8786[/C][/ROW]
[ROW][C]10[/C][C] 0.187[/C][C] 0.374[/C][C] 0.813[/C][/ROW]
[ROW][C]11[/C][C] 0.1905[/C][C] 0.3809[/C][C] 0.8095[/C][/ROW]
[ROW][C]12[/C][C] 0.1408[/C][C] 0.2817[/C][C] 0.8592[/C][/ROW]
[ROW][C]13[/C][C] 0.3274[/C][C] 0.6548[/C][C] 0.6726[/C][/ROW]
[ROW][C]14[/C][C] 0.2445[/C][C] 0.4889[/C][C] 0.7555[/C][/ROW]
[ROW][C]15[/C][C] 0.2454[/C][C] 0.4908[/C][C] 0.7546[/C][/ROW]
[ROW][C]16[/C][C] 0.1825[/C][C] 0.3651[/C][C] 0.8175[/C][/ROW]
[ROW][C]17[/C][C] 0.2126[/C][C] 0.4253[/C][C] 0.7874[/C][/ROW]
[ROW][C]18[/C][C] 0.1569[/C][C] 0.3138[/C][C] 0.8431[/C][/ROW]
[ROW][C]19[/C][C] 0.1463[/C][C] 0.2927[/C][C] 0.8537[/C][/ROW]
[ROW][C]20[/C][C] 0.1255[/C][C] 0.2511[/C][C] 0.8745[/C][/ROW]
[ROW][C]21[/C][C] 0.5363[/C][C] 0.9274[/C][C] 0.4637[/C][/ROW]
[ROW][C]22[/C][C] 0.5812[/C][C] 0.8377[/C][C] 0.4188[/C][/ROW]
[ROW][C]23[/C][C] 0.5766[/C][C] 0.8467[/C][C] 0.4234[/C][/ROW]
[ROW][C]24[/C][C] 0.5092[/C][C] 0.9816[/C][C] 0.4908[/C][/ROW]
[ROW][C]25[/C][C] 0.4745[/C][C] 0.949[/C][C] 0.5255[/C][/ROW]
[ROW][C]26[/C][C] 0.4297[/C][C] 0.8595[/C][C] 0.5703[/C][/ROW]
[ROW][C]27[/C][C] 0.3735[/C][C] 0.747[/C][C] 0.6265[/C][/ROW]
[ROW][C]28[/C][C] 0.3872[/C][C] 0.7743[/C][C] 0.6128[/C][/ROW]
[ROW][C]29[/C][C] 0.3276[/C][C] 0.6552[/C][C] 0.6724[/C][/ROW]
[ROW][C]30[/C][C] 0.4027[/C][C] 0.8055[/C][C] 0.5973[/C][/ROW]
[ROW][C]31[/C][C] 0.3841[/C][C] 0.7681[/C][C] 0.6159[/C][/ROW]
[ROW][C]32[/C][C] 0.4401[/C][C] 0.8802[/C][C] 0.5599[/C][/ROW]
[ROW][C]33[/C][C] 0.9897[/C][C] 0.02053[/C][C] 0.01027[/C][/ROW]
[ROW][C]34[/C][C] 0.9913[/C][C] 0.01731[/C][C] 0.008656[/C][/ROW]
[ROW][C]35[/C][C] 0.9881[/C][C] 0.02377[/C][C] 0.01188[/C][/ROW]
[ROW][C]36[/C][C] 0.9852[/C][C] 0.02958[/C][C] 0.01479[/C][/ROW]
[ROW][C]37[/C][C] 0.9795[/C][C] 0.04108[/C][C] 0.02054[/C][/ROW]
[ROW][C]38[/C][C] 0.9736[/C][C] 0.05274[/C][C] 0.02637[/C][/ROW]
[ROW][C]39[/C][C] 0.965[/C][C] 0.07007[/C][C] 0.03503[/C][/ROW]
[ROW][C]40[/C][C] 0.9565[/C][C] 0.08696[/C][C] 0.04348[/C][/ROW]
[ROW][C]41[/C][C] 0.9436[/C][C] 0.1129[/C][C] 0.05643[/C][/ROW]
[ROW][C]42[/C][C] 0.9463[/C][C] 0.1073[/C][C] 0.05366[/C][/ROW]
[ROW][C]43[/C][C] 0.9579[/C][C] 0.08422[/C][C] 0.04211[/C][/ROW]
[ROW][C]44[/C][C] 0.96[/C][C] 0.08004[/C][C] 0.04002[/C][/ROW]
[ROW][C]45[/C][C] 0.999[/C][C] 0.002028[/C][C] 0.001014[/C][/ROW]
[ROW][C]46[/C][C] 0.9991[/C][C] 0.001725[/C][C] 0.0008625[/C][/ROW]
[ROW][C]47[/C][C] 0.9987[/C][C] 0.002589[/C][C] 0.001294[/C][/ROW]
[ROW][C]48[/C][C] 0.9981[/C][C] 0.003789[/C][C] 0.001894[/C][/ROW]
[ROW][C]49[/C][C] 0.9975[/C][C] 0.004947[/C][C] 0.002473[/C][/ROW]
[ROW][C]50[/C][C] 0.9969[/C][C] 0.006176[/C][C] 0.003088[/C][/ROW]
[ROW][C]51[/C][C] 0.9957[/C][C] 0.008683[/C][C] 0.004341[/C][/ROW]
[ROW][C]52[/C][C] 0.9939[/C][C] 0.01216[/C][C] 0.00608[/C][/ROW]
[ROW][C]53[/C][C] 0.9938[/C][C] 0.01247[/C][C] 0.006237[/C][/ROW]
[ROW][C]54[/C][C] 0.9918[/C][C] 0.0165[/C][C] 0.00825[/C][/ROW]
[ROW][C]55[/C][C] 0.9927[/C][C] 0.01462[/C][C] 0.007312[/C][/ROW]
[ROW][C]56[/C][C] 0.9932[/C][C] 0.0135[/C][C] 0.006752[/C][/ROW]
[ROW][C]57[/C][C] 0.9908[/C][C] 0.01838[/C][C] 0.009191[/C][/ROW]
[ROW][C]58[/C][C] 0.9934[/C][C] 0.01317[/C][C] 0.006585[/C][/ROW]
[ROW][C]59[/C][C] 0.9962[/C][C] 0.007619[/C][C] 0.00381[/C][/ROW]
[ROW][C]60[/C][C] 0.9948[/C][C] 0.01037[/C][C] 0.005187[/C][/ROW]
[ROW][C]61[/C][C] 0.9931[/C][C] 0.01389[/C][C] 0.006946[/C][/ROW]
[ROW][C]62[/C][C] 0.9907[/C][C] 0.01867[/C][C] 0.009336[/C][/ROW]
[ROW][C]63[/C][C] 0.9876[/C][C] 0.02485[/C][C] 0.01242[/C][/ROW]
[ROW][C]64[/C][C] 0.9839[/C][C] 0.03213[/C][C] 0.01606[/C][/ROW]
[ROW][C]65[/C][C] 0.983[/C][C] 0.03409[/C][C] 0.01705[/C][/ROW]
[ROW][C]66[/C][C] 0.9776[/C][C] 0.04473[/C][C] 0.02237[/C][/ROW]
[ROW][C]67[/C][C] 0.9823[/C][C] 0.03537[/C][C] 0.01769[/C][/ROW]
[ROW][C]68[/C][C] 0.9773[/C][C] 0.04541[/C][C] 0.0227[/C][/ROW]
[ROW][C]69[/C][C] 0.9773[/C][C] 0.0454[/C][C] 0.0227[/C][/ROW]
[ROW][C]70[/C][C] 0.9829[/C][C] 0.03427[/C][C] 0.01714[/C][/ROW]
[ROW][C]71[/C][C] 0.9894[/C][C] 0.0212[/C][C] 0.0106[/C][/ROW]
[ROW][C]72[/C][C] 0.9911[/C][C] 0.01784[/C][C] 0.008922[/C][/ROW]
[ROW][C]73[/C][C] 0.9892[/C][C] 0.02157[/C][C] 0.01079[/C][/ROW]
[ROW][C]74[/C][C] 0.9893[/C][C] 0.02137[/C][C] 0.01069[/C][/ROW]
[ROW][C]75[/C][C] 0.9891[/C][C] 0.0217[/C][C] 0.01085[/C][/ROW]
[ROW][C]76[/C][C] 0.9867[/C][C] 0.0267[/C][C] 0.01335[/C][/ROW]
[ROW][C]77[/C][C] 0.9868[/C][C] 0.02647[/C][C] 0.01323[/C][/ROW]
[ROW][C]78[/C][C] 0.9841[/C][C] 0.03175[/C][C] 0.01588[/C][/ROW]
[ROW][C]79[/C][C] 0.9841[/C][C] 0.03177[/C][C] 0.01589[/C][/ROW]
[ROW][C]80[/C][C] 0.981[/C][C] 0.03793[/C][C] 0.01896[/C][/ROW]
[ROW][C]81[/C][C] 0.9798[/C][C] 0.04039[/C][C] 0.0202[/C][/ROW]
[ROW][C]82[/C][C] 0.9739[/C][C] 0.05214[/C][C] 0.02607[/C][/ROW]
[ROW][C]83[/C][C] 0.9743[/C][C] 0.0514[/C][C] 0.0257[/C][/ROW]
[ROW][C]84[/C][C] 0.9805[/C][C] 0.03892[/C][C] 0.01946[/C][/ROW]
[ROW][C]85[/C][C] 0.9821[/C][C] 0.03577[/C][C] 0.01789[/C][/ROW]
[ROW][C]86[/C][C] 0.9793[/C][C] 0.04133[/C][C] 0.02066[/C][/ROW]
[ROW][C]87[/C][C] 0.9877[/C][C] 0.02461[/C][C] 0.0123[/C][/ROW]
[ROW][C]88[/C][C] 0.9859[/C][C] 0.02825[/C][C] 0.01412[/C][/ROW]
[ROW][C]89[/C][C] 0.9838[/C][C] 0.03232[/C][C] 0.01616[/C][/ROW]
[ROW][C]90[/C][C] 0.9822[/C][C] 0.03564[/C][C] 0.01782[/C][/ROW]
[ROW][C]91[/C][C] 0.9871[/C][C] 0.02582[/C][C] 0.01291[/C][/ROW]
[ROW][C]92[/C][C] 0.9851[/C][C] 0.02984[/C][C] 0.01492[/C][/ROW]
[ROW][C]93[/C][C] 0.9849[/C][C] 0.03013[/C][C] 0.01507[/C][/ROW]
[ROW][C]94[/C][C] 0.9816[/C][C] 0.03685[/C][C] 0.01843[/C][/ROW]
[ROW][C]95[/C][C] 0.9778[/C][C] 0.04439[/C][C] 0.0222[/C][/ROW]
[ROW][C]96[/C][C] 0.9729[/C][C] 0.05416[/C][C] 0.02708[/C][/ROW]
[ROW][C]97[/C][C] 0.9792[/C][C] 0.04163[/C][C] 0.02082[/C][/ROW]
[ROW][C]98[/C][C] 0.9733[/C][C] 0.05331[/C][C] 0.02665[/C][/ROW]
[ROW][C]99[/C][C] 0.9846[/C][C] 0.03088[/C][C] 0.01544[/C][/ROW]
[ROW][C]100[/C][C] 0.9811[/C][C] 0.03774[/C][C] 0.01887[/C][/ROW]
[ROW][C]101[/C][C] 0.9809[/C][C] 0.03822[/C][C] 0.01911[/C][/ROW]
[ROW][C]102[/C][C] 0.9874[/C][C] 0.02516[/C][C] 0.01258[/C][/ROW]
[ROW][C]103[/C][C] 0.9906[/C][C] 0.01871[/C][C] 0.009356[/C][/ROW]
[ROW][C]104[/C][C] 0.9873[/C][C] 0.02541[/C][C] 0.01271[/C][/ROW]
[ROW][C]105[/C][C] 0.9838[/C][C] 0.03243[/C][C] 0.01622[/C][/ROW]
[ROW][C]106[/C][C] 0.9786[/C][C] 0.04275[/C][C] 0.02138[/C][/ROW]
[ROW][C]107[/C][C] 0.9752[/C][C] 0.04962[/C][C] 0.02481[/C][/ROW]
[ROW][C]108[/C][C] 0.9752[/C][C] 0.04957[/C][C] 0.02478[/C][/ROW]
[ROW][C]109[/C][C] 0.9747[/C][C] 0.05054[/C][C] 0.02527[/C][/ROW]
[ROW][C]110[/C][C] 0.9699[/C][C] 0.06022[/C][C] 0.03011[/C][/ROW]
[ROW][C]111[/C][C] 0.9847[/C][C] 0.03059[/C][C] 0.0153[/C][/ROW]
[ROW][C]112[/C][C] 0.9798[/C][C] 0.04044[/C][C] 0.02022[/C][/ROW]
[ROW][C]113[/C][C] 0.9809[/C][C] 0.03821[/C][C] 0.01911[/C][/ROW]
[ROW][C]114[/C][C] 0.9771[/C][C] 0.04572[/C][C] 0.02286[/C][/ROW]
[ROW][C]115[/C][C] 0.9697[/C][C] 0.06062[/C][C] 0.03031[/C][/ROW]
[ROW][C]116[/C][C] 0.9616[/C][C] 0.07672[/C][C] 0.03836[/C][/ROW]
[ROW][C]117[/C][C] 0.9674[/C][C] 0.06526[/C][C] 0.03263[/C][/ROW]
[ROW][C]118[/C][C] 0.9617[/C][C] 0.07659[/C][C] 0.0383[/C][/ROW]
[ROW][C]119[/C][C] 0.9567[/C][C] 0.08651[/C][C] 0.04326[/C][/ROW]
[ROW][C]120[/C][C] 0.9483[/C][C] 0.1035[/C][C] 0.05173[/C][/ROW]
[ROW][C]121[/C][C] 0.9371[/C][C] 0.1258[/C][C] 0.06288[/C][/ROW]
[ROW][C]122[/C][C] 0.9571[/C][C] 0.08571[/C][C] 0.04285[/C][/ROW]
[ROW][C]123[/C][C] 0.9457[/C][C] 0.1087[/C][C] 0.05434[/C][/ROW]
[ROW][C]124[/C][C] 0.9315[/C][C] 0.137[/C][C] 0.06851[/C][/ROW]
[ROW][C]125[/C][C] 0.914[/C][C] 0.172[/C][C] 0.08601[/C][/ROW]
[ROW][C]126[/C][C] 0.8913[/C][C] 0.2173[/C][C] 0.1087[/C][/ROW]
[ROW][C]127[/C][C] 0.8744[/C][C] 0.2512[/C][C] 0.1256[/C][/ROW]
[ROW][C]128[/C][C] 0.8465[/C][C] 0.307[/C][C] 0.1535[/C][/ROW]
[ROW][C]129[/C][C] 0.8349[/C][C] 0.3302[/C][C] 0.1651[/C][/ROW]
[ROW][C]130[/C][C] 0.8037[/C][C] 0.3926[/C][C] 0.1963[/C][/ROW]
[ROW][C]131[/C][C] 0.8463[/C][C] 0.3075[/C][C] 0.1537[/C][/ROW]
[ROW][C]132[/C][C] 0.8286[/C][C] 0.3428[/C][C] 0.1714[/C][/ROW]
[ROW][C]133[/C][C] 0.7914[/C][C] 0.4173[/C][C] 0.2086[/C][/ROW]
[ROW][C]134[/C][C] 0.7628[/C][C] 0.4744[/C][C] 0.2372[/C][/ROW]
[ROW][C]135[/C][C] 0.7837[/C][C] 0.4326[/C][C] 0.2163[/C][/ROW]
[ROW][C]136[/C][C] 0.9224[/C][C] 0.1553[/C][C] 0.07764[/C][/ROW]
[ROW][C]137[/C][C] 0.9442[/C][C] 0.1116[/C][C] 0.0558[/C][/ROW]
[ROW][C]138[/C][C] 0.9349[/C][C] 0.1303[/C][C] 0.06515[/C][/ROW]
[ROW][C]139[/C][C] 0.9446[/C][C] 0.1107[/C][C] 0.05535[/C][/ROW]
[ROW][C]140[/C][C] 0.927[/C][C] 0.146[/C][C] 0.07302[/C][/ROW]
[ROW][C]141[/C][C] 0.9278[/C][C] 0.1445[/C][C] 0.07225[/C][/ROW]
[ROW][C]142[/C][C] 0.9059[/C][C] 0.1883[/C][C] 0.09413[/C][/ROW]
[ROW][C]143[/C][C] 0.902[/C][C] 0.1959[/C][C] 0.09797[/C][/ROW]
[ROW][C]144[/C][C] 0.8762[/C][C] 0.2476[/C][C] 0.1238[/C][/ROW]
[ROW][C]145[/C][C] 0.8413[/C][C] 0.3174[/C][C] 0.1587[/C][/ROW]
[ROW][C]146[/C][C] 0.8371[/C][C] 0.3259[/C][C] 0.1629[/C][/ROW]
[ROW][C]147[/C][C] 0.7867[/C][C] 0.4265[/C][C] 0.2133[/C][/ROW]
[ROW][C]148[/C][C] 0.7571[/C][C] 0.4858[/C][C] 0.2429[/C][/ROW]
[ROW][C]149[/C][C] 0.7069[/C][C] 0.5862[/C][C] 0.2931[/C][/ROW]
[ROW][C]150[/C][C] 0.6389[/C][C] 0.7221[/C][C] 0.3611[/C][/ROW]
[ROW][C]151[/C][C] 0.8652[/C][C] 0.2695[/C][C] 0.1348[/C][/ROW]
[ROW][C]152[/C][C] 0.8759[/C][C] 0.2483[/C][C] 0.1241[/C][/ROW]
[ROW][C]153[/C][C] 0.9326[/C][C] 0.1349[/C][C] 0.06744[/C][/ROW]
[ROW][C]154[/C][C] 0.8925[/C][C] 0.2149[/C][C] 0.1075[/C][/ROW]
[ROW][C]155[/C][C] 0.9064[/C][C] 0.1873[/C][C] 0.09365[/C][/ROW]
[ROW][C]156[/C][C] 0.9117[/C][C] 0.1765[/C][C] 0.08826[/C][/ROW]
[ROW][C]157[/C][C] 0.9307[/C][C] 0.1385[/C][C] 0.06927[/C][/ROW]
[ROW][C]158[/C][C] 0.9249[/C][C] 0.1502[/C][C] 0.07508[/C][/ROW]
[ROW][C]159[/C][C] 0.8568[/C][C] 0.2865[/C][C] 0.1432[/C][/ROW]
[ROW][C]160[/C][C] 0.749[/C][C] 0.5019[/C][C] 0.251[/C][/ROW]
[ROW][C]161[/C][C] 0.5889[/C][C] 0.8222[/C][C] 0.4111[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=302343&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=302343&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
8 0.1866 0.3732 0.8134
9 0.1214 0.2428 0.8786
10 0.187 0.374 0.813
11 0.1905 0.3809 0.8095
12 0.1408 0.2817 0.8592
13 0.3274 0.6548 0.6726
14 0.2445 0.4889 0.7555
15 0.2454 0.4908 0.7546
16 0.1825 0.3651 0.8175
17 0.2126 0.4253 0.7874
18 0.1569 0.3138 0.8431
19 0.1463 0.2927 0.8537
20 0.1255 0.2511 0.8745
21 0.5363 0.9274 0.4637
22 0.5812 0.8377 0.4188
23 0.5766 0.8467 0.4234
24 0.5092 0.9816 0.4908
25 0.4745 0.949 0.5255
26 0.4297 0.8595 0.5703
27 0.3735 0.747 0.6265
28 0.3872 0.7743 0.6128
29 0.3276 0.6552 0.6724
30 0.4027 0.8055 0.5973
31 0.3841 0.7681 0.6159
32 0.4401 0.8802 0.5599
33 0.9897 0.02053 0.01027
34 0.9913 0.01731 0.008656
35 0.9881 0.02377 0.01188
36 0.9852 0.02958 0.01479
37 0.9795 0.04108 0.02054
38 0.9736 0.05274 0.02637
39 0.965 0.07007 0.03503
40 0.9565 0.08696 0.04348
41 0.9436 0.1129 0.05643
42 0.9463 0.1073 0.05366
43 0.9579 0.08422 0.04211
44 0.96 0.08004 0.04002
45 0.999 0.002028 0.001014
46 0.9991 0.001725 0.0008625
47 0.9987 0.002589 0.001294
48 0.9981 0.003789 0.001894
49 0.9975 0.004947 0.002473
50 0.9969 0.006176 0.003088
51 0.9957 0.008683 0.004341
52 0.9939 0.01216 0.00608
53 0.9938 0.01247 0.006237
54 0.9918 0.0165 0.00825
55 0.9927 0.01462 0.007312
56 0.9932 0.0135 0.006752
57 0.9908 0.01838 0.009191
58 0.9934 0.01317 0.006585
59 0.9962 0.007619 0.00381
60 0.9948 0.01037 0.005187
61 0.9931 0.01389 0.006946
62 0.9907 0.01867 0.009336
63 0.9876 0.02485 0.01242
64 0.9839 0.03213 0.01606
65 0.983 0.03409 0.01705
66 0.9776 0.04473 0.02237
67 0.9823 0.03537 0.01769
68 0.9773 0.04541 0.0227
69 0.9773 0.0454 0.0227
70 0.9829 0.03427 0.01714
71 0.9894 0.0212 0.0106
72 0.9911 0.01784 0.008922
73 0.9892 0.02157 0.01079
74 0.9893 0.02137 0.01069
75 0.9891 0.0217 0.01085
76 0.9867 0.0267 0.01335
77 0.9868 0.02647 0.01323
78 0.9841 0.03175 0.01588
79 0.9841 0.03177 0.01589
80 0.981 0.03793 0.01896
81 0.9798 0.04039 0.0202
82 0.9739 0.05214 0.02607
83 0.9743 0.0514 0.0257
84 0.9805 0.03892 0.01946
85 0.9821 0.03577 0.01789
86 0.9793 0.04133 0.02066
87 0.9877 0.02461 0.0123
88 0.9859 0.02825 0.01412
89 0.9838 0.03232 0.01616
90 0.9822 0.03564 0.01782
91 0.9871 0.02582 0.01291
92 0.9851 0.02984 0.01492
93 0.9849 0.03013 0.01507
94 0.9816 0.03685 0.01843
95 0.9778 0.04439 0.0222
96 0.9729 0.05416 0.02708
97 0.9792 0.04163 0.02082
98 0.9733 0.05331 0.02665
99 0.9846 0.03088 0.01544
100 0.9811 0.03774 0.01887
101 0.9809 0.03822 0.01911
102 0.9874 0.02516 0.01258
103 0.9906 0.01871 0.009356
104 0.9873 0.02541 0.01271
105 0.9838 0.03243 0.01622
106 0.9786 0.04275 0.02138
107 0.9752 0.04962 0.02481
108 0.9752 0.04957 0.02478
109 0.9747 0.05054 0.02527
110 0.9699 0.06022 0.03011
111 0.9847 0.03059 0.0153
112 0.9798 0.04044 0.02022
113 0.9809 0.03821 0.01911
114 0.9771 0.04572 0.02286
115 0.9697 0.06062 0.03031
116 0.9616 0.07672 0.03836
117 0.9674 0.06526 0.03263
118 0.9617 0.07659 0.0383
119 0.9567 0.08651 0.04326
120 0.9483 0.1035 0.05173
121 0.9371 0.1258 0.06288
122 0.9571 0.08571 0.04285
123 0.9457 0.1087 0.05434
124 0.9315 0.137 0.06851
125 0.914 0.172 0.08601
126 0.8913 0.2173 0.1087
127 0.8744 0.2512 0.1256
128 0.8465 0.307 0.1535
129 0.8349 0.3302 0.1651
130 0.8037 0.3926 0.1963
131 0.8463 0.3075 0.1537
132 0.8286 0.3428 0.1714
133 0.7914 0.4173 0.2086
134 0.7628 0.4744 0.2372
135 0.7837 0.4326 0.2163
136 0.9224 0.1553 0.07764
137 0.9442 0.1116 0.0558
138 0.9349 0.1303 0.06515
139 0.9446 0.1107 0.05535
140 0.927 0.146 0.07302
141 0.9278 0.1445 0.07225
142 0.9059 0.1883 0.09413
143 0.902 0.1959 0.09797
144 0.8762 0.2476 0.1238
145 0.8413 0.3174 0.1587
146 0.8371 0.3259 0.1629
147 0.7867 0.4265 0.2133
148 0.7571 0.4858 0.2429
149 0.7069 0.5862 0.2931
150 0.6389 0.7221 0.3611
151 0.8652 0.2695 0.1348
152 0.8759 0.2483 0.1241
153 0.9326 0.1349 0.06744
154 0.8925 0.2149 0.1075
155 0.9064 0.1873 0.09365
156 0.9117 0.1765 0.08826
157 0.9307 0.1385 0.06927
158 0.9249 0.1502 0.07508
159 0.8568 0.2865 0.1432
160 0.749 0.5019 0.251
161 0.5889 0.8222 0.4111







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level8 0.05195NOK
5% type I error level690.448052NOK
10% type I error level860.558442NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 8 &  0.05195 & NOK \tabularnewline
5% type I error level & 69 & 0.448052 & NOK \tabularnewline
10% type I error level & 86 & 0.558442 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=302343&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]8[/C][C] 0.05195[/C][C]NOK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]69[/C][C]0.448052[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]86[/C][C]0.558442[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=302343&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=302343&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level8 0.05195NOK
5% type I error level690.448052NOK
10% type I error level860.558442NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.4367, df1 = 2, df2 = 162, p-value = 0.09064
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.76008, df1 = 8, df2 = 156, p-value = 0.6384
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 2.6898, df1 = 2, df2 = 162, p-value = 0.07092

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.4367, df1 = 2, df2 = 162, p-value = 0.09064
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.76008, df1 = 8, df2 = 156, p-value = 0.6384
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 2.6898, df1 = 2, df2 = 162, p-value = 0.07092
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=302343&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.4367, df1 = 2, df2 = 162, p-value = 0.09064
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.76008, df1 = 8, df2 = 156, p-value = 0.6384
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 2.6898, df1 = 2, df2 = 162, p-value = 0.07092
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=302343&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=302343&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 2.4367, df1 = 2, df2 = 162, p-value = 0.09064
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.76008, df1 = 8, df2 = 156, p-value = 0.6384
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 2.6898, df1 = 2, df2 = 162, p-value = 0.07092







Variance Inflation Factors (Multicollinearity)
> vif
   IVHB1    IVHB2    IVHB3    IVHB4 
1.031563 1.038787 1.071631 1.058612 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
   IVHB1    IVHB2    IVHB3    IVHB4 
1.031563 1.038787 1.071631 1.058612 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=302343&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
   IVHB1    IVHB2    IVHB3    IVHB4 
1.031563 1.038787 1.071631 1.058612 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=302343&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=302343&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
   IVHB1    IVHB2    IVHB3    IVHB4 
1.031563 1.038787 1.071631 1.058612 



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
par5 <- '0'
par4 <- '0'
par3 <- 'No Linear Trend'
par2 <- 'Do not include Seasonal Dummies'
par1 <- '1'
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')