Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 17 Dec 2014 15:08:03 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2014/Dec/17/t1418829647s6dxfmukymge2th.htm/, Retrieved Thu, 16 May 2024 11:49:37 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=270395, Retrieved Thu, 16 May 2024 11:49:37 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact75
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [m] [2014-12-17 15:08:03] [6ac057e9f6255a74ae39891d7e02481c] [Current]
Feedback Forum

Post a new message
Dataseries X:
96	12.9
75	7.4
70	12.2
88	12.8
114	7.4
69	6.7
176	12.6
114	14.8
121	13.3
110	11.1
158	8.2
116	11.4
181	6.4
77	10.6
141	12.0
35	6.3
80	11.3
152	11.9
97	9.3
99	9.6
84	10.0
68	6.4
101	13.8
107	10.8
88	13.8
112	11.7
171	10.9
137	16.1
77	13.4
66	9.9
93	11.5
105	8.3
131	11.7
89	6.1
102	9.0
161	9.7
120	10.8
127	10.3
77	10.4
108	12.7
85	9.3
168	11.8
48	5.9
152	11.4
75	13.0
107	10.8
62	12.3
121	11.3
124	11.8
72	7.9
40	12.7
58	12.3
97	11.6
88	6.7
126	10.9
104	12.1
148	13.3
146	10.1
80	5.7
97	14.3
25	8.0
99	13.3
118	9.3
58	12.5
63	7.6
139	15.9
50	9.2
60	9.1
152	11.1
142	13.0
94	14.5
66	12.2
127	12.3
67	11.4
90	8.8
75	14.6
96	7.3
128	12.6
41	NA
146	13.0
69	12.6
186	13.2
81	9.9
85	7.7
54	10.5
46	13.4
106	10.9
34	4.3
60	10.3
95	11.8
57	11.2
62	11.4
36	8.6
56	13.2
54	12.6
64	5.6
76	9.9
98	8.8
88	7.7
35	9.0




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Sir Maurice George Kendall' @ kendall.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 6 seconds \tabularnewline
R Server & 'Sir Maurice George Kendall' @ kendall.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270395&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]6 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Sir Maurice George Kendall' @ kendall.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270395&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270395&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time6 seconds
R Server'Sir Maurice George Kendall' @ kendall.wessa.net







Multiple Linear Regression - Estimated Regression Equation
B[t] = + 52.4936 + 4.08983TOT[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
B[t] =  +  52.4936 +  4.08983TOT[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270395&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]B[t] =  +  52.4936 +  4.08983TOT[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270395&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270395&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
B[t] = + 52.4936 + 4.08983TOT[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)52.493615.84313.3130.001296660.000648331
TOT4.089831.442932.8340.00558690.00279345

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 52.4936 & 15.8431 & 3.313 & 0.00129666 & 0.000648331 \tabularnewline
TOT & 4.08983 & 1.44293 & 2.834 & 0.0055869 & 0.00279345 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270395&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]52.4936[/C][C]15.8431[/C][C]3.313[/C][C]0.00129666[/C][C]0.000648331[/C][/ROW]
[ROW][C]TOT[/C][C]4.08983[/C][C]1.44293[/C][C]2.834[/C][C]0.0055869[/C][C]0.00279345[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270395&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270395&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)52.493615.84313.3130.001296660.000648331
TOT4.089831.442932.8340.00558690.00279345







Multiple Linear Regression - Regression Statistics
Multiple R0.276563
R-squared0.0764873
Adjusted R-squared0.0669665
F-TEST (value)8.03375
F-TEST (DF numerator)1
F-TEST (DF denominator)97
p-value0.0055869
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation35.5468
Sum Squared Residuals122567

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.276563 \tabularnewline
R-squared & 0.0764873 \tabularnewline
Adjusted R-squared & 0.0669665 \tabularnewline
F-TEST (value) & 8.03375 \tabularnewline
F-TEST (DF numerator) & 1 \tabularnewline
F-TEST (DF denominator) & 97 \tabularnewline
p-value & 0.0055869 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 35.5468 \tabularnewline
Sum Squared Residuals & 122567 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270395&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.276563[/C][/ROW]
[ROW][C]R-squared[/C][C]0.0764873[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.0669665[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]8.03375[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]1[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]97[/C][/ROW]
[ROW][C]p-value[/C][C]0.0055869[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]35.5468[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]122567[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270395&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270395&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.276563
R-squared0.0764873
Adjusted R-squared0.0669665
F-TEST (value)8.03375
F-TEST (DF numerator)1
F-TEST (DF denominator)97
p-value0.0055869
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation35.5468
Sum Squared Residuals122567







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
196105.252-9.25244
27582.7584-7.75838
370102.39-32.3896
488104.843-16.8435
511482.758431.2416
66979.8955-10.8955
7176104.02571.9745
8114113.0230.976883
9121106.88814.1116
1011097.890712.1093
1115886.030271.9698
1211699.117716.8823
1318178.6686102.331
147795.8458-18.8458
15141101.57239.4284
163578.2596-43.2596
178098.7087-18.7087
18152101.16350.8374
199790.52916.47094
209991.7567.24399
218493.3919-9.39194
226878.6686-10.6686
23101108.933-7.93329
2410796.663810.3362
2588108.933-20.9333
26112100.34511.6554
2717197.072873.9272
28137118.3418.6601
2977107.297-30.2974
306692.983-26.983
319399.5267-6.52668
3210586.439218.5608
33131100.34530.6554
348977.441611.5584
3510289.302112.6979
3616192.16568.835
3712096.663823.3362
3812794.618932.3811
397795.0279-18.0279
40108104.4343.56552
418590.5291-5.52906
42168100.75467.2464
434876.6236-28.6236
4415299.117752.8823
4575105.661-30.6614
4610796.663810.3362
4762102.799-40.7985
4812198.708722.2913
49124100.75423.2464
507284.8033-12.8033
5140104.434-64.4345
5258102.799-44.7985
539799.9357-2.93566
548879.89558.1045
5512697.072828.9272
56104101.9812.01942
57148106.88841.1116
5814693.800952.1991
598075.80574.19433
6097110.978-13.9782
612585.2123-60.2123
6299106.888-7.88837
6311890.529127.4709
6458103.617-45.6165
656383.5763-20.5763
66139117.52221.4781
675090.1201-40.1201
686089.7111-29.7111
6915297.890754.1093
70142105.66136.3386
7194111.796-17.7962
7266102.39-36.3896
73127102.79924.2015
746799.1177-32.1177
759088.48411.51586
7675112.205-37.2052
779682.349413.6506
78128104.02523.9745
79410.66142540.3386
80146181.025-35.0255
8169-10.520679.5206
82186197.983-11.983
838179.98531.01467
8485126.437-41.4369
8554115.297-61.2974
864637.07288.92722
87106142.08-36.0799
883468.6189-34.6189
896065.7536-5.75363
9095136.3-41.2997
915794.1177-37.1177
9262113.666-51.6662
933686.4794-50.4794
9456106.025-50.0255
955465.3967-11.3967
966480.983-16.983
977666.48419.51586
989893.98534.01467
9988142.302-54.3021
10035NANA

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 96 & 105.252 & -9.25244 \tabularnewline
2 & 75 & 82.7584 & -7.75838 \tabularnewline
3 & 70 & 102.39 & -32.3896 \tabularnewline
4 & 88 & 104.843 & -16.8435 \tabularnewline
5 & 114 & 82.7584 & 31.2416 \tabularnewline
6 & 69 & 79.8955 & -10.8955 \tabularnewline
7 & 176 & 104.025 & 71.9745 \tabularnewline
8 & 114 & 113.023 & 0.976883 \tabularnewline
9 & 121 & 106.888 & 14.1116 \tabularnewline
10 & 110 & 97.8907 & 12.1093 \tabularnewline
11 & 158 & 86.0302 & 71.9698 \tabularnewline
12 & 116 & 99.1177 & 16.8823 \tabularnewline
13 & 181 & 78.6686 & 102.331 \tabularnewline
14 & 77 & 95.8458 & -18.8458 \tabularnewline
15 & 141 & 101.572 & 39.4284 \tabularnewline
16 & 35 & 78.2596 & -43.2596 \tabularnewline
17 & 80 & 98.7087 & -18.7087 \tabularnewline
18 & 152 & 101.163 & 50.8374 \tabularnewline
19 & 97 & 90.5291 & 6.47094 \tabularnewline
20 & 99 & 91.756 & 7.24399 \tabularnewline
21 & 84 & 93.3919 & -9.39194 \tabularnewline
22 & 68 & 78.6686 & -10.6686 \tabularnewline
23 & 101 & 108.933 & -7.93329 \tabularnewline
24 & 107 & 96.6638 & 10.3362 \tabularnewline
25 & 88 & 108.933 & -20.9333 \tabularnewline
26 & 112 & 100.345 & 11.6554 \tabularnewline
27 & 171 & 97.0728 & 73.9272 \tabularnewline
28 & 137 & 118.34 & 18.6601 \tabularnewline
29 & 77 & 107.297 & -30.2974 \tabularnewline
30 & 66 & 92.983 & -26.983 \tabularnewline
31 & 93 & 99.5267 & -6.52668 \tabularnewline
32 & 105 & 86.4392 & 18.5608 \tabularnewline
33 & 131 & 100.345 & 30.6554 \tabularnewline
34 & 89 & 77.4416 & 11.5584 \tabularnewline
35 & 102 & 89.3021 & 12.6979 \tabularnewline
36 & 161 & 92.165 & 68.835 \tabularnewline
37 & 120 & 96.6638 & 23.3362 \tabularnewline
38 & 127 & 94.6189 & 32.3811 \tabularnewline
39 & 77 & 95.0279 & -18.0279 \tabularnewline
40 & 108 & 104.434 & 3.56552 \tabularnewline
41 & 85 & 90.5291 & -5.52906 \tabularnewline
42 & 168 & 100.754 & 67.2464 \tabularnewline
43 & 48 & 76.6236 & -28.6236 \tabularnewline
44 & 152 & 99.1177 & 52.8823 \tabularnewline
45 & 75 & 105.661 & -30.6614 \tabularnewline
46 & 107 & 96.6638 & 10.3362 \tabularnewline
47 & 62 & 102.799 & -40.7985 \tabularnewline
48 & 121 & 98.7087 & 22.2913 \tabularnewline
49 & 124 & 100.754 & 23.2464 \tabularnewline
50 & 72 & 84.8033 & -12.8033 \tabularnewline
51 & 40 & 104.434 & -64.4345 \tabularnewline
52 & 58 & 102.799 & -44.7985 \tabularnewline
53 & 97 & 99.9357 & -2.93566 \tabularnewline
54 & 88 & 79.8955 & 8.1045 \tabularnewline
55 & 126 & 97.0728 & 28.9272 \tabularnewline
56 & 104 & 101.981 & 2.01942 \tabularnewline
57 & 148 & 106.888 & 41.1116 \tabularnewline
58 & 146 & 93.8009 & 52.1991 \tabularnewline
59 & 80 & 75.8057 & 4.19433 \tabularnewline
60 & 97 & 110.978 & -13.9782 \tabularnewline
61 & 25 & 85.2123 & -60.2123 \tabularnewline
62 & 99 & 106.888 & -7.88837 \tabularnewline
63 & 118 & 90.5291 & 27.4709 \tabularnewline
64 & 58 & 103.617 & -45.6165 \tabularnewline
65 & 63 & 83.5763 & -20.5763 \tabularnewline
66 & 139 & 117.522 & 21.4781 \tabularnewline
67 & 50 & 90.1201 & -40.1201 \tabularnewline
68 & 60 & 89.7111 & -29.7111 \tabularnewline
69 & 152 & 97.8907 & 54.1093 \tabularnewline
70 & 142 & 105.661 & 36.3386 \tabularnewline
71 & 94 & 111.796 & -17.7962 \tabularnewline
72 & 66 & 102.39 & -36.3896 \tabularnewline
73 & 127 & 102.799 & 24.2015 \tabularnewline
74 & 67 & 99.1177 & -32.1177 \tabularnewline
75 & 90 & 88.4841 & 1.51586 \tabularnewline
76 & 75 & 112.205 & -37.2052 \tabularnewline
77 & 96 & 82.3494 & 13.6506 \tabularnewline
78 & 128 & 104.025 & 23.9745 \tabularnewline
79 & 41 & 0.661425 & 40.3386 \tabularnewline
80 & 146 & 181.025 & -35.0255 \tabularnewline
81 & 69 & -10.5206 & 79.5206 \tabularnewline
82 & 186 & 197.983 & -11.983 \tabularnewline
83 & 81 & 79.9853 & 1.01467 \tabularnewline
84 & 85 & 126.437 & -41.4369 \tabularnewline
85 & 54 & 115.297 & -61.2974 \tabularnewline
86 & 46 & 37.0728 & 8.92722 \tabularnewline
87 & 106 & 142.08 & -36.0799 \tabularnewline
88 & 34 & 68.6189 & -34.6189 \tabularnewline
89 & 60 & 65.7536 & -5.75363 \tabularnewline
90 & 95 & 136.3 & -41.2997 \tabularnewline
91 & 57 & 94.1177 & -37.1177 \tabularnewline
92 & 62 & 113.666 & -51.6662 \tabularnewline
93 & 36 & 86.4794 & -50.4794 \tabularnewline
94 & 56 & 106.025 & -50.0255 \tabularnewline
95 & 54 & 65.3967 & -11.3967 \tabularnewline
96 & 64 & 80.983 & -16.983 \tabularnewline
97 & 76 & 66.4841 & 9.51586 \tabularnewline
98 & 98 & 93.9853 & 4.01467 \tabularnewline
99 & 88 & 142.302 & -54.3021 \tabularnewline
100 & 35 & NA & NA \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270395&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]96[/C][C]105.252[/C][C]-9.25244[/C][/ROW]
[ROW][C]2[/C][C]75[/C][C]82.7584[/C][C]-7.75838[/C][/ROW]
[ROW][C]3[/C][C]70[/C][C]102.39[/C][C]-32.3896[/C][/ROW]
[ROW][C]4[/C][C]88[/C][C]104.843[/C][C]-16.8435[/C][/ROW]
[ROW][C]5[/C][C]114[/C][C]82.7584[/C][C]31.2416[/C][/ROW]
[ROW][C]6[/C][C]69[/C][C]79.8955[/C][C]-10.8955[/C][/ROW]
[ROW][C]7[/C][C]176[/C][C]104.025[/C][C]71.9745[/C][/ROW]
[ROW][C]8[/C][C]114[/C][C]113.023[/C][C]0.976883[/C][/ROW]
[ROW][C]9[/C][C]121[/C][C]106.888[/C][C]14.1116[/C][/ROW]
[ROW][C]10[/C][C]110[/C][C]97.8907[/C][C]12.1093[/C][/ROW]
[ROW][C]11[/C][C]158[/C][C]86.0302[/C][C]71.9698[/C][/ROW]
[ROW][C]12[/C][C]116[/C][C]99.1177[/C][C]16.8823[/C][/ROW]
[ROW][C]13[/C][C]181[/C][C]78.6686[/C][C]102.331[/C][/ROW]
[ROW][C]14[/C][C]77[/C][C]95.8458[/C][C]-18.8458[/C][/ROW]
[ROW][C]15[/C][C]141[/C][C]101.572[/C][C]39.4284[/C][/ROW]
[ROW][C]16[/C][C]35[/C][C]78.2596[/C][C]-43.2596[/C][/ROW]
[ROW][C]17[/C][C]80[/C][C]98.7087[/C][C]-18.7087[/C][/ROW]
[ROW][C]18[/C][C]152[/C][C]101.163[/C][C]50.8374[/C][/ROW]
[ROW][C]19[/C][C]97[/C][C]90.5291[/C][C]6.47094[/C][/ROW]
[ROW][C]20[/C][C]99[/C][C]91.756[/C][C]7.24399[/C][/ROW]
[ROW][C]21[/C][C]84[/C][C]93.3919[/C][C]-9.39194[/C][/ROW]
[ROW][C]22[/C][C]68[/C][C]78.6686[/C][C]-10.6686[/C][/ROW]
[ROW][C]23[/C][C]101[/C][C]108.933[/C][C]-7.93329[/C][/ROW]
[ROW][C]24[/C][C]107[/C][C]96.6638[/C][C]10.3362[/C][/ROW]
[ROW][C]25[/C][C]88[/C][C]108.933[/C][C]-20.9333[/C][/ROW]
[ROW][C]26[/C][C]112[/C][C]100.345[/C][C]11.6554[/C][/ROW]
[ROW][C]27[/C][C]171[/C][C]97.0728[/C][C]73.9272[/C][/ROW]
[ROW][C]28[/C][C]137[/C][C]118.34[/C][C]18.6601[/C][/ROW]
[ROW][C]29[/C][C]77[/C][C]107.297[/C][C]-30.2974[/C][/ROW]
[ROW][C]30[/C][C]66[/C][C]92.983[/C][C]-26.983[/C][/ROW]
[ROW][C]31[/C][C]93[/C][C]99.5267[/C][C]-6.52668[/C][/ROW]
[ROW][C]32[/C][C]105[/C][C]86.4392[/C][C]18.5608[/C][/ROW]
[ROW][C]33[/C][C]131[/C][C]100.345[/C][C]30.6554[/C][/ROW]
[ROW][C]34[/C][C]89[/C][C]77.4416[/C][C]11.5584[/C][/ROW]
[ROW][C]35[/C][C]102[/C][C]89.3021[/C][C]12.6979[/C][/ROW]
[ROW][C]36[/C][C]161[/C][C]92.165[/C][C]68.835[/C][/ROW]
[ROW][C]37[/C][C]120[/C][C]96.6638[/C][C]23.3362[/C][/ROW]
[ROW][C]38[/C][C]127[/C][C]94.6189[/C][C]32.3811[/C][/ROW]
[ROW][C]39[/C][C]77[/C][C]95.0279[/C][C]-18.0279[/C][/ROW]
[ROW][C]40[/C][C]108[/C][C]104.434[/C][C]3.56552[/C][/ROW]
[ROW][C]41[/C][C]85[/C][C]90.5291[/C][C]-5.52906[/C][/ROW]
[ROW][C]42[/C][C]168[/C][C]100.754[/C][C]67.2464[/C][/ROW]
[ROW][C]43[/C][C]48[/C][C]76.6236[/C][C]-28.6236[/C][/ROW]
[ROW][C]44[/C][C]152[/C][C]99.1177[/C][C]52.8823[/C][/ROW]
[ROW][C]45[/C][C]75[/C][C]105.661[/C][C]-30.6614[/C][/ROW]
[ROW][C]46[/C][C]107[/C][C]96.6638[/C][C]10.3362[/C][/ROW]
[ROW][C]47[/C][C]62[/C][C]102.799[/C][C]-40.7985[/C][/ROW]
[ROW][C]48[/C][C]121[/C][C]98.7087[/C][C]22.2913[/C][/ROW]
[ROW][C]49[/C][C]124[/C][C]100.754[/C][C]23.2464[/C][/ROW]
[ROW][C]50[/C][C]72[/C][C]84.8033[/C][C]-12.8033[/C][/ROW]
[ROW][C]51[/C][C]40[/C][C]104.434[/C][C]-64.4345[/C][/ROW]
[ROW][C]52[/C][C]58[/C][C]102.799[/C][C]-44.7985[/C][/ROW]
[ROW][C]53[/C][C]97[/C][C]99.9357[/C][C]-2.93566[/C][/ROW]
[ROW][C]54[/C][C]88[/C][C]79.8955[/C][C]8.1045[/C][/ROW]
[ROW][C]55[/C][C]126[/C][C]97.0728[/C][C]28.9272[/C][/ROW]
[ROW][C]56[/C][C]104[/C][C]101.981[/C][C]2.01942[/C][/ROW]
[ROW][C]57[/C][C]148[/C][C]106.888[/C][C]41.1116[/C][/ROW]
[ROW][C]58[/C][C]146[/C][C]93.8009[/C][C]52.1991[/C][/ROW]
[ROW][C]59[/C][C]80[/C][C]75.8057[/C][C]4.19433[/C][/ROW]
[ROW][C]60[/C][C]97[/C][C]110.978[/C][C]-13.9782[/C][/ROW]
[ROW][C]61[/C][C]25[/C][C]85.2123[/C][C]-60.2123[/C][/ROW]
[ROW][C]62[/C][C]99[/C][C]106.888[/C][C]-7.88837[/C][/ROW]
[ROW][C]63[/C][C]118[/C][C]90.5291[/C][C]27.4709[/C][/ROW]
[ROW][C]64[/C][C]58[/C][C]103.617[/C][C]-45.6165[/C][/ROW]
[ROW][C]65[/C][C]63[/C][C]83.5763[/C][C]-20.5763[/C][/ROW]
[ROW][C]66[/C][C]139[/C][C]117.522[/C][C]21.4781[/C][/ROW]
[ROW][C]67[/C][C]50[/C][C]90.1201[/C][C]-40.1201[/C][/ROW]
[ROW][C]68[/C][C]60[/C][C]89.7111[/C][C]-29.7111[/C][/ROW]
[ROW][C]69[/C][C]152[/C][C]97.8907[/C][C]54.1093[/C][/ROW]
[ROW][C]70[/C][C]142[/C][C]105.661[/C][C]36.3386[/C][/ROW]
[ROW][C]71[/C][C]94[/C][C]111.796[/C][C]-17.7962[/C][/ROW]
[ROW][C]72[/C][C]66[/C][C]102.39[/C][C]-36.3896[/C][/ROW]
[ROW][C]73[/C][C]127[/C][C]102.799[/C][C]24.2015[/C][/ROW]
[ROW][C]74[/C][C]67[/C][C]99.1177[/C][C]-32.1177[/C][/ROW]
[ROW][C]75[/C][C]90[/C][C]88.4841[/C][C]1.51586[/C][/ROW]
[ROW][C]76[/C][C]75[/C][C]112.205[/C][C]-37.2052[/C][/ROW]
[ROW][C]77[/C][C]96[/C][C]82.3494[/C][C]13.6506[/C][/ROW]
[ROW][C]78[/C][C]128[/C][C]104.025[/C][C]23.9745[/C][/ROW]
[ROW][C]79[/C][C]41[/C][C]0.661425[/C][C]40.3386[/C][/ROW]
[ROW][C]80[/C][C]146[/C][C]181.025[/C][C]-35.0255[/C][/ROW]
[ROW][C]81[/C][C]69[/C][C]-10.5206[/C][C]79.5206[/C][/ROW]
[ROW][C]82[/C][C]186[/C][C]197.983[/C][C]-11.983[/C][/ROW]
[ROW][C]83[/C][C]81[/C][C]79.9853[/C][C]1.01467[/C][/ROW]
[ROW][C]84[/C][C]85[/C][C]126.437[/C][C]-41.4369[/C][/ROW]
[ROW][C]85[/C][C]54[/C][C]115.297[/C][C]-61.2974[/C][/ROW]
[ROW][C]86[/C][C]46[/C][C]37.0728[/C][C]8.92722[/C][/ROW]
[ROW][C]87[/C][C]106[/C][C]142.08[/C][C]-36.0799[/C][/ROW]
[ROW][C]88[/C][C]34[/C][C]68.6189[/C][C]-34.6189[/C][/ROW]
[ROW][C]89[/C][C]60[/C][C]65.7536[/C][C]-5.75363[/C][/ROW]
[ROW][C]90[/C][C]95[/C][C]136.3[/C][C]-41.2997[/C][/ROW]
[ROW][C]91[/C][C]57[/C][C]94.1177[/C][C]-37.1177[/C][/ROW]
[ROW][C]92[/C][C]62[/C][C]113.666[/C][C]-51.6662[/C][/ROW]
[ROW][C]93[/C][C]36[/C][C]86.4794[/C][C]-50.4794[/C][/ROW]
[ROW][C]94[/C][C]56[/C][C]106.025[/C][C]-50.0255[/C][/ROW]
[ROW][C]95[/C][C]54[/C][C]65.3967[/C][C]-11.3967[/C][/ROW]
[ROW][C]96[/C][C]64[/C][C]80.983[/C][C]-16.983[/C][/ROW]
[ROW][C]97[/C][C]76[/C][C]66.4841[/C][C]9.51586[/C][/ROW]
[ROW][C]98[/C][C]98[/C][C]93.9853[/C][C]4.01467[/C][/ROW]
[ROW][C]99[/C][C]88[/C][C]142.302[/C][C]-54.3021[/C][/ROW]
[ROW][C]100[/C][C]35[/C][C]NA[/C][C]NA[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270395&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270395&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
196105.252-9.25244
27582.7584-7.75838
370102.39-32.3896
488104.843-16.8435
511482.758431.2416
66979.8955-10.8955
7176104.02571.9745
8114113.0230.976883
9121106.88814.1116
1011097.890712.1093
1115886.030271.9698
1211699.117716.8823
1318178.6686102.331
147795.8458-18.8458
15141101.57239.4284
163578.2596-43.2596
178098.7087-18.7087
18152101.16350.8374
199790.52916.47094
209991.7567.24399
218493.3919-9.39194
226878.6686-10.6686
23101108.933-7.93329
2410796.663810.3362
2588108.933-20.9333
26112100.34511.6554
2717197.072873.9272
28137118.3418.6601
2977107.297-30.2974
306692.983-26.983
319399.5267-6.52668
3210586.439218.5608
33131100.34530.6554
348977.441611.5584
3510289.302112.6979
3616192.16568.835
3712096.663823.3362
3812794.618932.3811
397795.0279-18.0279
40108104.4343.56552
418590.5291-5.52906
42168100.75467.2464
434876.6236-28.6236
4415299.117752.8823
4575105.661-30.6614
4610796.663810.3362
4762102.799-40.7985
4812198.708722.2913
49124100.75423.2464
507284.8033-12.8033
5140104.434-64.4345
5258102.799-44.7985
539799.9357-2.93566
548879.89558.1045
5512697.072828.9272
56104101.9812.01942
57148106.88841.1116
5814693.800952.1991
598075.80574.19433
6097110.978-13.9782
612585.2123-60.2123
6299106.888-7.88837
6311890.529127.4709
6458103.617-45.6165
656383.5763-20.5763
66139117.52221.4781
675090.1201-40.1201
686089.7111-29.7111
6915297.890754.1093
70142105.66136.3386
7194111.796-17.7962
7266102.39-36.3896
73127102.79924.2015
746799.1177-32.1177
759088.48411.51586
7675112.205-37.2052
779682.349413.6506
78128104.02523.9745
79410.66142540.3386
80146181.025-35.0255
8169-10.520679.5206
82186197.983-11.983
838179.98531.01467
8485126.437-41.4369
8554115.297-61.2974
864637.07288.92722
87106142.08-36.0799
883468.6189-34.6189
896065.7536-5.75363
9095136.3-41.2997
915794.1177-37.1177
9262113.666-51.6662
933686.4794-50.4794
9456106.025-50.0255
955465.3967-11.3967
966480.983-16.983
977666.48419.51586
989893.98534.01467
9988142.302-54.3021
10035NANA







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
50.1702450.340490.829755
60.1186650.2373310.881335
70.6844240.6311520.315576
80.5630520.8738960.436948
90.4517080.9034160.548292
100.3459730.6919460.654027
110.563730.8725410.43627
120.4692160.9384330.530784
130.7674340.4651310.232566
140.7562510.4874970.243749
150.7386860.5226270.261314
160.8672520.2654960.132748
170.8460620.3078760.153938
180.8610980.2778050.138902
190.8186190.3627620.181381
200.7689070.4621860.231093
210.7285820.5428360.271418
220.6932970.6134050.306703
230.6381870.7236260.361813
240.5733430.8533140.426657
250.5371810.9256390.462819
260.4726270.9452550.527373
270.6442540.7114920.355746
280.5942130.8115740.405787
290.5924880.8150240.407512
300.5873730.8252530.412627
310.531760.9364790.46824
320.4788410.9576820.521159
330.4526260.9052530.547374
340.3984310.7968620.601569
350.3455350.6910690.654465
360.4860540.9721080.513946
370.4462330.8924660.553767
380.4283780.8567570.571622
390.3999470.7998930.600053
400.3450810.6901610.654919
410.3015360.6030720.698464
420.44260.8852010.5574
430.4453190.8906390.554681
440.5159590.9680830.484041
450.5150960.9698080.484904
460.4654460.9308930.534554
470.4970950.994190.502905
480.465410.9308210.53459
490.436880.8737590.56312
500.3956860.7913720.604314
510.5390590.9218820.460941
520.5750220.8499560.424978
530.5193120.9613760.480688
540.4749020.9498050.525098
550.4648650.9297310.535135
560.4097270.8194540.590273
570.4423930.8847860.557607
580.5442740.9114510.455726
590.5059690.9880610.494031
600.4548860.9097720.545114
610.5497320.9005370.450268
620.4927230.9854460.507277
630.496490.992980.50351
640.5242550.951490.475745
650.478580.957160.52142
660.4462870.8925750.553713
670.4435660.8871320.556434
680.4109710.8219420.589029
690.5441750.9116510.455825
700.5919270.8161470.408073
710.5353880.9292230.464612
720.5130190.9739610.486981
730.5159440.9681120.484056
740.4780070.9560150.521993
750.4233940.8467870.576606
760.3988580.7977160.601142
770.3784260.7568520.621574
780.3799770.7599530.620023
790.4959090.9918190.504091
800.4465440.8930880.553456
810.9707960.05840880.0292044
820.959040.08192050.0409603
830.9497910.1004170.0502087
840.9311720.1376560.0688282
850.9315820.1368360.0684182
860.9544290.09114180.0455709
870.9546180.09076490.0453825
880.9252070.1495870.0747934
890.9404050.1191910.0595954
900.8978290.2043410.102171
910.8298880.3402230.170112
920.8478450.304310.152155
930.7415170.5169670.258483
940.6084270.7831460.391573
950.441170.8823390.55883

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
5 & 0.170245 & 0.34049 & 0.829755 \tabularnewline
6 & 0.118665 & 0.237331 & 0.881335 \tabularnewline
7 & 0.684424 & 0.631152 & 0.315576 \tabularnewline
8 & 0.563052 & 0.873896 & 0.436948 \tabularnewline
9 & 0.451708 & 0.903416 & 0.548292 \tabularnewline
10 & 0.345973 & 0.691946 & 0.654027 \tabularnewline
11 & 0.56373 & 0.872541 & 0.43627 \tabularnewline
12 & 0.469216 & 0.938433 & 0.530784 \tabularnewline
13 & 0.767434 & 0.465131 & 0.232566 \tabularnewline
14 & 0.756251 & 0.487497 & 0.243749 \tabularnewline
15 & 0.738686 & 0.522627 & 0.261314 \tabularnewline
16 & 0.867252 & 0.265496 & 0.132748 \tabularnewline
17 & 0.846062 & 0.307876 & 0.153938 \tabularnewline
18 & 0.861098 & 0.277805 & 0.138902 \tabularnewline
19 & 0.818619 & 0.362762 & 0.181381 \tabularnewline
20 & 0.768907 & 0.462186 & 0.231093 \tabularnewline
21 & 0.728582 & 0.542836 & 0.271418 \tabularnewline
22 & 0.693297 & 0.613405 & 0.306703 \tabularnewline
23 & 0.638187 & 0.723626 & 0.361813 \tabularnewline
24 & 0.573343 & 0.853314 & 0.426657 \tabularnewline
25 & 0.537181 & 0.925639 & 0.462819 \tabularnewline
26 & 0.472627 & 0.945255 & 0.527373 \tabularnewline
27 & 0.644254 & 0.711492 & 0.355746 \tabularnewline
28 & 0.594213 & 0.811574 & 0.405787 \tabularnewline
29 & 0.592488 & 0.815024 & 0.407512 \tabularnewline
30 & 0.587373 & 0.825253 & 0.412627 \tabularnewline
31 & 0.53176 & 0.936479 & 0.46824 \tabularnewline
32 & 0.478841 & 0.957682 & 0.521159 \tabularnewline
33 & 0.452626 & 0.905253 & 0.547374 \tabularnewline
34 & 0.398431 & 0.796862 & 0.601569 \tabularnewline
35 & 0.345535 & 0.691069 & 0.654465 \tabularnewline
36 & 0.486054 & 0.972108 & 0.513946 \tabularnewline
37 & 0.446233 & 0.892466 & 0.553767 \tabularnewline
38 & 0.428378 & 0.856757 & 0.571622 \tabularnewline
39 & 0.399947 & 0.799893 & 0.600053 \tabularnewline
40 & 0.345081 & 0.690161 & 0.654919 \tabularnewline
41 & 0.301536 & 0.603072 & 0.698464 \tabularnewline
42 & 0.4426 & 0.885201 & 0.5574 \tabularnewline
43 & 0.445319 & 0.890639 & 0.554681 \tabularnewline
44 & 0.515959 & 0.968083 & 0.484041 \tabularnewline
45 & 0.515096 & 0.969808 & 0.484904 \tabularnewline
46 & 0.465446 & 0.930893 & 0.534554 \tabularnewline
47 & 0.497095 & 0.99419 & 0.502905 \tabularnewline
48 & 0.46541 & 0.930821 & 0.53459 \tabularnewline
49 & 0.43688 & 0.873759 & 0.56312 \tabularnewline
50 & 0.395686 & 0.791372 & 0.604314 \tabularnewline
51 & 0.539059 & 0.921882 & 0.460941 \tabularnewline
52 & 0.575022 & 0.849956 & 0.424978 \tabularnewline
53 & 0.519312 & 0.961376 & 0.480688 \tabularnewline
54 & 0.474902 & 0.949805 & 0.525098 \tabularnewline
55 & 0.464865 & 0.929731 & 0.535135 \tabularnewline
56 & 0.409727 & 0.819454 & 0.590273 \tabularnewline
57 & 0.442393 & 0.884786 & 0.557607 \tabularnewline
58 & 0.544274 & 0.911451 & 0.455726 \tabularnewline
59 & 0.505969 & 0.988061 & 0.494031 \tabularnewline
60 & 0.454886 & 0.909772 & 0.545114 \tabularnewline
61 & 0.549732 & 0.900537 & 0.450268 \tabularnewline
62 & 0.492723 & 0.985446 & 0.507277 \tabularnewline
63 & 0.49649 & 0.99298 & 0.50351 \tabularnewline
64 & 0.524255 & 0.95149 & 0.475745 \tabularnewline
65 & 0.47858 & 0.95716 & 0.52142 \tabularnewline
66 & 0.446287 & 0.892575 & 0.553713 \tabularnewline
67 & 0.443566 & 0.887132 & 0.556434 \tabularnewline
68 & 0.410971 & 0.821942 & 0.589029 \tabularnewline
69 & 0.544175 & 0.911651 & 0.455825 \tabularnewline
70 & 0.591927 & 0.816147 & 0.408073 \tabularnewline
71 & 0.535388 & 0.929223 & 0.464612 \tabularnewline
72 & 0.513019 & 0.973961 & 0.486981 \tabularnewline
73 & 0.515944 & 0.968112 & 0.484056 \tabularnewline
74 & 0.478007 & 0.956015 & 0.521993 \tabularnewline
75 & 0.423394 & 0.846787 & 0.576606 \tabularnewline
76 & 0.398858 & 0.797716 & 0.601142 \tabularnewline
77 & 0.378426 & 0.756852 & 0.621574 \tabularnewline
78 & 0.379977 & 0.759953 & 0.620023 \tabularnewline
79 & 0.495909 & 0.991819 & 0.504091 \tabularnewline
80 & 0.446544 & 0.893088 & 0.553456 \tabularnewline
81 & 0.970796 & 0.0584088 & 0.0292044 \tabularnewline
82 & 0.95904 & 0.0819205 & 0.0409603 \tabularnewline
83 & 0.949791 & 0.100417 & 0.0502087 \tabularnewline
84 & 0.931172 & 0.137656 & 0.0688282 \tabularnewline
85 & 0.931582 & 0.136836 & 0.0684182 \tabularnewline
86 & 0.954429 & 0.0911418 & 0.0455709 \tabularnewline
87 & 0.954618 & 0.0907649 & 0.0453825 \tabularnewline
88 & 0.925207 & 0.149587 & 0.0747934 \tabularnewline
89 & 0.940405 & 0.119191 & 0.0595954 \tabularnewline
90 & 0.897829 & 0.204341 & 0.102171 \tabularnewline
91 & 0.829888 & 0.340223 & 0.170112 \tabularnewline
92 & 0.847845 & 0.30431 & 0.152155 \tabularnewline
93 & 0.741517 & 0.516967 & 0.258483 \tabularnewline
94 & 0.608427 & 0.783146 & 0.391573 \tabularnewline
95 & 0.44117 & 0.882339 & 0.55883 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270395&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]5[/C][C]0.170245[/C][C]0.34049[/C][C]0.829755[/C][/ROW]
[ROW][C]6[/C][C]0.118665[/C][C]0.237331[/C][C]0.881335[/C][/ROW]
[ROW][C]7[/C][C]0.684424[/C][C]0.631152[/C][C]0.315576[/C][/ROW]
[ROW][C]8[/C][C]0.563052[/C][C]0.873896[/C][C]0.436948[/C][/ROW]
[ROW][C]9[/C][C]0.451708[/C][C]0.903416[/C][C]0.548292[/C][/ROW]
[ROW][C]10[/C][C]0.345973[/C][C]0.691946[/C][C]0.654027[/C][/ROW]
[ROW][C]11[/C][C]0.56373[/C][C]0.872541[/C][C]0.43627[/C][/ROW]
[ROW][C]12[/C][C]0.469216[/C][C]0.938433[/C][C]0.530784[/C][/ROW]
[ROW][C]13[/C][C]0.767434[/C][C]0.465131[/C][C]0.232566[/C][/ROW]
[ROW][C]14[/C][C]0.756251[/C][C]0.487497[/C][C]0.243749[/C][/ROW]
[ROW][C]15[/C][C]0.738686[/C][C]0.522627[/C][C]0.261314[/C][/ROW]
[ROW][C]16[/C][C]0.867252[/C][C]0.265496[/C][C]0.132748[/C][/ROW]
[ROW][C]17[/C][C]0.846062[/C][C]0.307876[/C][C]0.153938[/C][/ROW]
[ROW][C]18[/C][C]0.861098[/C][C]0.277805[/C][C]0.138902[/C][/ROW]
[ROW][C]19[/C][C]0.818619[/C][C]0.362762[/C][C]0.181381[/C][/ROW]
[ROW][C]20[/C][C]0.768907[/C][C]0.462186[/C][C]0.231093[/C][/ROW]
[ROW][C]21[/C][C]0.728582[/C][C]0.542836[/C][C]0.271418[/C][/ROW]
[ROW][C]22[/C][C]0.693297[/C][C]0.613405[/C][C]0.306703[/C][/ROW]
[ROW][C]23[/C][C]0.638187[/C][C]0.723626[/C][C]0.361813[/C][/ROW]
[ROW][C]24[/C][C]0.573343[/C][C]0.853314[/C][C]0.426657[/C][/ROW]
[ROW][C]25[/C][C]0.537181[/C][C]0.925639[/C][C]0.462819[/C][/ROW]
[ROW][C]26[/C][C]0.472627[/C][C]0.945255[/C][C]0.527373[/C][/ROW]
[ROW][C]27[/C][C]0.644254[/C][C]0.711492[/C][C]0.355746[/C][/ROW]
[ROW][C]28[/C][C]0.594213[/C][C]0.811574[/C][C]0.405787[/C][/ROW]
[ROW][C]29[/C][C]0.592488[/C][C]0.815024[/C][C]0.407512[/C][/ROW]
[ROW][C]30[/C][C]0.587373[/C][C]0.825253[/C][C]0.412627[/C][/ROW]
[ROW][C]31[/C][C]0.53176[/C][C]0.936479[/C][C]0.46824[/C][/ROW]
[ROW][C]32[/C][C]0.478841[/C][C]0.957682[/C][C]0.521159[/C][/ROW]
[ROW][C]33[/C][C]0.452626[/C][C]0.905253[/C][C]0.547374[/C][/ROW]
[ROW][C]34[/C][C]0.398431[/C][C]0.796862[/C][C]0.601569[/C][/ROW]
[ROW][C]35[/C][C]0.345535[/C][C]0.691069[/C][C]0.654465[/C][/ROW]
[ROW][C]36[/C][C]0.486054[/C][C]0.972108[/C][C]0.513946[/C][/ROW]
[ROW][C]37[/C][C]0.446233[/C][C]0.892466[/C][C]0.553767[/C][/ROW]
[ROW][C]38[/C][C]0.428378[/C][C]0.856757[/C][C]0.571622[/C][/ROW]
[ROW][C]39[/C][C]0.399947[/C][C]0.799893[/C][C]0.600053[/C][/ROW]
[ROW][C]40[/C][C]0.345081[/C][C]0.690161[/C][C]0.654919[/C][/ROW]
[ROW][C]41[/C][C]0.301536[/C][C]0.603072[/C][C]0.698464[/C][/ROW]
[ROW][C]42[/C][C]0.4426[/C][C]0.885201[/C][C]0.5574[/C][/ROW]
[ROW][C]43[/C][C]0.445319[/C][C]0.890639[/C][C]0.554681[/C][/ROW]
[ROW][C]44[/C][C]0.515959[/C][C]0.968083[/C][C]0.484041[/C][/ROW]
[ROW][C]45[/C][C]0.515096[/C][C]0.969808[/C][C]0.484904[/C][/ROW]
[ROW][C]46[/C][C]0.465446[/C][C]0.930893[/C][C]0.534554[/C][/ROW]
[ROW][C]47[/C][C]0.497095[/C][C]0.99419[/C][C]0.502905[/C][/ROW]
[ROW][C]48[/C][C]0.46541[/C][C]0.930821[/C][C]0.53459[/C][/ROW]
[ROW][C]49[/C][C]0.43688[/C][C]0.873759[/C][C]0.56312[/C][/ROW]
[ROW][C]50[/C][C]0.395686[/C][C]0.791372[/C][C]0.604314[/C][/ROW]
[ROW][C]51[/C][C]0.539059[/C][C]0.921882[/C][C]0.460941[/C][/ROW]
[ROW][C]52[/C][C]0.575022[/C][C]0.849956[/C][C]0.424978[/C][/ROW]
[ROW][C]53[/C][C]0.519312[/C][C]0.961376[/C][C]0.480688[/C][/ROW]
[ROW][C]54[/C][C]0.474902[/C][C]0.949805[/C][C]0.525098[/C][/ROW]
[ROW][C]55[/C][C]0.464865[/C][C]0.929731[/C][C]0.535135[/C][/ROW]
[ROW][C]56[/C][C]0.409727[/C][C]0.819454[/C][C]0.590273[/C][/ROW]
[ROW][C]57[/C][C]0.442393[/C][C]0.884786[/C][C]0.557607[/C][/ROW]
[ROW][C]58[/C][C]0.544274[/C][C]0.911451[/C][C]0.455726[/C][/ROW]
[ROW][C]59[/C][C]0.505969[/C][C]0.988061[/C][C]0.494031[/C][/ROW]
[ROW][C]60[/C][C]0.454886[/C][C]0.909772[/C][C]0.545114[/C][/ROW]
[ROW][C]61[/C][C]0.549732[/C][C]0.900537[/C][C]0.450268[/C][/ROW]
[ROW][C]62[/C][C]0.492723[/C][C]0.985446[/C][C]0.507277[/C][/ROW]
[ROW][C]63[/C][C]0.49649[/C][C]0.99298[/C][C]0.50351[/C][/ROW]
[ROW][C]64[/C][C]0.524255[/C][C]0.95149[/C][C]0.475745[/C][/ROW]
[ROW][C]65[/C][C]0.47858[/C][C]0.95716[/C][C]0.52142[/C][/ROW]
[ROW][C]66[/C][C]0.446287[/C][C]0.892575[/C][C]0.553713[/C][/ROW]
[ROW][C]67[/C][C]0.443566[/C][C]0.887132[/C][C]0.556434[/C][/ROW]
[ROW][C]68[/C][C]0.410971[/C][C]0.821942[/C][C]0.589029[/C][/ROW]
[ROW][C]69[/C][C]0.544175[/C][C]0.911651[/C][C]0.455825[/C][/ROW]
[ROW][C]70[/C][C]0.591927[/C][C]0.816147[/C][C]0.408073[/C][/ROW]
[ROW][C]71[/C][C]0.535388[/C][C]0.929223[/C][C]0.464612[/C][/ROW]
[ROW][C]72[/C][C]0.513019[/C][C]0.973961[/C][C]0.486981[/C][/ROW]
[ROW][C]73[/C][C]0.515944[/C][C]0.968112[/C][C]0.484056[/C][/ROW]
[ROW][C]74[/C][C]0.478007[/C][C]0.956015[/C][C]0.521993[/C][/ROW]
[ROW][C]75[/C][C]0.423394[/C][C]0.846787[/C][C]0.576606[/C][/ROW]
[ROW][C]76[/C][C]0.398858[/C][C]0.797716[/C][C]0.601142[/C][/ROW]
[ROW][C]77[/C][C]0.378426[/C][C]0.756852[/C][C]0.621574[/C][/ROW]
[ROW][C]78[/C][C]0.379977[/C][C]0.759953[/C][C]0.620023[/C][/ROW]
[ROW][C]79[/C][C]0.495909[/C][C]0.991819[/C][C]0.504091[/C][/ROW]
[ROW][C]80[/C][C]0.446544[/C][C]0.893088[/C][C]0.553456[/C][/ROW]
[ROW][C]81[/C][C]0.970796[/C][C]0.0584088[/C][C]0.0292044[/C][/ROW]
[ROW][C]82[/C][C]0.95904[/C][C]0.0819205[/C][C]0.0409603[/C][/ROW]
[ROW][C]83[/C][C]0.949791[/C][C]0.100417[/C][C]0.0502087[/C][/ROW]
[ROW][C]84[/C][C]0.931172[/C][C]0.137656[/C][C]0.0688282[/C][/ROW]
[ROW][C]85[/C][C]0.931582[/C][C]0.136836[/C][C]0.0684182[/C][/ROW]
[ROW][C]86[/C][C]0.954429[/C][C]0.0911418[/C][C]0.0455709[/C][/ROW]
[ROW][C]87[/C][C]0.954618[/C][C]0.0907649[/C][C]0.0453825[/C][/ROW]
[ROW][C]88[/C][C]0.925207[/C][C]0.149587[/C][C]0.0747934[/C][/ROW]
[ROW][C]89[/C][C]0.940405[/C][C]0.119191[/C][C]0.0595954[/C][/ROW]
[ROW][C]90[/C][C]0.897829[/C][C]0.204341[/C][C]0.102171[/C][/ROW]
[ROW][C]91[/C][C]0.829888[/C][C]0.340223[/C][C]0.170112[/C][/ROW]
[ROW][C]92[/C][C]0.847845[/C][C]0.30431[/C][C]0.152155[/C][/ROW]
[ROW][C]93[/C][C]0.741517[/C][C]0.516967[/C][C]0.258483[/C][/ROW]
[ROW][C]94[/C][C]0.608427[/C][C]0.783146[/C][C]0.391573[/C][/ROW]
[ROW][C]95[/C][C]0.44117[/C][C]0.882339[/C][C]0.55883[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270395&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270395&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
50.1702450.340490.829755
60.1186650.2373310.881335
70.6844240.6311520.315576
80.5630520.8738960.436948
90.4517080.9034160.548292
100.3459730.6919460.654027
110.563730.8725410.43627
120.4692160.9384330.530784
130.7674340.4651310.232566
140.7562510.4874970.243749
150.7386860.5226270.261314
160.8672520.2654960.132748
170.8460620.3078760.153938
180.8610980.2778050.138902
190.8186190.3627620.181381
200.7689070.4621860.231093
210.7285820.5428360.271418
220.6932970.6134050.306703
230.6381870.7236260.361813
240.5733430.8533140.426657
250.5371810.9256390.462819
260.4726270.9452550.527373
270.6442540.7114920.355746
280.5942130.8115740.405787
290.5924880.8150240.407512
300.5873730.8252530.412627
310.531760.9364790.46824
320.4788410.9576820.521159
330.4526260.9052530.547374
340.3984310.7968620.601569
350.3455350.6910690.654465
360.4860540.9721080.513946
370.4462330.8924660.553767
380.4283780.8567570.571622
390.3999470.7998930.600053
400.3450810.6901610.654919
410.3015360.6030720.698464
420.44260.8852010.5574
430.4453190.8906390.554681
440.5159590.9680830.484041
450.5150960.9698080.484904
460.4654460.9308930.534554
470.4970950.994190.502905
480.465410.9308210.53459
490.436880.8737590.56312
500.3956860.7913720.604314
510.5390590.9218820.460941
520.5750220.8499560.424978
530.5193120.9613760.480688
540.4749020.9498050.525098
550.4648650.9297310.535135
560.4097270.8194540.590273
570.4423930.8847860.557607
580.5442740.9114510.455726
590.5059690.9880610.494031
600.4548860.9097720.545114
610.5497320.9005370.450268
620.4927230.9854460.507277
630.496490.992980.50351
640.5242550.951490.475745
650.478580.957160.52142
660.4462870.8925750.553713
670.4435660.8871320.556434
680.4109710.8219420.589029
690.5441750.9116510.455825
700.5919270.8161470.408073
710.5353880.9292230.464612
720.5130190.9739610.486981
730.5159440.9681120.484056
740.4780070.9560150.521993
750.4233940.8467870.576606
760.3988580.7977160.601142
770.3784260.7568520.621574
780.3799770.7599530.620023
790.4959090.9918190.504091
800.4465440.8930880.553456
810.9707960.05840880.0292044
820.959040.08192050.0409603
830.9497910.1004170.0502087
840.9311720.1376560.0688282
850.9315820.1368360.0684182
860.9544290.09114180.0455709
870.9546180.09076490.0453825
880.9252070.1495870.0747934
890.9404050.1191910.0595954
900.8978290.2043410.102171
910.8298880.3402230.170112
920.8478450.304310.152155
930.7415170.5169670.258483
940.6084270.7831460.391573
950.441170.8823390.55883







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level40.043956OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 4 & 0.043956 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270395&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]4[/C][C]0.043956[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270395&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270395&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level40.043956OK



Parameters (Session):
par1 = 5 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
par3 <- 'No Linear Trend'
par2 <- 'Do not include Seasonal Dummies'
par1 <- '1'
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,signif(mysum$coefficients[i,1],6))
a<-table.element(a, signif(mysum$coefficients[i,2],6))
a<-table.element(a, signif(mysum$coefficients[i,3],4))
a<-table.element(a, signif(mysum$coefficients[i,4],6))
a<-table.element(a, signif(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, signif(sqrt(mysum$r.squared),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, signif(mysum$r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, signif(mysum$adj.r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[1],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, signif(mysum$sigma,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, signif(sum(myerror*myerror),6))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,signif(x[i],6))
a<-table.element(a,signif(x[i]-mysum$resid[i],6))
a<-table.element(a,signif(mysum$resid[i],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,signif(gqarr[mypoint-kp3+1,1],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,2],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,3],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,signif(numsignificant1/numgqtests,6))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}