Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationThu, 18 Dec 2014 13:08:04 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2014/Dec/18/t14189081180whmhcmpg1fgis3.htm/, Retrieved Fri, 17 May 2024 18:16:05 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=270891, Retrieved Fri, 17 May 2024 18:16:05 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact99
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [Verklaring examen...] [2014-12-18 10:27:57] [94e0b03eaaae24ea322c1a0c8a3c30a1]
- R  D    [Multiple Regression] [totaal score en r...] [2014-12-18 13:08:04] [0adf43ccf8dfa476608a94fd7836e72e] [Current]
Feedback Forum

Post a new message
Dataseries X:
13	20	39	12
12	20	32	9
13	31	25	12
7	16	36	8
7	24	28	9
13	24	34	9
15	27	36	12
13	21	26	9
11	24	32	10
8	16	36	13
11	19	26	8
6	24	29	11
11	24	34	7
12	11	35	11
6	27	42	13
11	40	33	12
12	21	32	9
9	27	34	9
10	24	33	9
10	19	36	12
14	27	32	12
11	29	34	9
14	0	0	0
12	19	32	14
11	29	35	13
16	31	42	12
13	21	28	9
10	22	33	10
12	19	34	8
8	18	30	4
12	32	30	7
9	22	30	8
10	39	34	13
11	20	30	11
10	17	30	5
10	23	38	9
13	22	31	9
9	22	35	10
12	14	36	12
6	15	30	5
11	20	32	8
13	31	43	15
11	16	31	4
12	23	32	9
11	25	37	9
12	13	33	10
8	19	32	11
13	37	44	14
12	22	33	12
12	13	33	12
7	22	29	7
11	14	22	7
12	25	39	10
13	15	28	15
10	17	37	10
6	27	39	10
14	16	34	10
8	17	32	11
13	22	33	9
9	15	32	11
13	18	35	10
8	28	39	10
16	14	29	9
9	26	33	9
9	16	27	8
11	20	40	8
13	14	41	12
15	25	24	9
12	10	32	3
12	22	38	13
11	26	36	10
9	0	0	0
15	21	33	12
13	18	24	7
13	18	32	9
13	22	36	12
13	22	35	11
10	12	32	9
8	11	36	10
11	13	33	11
13	13	32	6
11	19	42	10
4	26	24	10
10	26	29	9
12	22	33	9
11	19	32	10
11	34	42	12
9	22	30	7
13	18	30	10
13	20	31	8
6	29	24	12
10	22	35	11
9	22	26	7
8	16	34	9
9	26	31	6
7	18	32	9
11	16	26	10
14	18	33	8
8	28	42	11
11	29	35	10
10	21	28	6
10	23	30	9
14	14	33	12
9	28	34	10
14	21	31	10
6	24	32	10
10	10	36	6
12	18	36	12
11	12	39	11




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Herman Ole Andreas Wold' @ wold.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'Herman Ole Andreas Wold' @ wold.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270891&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Herman Ole Andreas Wold' @ wold.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270891&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270891&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Herman Ole Andreas Wold' @ wold.wessa.net







Multiple Linear Regression - Estimated Regression Equation
TOT[t] = + 10.532 -0.054965PO[t] -0.0156928PS[t] + 0.205519PV[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
TOT[t] =  +  10.532 -0.054965PO[t] -0.0156928PS[t] +  0.205519PV[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270891&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]TOT[t] =  +  10.532 -0.054965PO[t] -0.0156928PS[t] +  0.205519PV[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270891&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270891&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
TOT[t] = + 10.532 -0.054965PO[t] -0.0156928PS[t] + 0.205519PV[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)10.5321.243318.4711.60519e-138.02596e-14
PO-0.0549650.039834-1.380.1705640.085282
PS-0.01569280.046927-0.33440.7387380.369369
PV0.2055190.1149231.7880.07660980.0383049

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 10.532 & 1.24331 & 8.471 & 1.60519e-13 & 8.02596e-14 \tabularnewline
PO & -0.054965 & 0.039834 & -1.38 & 0.170564 & 0.085282 \tabularnewline
PS & -0.0156928 & 0.046927 & -0.3344 & 0.738738 & 0.369369 \tabularnewline
PV & 0.205519 & 0.114923 & 1.788 & 0.0766098 & 0.0383049 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270891&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]10.532[/C][C]1.24331[/C][C]8.471[/C][C]1.60519e-13[/C][C]8.02596e-14[/C][/ROW]
[ROW][C]PO[/C][C]-0.054965[/C][C]0.039834[/C][C]-1.38[/C][C]0.170564[/C][C]0.085282[/C][/ROW]
[ROW][C]PS[/C][C]-0.0156928[/C][C]0.046927[/C][C]-0.3344[/C][C]0.738738[/C][C]0.369369[/C][/ROW]
[ROW][C]PV[/C][C]0.205519[/C][C]0.114923[/C][C]1.788[/C][C]0.0766098[/C][C]0.0383049[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270891&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270891&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)10.5321.243318.4711.60519e-138.02596e-14
PO-0.0549650.039834-1.380.1705640.085282
PS-0.01569280.046927-0.33440.7387380.369369
PV0.2055190.1149231.7880.07660980.0383049







Multiple Linear Regression - Regression Statistics
Multiple R0.194766
R-squared0.0379338
Adjusted R-squared0.0104462
F-TEST (value)1.38003
F-TEST (DF numerator)3
F-TEST (DF denominator)105
p-value0.25298
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation2.40835
Sum Squared Residuals609.014

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.194766 \tabularnewline
R-squared & 0.0379338 \tabularnewline
Adjusted R-squared & 0.0104462 \tabularnewline
F-TEST (value) & 1.38003 \tabularnewline
F-TEST (DF numerator) & 3 \tabularnewline
F-TEST (DF denominator) & 105 \tabularnewline
p-value & 0.25298 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 2.40835 \tabularnewline
Sum Squared Residuals & 609.014 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270891&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.194766[/C][/ROW]
[ROW][C]R-squared[/C][C]0.0379338[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.0104462[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]1.38003[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]3[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]105[/C][/ROW]
[ROW][C]p-value[/C][C]0.25298[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]2.40835[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]609.014[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270891&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270891&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.194766
R-squared0.0379338
Adjusted R-squared0.0104462
F-TEST (value)1.38003
F-TEST (DF numerator)3
F-TEST (DF denominator)105
p-value0.25298
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation2.40835
Sum Squared Residuals609.014







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
11311.28691.71313
21210.78021.21984
31310.9022.09805
4710.7317-3.73173
5710.6231-3.62307
61310.52892.47108
71510.94924.05081
81310.81942.18064
91110.76580.234178
10811.7593-3.75933
111110.72380.276233
12611.0184-5.01842
131110.11790.88212
141211.63880.361193
15611.0606-5.06055
161110.28170.718273
171210.72521.2748
18910.364-1.36402
191010.5446-0.544611
201011.3889-1.38891
211411.0122.98804
221110.25410.745907
231410.5323.46803
241211.86270.137278
251111.0605-0.0604748
261610.63525.36482
271310.7882.21203
281010.8601-0.860059
291210.59821.40178
3089.89389-1.89389
31129.740932.25907
32910.4961-1.4961
331010.5265-0.526518
341111.2226-0.222586
351010.1544-0.154369
361010.5211-0.521111
371310.68592.31407
38910.8287-1.82867
391211.66370.336262
40610.2643-4.2643
411110.57460.425355
421311.2361.76396
43119.988121.01188
441210.61531.38473
451110.42690.573126
461211.35470.645256
47811.2462-3.24617
481310.6852.31496
491211.27110.728903
501211.76580.234219
51710.3063-3.30627
521110.85580.144156
531210.6011.39899
541312.35090.649129
551011.0721-1.07211
56610.4911-4.49108
571411.17422.82584
58811.3561-3.3561
591310.65452.34546
60911.466-2.46603
611311.04851.95147
62810.4361-2.43611
631611.1574.84297
64910.4347-1.43468
65910.873-1.87297
661110.44910.550898
671311.58531.41473
681510.63094.36912
691210.09671.9033
701211.39820.601849
711110.59310.406879
72910.532-1.53197
731511.32613.67394
741310.60462.3954
751310.89012.10991
761311.2241.77598
771311.03421.96581
781011.2199-1.21988
79811.4176-3.4176
801111.5603-0.560262
811310.54842.45164
821110.88370.116281
83410.7814-6.78143
841010.4975-0.497452
851210.65451.34546
861111.0406-0.0406469
871110.47030.529719
88910.2906-1.29058
891311.1271.873
901310.59032.40966
91611.0276-5.02758
921011.0342-1.03419
93910.3534-1.35335
94810.9686-2.96864
9599.84951-0.84951
96710.8901-3.89009
971111.2997-0.299699
981410.66893.33112
99810.5946-2.59455
1001110.44390.556081
1011010.1714-0.171414
1021010.6467-0.646654
1031411.71082.28918
104910.5146-1.51458
1051410.94643.05359
106610.7658-4.76582
1071010.6505-0.650486
1081211.44390.556122
1091111.5211-0.52107

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 13 & 11.2869 & 1.71313 \tabularnewline
2 & 12 & 10.7802 & 1.21984 \tabularnewline
3 & 13 & 10.902 & 2.09805 \tabularnewline
4 & 7 & 10.7317 & -3.73173 \tabularnewline
5 & 7 & 10.6231 & -3.62307 \tabularnewline
6 & 13 & 10.5289 & 2.47108 \tabularnewline
7 & 15 & 10.9492 & 4.05081 \tabularnewline
8 & 13 & 10.8194 & 2.18064 \tabularnewline
9 & 11 & 10.7658 & 0.234178 \tabularnewline
10 & 8 & 11.7593 & -3.75933 \tabularnewline
11 & 11 & 10.7238 & 0.276233 \tabularnewline
12 & 6 & 11.0184 & -5.01842 \tabularnewline
13 & 11 & 10.1179 & 0.88212 \tabularnewline
14 & 12 & 11.6388 & 0.361193 \tabularnewline
15 & 6 & 11.0606 & -5.06055 \tabularnewline
16 & 11 & 10.2817 & 0.718273 \tabularnewline
17 & 12 & 10.7252 & 1.2748 \tabularnewline
18 & 9 & 10.364 & -1.36402 \tabularnewline
19 & 10 & 10.5446 & -0.544611 \tabularnewline
20 & 10 & 11.3889 & -1.38891 \tabularnewline
21 & 14 & 11.012 & 2.98804 \tabularnewline
22 & 11 & 10.2541 & 0.745907 \tabularnewline
23 & 14 & 10.532 & 3.46803 \tabularnewline
24 & 12 & 11.8627 & 0.137278 \tabularnewline
25 & 11 & 11.0605 & -0.0604748 \tabularnewline
26 & 16 & 10.6352 & 5.36482 \tabularnewline
27 & 13 & 10.788 & 2.21203 \tabularnewline
28 & 10 & 10.8601 & -0.860059 \tabularnewline
29 & 12 & 10.5982 & 1.40178 \tabularnewline
30 & 8 & 9.89389 & -1.89389 \tabularnewline
31 & 12 & 9.74093 & 2.25907 \tabularnewline
32 & 9 & 10.4961 & -1.4961 \tabularnewline
33 & 10 & 10.5265 & -0.526518 \tabularnewline
34 & 11 & 11.2226 & -0.222586 \tabularnewline
35 & 10 & 10.1544 & -0.154369 \tabularnewline
36 & 10 & 10.5211 & -0.521111 \tabularnewline
37 & 13 & 10.6859 & 2.31407 \tabularnewline
38 & 9 & 10.8287 & -1.82867 \tabularnewline
39 & 12 & 11.6637 & 0.336262 \tabularnewline
40 & 6 & 10.2643 & -4.2643 \tabularnewline
41 & 11 & 10.5746 & 0.425355 \tabularnewline
42 & 13 & 11.236 & 1.76396 \tabularnewline
43 & 11 & 9.98812 & 1.01188 \tabularnewline
44 & 12 & 10.6153 & 1.38473 \tabularnewline
45 & 11 & 10.4269 & 0.573126 \tabularnewline
46 & 12 & 11.3547 & 0.645256 \tabularnewline
47 & 8 & 11.2462 & -3.24617 \tabularnewline
48 & 13 & 10.685 & 2.31496 \tabularnewline
49 & 12 & 11.2711 & 0.728903 \tabularnewline
50 & 12 & 11.7658 & 0.234219 \tabularnewline
51 & 7 & 10.3063 & -3.30627 \tabularnewline
52 & 11 & 10.8558 & 0.144156 \tabularnewline
53 & 12 & 10.601 & 1.39899 \tabularnewline
54 & 13 & 12.3509 & 0.649129 \tabularnewline
55 & 10 & 11.0721 & -1.07211 \tabularnewline
56 & 6 & 10.4911 & -4.49108 \tabularnewline
57 & 14 & 11.1742 & 2.82584 \tabularnewline
58 & 8 & 11.3561 & -3.3561 \tabularnewline
59 & 13 & 10.6545 & 2.34546 \tabularnewline
60 & 9 & 11.466 & -2.46603 \tabularnewline
61 & 13 & 11.0485 & 1.95147 \tabularnewline
62 & 8 & 10.4361 & -2.43611 \tabularnewline
63 & 16 & 11.157 & 4.84297 \tabularnewline
64 & 9 & 10.4347 & -1.43468 \tabularnewline
65 & 9 & 10.873 & -1.87297 \tabularnewline
66 & 11 & 10.4491 & 0.550898 \tabularnewline
67 & 13 & 11.5853 & 1.41473 \tabularnewline
68 & 15 & 10.6309 & 4.36912 \tabularnewline
69 & 12 & 10.0967 & 1.9033 \tabularnewline
70 & 12 & 11.3982 & 0.601849 \tabularnewline
71 & 11 & 10.5931 & 0.406879 \tabularnewline
72 & 9 & 10.532 & -1.53197 \tabularnewline
73 & 15 & 11.3261 & 3.67394 \tabularnewline
74 & 13 & 10.6046 & 2.3954 \tabularnewline
75 & 13 & 10.8901 & 2.10991 \tabularnewline
76 & 13 & 11.224 & 1.77598 \tabularnewline
77 & 13 & 11.0342 & 1.96581 \tabularnewline
78 & 10 & 11.2199 & -1.21988 \tabularnewline
79 & 8 & 11.4176 & -3.4176 \tabularnewline
80 & 11 & 11.5603 & -0.560262 \tabularnewline
81 & 13 & 10.5484 & 2.45164 \tabularnewline
82 & 11 & 10.8837 & 0.116281 \tabularnewline
83 & 4 & 10.7814 & -6.78143 \tabularnewline
84 & 10 & 10.4975 & -0.497452 \tabularnewline
85 & 12 & 10.6545 & 1.34546 \tabularnewline
86 & 11 & 11.0406 & -0.0406469 \tabularnewline
87 & 11 & 10.4703 & 0.529719 \tabularnewline
88 & 9 & 10.2906 & -1.29058 \tabularnewline
89 & 13 & 11.127 & 1.873 \tabularnewline
90 & 13 & 10.5903 & 2.40966 \tabularnewline
91 & 6 & 11.0276 & -5.02758 \tabularnewline
92 & 10 & 11.0342 & -1.03419 \tabularnewline
93 & 9 & 10.3534 & -1.35335 \tabularnewline
94 & 8 & 10.9686 & -2.96864 \tabularnewline
95 & 9 & 9.84951 & -0.84951 \tabularnewline
96 & 7 & 10.8901 & -3.89009 \tabularnewline
97 & 11 & 11.2997 & -0.299699 \tabularnewline
98 & 14 & 10.6689 & 3.33112 \tabularnewline
99 & 8 & 10.5946 & -2.59455 \tabularnewline
100 & 11 & 10.4439 & 0.556081 \tabularnewline
101 & 10 & 10.1714 & -0.171414 \tabularnewline
102 & 10 & 10.6467 & -0.646654 \tabularnewline
103 & 14 & 11.7108 & 2.28918 \tabularnewline
104 & 9 & 10.5146 & -1.51458 \tabularnewline
105 & 14 & 10.9464 & 3.05359 \tabularnewline
106 & 6 & 10.7658 & -4.76582 \tabularnewline
107 & 10 & 10.6505 & -0.650486 \tabularnewline
108 & 12 & 11.4439 & 0.556122 \tabularnewline
109 & 11 & 11.5211 & -0.52107 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270891&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]13[/C][C]11.2869[/C][C]1.71313[/C][/ROW]
[ROW][C]2[/C][C]12[/C][C]10.7802[/C][C]1.21984[/C][/ROW]
[ROW][C]3[/C][C]13[/C][C]10.902[/C][C]2.09805[/C][/ROW]
[ROW][C]4[/C][C]7[/C][C]10.7317[/C][C]-3.73173[/C][/ROW]
[ROW][C]5[/C][C]7[/C][C]10.6231[/C][C]-3.62307[/C][/ROW]
[ROW][C]6[/C][C]13[/C][C]10.5289[/C][C]2.47108[/C][/ROW]
[ROW][C]7[/C][C]15[/C][C]10.9492[/C][C]4.05081[/C][/ROW]
[ROW][C]8[/C][C]13[/C][C]10.8194[/C][C]2.18064[/C][/ROW]
[ROW][C]9[/C][C]11[/C][C]10.7658[/C][C]0.234178[/C][/ROW]
[ROW][C]10[/C][C]8[/C][C]11.7593[/C][C]-3.75933[/C][/ROW]
[ROW][C]11[/C][C]11[/C][C]10.7238[/C][C]0.276233[/C][/ROW]
[ROW][C]12[/C][C]6[/C][C]11.0184[/C][C]-5.01842[/C][/ROW]
[ROW][C]13[/C][C]11[/C][C]10.1179[/C][C]0.88212[/C][/ROW]
[ROW][C]14[/C][C]12[/C][C]11.6388[/C][C]0.361193[/C][/ROW]
[ROW][C]15[/C][C]6[/C][C]11.0606[/C][C]-5.06055[/C][/ROW]
[ROW][C]16[/C][C]11[/C][C]10.2817[/C][C]0.718273[/C][/ROW]
[ROW][C]17[/C][C]12[/C][C]10.7252[/C][C]1.2748[/C][/ROW]
[ROW][C]18[/C][C]9[/C][C]10.364[/C][C]-1.36402[/C][/ROW]
[ROW][C]19[/C][C]10[/C][C]10.5446[/C][C]-0.544611[/C][/ROW]
[ROW][C]20[/C][C]10[/C][C]11.3889[/C][C]-1.38891[/C][/ROW]
[ROW][C]21[/C][C]14[/C][C]11.012[/C][C]2.98804[/C][/ROW]
[ROW][C]22[/C][C]11[/C][C]10.2541[/C][C]0.745907[/C][/ROW]
[ROW][C]23[/C][C]14[/C][C]10.532[/C][C]3.46803[/C][/ROW]
[ROW][C]24[/C][C]12[/C][C]11.8627[/C][C]0.137278[/C][/ROW]
[ROW][C]25[/C][C]11[/C][C]11.0605[/C][C]-0.0604748[/C][/ROW]
[ROW][C]26[/C][C]16[/C][C]10.6352[/C][C]5.36482[/C][/ROW]
[ROW][C]27[/C][C]13[/C][C]10.788[/C][C]2.21203[/C][/ROW]
[ROW][C]28[/C][C]10[/C][C]10.8601[/C][C]-0.860059[/C][/ROW]
[ROW][C]29[/C][C]12[/C][C]10.5982[/C][C]1.40178[/C][/ROW]
[ROW][C]30[/C][C]8[/C][C]9.89389[/C][C]-1.89389[/C][/ROW]
[ROW][C]31[/C][C]12[/C][C]9.74093[/C][C]2.25907[/C][/ROW]
[ROW][C]32[/C][C]9[/C][C]10.4961[/C][C]-1.4961[/C][/ROW]
[ROW][C]33[/C][C]10[/C][C]10.5265[/C][C]-0.526518[/C][/ROW]
[ROW][C]34[/C][C]11[/C][C]11.2226[/C][C]-0.222586[/C][/ROW]
[ROW][C]35[/C][C]10[/C][C]10.1544[/C][C]-0.154369[/C][/ROW]
[ROW][C]36[/C][C]10[/C][C]10.5211[/C][C]-0.521111[/C][/ROW]
[ROW][C]37[/C][C]13[/C][C]10.6859[/C][C]2.31407[/C][/ROW]
[ROW][C]38[/C][C]9[/C][C]10.8287[/C][C]-1.82867[/C][/ROW]
[ROW][C]39[/C][C]12[/C][C]11.6637[/C][C]0.336262[/C][/ROW]
[ROW][C]40[/C][C]6[/C][C]10.2643[/C][C]-4.2643[/C][/ROW]
[ROW][C]41[/C][C]11[/C][C]10.5746[/C][C]0.425355[/C][/ROW]
[ROW][C]42[/C][C]13[/C][C]11.236[/C][C]1.76396[/C][/ROW]
[ROW][C]43[/C][C]11[/C][C]9.98812[/C][C]1.01188[/C][/ROW]
[ROW][C]44[/C][C]12[/C][C]10.6153[/C][C]1.38473[/C][/ROW]
[ROW][C]45[/C][C]11[/C][C]10.4269[/C][C]0.573126[/C][/ROW]
[ROW][C]46[/C][C]12[/C][C]11.3547[/C][C]0.645256[/C][/ROW]
[ROW][C]47[/C][C]8[/C][C]11.2462[/C][C]-3.24617[/C][/ROW]
[ROW][C]48[/C][C]13[/C][C]10.685[/C][C]2.31496[/C][/ROW]
[ROW][C]49[/C][C]12[/C][C]11.2711[/C][C]0.728903[/C][/ROW]
[ROW][C]50[/C][C]12[/C][C]11.7658[/C][C]0.234219[/C][/ROW]
[ROW][C]51[/C][C]7[/C][C]10.3063[/C][C]-3.30627[/C][/ROW]
[ROW][C]52[/C][C]11[/C][C]10.8558[/C][C]0.144156[/C][/ROW]
[ROW][C]53[/C][C]12[/C][C]10.601[/C][C]1.39899[/C][/ROW]
[ROW][C]54[/C][C]13[/C][C]12.3509[/C][C]0.649129[/C][/ROW]
[ROW][C]55[/C][C]10[/C][C]11.0721[/C][C]-1.07211[/C][/ROW]
[ROW][C]56[/C][C]6[/C][C]10.4911[/C][C]-4.49108[/C][/ROW]
[ROW][C]57[/C][C]14[/C][C]11.1742[/C][C]2.82584[/C][/ROW]
[ROW][C]58[/C][C]8[/C][C]11.3561[/C][C]-3.3561[/C][/ROW]
[ROW][C]59[/C][C]13[/C][C]10.6545[/C][C]2.34546[/C][/ROW]
[ROW][C]60[/C][C]9[/C][C]11.466[/C][C]-2.46603[/C][/ROW]
[ROW][C]61[/C][C]13[/C][C]11.0485[/C][C]1.95147[/C][/ROW]
[ROW][C]62[/C][C]8[/C][C]10.4361[/C][C]-2.43611[/C][/ROW]
[ROW][C]63[/C][C]16[/C][C]11.157[/C][C]4.84297[/C][/ROW]
[ROW][C]64[/C][C]9[/C][C]10.4347[/C][C]-1.43468[/C][/ROW]
[ROW][C]65[/C][C]9[/C][C]10.873[/C][C]-1.87297[/C][/ROW]
[ROW][C]66[/C][C]11[/C][C]10.4491[/C][C]0.550898[/C][/ROW]
[ROW][C]67[/C][C]13[/C][C]11.5853[/C][C]1.41473[/C][/ROW]
[ROW][C]68[/C][C]15[/C][C]10.6309[/C][C]4.36912[/C][/ROW]
[ROW][C]69[/C][C]12[/C][C]10.0967[/C][C]1.9033[/C][/ROW]
[ROW][C]70[/C][C]12[/C][C]11.3982[/C][C]0.601849[/C][/ROW]
[ROW][C]71[/C][C]11[/C][C]10.5931[/C][C]0.406879[/C][/ROW]
[ROW][C]72[/C][C]9[/C][C]10.532[/C][C]-1.53197[/C][/ROW]
[ROW][C]73[/C][C]15[/C][C]11.3261[/C][C]3.67394[/C][/ROW]
[ROW][C]74[/C][C]13[/C][C]10.6046[/C][C]2.3954[/C][/ROW]
[ROW][C]75[/C][C]13[/C][C]10.8901[/C][C]2.10991[/C][/ROW]
[ROW][C]76[/C][C]13[/C][C]11.224[/C][C]1.77598[/C][/ROW]
[ROW][C]77[/C][C]13[/C][C]11.0342[/C][C]1.96581[/C][/ROW]
[ROW][C]78[/C][C]10[/C][C]11.2199[/C][C]-1.21988[/C][/ROW]
[ROW][C]79[/C][C]8[/C][C]11.4176[/C][C]-3.4176[/C][/ROW]
[ROW][C]80[/C][C]11[/C][C]11.5603[/C][C]-0.560262[/C][/ROW]
[ROW][C]81[/C][C]13[/C][C]10.5484[/C][C]2.45164[/C][/ROW]
[ROW][C]82[/C][C]11[/C][C]10.8837[/C][C]0.116281[/C][/ROW]
[ROW][C]83[/C][C]4[/C][C]10.7814[/C][C]-6.78143[/C][/ROW]
[ROW][C]84[/C][C]10[/C][C]10.4975[/C][C]-0.497452[/C][/ROW]
[ROW][C]85[/C][C]12[/C][C]10.6545[/C][C]1.34546[/C][/ROW]
[ROW][C]86[/C][C]11[/C][C]11.0406[/C][C]-0.0406469[/C][/ROW]
[ROW][C]87[/C][C]11[/C][C]10.4703[/C][C]0.529719[/C][/ROW]
[ROW][C]88[/C][C]9[/C][C]10.2906[/C][C]-1.29058[/C][/ROW]
[ROW][C]89[/C][C]13[/C][C]11.127[/C][C]1.873[/C][/ROW]
[ROW][C]90[/C][C]13[/C][C]10.5903[/C][C]2.40966[/C][/ROW]
[ROW][C]91[/C][C]6[/C][C]11.0276[/C][C]-5.02758[/C][/ROW]
[ROW][C]92[/C][C]10[/C][C]11.0342[/C][C]-1.03419[/C][/ROW]
[ROW][C]93[/C][C]9[/C][C]10.3534[/C][C]-1.35335[/C][/ROW]
[ROW][C]94[/C][C]8[/C][C]10.9686[/C][C]-2.96864[/C][/ROW]
[ROW][C]95[/C][C]9[/C][C]9.84951[/C][C]-0.84951[/C][/ROW]
[ROW][C]96[/C][C]7[/C][C]10.8901[/C][C]-3.89009[/C][/ROW]
[ROW][C]97[/C][C]11[/C][C]11.2997[/C][C]-0.299699[/C][/ROW]
[ROW][C]98[/C][C]14[/C][C]10.6689[/C][C]3.33112[/C][/ROW]
[ROW][C]99[/C][C]8[/C][C]10.5946[/C][C]-2.59455[/C][/ROW]
[ROW][C]100[/C][C]11[/C][C]10.4439[/C][C]0.556081[/C][/ROW]
[ROW][C]101[/C][C]10[/C][C]10.1714[/C][C]-0.171414[/C][/ROW]
[ROW][C]102[/C][C]10[/C][C]10.6467[/C][C]-0.646654[/C][/ROW]
[ROW][C]103[/C][C]14[/C][C]11.7108[/C][C]2.28918[/C][/ROW]
[ROW][C]104[/C][C]9[/C][C]10.5146[/C][C]-1.51458[/C][/ROW]
[ROW][C]105[/C][C]14[/C][C]10.9464[/C][C]3.05359[/C][/ROW]
[ROW][C]106[/C][C]6[/C][C]10.7658[/C][C]-4.76582[/C][/ROW]
[ROW][C]107[/C][C]10[/C][C]10.6505[/C][C]-0.650486[/C][/ROW]
[ROW][C]108[/C][C]12[/C][C]11.4439[/C][C]0.556122[/C][/ROW]
[ROW][C]109[/C][C]11[/C][C]11.5211[/C][C]-0.52107[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270891&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270891&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
11311.28691.71313
21210.78021.21984
31310.9022.09805
4710.7317-3.73173
5710.6231-3.62307
61310.52892.47108
71510.94924.05081
81310.81942.18064
91110.76580.234178
10811.7593-3.75933
111110.72380.276233
12611.0184-5.01842
131110.11790.88212
141211.63880.361193
15611.0606-5.06055
161110.28170.718273
171210.72521.2748
18910.364-1.36402
191010.5446-0.544611
201011.3889-1.38891
211411.0122.98804
221110.25410.745907
231410.5323.46803
241211.86270.137278
251111.0605-0.0604748
261610.63525.36482
271310.7882.21203
281010.8601-0.860059
291210.59821.40178
3089.89389-1.89389
31129.740932.25907
32910.4961-1.4961
331010.5265-0.526518
341111.2226-0.222586
351010.1544-0.154369
361010.5211-0.521111
371310.68592.31407
38910.8287-1.82867
391211.66370.336262
40610.2643-4.2643
411110.57460.425355
421311.2361.76396
43119.988121.01188
441210.61531.38473
451110.42690.573126
461211.35470.645256
47811.2462-3.24617
481310.6852.31496
491211.27110.728903
501211.76580.234219
51710.3063-3.30627
521110.85580.144156
531210.6011.39899
541312.35090.649129
551011.0721-1.07211
56610.4911-4.49108
571411.17422.82584
58811.3561-3.3561
591310.65452.34546
60911.466-2.46603
611311.04851.95147
62810.4361-2.43611
631611.1574.84297
64910.4347-1.43468
65910.873-1.87297
661110.44910.550898
671311.58531.41473
681510.63094.36912
691210.09671.9033
701211.39820.601849
711110.59310.406879
72910.532-1.53197
731511.32613.67394
741310.60462.3954
751310.89012.10991
761311.2241.77598
771311.03421.96581
781011.2199-1.21988
79811.4176-3.4176
801111.5603-0.560262
811310.54842.45164
821110.88370.116281
83410.7814-6.78143
841010.4975-0.497452
851210.65451.34546
861111.0406-0.0406469
871110.47030.529719
88910.2906-1.29058
891311.1271.873
901310.59032.40966
91611.0276-5.02758
921011.0342-1.03419
93910.3534-1.35335
94810.9686-2.96864
9599.84951-0.84951
96710.8901-3.89009
971111.2997-0.299699
981410.66893.33112
99810.5946-2.59455
1001110.44390.556081
1011010.1714-0.171414
1021010.6467-0.646654
1031411.71082.28918
104910.5146-1.51458
1051410.94643.05359
106610.7658-4.76582
1071010.6505-0.650486
1081211.44390.556122
1091111.5211-0.52107







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
70.6229460.7541070.377054
80.8079970.3840060.192003
90.707960.584080.29204
100.7422070.5155850.257793
110.705160.589680.29484
120.9000430.1999150.0999574
130.8593680.2812630.140632
140.8797430.2405140.120257
150.9651970.06960590.0348029
160.9464930.1070140.0535068
170.9278040.1443920.0721958
180.9077720.1844570.0922283
190.87280.2543990.1272
200.8326980.3346050.167302
210.8513940.2972110.148606
220.8076160.3847680.192384
230.7945020.4109970.205498
240.7447480.5105030.255252
250.6857310.6285370.314269
260.8822470.2355060.117753
270.868560.262880.13144
280.8357820.3284360.164218
290.8059940.3880110.194006
300.799140.4017190.20086
310.7782610.4434790.221739
320.7541230.4917540.245877
330.7213160.5573690.278684
340.6675380.6649230.332462
350.6107050.7785890.389295
360.552240.895520.44776
370.5453610.9092790.454639
380.5142560.9714890.485744
390.4671060.9342120.532894
400.5795760.8408480.420424
410.5245590.9508820.475441
420.5045910.9908180.495409
430.463420.926840.53658
440.4256860.8513730.574314
450.3745380.7490760.625462
460.3300420.6600840.669958
470.3707550.741510.629245
480.3746480.7492970.625352
490.3282070.6564130.671793
500.2825110.5650210.717489
510.3292160.6584320.670784
520.2797940.5595880.720206
530.2537780.5075570.746222
540.2140.4280010.786
550.1815430.3630850.818457
560.2777320.5554650.722268
570.3029720.6059440.697028
580.3457930.6915860.654207
590.3466680.6933360.653332
600.3485280.6970560.651472
610.3337080.6674170.666292
620.3276050.6552090.672395
630.4948620.9897240.505138
640.455410.910820.54459
650.4309880.8619750.569012
660.381920.7638410.61808
670.3478770.6957540.652123
680.5228010.9543990.477199
690.491630.9832590.50837
700.437680.8753590.56232
710.383590.767180.61641
720.3484350.6968690.651565
730.456310.912620.54369
740.4866420.9732830.513358
750.4844180.9688350.515582
760.4764240.9528470.523576
770.4835520.9671040.516448
780.434850.86970.56515
790.5379060.9241890.462094
800.4771130.9542260.522887
810.4562890.9125780.543711
820.3975780.7951560.602422
830.6779660.6440690.322034
840.6178090.7643830.382191
850.5863450.8273090.413655
860.5157210.9685590.484279
870.497860.995720.50214
880.4321850.864370.567815
890.4181890.8363780.581811
900.4517670.9035330.548233
910.5762040.8475930.423796
920.4957160.9914320.504284
930.4236480.8472950.576352
940.4455790.8911580.554421
950.358930.717860.64107
960.5032110.9935780.496789
970.4662030.9324070.533797
980.5831590.8336820.416841
990.4705380.9410770.529462
1000.5279130.9441740.472087
1010.3839190.7678380.616081
1020.2452210.4904430.754779

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
7 & 0.622946 & 0.754107 & 0.377054 \tabularnewline
8 & 0.807997 & 0.384006 & 0.192003 \tabularnewline
9 & 0.70796 & 0.58408 & 0.29204 \tabularnewline
10 & 0.742207 & 0.515585 & 0.257793 \tabularnewline
11 & 0.70516 & 0.58968 & 0.29484 \tabularnewline
12 & 0.900043 & 0.199915 & 0.0999574 \tabularnewline
13 & 0.859368 & 0.281263 & 0.140632 \tabularnewline
14 & 0.879743 & 0.240514 & 0.120257 \tabularnewline
15 & 0.965197 & 0.0696059 & 0.0348029 \tabularnewline
16 & 0.946493 & 0.107014 & 0.0535068 \tabularnewline
17 & 0.927804 & 0.144392 & 0.0721958 \tabularnewline
18 & 0.907772 & 0.184457 & 0.0922283 \tabularnewline
19 & 0.8728 & 0.254399 & 0.1272 \tabularnewline
20 & 0.832698 & 0.334605 & 0.167302 \tabularnewline
21 & 0.851394 & 0.297211 & 0.148606 \tabularnewline
22 & 0.807616 & 0.384768 & 0.192384 \tabularnewline
23 & 0.794502 & 0.410997 & 0.205498 \tabularnewline
24 & 0.744748 & 0.510503 & 0.255252 \tabularnewline
25 & 0.685731 & 0.628537 & 0.314269 \tabularnewline
26 & 0.882247 & 0.235506 & 0.117753 \tabularnewline
27 & 0.86856 & 0.26288 & 0.13144 \tabularnewline
28 & 0.835782 & 0.328436 & 0.164218 \tabularnewline
29 & 0.805994 & 0.388011 & 0.194006 \tabularnewline
30 & 0.79914 & 0.401719 & 0.20086 \tabularnewline
31 & 0.778261 & 0.443479 & 0.221739 \tabularnewline
32 & 0.754123 & 0.491754 & 0.245877 \tabularnewline
33 & 0.721316 & 0.557369 & 0.278684 \tabularnewline
34 & 0.667538 & 0.664923 & 0.332462 \tabularnewline
35 & 0.610705 & 0.778589 & 0.389295 \tabularnewline
36 & 0.55224 & 0.89552 & 0.44776 \tabularnewline
37 & 0.545361 & 0.909279 & 0.454639 \tabularnewline
38 & 0.514256 & 0.971489 & 0.485744 \tabularnewline
39 & 0.467106 & 0.934212 & 0.532894 \tabularnewline
40 & 0.579576 & 0.840848 & 0.420424 \tabularnewline
41 & 0.524559 & 0.950882 & 0.475441 \tabularnewline
42 & 0.504591 & 0.990818 & 0.495409 \tabularnewline
43 & 0.46342 & 0.92684 & 0.53658 \tabularnewline
44 & 0.425686 & 0.851373 & 0.574314 \tabularnewline
45 & 0.374538 & 0.749076 & 0.625462 \tabularnewline
46 & 0.330042 & 0.660084 & 0.669958 \tabularnewline
47 & 0.370755 & 0.74151 & 0.629245 \tabularnewline
48 & 0.374648 & 0.749297 & 0.625352 \tabularnewline
49 & 0.328207 & 0.656413 & 0.671793 \tabularnewline
50 & 0.282511 & 0.565021 & 0.717489 \tabularnewline
51 & 0.329216 & 0.658432 & 0.670784 \tabularnewline
52 & 0.279794 & 0.559588 & 0.720206 \tabularnewline
53 & 0.253778 & 0.507557 & 0.746222 \tabularnewline
54 & 0.214 & 0.428001 & 0.786 \tabularnewline
55 & 0.181543 & 0.363085 & 0.818457 \tabularnewline
56 & 0.277732 & 0.555465 & 0.722268 \tabularnewline
57 & 0.302972 & 0.605944 & 0.697028 \tabularnewline
58 & 0.345793 & 0.691586 & 0.654207 \tabularnewline
59 & 0.346668 & 0.693336 & 0.653332 \tabularnewline
60 & 0.348528 & 0.697056 & 0.651472 \tabularnewline
61 & 0.333708 & 0.667417 & 0.666292 \tabularnewline
62 & 0.327605 & 0.655209 & 0.672395 \tabularnewline
63 & 0.494862 & 0.989724 & 0.505138 \tabularnewline
64 & 0.45541 & 0.91082 & 0.54459 \tabularnewline
65 & 0.430988 & 0.861975 & 0.569012 \tabularnewline
66 & 0.38192 & 0.763841 & 0.61808 \tabularnewline
67 & 0.347877 & 0.695754 & 0.652123 \tabularnewline
68 & 0.522801 & 0.954399 & 0.477199 \tabularnewline
69 & 0.49163 & 0.983259 & 0.50837 \tabularnewline
70 & 0.43768 & 0.875359 & 0.56232 \tabularnewline
71 & 0.38359 & 0.76718 & 0.61641 \tabularnewline
72 & 0.348435 & 0.696869 & 0.651565 \tabularnewline
73 & 0.45631 & 0.91262 & 0.54369 \tabularnewline
74 & 0.486642 & 0.973283 & 0.513358 \tabularnewline
75 & 0.484418 & 0.968835 & 0.515582 \tabularnewline
76 & 0.476424 & 0.952847 & 0.523576 \tabularnewline
77 & 0.483552 & 0.967104 & 0.516448 \tabularnewline
78 & 0.43485 & 0.8697 & 0.56515 \tabularnewline
79 & 0.537906 & 0.924189 & 0.462094 \tabularnewline
80 & 0.477113 & 0.954226 & 0.522887 \tabularnewline
81 & 0.456289 & 0.912578 & 0.543711 \tabularnewline
82 & 0.397578 & 0.795156 & 0.602422 \tabularnewline
83 & 0.677966 & 0.644069 & 0.322034 \tabularnewline
84 & 0.617809 & 0.764383 & 0.382191 \tabularnewline
85 & 0.586345 & 0.827309 & 0.413655 \tabularnewline
86 & 0.515721 & 0.968559 & 0.484279 \tabularnewline
87 & 0.49786 & 0.99572 & 0.50214 \tabularnewline
88 & 0.432185 & 0.86437 & 0.567815 \tabularnewline
89 & 0.418189 & 0.836378 & 0.581811 \tabularnewline
90 & 0.451767 & 0.903533 & 0.548233 \tabularnewline
91 & 0.576204 & 0.847593 & 0.423796 \tabularnewline
92 & 0.495716 & 0.991432 & 0.504284 \tabularnewline
93 & 0.423648 & 0.847295 & 0.576352 \tabularnewline
94 & 0.445579 & 0.891158 & 0.554421 \tabularnewline
95 & 0.35893 & 0.71786 & 0.64107 \tabularnewline
96 & 0.503211 & 0.993578 & 0.496789 \tabularnewline
97 & 0.466203 & 0.932407 & 0.533797 \tabularnewline
98 & 0.583159 & 0.833682 & 0.416841 \tabularnewline
99 & 0.470538 & 0.941077 & 0.529462 \tabularnewline
100 & 0.527913 & 0.944174 & 0.472087 \tabularnewline
101 & 0.383919 & 0.767838 & 0.616081 \tabularnewline
102 & 0.245221 & 0.490443 & 0.754779 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270891&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]7[/C][C]0.622946[/C][C]0.754107[/C][C]0.377054[/C][/ROW]
[ROW][C]8[/C][C]0.807997[/C][C]0.384006[/C][C]0.192003[/C][/ROW]
[ROW][C]9[/C][C]0.70796[/C][C]0.58408[/C][C]0.29204[/C][/ROW]
[ROW][C]10[/C][C]0.742207[/C][C]0.515585[/C][C]0.257793[/C][/ROW]
[ROW][C]11[/C][C]0.70516[/C][C]0.58968[/C][C]0.29484[/C][/ROW]
[ROW][C]12[/C][C]0.900043[/C][C]0.199915[/C][C]0.0999574[/C][/ROW]
[ROW][C]13[/C][C]0.859368[/C][C]0.281263[/C][C]0.140632[/C][/ROW]
[ROW][C]14[/C][C]0.879743[/C][C]0.240514[/C][C]0.120257[/C][/ROW]
[ROW][C]15[/C][C]0.965197[/C][C]0.0696059[/C][C]0.0348029[/C][/ROW]
[ROW][C]16[/C][C]0.946493[/C][C]0.107014[/C][C]0.0535068[/C][/ROW]
[ROW][C]17[/C][C]0.927804[/C][C]0.144392[/C][C]0.0721958[/C][/ROW]
[ROW][C]18[/C][C]0.907772[/C][C]0.184457[/C][C]0.0922283[/C][/ROW]
[ROW][C]19[/C][C]0.8728[/C][C]0.254399[/C][C]0.1272[/C][/ROW]
[ROW][C]20[/C][C]0.832698[/C][C]0.334605[/C][C]0.167302[/C][/ROW]
[ROW][C]21[/C][C]0.851394[/C][C]0.297211[/C][C]0.148606[/C][/ROW]
[ROW][C]22[/C][C]0.807616[/C][C]0.384768[/C][C]0.192384[/C][/ROW]
[ROW][C]23[/C][C]0.794502[/C][C]0.410997[/C][C]0.205498[/C][/ROW]
[ROW][C]24[/C][C]0.744748[/C][C]0.510503[/C][C]0.255252[/C][/ROW]
[ROW][C]25[/C][C]0.685731[/C][C]0.628537[/C][C]0.314269[/C][/ROW]
[ROW][C]26[/C][C]0.882247[/C][C]0.235506[/C][C]0.117753[/C][/ROW]
[ROW][C]27[/C][C]0.86856[/C][C]0.26288[/C][C]0.13144[/C][/ROW]
[ROW][C]28[/C][C]0.835782[/C][C]0.328436[/C][C]0.164218[/C][/ROW]
[ROW][C]29[/C][C]0.805994[/C][C]0.388011[/C][C]0.194006[/C][/ROW]
[ROW][C]30[/C][C]0.79914[/C][C]0.401719[/C][C]0.20086[/C][/ROW]
[ROW][C]31[/C][C]0.778261[/C][C]0.443479[/C][C]0.221739[/C][/ROW]
[ROW][C]32[/C][C]0.754123[/C][C]0.491754[/C][C]0.245877[/C][/ROW]
[ROW][C]33[/C][C]0.721316[/C][C]0.557369[/C][C]0.278684[/C][/ROW]
[ROW][C]34[/C][C]0.667538[/C][C]0.664923[/C][C]0.332462[/C][/ROW]
[ROW][C]35[/C][C]0.610705[/C][C]0.778589[/C][C]0.389295[/C][/ROW]
[ROW][C]36[/C][C]0.55224[/C][C]0.89552[/C][C]0.44776[/C][/ROW]
[ROW][C]37[/C][C]0.545361[/C][C]0.909279[/C][C]0.454639[/C][/ROW]
[ROW][C]38[/C][C]0.514256[/C][C]0.971489[/C][C]0.485744[/C][/ROW]
[ROW][C]39[/C][C]0.467106[/C][C]0.934212[/C][C]0.532894[/C][/ROW]
[ROW][C]40[/C][C]0.579576[/C][C]0.840848[/C][C]0.420424[/C][/ROW]
[ROW][C]41[/C][C]0.524559[/C][C]0.950882[/C][C]0.475441[/C][/ROW]
[ROW][C]42[/C][C]0.504591[/C][C]0.990818[/C][C]0.495409[/C][/ROW]
[ROW][C]43[/C][C]0.46342[/C][C]0.92684[/C][C]0.53658[/C][/ROW]
[ROW][C]44[/C][C]0.425686[/C][C]0.851373[/C][C]0.574314[/C][/ROW]
[ROW][C]45[/C][C]0.374538[/C][C]0.749076[/C][C]0.625462[/C][/ROW]
[ROW][C]46[/C][C]0.330042[/C][C]0.660084[/C][C]0.669958[/C][/ROW]
[ROW][C]47[/C][C]0.370755[/C][C]0.74151[/C][C]0.629245[/C][/ROW]
[ROW][C]48[/C][C]0.374648[/C][C]0.749297[/C][C]0.625352[/C][/ROW]
[ROW][C]49[/C][C]0.328207[/C][C]0.656413[/C][C]0.671793[/C][/ROW]
[ROW][C]50[/C][C]0.282511[/C][C]0.565021[/C][C]0.717489[/C][/ROW]
[ROW][C]51[/C][C]0.329216[/C][C]0.658432[/C][C]0.670784[/C][/ROW]
[ROW][C]52[/C][C]0.279794[/C][C]0.559588[/C][C]0.720206[/C][/ROW]
[ROW][C]53[/C][C]0.253778[/C][C]0.507557[/C][C]0.746222[/C][/ROW]
[ROW][C]54[/C][C]0.214[/C][C]0.428001[/C][C]0.786[/C][/ROW]
[ROW][C]55[/C][C]0.181543[/C][C]0.363085[/C][C]0.818457[/C][/ROW]
[ROW][C]56[/C][C]0.277732[/C][C]0.555465[/C][C]0.722268[/C][/ROW]
[ROW][C]57[/C][C]0.302972[/C][C]0.605944[/C][C]0.697028[/C][/ROW]
[ROW][C]58[/C][C]0.345793[/C][C]0.691586[/C][C]0.654207[/C][/ROW]
[ROW][C]59[/C][C]0.346668[/C][C]0.693336[/C][C]0.653332[/C][/ROW]
[ROW][C]60[/C][C]0.348528[/C][C]0.697056[/C][C]0.651472[/C][/ROW]
[ROW][C]61[/C][C]0.333708[/C][C]0.667417[/C][C]0.666292[/C][/ROW]
[ROW][C]62[/C][C]0.327605[/C][C]0.655209[/C][C]0.672395[/C][/ROW]
[ROW][C]63[/C][C]0.494862[/C][C]0.989724[/C][C]0.505138[/C][/ROW]
[ROW][C]64[/C][C]0.45541[/C][C]0.91082[/C][C]0.54459[/C][/ROW]
[ROW][C]65[/C][C]0.430988[/C][C]0.861975[/C][C]0.569012[/C][/ROW]
[ROW][C]66[/C][C]0.38192[/C][C]0.763841[/C][C]0.61808[/C][/ROW]
[ROW][C]67[/C][C]0.347877[/C][C]0.695754[/C][C]0.652123[/C][/ROW]
[ROW][C]68[/C][C]0.522801[/C][C]0.954399[/C][C]0.477199[/C][/ROW]
[ROW][C]69[/C][C]0.49163[/C][C]0.983259[/C][C]0.50837[/C][/ROW]
[ROW][C]70[/C][C]0.43768[/C][C]0.875359[/C][C]0.56232[/C][/ROW]
[ROW][C]71[/C][C]0.38359[/C][C]0.76718[/C][C]0.61641[/C][/ROW]
[ROW][C]72[/C][C]0.348435[/C][C]0.696869[/C][C]0.651565[/C][/ROW]
[ROW][C]73[/C][C]0.45631[/C][C]0.91262[/C][C]0.54369[/C][/ROW]
[ROW][C]74[/C][C]0.486642[/C][C]0.973283[/C][C]0.513358[/C][/ROW]
[ROW][C]75[/C][C]0.484418[/C][C]0.968835[/C][C]0.515582[/C][/ROW]
[ROW][C]76[/C][C]0.476424[/C][C]0.952847[/C][C]0.523576[/C][/ROW]
[ROW][C]77[/C][C]0.483552[/C][C]0.967104[/C][C]0.516448[/C][/ROW]
[ROW][C]78[/C][C]0.43485[/C][C]0.8697[/C][C]0.56515[/C][/ROW]
[ROW][C]79[/C][C]0.537906[/C][C]0.924189[/C][C]0.462094[/C][/ROW]
[ROW][C]80[/C][C]0.477113[/C][C]0.954226[/C][C]0.522887[/C][/ROW]
[ROW][C]81[/C][C]0.456289[/C][C]0.912578[/C][C]0.543711[/C][/ROW]
[ROW][C]82[/C][C]0.397578[/C][C]0.795156[/C][C]0.602422[/C][/ROW]
[ROW][C]83[/C][C]0.677966[/C][C]0.644069[/C][C]0.322034[/C][/ROW]
[ROW][C]84[/C][C]0.617809[/C][C]0.764383[/C][C]0.382191[/C][/ROW]
[ROW][C]85[/C][C]0.586345[/C][C]0.827309[/C][C]0.413655[/C][/ROW]
[ROW][C]86[/C][C]0.515721[/C][C]0.968559[/C][C]0.484279[/C][/ROW]
[ROW][C]87[/C][C]0.49786[/C][C]0.99572[/C][C]0.50214[/C][/ROW]
[ROW][C]88[/C][C]0.432185[/C][C]0.86437[/C][C]0.567815[/C][/ROW]
[ROW][C]89[/C][C]0.418189[/C][C]0.836378[/C][C]0.581811[/C][/ROW]
[ROW][C]90[/C][C]0.451767[/C][C]0.903533[/C][C]0.548233[/C][/ROW]
[ROW][C]91[/C][C]0.576204[/C][C]0.847593[/C][C]0.423796[/C][/ROW]
[ROW][C]92[/C][C]0.495716[/C][C]0.991432[/C][C]0.504284[/C][/ROW]
[ROW][C]93[/C][C]0.423648[/C][C]0.847295[/C][C]0.576352[/C][/ROW]
[ROW][C]94[/C][C]0.445579[/C][C]0.891158[/C][C]0.554421[/C][/ROW]
[ROW][C]95[/C][C]0.35893[/C][C]0.71786[/C][C]0.64107[/C][/ROW]
[ROW][C]96[/C][C]0.503211[/C][C]0.993578[/C][C]0.496789[/C][/ROW]
[ROW][C]97[/C][C]0.466203[/C][C]0.932407[/C][C]0.533797[/C][/ROW]
[ROW][C]98[/C][C]0.583159[/C][C]0.833682[/C][C]0.416841[/C][/ROW]
[ROW][C]99[/C][C]0.470538[/C][C]0.941077[/C][C]0.529462[/C][/ROW]
[ROW][C]100[/C][C]0.527913[/C][C]0.944174[/C][C]0.472087[/C][/ROW]
[ROW][C]101[/C][C]0.383919[/C][C]0.767838[/C][C]0.616081[/C][/ROW]
[ROW][C]102[/C][C]0.245221[/C][C]0.490443[/C][C]0.754779[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270891&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270891&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
70.6229460.7541070.377054
80.8079970.3840060.192003
90.707960.584080.29204
100.7422070.5155850.257793
110.705160.589680.29484
120.9000430.1999150.0999574
130.8593680.2812630.140632
140.8797430.2405140.120257
150.9651970.06960590.0348029
160.9464930.1070140.0535068
170.9278040.1443920.0721958
180.9077720.1844570.0922283
190.87280.2543990.1272
200.8326980.3346050.167302
210.8513940.2972110.148606
220.8076160.3847680.192384
230.7945020.4109970.205498
240.7447480.5105030.255252
250.6857310.6285370.314269
260.8822470.2355060.117753
270.868560.262880.13144
280.8357820.3284360.164218
290.8059940.3880110.194006
300.799140.4017190.20086
310.7782610.4434790.221739
320.7541230.4917540.245877
330.7213160.5573690.278684
340.6675380.6649230.332462
350.6107050.7785890.389295
360.552240.895520.44776
370.5453610.9092790.454639
380.5142560.9714890.485744
390.4671060.9342120.532894
400.5795760.8408480.420424
410.5245590.9508820.475441
420.5045910.9908180.495409
430.463420.926840.53658
440.4256860.8513730.574314
450.3745380.7490760.625462
460.3300420.6600840.669958
470.3707550.741510.629245
480.3746480.7492970.625352
490.3282070.6564130.671793
500.2825110.5650210.717489
510.3292160.6584320.670784
520.2797940.5595880.720206
530.2537780.5075570.746222
540.2140.4280010.786
550.1815430.3630850.818457
560.2777320.5554650.722268
570.3029720.6059440.697028
580.3457930.6915860.654207
590.3466680.6933360.653332
600.3485280.6970560.651472
610.3337080.6674170.666292
620.3276050.6552090.672395
630.4948620.9897240.505138
640.455410.910820.54459
650.4309880.8619750.569012
660.381920.7638410.61808
670.3478770.6957540.652123
680.5228010.9543990.477199
690.491630.9832590.50837
700.437680.8753590.56232
710.383590.767180.61641
720.3484350.6968690.651565
730.456310.912620.54369
740.4866420.9732830.513358
750.4844180.9688350.515582
760.4764240.9528470.523576
770.4835520.9671040.516448
780.434850.86970.56515
790.5379060.9241890.462094
800.4771130.9542260.522887
810.4562890.9125780.543711
820.3975780.7951560.602422
830.6779660.6440690.322034
840.6178090.7643830.382191
850.5863450.8273090.413655
860.5157210.9685590.484279
870.497860.995720.50214
880.4321850.864370.567815
890.4181890.8363780.581811
900.4517670.9035330.548233
910.5762040.8475930.423796
920.4957160.9914320.504284
930.4236480.8472950.576352
940.4455790.8911580.554421
950.358930.717860.64107
960.5032110.9935780.496789
970.4662030.9324070.533797
980.5831590.8336820.416841
990.4705380.9410770.529462
1000.5279130.9441740.472087
1010.3839190.7678380.616081
1020.2452210.4904430.754779







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level10.0104167OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 & 0 & OK \tabularnewline
5% type I error level & 0 & 0 & OK \tabularnewline
10% type I error level & 1 & 0.0104167 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=270891&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]0[/C][C]0[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]1[/C][C]0.0104167[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=270891&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=270891&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level00OK
5% type I error level00OK
10% type I error level10.0104167OK



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,signif(mysum$coefficients[i,1],6))
a<-table.element(a, signif(mysum$coefficients[i,2],6))
a<-table.element(a, signif(mysum$coefficients[i,3],4))
a<-table.element(a, signif(mysum$coefficients[i,4],6))
a<-table.element(a, signif(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, signif(sqrt(mysum$r.squared),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, signif(mysum$r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, signif(mysum$adj.r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[1],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, signif(mysum$sigma,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, signif(sum(myerror*myerror),6))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,signif(x[i],6))
a<-table.element(a,signif(x[i]-mysum$resid[i],6))
a<-table.element(a,signif(mysum$resid[i],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,signif(gqarr[mypoint-kp3+1,1],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,2],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,3],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,signif(numsignificant1/numgqtests,6))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}