Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationMon, 15 Dec 2014 10:53:24 +0000
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2014/Dec/15/t14186408122y30l32envziix8.htm/, Retrieved Fri, 17 May 2024 00:03:35 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=268087, Retrieved Fri, 17 May 2024 00:03:35 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact60
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-     [Multiple Regression] [AME.I en factoren] [2014-12-09 14:07:57] [1e921ed6280e31020168fe5cd3fc7265]
- R       [Multiple Regression] [] [2014-12-15 10:53:24] [f2d9a31865e6602837b48e5a0fc457f1] [Current]
Feedback Forum

Post a new message
Dataseries X:
50 13 13 12
62 13 16 8
54 11 11 11
71 14 10 13
54 15 9 11
65 14 8 10
73 11 26 7
52 13 10 10
84 16 10 15
42 14 8 12
66 14 13 12
65 15 11 10
78 15 8 10
73 13 12 14
75 14 24 6
72 11 21 12
66 12 5 14
70 14 14 11
61 13 11 8
81 12 9 12
69 15 17 13
71 14 18 11
68 12 23 7
70 12 9 11
68 12 14 7
61 15 13 12
67 14 10 12
76 16 8 13
70 12 10 9
60 12 19 11
72 14 11 12
69 16 16 15
71 15 12 12
62 12 11 6
70 14 11 5
64 13 10 13
58 14 13 11
76 16 14 6
52 12 8 12
59 14 11 10
68 15 11 6
76 13 13 12
65 16 15 11
67 16 15 6
59 12 16 12
69 12 12 12
76 16 12 8
63 12 17 10
75 15 14 11
63 12 15 7
60 13 12 12
73 12 13 13
63 14 7 14
70 14 8 12
75 11 16 6
66 10 20 14
63 12 14 10
63 11 10 12
64 16 16 11
70 14 11 10
75 14 26 7
61 15 9 12
60 15 15 7
62 14 12 12
73 13 21 12
61 11 20 10
66 16 20 10
64 12 10 12
59 15 15 12
64 14 10 12
60 15 16 8
56 14 9 10
78 13 17 5
53 6 10 10
67 12 19 10
59 12 13 12
66 14 8 11
68 14 11 9
71 15 9 12
66 11 12 11
73 13 10 10
72 14 9 12
71 16 14 10
59 13 14 9
64 14 10 11
66 16 8 12
78 11 13 7
68 13 9 11
73 13 14 12
62 15 8 6
65 12 16 9
68 13 14 15
65 12 14 10
60 14 8 11
71 14 11 12
65 16 11 12
68 15 13 12
64 14 12 11
74 13 13 9
69 14 9 11
76 15 10 12
68 14 12 12
72 12 11 14
67 7 13 8
63 12 17 10
59 15 15 9
73 12 15 10
66 13 14 9
62 11 10 10
69 14 15 12




Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Sir Ronald Aylmer Fisher' @ fisher.wessa.net

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input & view raw input (R code)  \tabularnewline
Raw Output & view raw output of R engine  \tabularnewline
Computing time & 5 seconds \tabularnewline
R Server & 'Sir Ronald Aylmer Fisher' @ fisher.wessa.net \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=268087&T=0

[TABLE]
[ROW][C]Summary of computational transaction[/C][/ROW]
[ROW][C]Raw Input[/C][C]view raw input (R code) [/C][/ROW]
[ROW][C]Raw Output[/C][C]view raw output of R engine [/C][/ROW]
[ROW][C]Computing time[/C][C]5 seconds[/C][/ROW]
[ROW][C]R Server[/C][C]'Sir Ronald Aylmer Fisher' @ fisher.wessa.net[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=268087&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=268087&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Inputview raw input (R code)
Raw Outputview raw output of R engine
Computing time5 seconds
R Server'Sir Ronald Aylmer Fisher' @ fisher.wessa.net







Multiple Linear Regression - Estimated Regression Equation
AMS.E[t] = + 53.1487 + 0.686456STRESSTOT[t] + 0.318836CESDTOT[t] + 0.00437824CONFSOFTTOT[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
AMS.E[t] =  +  53.1487 +  0.686456STRESSTOT[t] +  0.318836CESDTOT[t] +  0.00437824CONFSOFTTOT[t]  + e[t] \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=268087&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]AMS.E[t] =  +  53.1487 +  0.686456STRESSTOT[t] +  0.318836CESDTOT[t] +  0.00437824CONFSOFTTOT[t]  + e[t][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=268087&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=268087&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
AMS.E[t] = + 53.1487 + 0.686456STRESSTOT[t] + 0.318836CESDTOT[t] + 0.00437824CONFSOFTTOT[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)53.14876.779567.843.74406e-121.87203e-12
STRESSTOT0.6864560.3735761.8380.06893320.0344666
CESDTOT0.3188360.1716431.8580.06600810.033004
CONFSOFTTOT0.004378240.3009540.014550.988420.49421

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & 53.1487 & 6.77956 & 7.84 & 3.74406e-12 & 1.87203e-12 \tabularnewline
STRESSTOT & 0.686456 & 0.373576 & 1.838 & 0.0689332 & 0.0344666 \tabularnewline
CESDTOT & 0.318836 & 0.171643 & 1.858 & 0.0660081 & 0.033004 \tabularnewline
CONFSOFTTOT & 0.00437824 & 0.300954 & 0.01455 & 0.98842 & 0.49421 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=268087&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]53.1487[/C][C]6.77956[/C][C]7.84[/C][C]3.74406e-12[/C][C]1.87203e-12[/C][/ROW]
[ROW][C]STRESSTOT[/C][C]0.686456[/C][C]0.373576[/C][C]1.838[/C][C]0.0689332[/C][C]0.0344666[/C][/ROW]
[ROW][C]CESDTOT[/C][C]0.318836[/C][C]0.171643[/C][C]1.858[/C][C]0.0660081[/C][C]0.033004[/C][/ROW]
[ROW][C]CONFSOFTTOT[/C][C]0.00437824[/C][C]0.300954[/C][C]0.01455[/C][C]0.98842[/C][C]0.49421[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=268087&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=268087&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)53.14876.779567.843.74406e-121.87203e-12
STRESSTOT0.6864560.3735761.8380.06893320.0344666
CESDTOT0.3188360.1716431.8580.06600810.033004
CONFSOFTTOT0.004378240.3009540.014550.988420.49421







Multiple Linear Regression - Regression Statistics
Multiple R0.236375
R-squared0.0558731
Adjusted R-squared0.0291525
F-TEST (value)2.09101
F-TEST (DF numerator)3
F-TEST (DF denominator)106
p-value0.10578
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation6.83135
Sum Squared Residuals4946.74

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R & 0.236375 \tabularnewline
R-squared & 0.0558731 \tabularnewline
Adjusted R-squared & 0.0291525 \tabularnewline
F-TEST (value) & 2.09101 \tabularnewline
F-TEST (DF numerator) & 3 \tabularnewline
F-TEST (DF denominator) & 106 \tabularnewline
p-value & 0.10578 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation & 6.83135 \tabularnewline
Sum Squared Residuals & 4946.74 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=268087&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C]0.236375[/C][/ROW]
[ROW][C]R-squared[/C][C]0.0558731[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]0.0291525[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C]2.09101[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]3[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]106[/C][/ROW]
[ROW][C]p-value[/C][C]0.10578[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C]6.83135[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C]4946.74[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=268087&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=268087&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R0.236375
R-squared0.0558731
Adjusted R-squared0.0291525
F-TEST (value)2.09101
F-TEST (DF numerator)3
F-TEST (DF denominator)106
p-value0.10578
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation6.83135
Sum Squared Residuals4946.74







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
15066.2701-16.2701
26267.2091-5.20907
35464.2551-10.2551
47166.00444.9956
55466.3633-12.3633
66565.3536-0.353597
77369.02013.97986
85265.3048-13.3048
98467.386116.6139
104265.3624-23.3624
116666.9565-0.956533
126566.9966-1.99656
137866.040111.9599
147365.967.04
157570.43754.56254
167267.44794.55215
176663.04172.95831
187067.2712.72901
196165.6149-4.61489
208164.308316.6917
216968.92270.0772885
227168.54632.45366
236868.7501-0.750091
247064.30395.6961
256865.88062.11943
266167.643-6.64299
2767660.999975
287666.73969.26036
297064.6145.38602
306067.4923-7.49226
317266.31895.68114
326969.2991-0.299088
337167.32423.67585
346264.9197-2.91968
357066.28823.71179
366465.3179-1.31795
375866.9522-8.95216
387668.6227.37799
395263.9894-11.9894
405966.3101-7.3101
416866.9791.02095
427666.27019.72992
436568.9627-3.96274
446768.9408-1.94085
455966.5401-7.54013
466965.26483.73521
477667.99318.0069
486366.8502-3.85021
497567.95747.04255
506366.1994-3.1994
516065.9512-5.95124
527365.5887.412
536365.0523-2.05227
547065.36244.63765
557565.82749.1726
566666.4513-0.451319
576365.8937-2.8937
586363.9407-0.940658
596469.2816-5.28157
607066.31013.6899
617571.07953.92049
626166.3676-5.36765
636068.2588-8.25877
646266.6377-4.6377
657368.82084.17923
666167.1203-6.12026
676670.5525-4.55254
686464.6271-0.627114
695968.2807-9.28066
706466-2.00003
716068.582-8.58198
725665.6724-9.67243
737867.514810.4852
745360.4996-7.49962
756767.4879-0.487881
765965.5836-6.58362
776665.3580.642025
786866.30571.69427
797166.36764.63235
806664.5741.42605
817365.30487.69519
827265.68126.31881
837168.63952.36048
845966.5758-7.57578
856465.9956-1.99565
866666.7353-0.735265
877864.875313.1247
886864.99043.00964
897366.58896.41109
906266.0225-4.02254
916566.527-1.527
926866.6021.39795
936565.8937-0.893701
946065.358-5.35798
957166.31894.68114
966567.6918-2.69177
976867.6430.357011
986466.6333-2.63332
997466.25697.74306
1006965.67683.32319
1017666.68659.31352
1026866.63771.3623
1037264.95477.04529
1046762.13384.86617
1056366.8502-3.85021
1065968.2675-9.26753
1077366.21256.78746
1086666.5758-0.575779
1096263.9319-1.9319
1106967.59421.40579

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 & 50 & 66.2701 & -16.2701 \tabularnewline
2 & 62 & 67.2091 & -5.20907 \tabularnewline
3 & 54 & 64.2551 & -10.2551 \tabularnewline
4 & 71 & 66.0044 & 4.9956 \tabularnewline
5 & 54 & 66.3633 & -12.3633 \tabularnewline
6 & 65 & 65.3536 & -0.353597 \tabularnewline
7 & 73 & 69.0201 & 3.97986 \tabularnewline
8 & 52 & 65.3048 & -13.3048 \tabularnewline
9 & 84 & 67.3861 & 16.6139 \tabularnewline
10 & 42 & 65.3624 & -23.3624 \tabularnewline
11 & 66 & 66.9565 & -0.956533 \tabularnewline
12 & 65 & 66.9966 & -1.99656 \tabularnewline
13 & 78 & 66.0401 & 11.9599 \tabularnewline
14 & 73 & 65.96 & 7.04 \tabularnewline
15 & 75 & 70.4375 & 4.56254 \tabularnewline
16 & 72 & 67.4479 & 4.55215 \tabularnewline
17 & 66 & 63.0417 & 2.95831 \tabularnewline
18 & 70 & 67.271 & 2.72901 \tabularnewline
19 & 61 & 65.6149 & -4.61489 \tabularnewline
20 & 81 & 64.3083 & 16.6917 \tabularnewline
21 & 69 & 68.9227 & 0.0772885 \tabularnewline
22 & 71 & 68.5463 & 2.45366 \tabularnewline
23 & 68 & 68.7501 & -0.750091 \tabularnewline
24 & 70 & 64.3039 & 5.6961 \tabularnewline
25 & 68 & 65.8806 & 2.11943 \tabularnewline
26 & 61 & 67.643 & -6.64299 \tabularnewline
27 & 67 & 66 & 0.999975 \tabularnewline
28 & 76 & 66.7396 & 9.26036 \tabularnewline
29 & 70 & 64.614 & 5.38602 \tabularnewline
30 & 60 & 67.4923 & -7.49226 \tabularnewline
31 & 72 & 66.3189 & 5.68114 \tabularnewline
32 & 69 & 69.2991 & -0.299088 \tabularnewline
33 & 71 & 67.3242 & 3.67585 \tabularnewline
34 & 62 & 64.9197 & -2.91968 \tabularnewline
35 & 70 & 66.2882 & 3.71179 \tabularnewline
36 & 64 & 65.3179 & -1.31795 \tabularnewline
37 & 58 & 66.9522 & -8.95216 \tabularnewline
38 & 76 & 68.622 & 7.37799 \tabularnewline
39 & 52 & 63.9894 & -11.9894 \tabularnewline
40 & 59 & 66.3101 & -7.3101 \tabularnewline
41 & 68 & 66.979 & 1.02095 \tabularnewline
42 & 76 & 66.2701 & 9.72992 \tabularnewline
43 & 65 & 68.9627 & -3.96274 \tabularnewline
44 & 67 & 68.9408 & -1.94085 \tabularnewline
45 & 59 & 66.5401 & -7.54013 \tabularnewline
46 & 69 & 65.2648 & 3.73521 \tabularnewline
47 & 76 & 67.9931 & 8.0069 \tabularnewline
48 & 63 & 66.8502 & -3.85021 \tabularnewline
49 & 75 & 67.9574 & 7.04255 \tabularnewline
50 & 63 & 66.1994 & -3.1994 \tabularnewline
51 & 60 & 65.9512 & -5.95124 \tabularnewline
52 & 73 & 65.588 & 7.412 \tabularnewline
53 & 63 & 65.0523 & -2.05227 \tabularnewline
54 & 70 & 65.3624 & 4.63765 \tabularnewline
55 & 75 & 65.8274 & 9.1726 \tabularnewline
56 & 66 & 66.4513 & -0.451319 \tabularnewline
57 & 63 & 65.8937 & -2.8937 \tabularnewline
58 & 63 & 63.9407 & -0.940658 \tabularnewline
59 & 64 & 69.2816 & -5.28157 \tabularnewline
60 & 70 & 66.3101 & 3.6899 \tabularnewline
61 & 75 & 71.0795 & 3.92049 \tabularnewline
62 & 61 & 66.3676 & -5.36765 \tabularnewline
63 & 60 & 68.2588 & -8.25877 \tabularnewline
64 & 62 & 66.6377 & -4.6377 \tabularnewline
65 & 73 & 68.8208 & 4.17923 \tabularnewline
66 & 61 & 67.1203 & -6.12026 \tabularnewline
67 & 66 & 70.5525 & -4.55254 \tabularnewline
68 & 64 & 64.6271 & -0.627114 \tabularnewline
69 & 59 & 68.2807 & -9.28066 \tabularnewline
70 & 64 & 66 & -2.00003 \tabularnewline
71 & 60 & 68.582 & -8.58198 \tabularnewline
72 & 56 & 65.6724 & -9.67243 \tabularnewline
73 & 78 & 67.5148 & 10.4852 \tabularnewline
74 & 53 & 60.4996 & -7.49962 \tabularnewline
75 & 67 & 67.4879 & -0.487881 \tabularnewline
76 & 59 & 65.5836 & -6.58362 \tabularnewline
77 & 66 & 65.358 & 0.642025 \tabularnewline
78 & 68 & 66.3057 & 1.69427 \tabularnewline
79 & 71 & 66.3676 & 4.63235 \tabularnewline
80 & 66 & 64.574 & 1.42605 \tabularnewline
81 & 73 & 65.3048 & 7.69519 \tabularnewline
82 & 72 & 65.6812 & 6.31881 \tabularnewline
83 & 71 & 68.6395 & 2.36048 \tabularnewline
84 & 59 & 66.5758 & -7.57578 \tabularnewline
85 & 64 & 65.9956 & -1.99565 \tabularnewline
86 & 66 & 66.7353 & -0.735265 \tabularnewline
87 & 78 & 64.8753 & 13.1247 \tabularnewline
88 & 68 & 64.9904 & 3.00964 \tabularnewline
89 & 73 & 66.5889 & 6.41109 \tabularnewline
90 & 62 & 66.0225 & -4.02254 \tabularnewline
91 & 65 & 66.527 & -1.527 \tabularnewline
92 & 68 & 66.602 & 1.39795 \tabularnewline
93 & 65 & 65.8937 & -0.893701 \tabularnewline
94 & 60 & 65.358 & -5.35798 \tabularnewline
95 & 71 & 66.3189 & 4.68114 \tabularnewline
96 & 65 & 67.6918 & -2.69177 \tabularnewline
97 & 68 & 67.643 & 0.357011 \tabularnewline
98 & 64 & 66.6333 & -2.63332 \tabularnewline
99 & 74 & 66.2569 & 7.74306 \tabularnewline
100 & 69 & 65.6768 & 3.32319 \tabularnewline
101 & 76 & 66.6865 & 9.31352 \tabularnewline
102 & 68 & 66.6377 & 1.3623 \tabularnewline
103 & 72 & 64.9547 & 7.04529 \tabularnewline
104 & 67 & 62.1338 & 4.86617 \tabularnewline
105 & 63 & 66.8502 & -3.85021 \tabularnewline
106 & 59 & 68.2675 & -9.26753 \tabularnewline
107 & 73 & 66.2125 & 6.78746 \tabularnewline
108 & 66 & 66.5758 & -0.575779 \tabularnewline
109 & 62 & 63.9319 & -1.9319 \tabularnewline
110 & 69 & 67.5942 & 1.40579 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=268087&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C]50[/C][C]66.2701[/C][C]-16.2701[/C][/ROW]
[ROW][C]2[/C][C]62[/C][C]67.2091[/C][C]-5.20907[/C][/ROW]
[ROW][C]3[/C][C]54[/C][C]64.2551[/C][C]-10.2551[/C][/ROW]
[ROW][C]4[/C][C]71[/C][C]66.0044[/C][C]4.9956[/C][/ROW]
[ROW][C]5[/C][C]54[/C][C]66.3633[/C][C]-12.3633[/C][/ROW]
[ROW][C]6[/C][C]65[/C][C]65.3536[/C][C]-0.353597[/C][/ROW]
[ROW][C]7[/C][C]73[/C][C]69.0201[/C][C]3.97986[/C][/ROW]
[ROW][C]8[/C][C]52[/C][C]65.3048[/C][C]-13.3048[/C][/ROW]
[ROW][C]9[/C][C]84[/C][C]67.3861[/C][C]16.6139[/C][/ROW]
[ROW][C]10[/C][C]42[/C][C]65.3624[/C][C]-23.3624[/C][/ROW]
[ROW][C]11[/C][C]66[/C][C]66.9565[/C][C]-0.956533[/C][/ROW]
[ROW][C]12[/C][C]65[/C][C]66.9966[/C][C]-1.99656[/C][/ROW]
[ROW][C]13[/C][C]78[/C][C]66.0401[/C][C]11.9599[/C][/ROW]
[ROW][C]14[/C][C]73[/C][C]65.96[/C][C]7.04[/C][/ROW]
[ROW][C]15[/C][C]75[/C][C]70.4375[/C][C]4.56254[/C][/ROW]
[ROW][C]16[/C][C]72[/C][C]67.4479[/C][C]4.55215[/C][/ROW]
[ROW][C]17[/C][C]66[/C][C]63.0417[/C][C]2.95831[/C][/ROW]
[ROW][C]18[/C][C]70[/C][C]67.271[/C][C]2.72901[/C][/ROW]
[ROW][C]19[/C][C]61[/C][C]65.6149[/C][C]-4.61489[/C][/ROW]
[ROW][C]20[/C][C]81[/C][C]64.3083[/C][C]16.6917[/C][/ROW]
[ROW][C]21[/C][C]69[/C][C]68.9227[/C][C]0.0772885[/C][/ROW]
[ROW][C]22[/C][C]71[/C][C]68.5463[/C][C]2.45366[/C][/ROW]
[ROW][C]23[/C][C]68[/C][C]68.7501[/C][C]-0.750091[/C][/ROW]
[ROW][C]24[/C][C]70[/C][C]64.3039[/C][C]5.6961[/C][/ROW]
[ROW][C]25[/C][C]68[/C][C]65.8806[/C][C]2.11943[/C][/ROW]
[ROW][C]26[/C][C]61[/C][C]67.643[/C][C]-6.64299[/C][/ROW]
[ROW][C]27[/C][C]67[/C][C]66[/C][C]0.999975[/C][/ROW]
[ROW][C]28[/C][C]76[/C][C]66.7396[/C][C]9.26036[/C][/ROW]
[ROW][C]29[/C][C]70[/C][C]64.614[/C][C]5.38602[/C][/ROW]
[ROW][C]30[/C][C]60[/C][C]67.4923[/C][C]-7.49226[/C][/ROW]
[ROW][C]31[/C][C]72[/C][C]66.3189[/C][C]5.68114[/C][/ROW]
[ROW][C]32[/C][C]69[/C][C]69.2991[/C][C]-0.299088[/C][/ROW]
[ROW][C]33[/C][C]71[/C][C]67.3242[/C][C]3.67585[/C][/ROW]
[ROW][C]34[/C][C]62[/C][C]64.9197[/C][C]-2.91968[/C][/ROW]
[ROW][C]35[/C][C]70[/C][C]66.2882[/C][C]3.71179[/C][/ROW]
[ROW][C]36[/C][C]64[/C][C]65.3179[/C][C]-1.31795[/C][/ROW]
[ROW][C]37[/C][C]58[/C][C]66.9522[/C][C]-8.95216[/C][/ROW]
[ROW][C]38[/C][C]76[/C][C]68.622[/C][C]7.37799[/C][/ROW]
[ROW][C]39[/C][C]52[/C][C]63.9894[/C][C]-11.9894[/C][/ROW]
[ROW][C]40[/C][C]59[/C][C]66.3101[/C][C]-7.3101[/C][/ROW]
[ROW][C]41[/C][C]68[/C][C]66.979[/C][C]1.02095[/C][/ROW]
[ROW][C]42[/C][C]76[/C][C]66.2701[/C][C]9.72992[/C][/ROW]
[ROW][C]43[/C][C]65[/C][C]68.9627[/C][C]-3.96274[/C][/ROW]
[ROW][C]44[/C][C]67[/C][C]68.9408[/C][C]-1.94085[/C][/ROW]
[ROW][C]45[/C][C]59[/C][C]66.5401[/C][C]-7.54013[/C][/ROW]
[ROW][C]46[/C][C]69[/C][C]65.2648[/C][C]3.73521[/C][/ROW]
[ROW][C]47[/C][C]76[/C][C]67.9931[/C][C]8.0069[/C][/ROW]
[ROW][C]48[/C][C]63[/C][C]66.8502[/C][C]-3.85021[/C][/ROW]
[ROW][C]49[/C][C]75[/C][C]67.9574[/C][C]7.04255[/C][/ROW]
[ROW][C]50[/C][C]63[/C][C]66.1994[/C][C]-3.1994[/C][/ROW]
[ROW][C]51[/C][C]60[/C][C]65.9512[/C][C]-5.95124[/C][/ROW]
[ROW][C]52[/C][C]73[/C][C]65.588[/C][C]7.412[/C][/ROW]
[ROW][C]53[/C][C]63[/C][C]65.0523[/C][C]-2.05227[/C][/ROW]
[ROW][C]54[/C][C]70[/C][C]65.3624[/C][C]4.63765[/C][/ROW]
[ROW][C]55[/C][C]75[/C][C]65.8274[/C][C]9.1726[/C][/ROW]
[ROW][C]56[/C][C]66[/C][C]66.4513[/C][C]-0.451319[/C][/ROW]
[ROW][C]57[/C][C]63[/C][C]65.8937[/C][C]-2.8937[/C][/ROW]
[ROW][C]58[/C][C]63[/C][C]63.9407[/C][C]-0.940658[/C][/ROW]
[ROW][C]59[/C][C]64[/C][C]69.2816[/C][C]-5.28157[/C][/ROW]
[ROW][C]60[/C][C]70[/C][C]66.3101[/C][C]3.6899[/C][/ROW]
[ROW][C]61[/C][C]75[/C][C]71.0795[/C][C]3.92049[/C][/ROW]
[ROW][C]62[/C][C]61[/C][C]66.3676[/C][C]-5.36765[/C][/ROW]
[ROW][C]63[/C][C]60[/C][C]68.2588[/C][C]-8.25877[/C][/ROW]
[ROW][C]64[/C][C]62[/C][C]66.6377[/C][C]-4.6377[/C][/ROW]
[ROW][C]65[/C][C]73[/C][C]68.8208[/C][C]4.17923[/C][/ROW]
[ROW][C]66[/C][C]61[/C][C]67.1203[/C][C]-6.12026[/C][/ROW]
[ROW][C]67[/C][C]66[/C][C]70.5525[/C][C]-4.55254[/C][/ROW]
[ROW][C]68[/C][C]64[/C][C]64.6271[/C][C]-0.627114[/C][/ROW]
[ROW][C]69[/C][C]59[/C][C]68.2807[/C][C]-9.28066[/C][/ROW]
[ROW][C]70[/C][C]64[/C][C]66[/C][C]-2.00003[/C][/ROW]
[ROW][C]71[/C][C]60[/C][C]68.582[/C][C]-8.58198[/C][/ROW]
[ROW][C]72[/C][C]56[/C][C]65.6724[/C][C]-9.67243[/C][/ROW]
[ROW][C]73[/C][C]78[/C][C]67.5148[/C][C]10.4852[/C][/ROW]
[ROW][C]74[/C][C]53[/C][C]60.4996[/C][C]-7.49962[/C][/ROW]
[ROW][C]75[/C][C]67[/C][C]67.4879[/C][C]-0.487881[/C][/ROW]
[ROW][C]76[/C][C]59[/C][C]65.5836[/C][C]-6.58362[/C][/ROW]
[ROW][C]77[/C][C]66[/C][C]65.358[/C][C]0.642025[/C][/ROW]
[ROW][C]78[/C][C]68[/C][C]66.3057[/C][C]1.69427[/C][/ROW]
[ROW][C]79[/C][C]71[/C][C]66.3676[/C][C]4.63235[/C][/ROW]
[ROW][C]80[/C][C]66[/C][C]64.574[/C][C]1.42605[/C][/ROW]
[ROW][C]81[/C][C]73[/C][C]65.3048[/C][C]7.69519[/C][/ROW]
[ROW][C]82[/C][C]72[/C][C]65.6812[/C][C]6.31881[/C][/ROW]
[ROW][C]83[/C][C]71[/C][C]68.6395[/C][C]2.36048[/C][/ROW]
[ROW][C]84[/C][C]59[/C][C]66.5758[/C][C]-7.57578[/C][/ROW]
[ROW][C]85[/C][C]64[/C][C]65.9956[/C][C]-1.99565[/C][/ROW]
[ROW][C]86[/C][C]66[/C][C]66.7353[/C][C]-0.735265[/C][/ROW]
[ROW][C]87[/C][C]78[/C][C]64.8753[/C][C]13.1247[/C][/ROW]
[ROW][C]88[/C][C]68[/C][C]64.9904[/C][C]3.00964[/C][/ROW]
[ROW][C]89[/C][C]73[/C][C]66.5889[/C][C]6.41109[/C][/ROW]
[ROW][C]90[/C][C]62[/C][C]66.0225[/C][C]-4.02254[/C][/ROW]
[ROW][C]91[/C][C]65[/C][C]66.527[/C][C]-1.527[/C][/ROW]
[ROW][C]92[/C][C]68[/C][C]66.602[/C][C]1.39795[/C][/ROW]
[ROW][C]93[/C][C]65[/C][C]65.8937[/C][C]-0.893701[/C][/ROW]
[ROW][C]94[/C][C]60[/C][C]65.358[/C][C]-5.35798[/C][/ROW]
[ROW][C]95[/C][C]71[/C][C]66.3189[/C][C]4.68114[/C][/ROW]
[ROW][C]96[/C][C]65[/C][C]67.6918[/C][C]-2.69177[/C][/ROW]
[ROW][C]97[/C][C]68[/C][C]67.643[/C][C]0.357011[/C][/ROW]
[ROW][C]98[/C][C]64[/C][C]66.6333[/C][C]-2.63332[/C][/ROW]
[ROW][C]99[/C][C]74[/C][C]66.2569[/C][C]7.74306[/C][/ROW]
[ROW][C]100[/C][C]69[/C][C]65.6768[/C][C]3.32319[/C][/ROW]
[ROW][C]101[/C][C]76[/C][C]66.6865[/C][C]9.31352[/C][/ROW]
[ROW][C]102[/C][C]68[/C][C]66.6377[/C][C]1.3623[/C][/ROW]
[ROW][C]103[/C][C]72[/C][C]64.9547[/C][C]7.04529[/C][/ROW]
[ROW][C]104[/C][C]67[/C][C]62.1338[/C][C]4.86617[/C][/ROW]
[ROW][C]105[/C][C]63[/C][C]66.8502[/C][C]-3.85021[/C][/ROW]
[ROW][C]106[/C][C]59[/C][C]68.2675[/C][C]-9.26753[/C][/ROW]
[ROW][C]107[/C][C]73[/C][C]66.2125[/C][C]6.78746[/C][/ROW]
[ROW][C]108[/C][C]66[/C][C]66.5758[/C][C]-0.575779[/C][/ROW]
[ROW][C]109[/C][C]62[/C][C]63.9319[/C][C]-1.9319[/C][/ROW]
[ROW][C]110[/C][C]69[/C][C]67.5942[/C][C]1.40579[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=268087&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=268087&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
15066.2701-16.2701
26267.2091-5.20907
35464.2551-10.2551
47166.00444.9956
55466.3633-12.3633
66565.3536-0.353597
77369.02013.97986
85265.3048-13.3048
98467.386116.6139
104265.3624-23.3624
116666.9565-0.956533
126566.9966-1.99656
137866.040111.9599
147365.967.04
157570.43754.56254
167267.44794.55215
176663.04172.95831
187067.2712.72901
196165.6149-4.61489
208164.308316.6917
216968.92270.0772885
227168.54632.45366
236868.7501-0.750091
247064.30395.6961
256865.88062.11943
266167.643-6.64299
2767660.999975
287666.73969.26036
297064.6145.38602
306067.4923-7.49226
317266.31895.68114
326969.2991-0.299088
337167.32423.67585
346264.9197-2.91968
357066.28823.71179
366465.3179-1.31795
375866.9522-8.95216
387668.6227.37799
395263.9894-11.9894
405966.3101-7.3101
416866.9791.02095
427666.27019.72992
436568.9627-3.96274
446768.9408-1.94085
455966.5401-7.54013
466965.26483.73521
477667.99318.0069
486366.8502-3.85021
497567.95747.04255
506366.1994-3.1994
516065.9512-5.95124
527365.5887.412
536365.0523-2.05227
547065.36244.63765
557565.82749.1726
566666.4513-0.451319
576365.8937-2.8937
586363.9407-0.940658
596469.2816-5.28157
607066.31013.6899
617571.07953.92049
626166.3676-5.36765
636068.2588-8.25877
646266.6377-4.6377
657368.82084.17923
666167.1203-6.12026
676670.5525-4.55254
686464.6271-0.627114
695968.2807-9.28066
706466-2.00003
716068.582-8.58198
725665.6724-9.67243
737867.514810.4852
745360.4996-7.49962
756767.4879-0.487881
765965.5836-6.58362
776665.3580.642025
786866.30571.69427
797166.36764.63235
806664.5741.42605
817365.30487.69519
827265.68126.31881
837168.63952.36048
845966.5758-7.57578
856465.9956-1.99565
866666.7353-0.735265
877864.875313.1247
886864.99043.00964
897366.58896.41109
906266.0225-4.02254
916566.527-1.527
926866.6021.39795
936565.8937-0.893701
946065.358-5.35798
957166.31894.68114
966567.6918-2.69177
976867.6430.357011
986466.6333-2.63332
997466.25697.74306
1006965.67683.32319
1017666.68659.31352
1026866.63771.3623
1037264.95477.04529
1046762.13384.86617
1056366.8502-3.85021
1065968.2675-9.26753
1077366.21256.78746
1086666.5758-0.575779
1096263.9319-1.9319
1106967.59421.40579







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
70.9491920.1016170.0508083
80.9328660.1342680.0671339
90.9637120.07257630.0362882
100.9975950.004809350.00240467
110.995040.00991940.0049597
120.9906670.01866510.00933255
130.9990620.001876890.000938445
140.9992680.001464180.000732091
150.9988120.002375530.00118777
160.9980670.003865660.00193283
170.9991130.001773830.000886913
180.9984160.003167440.00158372
190.9978920.004216610.00210831
200.9999480.0001040175.20083e-05
210.9999290.0001420347.10169e-05
220.999870.0002608590.000130429
230.9997550.0004906210.000245311
240.9997610.000478320.00023916
250.9996760.0006476320.000323816
260.9996660.000668990.000334495
270.9994190.001161140.00058057
280.9995560.0008871420.000443571
290.999530.0009403330.000470166
300.9995840.0008328710.000416435
310.9994750.001049740.000524871
320.9992050.001589850.000794924
330.9988380.002323570.00116178
340.9982780.003444360.00172218
350.9977270.004546650.00227332
360.9964850.007029550.00351477
370.9973060.005388660.00269433
380.9972360.005528630.00276432
390.9987250.002549750.00127488
400.9988020.002396190.0011981
410.9980950.003810880.00190544
420.9988490.002302450.00115122
430.9985070.002985610.00149281
440.9977750.004449630.00222481
450.9978670.004265220.00213261
460.9971710.005658710.00282935
470.997520.004960460.00248023
480.996630.006739250.00336963
490.996780.006440370.00322019
500.9955680.008864550.00443228
510.9950820.009836890.00491844
520.9957050.008590080.00429504
530.9937440.01251280.00625642
540.9923840.01523180.00761591
550.9947190.01056230.00528114
560.9921620.01567530.00783767
570.9892940.02141230.0107061
580.9848690.03026260.0151313
590.9825660.03486830.0174341
600.9779020.04419560.0220978
610.9742490.05150230.0257512
620.9713570.0572860.028643
630.974510.05097970.0254898
640.9698370.06032610.0301631
650.9658620.06827630.0341381
660.9620480.07590350.0379518
670.9534570.09308640.0465432
680.937770.1244610.0622303
690.9539440.09211160.0460558
700.9399560.1200880.0600439
710.9549870.09002530.0450126
720.9734650.05306990.026535
730.9857650.02847070.0142353
740.9946560.01068830.00534413
750.9915810.01683870.00841936
760.995250.009499130.00474957
770.9927320.01453560.00726778
780.988890.02221950.0111098
790.984960.03008040.0150402
800.9791080.04178340.0208917
810.9791920.04161630.0208082
820.9750780.04984330.0249217
830.9720430.05591470.0279573
840.9778810.04423890.0221195
850.96950.06099970.0304999
860.9541060.09178790.0458939
870.9918010.01639740.0081987
880.986170.02766040.0138302
890.9838490.03230170.0161509
900.9735550.05288950.0264447
910.9573640.08527190.0426359
920.9455660.1088690.0544344
930.9190470.1619060.0809529
940.9396770.1206460.060323
950.9087280.1825440.0912721
960.8798230.2403550.120177
970.8183890.3632230.181611
980.7821350.435730.217865
990.8560050.2879910.143995
1000.7695790.4608410.230421
1010.8724210.2551590.127579
1020.7720460.4559080.227954
1030.6171270.7657450.382873

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
7 & 0.949192 & 0.101617 & 0.0508083 \tabularnewline
8 & 0.932866 & 0.134268 & 0.0671339 \tabularnewline
9 & 0.963712 & 0.0725763 & 0.0362882 \tabularnewline
10 & 0.997595 & 0.00480935 & 0.00240467 \tabularnewline
11 & 0.99504 & 0.0099194 & 0.0049597 \tabularnewline
12 & 0.990667 & 0.0186651 & 0.00933255 \tabularnewline
13 & 0.999062 & 0.00187689 & 0.000938445 \tabularnewline
14 & 0.999268 & 0.00146418 & 0.000732091 \tabularnewline
15 & 0.998812 & 0.00237553 & 0.00118777 \tabularnewline
16 & 0.998067 & 0.00386566 & 0.00193283 \tabularnewline
17 & 0.999113 & 0.00177383 & 0.000886913 \tabularnewline
18 & 0.998416 & 0.00316744 & 0.00158372 \tabularnewline
19 & 0.997892 & 0.00421661 & 0.00210831 \tabularnewline
20 & 0.999948 & 0.000104017 & 5.20083e-05 \tabularnewline
21 & 0.999929 & 0.000142034 & 7.10169e-05 \tabularnewline
22 & 0.99987 & 0.000260859 & 0.000130429 \tabularnewline
23 & 0.999755 & 0.000490621 & 0.000245311 \tabularnewline
24 & 0.999761 & 0.00047832 & 0.00023916 \tabularnewline
25 & 0.999676 & 0.000647632 & 0.000323816 \tabularnewline
26 & 0.999666 & 0.00066899 & 0.000334495 \tabularnewline
27 & 0.999419 & 0.00116114 & 0.00058057 \tabularnewline
28 & 0.999556 & 0.000887142 & 0.000443571 \tabularnewline
29 & 0.99953 & 0.000940333 & 0.000470166 \tabularnewline
30 & 0.999584 & 0.000832871 & 0.000416435 \tabularnewline
31 & 0.999475 & 0.00104974 & 0.000524871 \tabularnewline
32 & 0.999205 & 0.00158985 & 0.000794924 \tabularnewline
33 & 0.998838 & 0.00232357 & 0.00116178 \tabularnewline
34 & 0.998278 & 0.00344436 & 0.00172218 \tabularnewline
35 & 0.997727 & 0.00454665 & 0.00227332 \tabularnewline
36 & 0.996485 & 0.00702955 & 0.00351477 \tabularnewline
37 & 0.997306 & 0.00538866 & 0.00269433 \tabularnewline
38 & 0.997236 & 0.00552863 & 0.00276432 \tabularnewline
39 & 0.998725 & 0.00254975 & 0.00127488 \tabularnewline
40 & 0.998802 & 0.00239619 & 0.0011981 \tabularnewline
41 & 0.998095 & 0.00381088 & 0.00190544 \tabularnewline
42 & 0.998849 & 0.00230245 & 0.00115122 \tabularnewline
43 & 0.998507 & 0.00298561 & 0.00149281 \tabularnewline
44 & 0.997775 & 0.00444963 & 0.00222481 \tabularnewline
45 & 0.997867 & 0.00426522 & 0.00213261 \tabularnewline
46 & 0.997171 & 0.00565871 & 0.00282935 \tabularnewline
47 & 0.99752 & 0.00496046 & 0.00248023 \tabularnewline
48 & 0.99663 & 0.00673925 & 0.00336963 \tabularnewline
49 & 0.99678 & 0.00644037 & 0.00322019 \tabularnewline
50 & 0.995568 & 0.00886455 & 0.00443228 \tabularnewline
51 & 0.995082 & 0.00983689 & 0.00491844 \tabularnewline
52 & 0.995705 & 0.00859008 & 0.00429504 \tabularnewline
53 & 0.993744 & 0.0125128 & 0.00625642 \tabularnewline
54 & 0.992384 & 0.0152318 & 0.00761591 \tabularnewline
55 & 0.994719 & 0.0105623 & 0.00528114 \tabularnewline
56 & 0.992162 & 0.0156753 & 0.00783767 \tabularnewline
57 & 0.989294 & 0.0214123 & 0.0107061 \tabularnewline
58 & 0.984869 & 0.0302626 & 0.0151313 \tabularnewline
59 & 0.982566 & 0.0348683 & 0.0174341 \tabularnewline
60 & 0.977902 & 0.0441956 & 0.0220978 \tabularnewline
61 & 0.974249 & 0.0515023 & 0.0257512 \tabularnewline
62 & 0.971357 & 0.057286 & 0.028643 \tabularnewline
63 & 0.97451 & 0.0509797 & 0.0254898 \tabularnewline
64 & 0.969837 & 0.0603261 & 0.0301631 \tabularnewline
65 & 0.965862 & 0.0682763 & 0.0341381 \tabularnewline
66 & 0.962048 & 0.0759035 & 0.0379518 \tabularnewline
67 & 0.953457 & 0.0930864 & 0.0465432 \tabularnewline
68 & 0.93777 & 0.124461 & 0.0622303 \tabularnewline
69 & 0.953944 & 0.0921116 & 0.0460558 \tabularnewline
70 & 0.939956 & 0.120088 & 0.0600439 \tabularnewline
71 & 0.954987 & 0.0900253 & 0.0450126 \tabularnewline
72 & 0.973465 & 0.0530699 & 0.026535 \tabularnewline
73 & 0.985765 & 0.0284707 & 0.0142353 \tabularnewline
74 & 0.994656 & 0.0106883 & 0.00534413 \tabularnewline
75 & 0.991581 & 0.0168387 & 0.00841936 \tabularnewline
76 & 0.99525 & 0.00949913 & 0.00474957 \tabularnewline
77 & 0.992732 & 0.0145356 & 0.00726778 \tabularnewline
78 & 0.98889 & 0.0222195 & 0.0111098 \tabularnewline
79 & 0.98496 & 0.0300804 & 0.0150402 \tabularnewline
80 & 0.979108 & 0.0417834 & 0.0208917 \tabularnewline
81 & 0.979192 & 0.0416163 & 0.0208082 \tabularnewline
82 & 0.975078 & 0.0498433 & 0.0249217 \tabularnewline
83 & 0.972043 & 0.0559147 & 0.0279573 \tabularnewline
84 & 0.977881 & 0.0442389 & 0.0221195 \tabularnewline
85 & 0.9695 & 0.0609997 & 0.0304999 \tabularnewline
86 & 0.954106 & 0.0917879 & 0.0458939 \tabularnewline
87 & 0.991801 & 0.0163974 & 0.0081987 \tabularnewline
88 & 0.98617 & 0.0276604 & 0.0138302 \tabularnewline
89 & 0.983849 & 0.0323017 & 0.0161509 \tabularnewline
90 & 0.973555 & 0.0528895 & 0.0264447 \tabularnewline
91 & 0.957364 & 0.0852719 & 0.0426359 \tabularnewline
92 & 0.945566 & 0.108869 & 0.0544344 \tabularnewline
93 & 0.919047 & 0.161906 & 0.0809529 \tabularnewline
94 & 0.939677 & 0.120646 & 0.060323 \tabularnewline
95 & 0.908728 & 0.182544 & 0.0912721 \tabularnewline
96 & 0.879823 & 0.240355 & 0.120177 \tabularnewline
97 & 0.818389 & 0.363223 & 0.181611 \tabularnewline
98 & 0.782135 & 0.43573 & 0.217865 \tabularnewline
99 & 0.856005 & 0.287991 & 0.143995 \tabularnewline
100 & 0.769579 & 0.460841 & 0.230421 \tabularnewline
101 & 0.872421 & 0.255159 & 0.127579 \tabularnewline
102 & 0.772046 & 0.455908 & 0.227954 \tabularnewline
103 & 0.617127 & 0.765745 & 0.382873 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=268087&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]7[/C][C]0.949192[/C][C]0.101617[/C][C]0.0508083[/C][/ROW]
[ROW][C]8[/C][C]0.932866[/C][C]0.134268[/C][C]0.0671339[/C][/ROW]
[ROW][C]9[/C][C]0.963712[/C][C]0.0725763[/C][C]0.0362882[/C][/ROW]
[ROW][C]10[/C][C]0.997595[/C][C]0.00480935[/C][C]0.00240467[/C][/ROW]
[ROW][C]11[/C][C]0.99504[/C][C]0.0099194[/C][C]0.0049597[/C][/ROW]
[ROW][C]12[/C][C]0.990667[/C][C]0.0186651[/C][C]0.00933255[/C][/ROW]
[ROW][C]13[/C][C]0.999062[/C][C]0.00187689[/C][C]0.000938445[/C][/ROW]
[ROW][C]14[/C][C]0.999268[/C][C]0.00146418[/C][C]0.000732091[/C][/ROW]
[ROW][C]15[/C][C]0.998812[/C][C]0.00237553[/C][C]0.00118777[/C][/ROW]
[ROW][C]16[/C][C]0.998067[/C][C]0.00386566[/C][C]0.00193283[/C][/ROW]
[ROW][C]17[/C][C]0.999113[/C][C]0.00177383[/C][C]0.000886913[/C][/ROW]
[ROW][C]18[/C][C]0.998416[/C][C]0.00316744[/C][C]0.00158372[/C][/ROW]
[ROW][C]19[/C][C]0.997892[/C][C]0.00421661[/C][C]0.00210831[/C][/ROW]
[ROW][C]20[/C][C]0.999948[/C][C]0.000104017[/C][C]5.20083e-05[/C][/ROW]
[ROW][C]21[/C][C]0.999929[/C][C]0.000142034[/C][C]7.10169e-05[/C][/ROW]
[ROW][C]22[/C][C]0.99987[/C][C]0.000260859[/C][C]0.000130429[/C][/ROW]
[ROW][C]23[/C][C]0.999755[/C][C]0.000490621[/C][C]0.000245311[/C][/ROW]
[ROW][C]24[/C][C]0.999761[/C][C]0.00047832[/C][C]0.00023916[/C][/ROW]
[ROW][C]25[/C][C]0.999676[/C][C]0.000647632[/C][C]0.000323816[/C][/ROW]
[ROW][C]26[/C][C]0.999666[/C][C]0.00066899[/C][C]0.000334495[/C][/ROW]
[ROW][C]27[/C][C]0.999419[/C][C]0.00116114[/C][C]0.00058057[/C][/ROW]
[ROW][C]28[/C][C]0.999556[/C][C]0.000887142[/C][C]0.000443571[/C][/ROW]
[ROW][C]29[/C][C]0.99953[/C][C]0.000940333[/C][C]0.000470166[/C][/ROW]
[ROW][C]30[/C][C]0.999584[/C][C]0.000832871[/C][C]0.000416435[/C][/ROW]
[ROW][C]31[/C][C]0.999475[/C][C]0.00104974[/C][C]0.000524871[/C][/ROW]
[ROW][C]32[/C][C]0.999205[/C][C]0.00158985[/C][C]0.000794924[/C][/ROW]
[ROW][C]33[/C][C]0.998838[/C][C]0.00232357[/C][C]0.00116178[/C][/ROW]
[ROW][C]34[/C][C]0.998278[/C][C]0.00344436[/C][C]0.00172218[/C][/ROW]
[ROW][C]35[/C][C]0.997727[/C][C]0.00454665[/C][C]0.00227332[/C][/ROW]
[ROW][C]36[/C][C]0.996485[/C][C]0.00702955[/C][C]0.00351477[/C][/ROW]
[ROW][C]37[/C][C]0.997306[/C][C]0.00538866[/C][C]0.00269433[/C][/ROW]
[ROW][C]38[/C][C]0.997236[/C][C]0.00552863[/C][C]0.00276432[/C][/ROW]
[ROW][C]39[/C][C]0.998725[/C][C]0.00254975[/C][C]0.00127488[/C][/ROW]
[ROW][C]40[/C][C]0.998802[/C][C]0.00239619[/C][C]0.0011981[/C][/ROW]
[ROW][C]41[/C][C]0.998095[/C][C]0.00381088[/C][C]0.00190544[/C][/ROW]
[ROW][C]42[/C][C]0.998849[/C][C]0.00230245[/C][C]0.00115122[/C][/ROW]
[ROW][C]43[/C][C]0.998507[/C][C]0.00298561[/C][C]0.00149281[/C][/ROW]
[ROW][C]44[/C][C]0.997775[/C][C]0.00444963[/C][C]0.00222481[/C][/ROW]
[ROW][C]45[/C][C]0.997867[/C][C]0.00426522[/C][C]0.00213261[/C][/ROW]
[ROW][C]46[/C][C]0.997171[/C][C]0.00565871[/C][C]0.00282935[/C][/ROW]
[ROW][C]47[/C][C]0.99752[/C][C]0.00496046[/C][C]0.00248023[/C][/ROW]
[ROW][C]48[/C][C]0.99663[/C][C]0.00673925[/C][C]0.00336963[/C][/ROW]
[ROW][C]49[/C][C]0.99678[/C][C]0.00644037[/C][C]0.00322019[/C][/ROW]
[ROW][C]50[/C][C]0.995568[/C][C]0.00886455[/C][C]0.00443228[/C][/ROW]
[ROW][C]51[/C][C]0.995082[/C][C]0.00983689[/C][C]0.00491844[/C][/ROW]
[ROW][C]52[/C][C]0.995705[/C][C]0.00859008[/C][C]0.00429504[/C][/ROW]
[ROW][C]53[/C][C]0.993744[/C][C]0.0125128[/C][C]0.00625642[/C][/ROW]
[ROW][C]54[/C][C]0.992384[/C][C]0.0152318[/C][C]0.00761591[/C][/ROW]
[ROW][C]55[/C][C]0.994719[/C][C]0.0105623[/C][C]0.00528114[/C][/ROW]
[ROW][C]56[/C][C]0.992162[/C][C]0.0156753[/C][C]0.00783767[/C][/ROW]
[ROW][C]57[/C][C]0.989294[/C][C]0.0214123[/C][C]0.0107061[/C][/ROW]
[ROW][C]58[/C][C]0.984869[/C][C]0.0302626[/C][C]0.0151313[/C][/ROW]
[ROW][C]59[/C][C]0.982566[/C][C]0.0348683[/C][C]0.0174341[/C][/ROW]
[ROW][C]60[/C][C]0.977902[/C][C]0.0441956[/C][C]0.0220978[/C][/ROW]
[ROW][C]61[/C][C]0.974249[/C][C]0.0515023[/C][C]0.0257512[/C][/ROW]
[ROW][C]62[/C][C]0.971357[/C][C]0.057286[/C][C]0.028643[/C][/ROW]
[ROW][C]63[/C][C]0.97451[/C][C]0.0509797[/C][C]0.0254898[/C][/ROW]
[ROW][C]64[/C][C]0.969837[/C][C]0.0603261[/C][C]0.0301631[/C][/ROW]
[ROW][C]65[/C][C]0.965862[/C][C]0.0682763[/C][C]0.0341381[/C][/ROW]
[ROW][C]66[/C][C]0.962048[/C][C]0.0759035[/C][C]0.0379518[/C][/ROW]
[ROW][C]67[/C][C]0.953457[/C][C]0.0930864[/C][C]0.0465432[/C][/ROW]
[ROW][C]68[/C][C]0.93777[/C][C]0.124461[/C][C]0.0622303[/C][/ROW]
[ROW][C]69[/C][C]0.953944[/C][C]0.0921116[/C][C]0.0460558[/C][/ROW]
[ROW][C]70[/C][C]0.939956[/C][C]0.120088[/C][C]0.0600439[/C][/ROW]
[ROW][C]71[/C][C]0.954987[/C][C]0.0900253[/C][C]0.0450126[/C][/ROW]
[ROW][C]72[/C][C]0.973465[/C][C]0.0530699[/C][C]0.026535[/C][/ROW]
[ROW][C]73[/C][C]0.985765[/C][C]0.0284707[/C][C]0.0142353[/C][/ROW]
[ROW][C]74[/C][C]0.994656[/C][C]0.0106883[/C][C]0.00534413[/C][/ROW]
[ROW][C]75[/C][C]0.991581[/C][C]0.0168387[/C][C]0.00841936[/C][/ROW]
[ROW][C]76[/C][C]0.99525[/C][C]0.00949913[/C][C]0.00474957[/C][/ROW]
[ROW][C]77[/C][C]0.992732[/C][C]0.0145356[/C][C]0.00726778[/C][/ROW]
[ROW][C]78[/C][C]0.98889[/C][C]0.0222195[/C][C]0.0111098[/C][/ROW]
[ROW][C]79[/C][C]0.98496[/C][C]0.0300804[/C][C]0.0150402[/C][/ROW]
[ROW][C]80[/C][C]0.979108[/C][C]0.0417834[/C][C]0.0208917[/C][/ROW]
[ROW][C]81[/C][C]0.979192[/C][C]0.0416163[/C][C]0.0208082[/C][/ROW]
[ROW][C]82[/C][C]0.975078[/C][C]0.0498433[/C][C]0.0249217[/C][/ROW]
[ROW][C]83[/C][C]0.972043[/C][C]0.0559147[/C][C]0.0279573[/C][/ROW]
[ROW][C]84[/C][C]0.977881[/C][C]0.0442389[/C][C]0.0221195[/C][/ROW]
[ROW][C]85[/C][C]0.9695[/C][C]0.0609997[/C][C]0.0304999[/C][/ROW]
[ROW][C]86[/C][C]0.954106[/C][C]0.0917879[/C][C]0.0458939[/C][/ROW]
[ROW][C]87[/C][C]0.991801[/C][C]0.0163974[/C][C]0.0081987[/C][/ROW]
[ROW][C]88[/C][C]0.98617[/C][C]0.0276604[/C][C]0.0138302[/C][/ROW]
[ROW][C]89[/C][C]0.983849[/C][C]0.0323017[/C][C]0.0161509[/C][/ROW]
[ROW][C]90[/C][C]0.973555[/C][C]0.0528895[/C][C]0.0264447[/C][/ROW]
[ROW][C]91[/C][C]0.957364[/C][C]0.0852719[/C][C]0.0426359[/C][/ROW]
[ROW][C]92[/C][C]0.945566[/C][C]0.108869[/C][C]0.0544344[/C][/ROW]
[ROW][C]93[/C][C]0.919047[/C][C]0.161906[/C][C]0.0809529[/C][/ROW]
[ROW][C]94[/C][C]0.939677[/C][C]0.120646[/C][C]0.060323[/C][/ROW]
[ROW][C]95[/C][C]0.908728[/C][C]0.182544[/C][C]0.0912721[/C][/ROW]
[ROW][C]96[/C][C]0.879823[/C][C]0.240355[/C][C]0.120177[/C][/ROW]
[ROW][C]97[/C][C]0.818389[/C][C]0.363223[/C][C]0.181611[/C][/ROW]
[ROW][C]98[/C][C]0.782135[/C][C]0.43573[/C][C]0.217865[/C][/ROW]
[ROW][C]99[/C][C]0.856005[/C][C]0.287991[/C][C]0.143995[/C][/ROW]
[ROW][C]100[/C][C]0.769579[/C][C]0.460841[/C][C]0.230421[/C][/ROW]
[ROW][C]101[/C][C]0.872421[/C][C]0.255159[/C][C]0.127579[/C][/ROW]
[ROW][C]102[/C][C]0.772046[/C][C]0.455908[/C][C]0.227954[/C][/ROW]
[ROW][C]103[/C][C]0.617127[/C][C]0.765745[/C][C]0.382873[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=268087&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=268087&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
70.9491920.1016170.0508083
80.9328660.1342680.0671339
90.9637120.07257630.0362882
100.9975950.004809350.00240467
110.995040.00991940.0049597
120.9906670.01866510.00933255
130.9990620.001876890.000938445
140.9992680.001464180.000732091
150.9988120.002375530.00118777
160.9980670.003865660.00193283
170.9991130.001773830.000886913
180.9984160.003167440.00158372
190.9978920.004216610.00210831
200.9999480.0001040175.20083e-05
210.9999290.0001420347.10169e-05
220.999870.0002608590.000130429
230.9997550.0004906210.000245311
240.9997610.000478320.00023916
250.9996760.0006476320.000323816
260.9996660.000668990.000334495
270.9994190.001161140.00058057
280.9995560.0008871420.000443571
290.999530.0009403330.000470166
300.9995840.0008328710.000416435
310.9994750.001049740.000524871
320.9992050.001589850.000794924
330.9988380.002323570.00116178
340.9982780.003444360.00172218
350.9977270.004546650.00227332
360.9964850.007029550.00351477
370.9973060.005388660.00269433
380.9972360.005528630.00276432
390.9987250.002549750.00127488
400.9988020.002396190.0011981
410.9980950.003810880.00190544
420.9988490.002302450.00115122
430.9985070.002985610.00149281
440.9977750.004449630.00222481
450.9978670.004265220.00213261
460.9971710.005658710.00282935
470.997520.004960460.00248023
480.996630.006739250.00336963
490.996780.006440370.00322019
500.9955680.008864550.00443228
510.9950820.009836890.00491844
520.9957050.008590080.00429504
530.9937440.01251280.00625642
540.9923840.01523180.00761591
550.9947190.01056230.00528114
560.9921620.01567530.00783767
570.9892940.02141230.0107061
580.9848690.03026260.0151313
590.9825660.03486830.0174341
600.9779020.04419560.0220978
610.9742490.05150230.0257512
620.9713570.0572860.028643
630.974510.05097970.0254898
640.9698370.06032610.0301631
650.9658620.06827630.0341381
660.9620480.07590350.0379518
670.9534570.09308640.0465432
680.937770.1244610.0622303
690.9539440.09211160.0460558
700.9399560.1200880.0600439
710.9549870.09002530.0450126
720.9734650.05306990.026535
730.9857650.02847070.0142353
740.9946560.01068830.00534413
750.9915810.01683870.00841936
760.995250.009499130.00474957
770.9927320.01453560.00726778
780.988890.02221950.0111098
790.984960.03008040.0150402
800.9791080.04178340.0208917
810.9791920.04161630.0208082
820.9750780.04984330.0249217
830.9720430.05591470.0279573
840.9778810.04423890.0221195
850.96950.06099970.0304999
860.9541060.09178790.0458939
870.9918010.01639740.0081987
880.986170.02766040.0138302
890.9838490.03230170.0161509
900.9735550.05288950.0264447
910.9573640.08527190.0426359
920.9455660.1088690.0544344
930.9190470.1619060.0809529
940.9396770.1206460.060323
950.9087280.1825440.0912721
960.8798230.2403550.120177
970.8183890.3632230.181611
980.7821350.435730.217865
990.8560050.2879910.143995
1000.7695790.4608410.230421
1010.8724210.2551590.127579
1020.7720460.4559080.227954
1030.6171270.7657450.382873







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level430.443299NOK
5% type I error level650.670103NOK
10% type I error level810.835052NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 43 & 0.443299 & NOK \tabularnewline
5% type I error level & 65 & 0.670103 & NOK \tabularnewline
10% type I error level & 81 & 0.835052 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=268087&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]43[/C][C]0.443299[/C][C]NOK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]65[/C][C]0.670103[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]81[/C][C]0.835052[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=268087&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=268087&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level430.443299NOK
5% type I error level650.670103NOK
10% type I error level810.835052NOK



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
par1 <- as.numeric(par1)
x <- t(y)
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
x2 <- array(0, dim=c(n-1,k), dimnames=list(1:(n-1), paste('(1-B)',colnames(x),sep='')))
for (i in 1:n-1) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
k <- length(x[1,])
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
x
k <- length(x[1,])
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
hist(mysum$resid, main='Residual Histogram', xlab='values of Residuals')
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqnorm(mysum$resid, main='Residual Normal Q-Q Plot')
qqline(mysum$resid)
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
z
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,hyperlink('ols1.htm','Multiple Linear Regression - Ordinary Least Squares',''), 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,signif(mysum$coefficients[i,1],6))
a<-table.element(a, signif(mysum$coefficients[i,2],6))
a<-table.element(a, signif(mysum$coefficients[i,3],4))
a<-table.element(a, signif(mysum$coefficients[i,4],6))
a<-table.element(a, signif(mysum$coefficients[i,4]/2,6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a, signif(sqrt(mysum$r.squared),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a, signif(mysum$r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a, signif(mysum$adj.r.squared,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[1],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a, signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a, signif(mysum$sigma,6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a, signif(sum(myerror*myerror),6))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,signif(x[i],6))
a<-table.element(a,signif(x[i]-mysum$resid[i],6))
a<-table.element(a,signif(mysum$resid[i],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,signif(gqarr[mypoint-kp3+1,1],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,2],6))
a<-table.element(a,signif(gqarr[mypoint-kp3+1,3],6))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,signif(numsignificant1/numgqtests,6))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}