Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 07 Dec 2016 15:07:40 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/07/t1481119919dfcqc10wc9d5k2y.htm/, Retrieved Tue, 07 May 2024 17:03:05 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=298131, Retrieved Tue, 07 May 2024 17:03:05 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact83
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [Multiple regressi...] [2016-12-07 14:07:40] [7e221ad450ea382d698d0f010899c791] [Current]
Feedback Forum

Post a new message
Dataseries X:
5	4	13
3	2	16
5	3	17
5	2	NA
5	2	NA
5	3	16
5	3	NA
5	2	NA
5	2	NA
5	4	17
4	2	17
2	2	15
5	3	16
4	2	14
5	3	16
4	2	17
5	2	NA
5	NA	NA
5	3	NA
4	2	NA
4	2	16
3	3	NA
5	1	16
4	2	NA
5	3	NA
4	2	NA
5	2	16
5	3	15
5	5	16
5	2	16
5	5	13
5	2	15
5	2	17
5	4	NA
5	1	13
4	2	17
4	2	NA
5	3	14
5	2	14
5	3	18
5	2	NA
5	3	17
5	4	13
5	4	16
5	3	15
5	2	15
5	2	NA
NA	1	15
4	4	13
5	4	NA
5	3	17
4	2	NA
5	2	NA
3	2	11
4	2	14
3	3	13
5	2	NA
5	2	17
5	3	16
5	3	NA
5	2	17
5	2	16
5	4	16
5	4	16
4	3	15
5	4	12
4	4	17
5	4	14
2	4	14
4	5	16
5	3	NA
5	4	NA
4	4	NA
5	2	NA
2	2	NA
5	3	15
3	4	16
4	2	14
4	5	15
5	1	17
5	3	NA
4	3	10
4	2	NA
5	2	17
4	1	NA
4	2	20
5	1	17
5	2	18
5	2	NA
4	2	17
4	2	14
4	3	NA
3	2	17
4	1	NA
5	1	17
5	3	NA
4	2	16
5	3	18
2	1	18
5	2	16
5	2	NA
4	3	NA
3	2	15
5	2	13
4	3	NA
5	1	NA
5	4	NA
5	3	NA
5	2	NA
5	3	16
4	3	NA
5	3	NA
5	4	NA
5	3	12
4	2	NA
5	3	16
5	2	16
2	1	NA
5	1	16
5	2	14
5	4	15
5	3	14
5	2	NA
5	2	15
5	3	NA
5	3	15
4	3	16
3	2	NA
5	2	NA
5	2	NA
5	3	11
5	4	NA
4	2	18
4	2	NA
4	1	11
5	3	NA
4	3	18
NA	4	NA
4	3	15
5	1	19
2	1	17
5	2	NA
4	1	14
5	5	NA
5	3	13
4	2	17
5	2	14
4	4	19
5	2	14
5	4	NA
5	4	NA
4	3	16
5	4	16
5	3	15
5	4	12
5	3	NA
5	4	17
2	2	NA
5	4	NA
3	1	18
5	4	15
5	3	18
5	2	15
4	2	NA
5	2	NA
5	4	NA
5	3	16
5	3	NA
5	2	16




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center
R Framework error message
The field 'Names of X columns' contains a hard return which cannot be interpreted.
Please, resubmit your request without hard returns in the 'Names of X columns'.

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time7 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
R Framework error message & 
The field 'Names of X columns' contains a hard return which cannot be interpreted.
Please, resubmit your request without hard returns in the 'Names of X columns'.
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=298131&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]7 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [ROW]R Framework error message[/C][C]
The field 'Names of X columns' contains a hard return which cannot be interpreted.
Please, resubmit your request without hard returns in the 'Names of X columns'.
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298131&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298131&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time7 seconds
R ServerBig Analytics Cloud Computing Center
R Framework error message
The field 'Names of X columns' contains a hard return which cannot be interpreted.
Please, resubmit your request without hard returns in the 'Names of X columns'.







Multiple Linear Regression - Estimated Regression Equation
EP1[t] = + 4.18795 + 0.12653EP3[t] -0.0021116`TVDSUM\r`[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
EP1[t] =  +  4.18795 +  0.12653EP3[t] -0.0021116`TVDSUM\r`[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298131&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]EP1[t] =  +  4.18795 +  0.12653EP3[t] -0.0021116`TVDSUM\r`[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298131&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298131&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
EP1[t] = + 4.18795 + 0.12653EP3[t] -0.0021116`TVDSUM\r`[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+4.188 0.7248+5.7780e+00 8.768e-08 4.384e-08
EP3+0.1265 0.0774+1.6350e+00 0.1053 0.05264
`TVDSUM\r`-0.002112 0.04241-4.9800e-02 0.9604 0.4802

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +4.188 &  0.7248 & +5.7780e+00 &  8.768e-08 &  4.384e-08 \tabularnewline
EP3 & +0.1265 &  0.0774 & +1.6350e+00 &  0.1053 &  0.05264 \tabularnewline
`TVDSUM\r` & -0.002112 &  0.04241 & -4.9800e-02 &  0.9604 &  0.4802 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298131&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+4.188[/C][C] 0.7248[/C][C]+5.7780e+00[/C][C] 8.768e-08[/C][C] 4.384e-08[/C][/ROW]
[ROW][C]EP3[/C][C]+0.1265[/C][C] 0.0774[/C][C]+1.6350e+00[/C][C] 0.1053[/C][C] 0.05264[/C][/ROW]
[ROW][C]`TVDSUM\r`[/C][C]-0.002112[/C][C] 0.04241[/C][C]-4.9800e-02[/C][C] 0.9604[/C][C] 0.4802[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298131&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298131&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+4.188 0.7248+5.7780e+00 8.768e-08 4.384e-08
EP3+0.1265 0.0774+1.6350e+00 0.1053 0.05264
`TVDSUM\r`-0.002112 0.04241-4.9800e-02 0.9604 0.4802







Multiple Linear Regression - Regression Statistics
Multiple R 0.1654
R-squared 0.02736
Adjusted R-squared 0.007715
F-TEST (value) 1.393
F-TEST (DF numerator)2
F-TEST (DF denominator)99
p-value 0.2532
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 0.7898
Sum Squared Residuals 61.75

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.1654 \tabularnewline
R-squared &  0.02736 \tabularnewline
Adjusted R-squared &  0.007715 \tabularnewline
F-TEST (value) &  1.393 \tabularnewline
F-TEST (DF numerator) & 2 \tabularnewline
F-TEST (DF denominator) & 99 \tabularnewline
p-value &  0.2532 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  0.7898 \tabularnewline
Sum Squared Residuals &  61.75 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298131&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.1654[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.02736[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C] 0.007715[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 1.393[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]2[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]99[/C][/ROW]
[ROW][C]p-value[/C][C] 0.2532[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 0.7898[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 61.75[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298131&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298131&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.1654
R-squared 0.02736
Adjusted R-squared 0.007715
F-TEST (value) 1.393
F-TEST (DF numerator)2
F-TEST (DF denominator)99
p-value 0.2532
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 0.7898
Sum Squared Residuals 61.75







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 5 4.667 0.3334
2 3 4.407-1.407
3 5 4.532 0.4684
4 5 4.534 0.4662
5 5 4.658 0.3418
6 4 4.405-0.4051
7 2 4.409-2.409
8 5 4.534 0.4662
9 4 4.411-0.4114
10 5 4.534 0.4662
11 4 4.405-0.4051
12 4 4.407-0.4072
13 5 4.281 0.7193
14 5 4.407 0.5928
15 5 4.536 0.4641
16 5 4.787 0.2132
17 5 4.407 0.5928
18 5 4.793 0.2069
19 5 4.409 0.5907
20 5 4.405 0.5949
21 5 4.287 0.713
22 4 4.405-0.4051
23 5 4.538 0.462
24 5 4.411 0.5886
25 5 4.53 0.4705
26 5 4.532 0.4684
27 5 4.667 0.3334
28 5 4.66 0.3397
29 5 4.536 0.4641
30 5 4.409 0.5907
31 4 4.667-0.6666
32 5 4.532 0.4684
33 3 4.418-1.418
34 4 4.411-0.4114
35 3 4.54-1.54
36 5 4.405 0.5949
37 5 4.534 0.4662
38 5 4.405 0.5949
39 5 4.407 0.5928
40 5 4.66 0.3397
41 5 4.66 0.3397
42 4 4.536-0.5359
43 5 4.669 0.3313
44 4 4.658-0.6582
45 5 4.665 0.3355
46 2 4.665-2.665
47 4 4.787-0.7868
48 5 4.536 0.4641
49 3 4.66-1.66
50 4 4.411-0.4114
51 4 4.789-0.7889
52 5 4.279 0.7214
53 4 4.546-0.5464
54 5 4.405 0.5949
55 4 4.399-0.3988
56 5 4.279 0.7214
57 5 4.403 0.597
58 4 4.405-0.4051
59 4 4.411-0.4114
60 3 4.405-1.405
61 5 4.279 0.7214
62 4 4.407-0.4072
63 5 4.53 0.4705
64 2 4.276-2.276
65 5 4.407 0.5928
66 3 4.409-1.409
67 5 4.414 0.5864
68 5 4.534 0.4662
69 5 4.542 0.4578
70 5 4.534 0.4662
71 5 4.407 0.5928
72 5 4.281 0.7193
73 5 4.411 0.5886
74 5 4.662 0.3376
75 5 4.538 0.462
76 5 4.409 0.5907
77 5 4.536 0.4641
78 4 4.534-0.5338
79 5 4.544 0.4557
80 4 4.403-0.403
81 4 4.291-0.2913
82 4 4.53-0.5295
83 4 4.536-0.5359
84 5 4.274 0.7256
85 2 4.279-2.279
86 4 4.285-0.2849
87 5 4.54 0.4599
88 4 4.405-0.4051
89 5 4.411 0.5886
90 4 4.654-0.654
91 5 4.411 0.5886
92 4 4.534-0.5338
93 5 4.66 0.3397
94 5 4.536 0.4641
95 5 4.669 0.3313
96 5 4.658 0.3418
97 3 4.276-1.276
98 5 4.662 0.3376
99 5 4.53 0.4705
100 5 4.409 0.5907
101 5 4.534 0.4662
102 5 4.407 0.5928

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  5 &  4.667 &  0.3334 \tabularnewline
2 &  3 &  4.407 & -1.407 \tabularnewline
3 &  5 &  4.532 &  0.4684 \tabularnewline
4 &  5 &  4.534 &  0.4662 \tabularnewline
5 &  5 &  4.658 &  0.3418 \tabularnewline
6 &  4 &  4.405 & -0.4051 \tabularnewline
7 &  2 &  4.409 & -2.409 \tabularnewline
8 &  5 &  4.534 &  0.4662 \tabularnewline
9 &  4 &  4.411 & -0.4114 \tabularnewline
10 &  5 &  4.534 &  0.4662 \tabularnewline
11 &  4 &  4.405 & -0.4051 \tabularnewline
12 &  4 &  4.407 & -0.4072 \tabularnewline
13 &  5 &  4.281 &  0.7193 \tabularnewline
14 &  5 &  4.407 &  0.5928 \tabularnewline
15 &  5 &  4.536 &  0.4641 \tabularnewline
16 &  5 &  4.787 &  0.2132 \tabularnewline
17 &  5 &  4.407 &  0.5928 \tabularnewline
18 &  5 &  4.793 &  0.2069 \tabularnewline
19 &  5 &  4.409 &  0.5907 \tabularnewline
20 &  5 &  4.405 &  0.5949 \tabularnewline
21 &  5 &  4.287 &  0.713 \tabularnewline
22 &  4 &  4.405 & -0.4051 \tabularnewline
23 &  5 &  4.538 &  0.462 \tabularnewline
24 &  5 &  4.411 &  0.5886 \tabularnewline
25 &  5 &  4.53 &  0.4705 \tabularnewline
26 &  5 &  4.532 &  0.4684 \tabularnewline
27 &  5 &  4.667 &  0.3334 \tabularnewline
28 &  5 &  4.66 &  0.3397 \tabularnewline
29 &  5 &  4.536 &  0.4641 \tabularnewline
30 &  5 &  4.409 &  0.5907 \tabularnewline
31 &  4 &  4.667 & -0.6666 \tabularnewline
32 &  5 &  4.532 &  0.4684 \tabularnewline
33 &  3 &  4.418 & -1.418 \tabularnewline
34 &  4 &  4.411 & -0.4114 \tabularnewline
35 &  3 &  4.54 & -1.54 \tabularnewline
36 &  5 &  4.405 &  0.5949 \tabularnewline
37 &  5 &  4.534 &  0.4662 \tabularnewline
38 &  5 &  4.405 &  0.5949 \tabularnewline
39 &  5 &  4.407 &  0.5928 \tabularnewline
40 &  5 &  4.66 &  0.3397 \tabularnewline
41 &  5 &  4.66 &  0.3397 \tabularnewline
42 &  4 &  4.536 & -0.5359 \tabularnewline
43 &  5 &  4.669 &  0.3313 \tabularnewline
44 &  4 &  4.658 & -0.6582 \tabularnewline
45 &  5 &  4.665 &  0.3355 \tabularnewline
46 &  2 &  4.665 & -2.665 \tabularnewline
47 &  4 &  4.787 & -0.7868 \tabularnewline
48 &  5 &  4.536 &  0.4641 \tabularnewline
49 &  3 &  4.66 & -1.66 \tabularnewline
50 &  4 &  4.411 & -0.4114 \tabularnewline
51 &  4 &  4.789 & -0.7889 \tabularnewline
52 &  5 &  4.279 &  0.7214 \tabularnewline
53 &  4 &  4.546 & -0.5464 \tabularnewline
54 &  5 &  4.405 &  0.5949 \tabularnewline
55 &  4 &  4.399 & -0.3988 \tabularnewline
56 &  5 &  4.279 &  0.7214 \tabularnewline
57 &  5 &  4.403 &  0.597 \tabularnewline
58 &  4 &  4.405 & -0.4051 \tabularnewline
59 &  4 &  4.411 & -0.4114 \tabularnewline
60 &  3 &  4.405 & -1.405 \tabularnewline
61 &  5 &  4.279 &  0.7214 \tabularnewline
62 &  4 &  4.407 & -0.4072 \tabularnewline
63 &  5 &  4.53 &  0.4705 \tabularnewline
64 &  2 &  4.276 & -2.276 \tabularnewline
65 &  5 &  4.407 &  0.5928 \tabularnewline
66 &  3 &  4.409 & -1.409 \tabularnewline
67 &  5 &  4.414 &  0.5864 \tabularnewline
68 &  5 &  4.534 &  0.4662 \tabularnewline
69 &  5 &  4.542 &  0.4578 \tabularnewline
70 &  5 &  4.534 &  0.4662 \tabularnewline
71 &  5 &  4.407 &  0.5928 \tabularnewline
72 &  5 &  4.281 &  0.7193 \tabularnewline
73 &  5 &  4.411 &  0.5886 \tabularnewline
74 &  5 &  4.662 &  0.3376 \tabularnewline
75 &  5 &  4.538 &  0.462 \tabularnewline
76 &  5 &  4.409 &  0.5907 \tabularnewline
77 &  5 &  4.536 &  0.4641 \tabularnewline
78 &  4 &  4.534 & -0.5338 \tabularnewline
79 &  5 &  4.544 &  0.4557 \tabularnewline
80 &  4 &  4.403 & -0.403 \tabularnewline
81 &  4 &  4.291 & -0.2913 \tabularnewline
82 &  4 &  4.53 & -0.5295 \tabularnewline
83 &  4 &  4.536 & -0.5359 \tabularnewline
84 &  5 &  4.274 &  0.7256 \tabularnewline
85 &  2 &  4.279 & -2.279 \tabularnewline
86 &  4 &  4.285 & -0.2849 \tabularnewline
87 &  5 &  4.54 &  0.4599 \tabularnewline
88 &  4 &  4.405 & -0.4051 \tabularnewline
89 &  5 &  4.411 &  0.5886 \tabularnewline
90 &  4 &  4.654 & -0.654 \tabularnewline
91 &  5 &  4.411 &  0.5886 \tabularnewline
92 &  4 &  4.534 & -0.5338 \tabularnewline
93 &  5 &  4.66 &  0.3397 \tabularnewline
94 &  5 &  4.536 &  0.4641 \tabularnewline
95 &  5 &  4.669 &  0.3313 \tabularnewline
96 &  5 &  4.658 &  0.3418 \tabularnewline
97 &  3 &  4.276 & -1.276 \tabularnewline
98 &  5 &  4.662 &  0.3376 \tabularnewline
99 &  5 &  4.53 &  0.4705 \tabularnewline
100 &  5 &  4.409 &  0.5907 \tabularnewline
101 &  5 &  4.534 &  0.4662 \tabularnewline
102 &  5 &  4.407 &  0.5928 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298131&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 5[/C][C] 4.667[/C][C] 0.3334[/C][/ROW]
[ROW][C]2[/C][C] 3[/C][C] 4.407[/C][C]-1.407[/C][/ROW]
[ROW][C]3[/C][C] 5[/C][C] 4.532[/C][C] 0.4684[/C][/ROW]
[ROW][C]4[/C][C] 5[/C][C] 4.534[/C][C] 0.4662[/C][/ROW]
[ROW][C]5[/C][C] 5[/C][C] 4.658[/C][C] 0.3418[/C][/ROW]
[ROW][C]6[/C][C] 4[/C][C] 4.405[/C][C]-0.4051[/C][/ROW]
[ROW][C]7[/C][C] 2[/C][C] 4.409[/C][C]-2.409[/C][/ROW]
[ROW][C]8[/C][C] 5[/C][C] 4.534[/C][C] 0.4662[/C][/ROW]
[ROW][C]9[/C][C] 4[/C][C] 4.411[/C][C]-0.4114[/C][/ROW]
[ROW][C]10[/C][C] 5[/C][C] 4.534[/C][C] 0.4662[/C][/ROW]
[ROW][C]11[/C][C] 4[/C][C] 4.405[/C][C]-0.4051[/C][/ROW]
[ROW][C]12[/C][C] 4[/C][C] 4.407[/C][C]-0.4072[/C][/ROW]
[ROW][C]13[/C][C] 5[/C][C] 4.281[/C][C] 0.7193[/C][/ROW]
[ROW][C]14[/C][C] 5[/C][C] 4.407[/C][C] 0.5928[/C][/ROW]
[ROW][C]15[/C][C] 5[/C][C] 4.536[/C][C] 0.4641[/C][/ROW]
[ROW][C]16[/C][C] 5[/C][C] 4.787[/C][C] 0.2132[/C][/ROW]
[ROW][C]17[/C][C] 5[/C][C] 4.407[/C][C] 0.5928[/C][/ROW]
[ROW][C]18[/C][C] 5[/C][C] 4.793[/C][C] 0.2069[/C][/ROW]
[ROW][C]19[/C][C] 5[/C][C] 4.409[/C][C] 0.5907[/C][/ROW]
[ROW][C]20[/C][C] 5[/C][C] 4.405[/C][C] 0.5949[/C][/ROW]
[ROW][C]21[/C][C] 5[/C][C] 4.287[/C][C] 0.713[/C][/ROW]
[ROW][C]22[/C][C] 4[/C][C] 4.405[/C][C]-0.4051[/C][/ROW]
[ROW][C]23[/C][C] 5[/C][C] 4.538[/C][C] 0.462[/C][/ROW]
[ROW][C]24[/C][C] 5[/C][C] 4.411[/C][C] 0.5886[/C][/ROW]
[ROW][C]25[/C][C] 5[/C][C] 4.53[/C][C] 0.4705[/C][/ROW]
[ROW][C]26[/C][C] 5[/C][C] 4.532[/C][C] 0.4684[/C][/ROW]
[ROW][C]27[/C][C] 5[/C][C] 4.667[/C][C] 0.3334[/C][/ROW]
[ROW][C]28[/C][C] 5[/C][C] 4.66[/C][C] 0.3397[/C][/ROW]
[ROW][C]29[/C][C] 5[/C][C] 4.536[/C][C] 0.4641[/C][/ROW]
[ROW][C]30[/C][C] 5[/C][C] 4.409[/C][C] 0.5907[/C][/ROW]
[ROW][C]31[/C][C] 4[/C][C] 4.667[/C][C]-0.6666[/C][/ROW]
[ROW][C]32[/C][C] 5[/C][C] 4.532[/C][C] 0.4684[/C][/ROW]
[ROW][C]33[/C][C] 3[/C][C] 4.418[/C][C]-1.418[/C][/ROW]
[ROW][C]34[/C][C] 4[/C][C] 4.411[/C][C]-0.4114[/C][/ROW]
[ROW][C]35[/C][C] 3[/C][C] 4.54[/C][C]-1.54[/C][/ROW]
[ROW][C]36[/C][C] 5[/C][C] 4.405[/C][C] 0.5949[/C][/ROW]
[ROW][C]37[/C][C] 5[/C][C] 4.534[/C][C] 0.4662[/C][/ROW]
[ROW][C]38[/C][C] 5[/C][C] 4.405[/C][C] 0.5949[/C][/ROW]
[ROW][C]39[/C][C] 5[/C][C] 4.407[/C][C] 0.5928[/C][/ROW]
[ROW][C]40[/C][C] 5[/C][C] 4.66[/C][C] 0.3397[/C][/ROW]
[ROW][C]41[/C][C] 5[/C][C] 4.66[/C][C] 0.3397[/C][/ROW]
[ROW][C]42[/C][C] 4[/C][C] 4.536[/C][C]-0.5359[/C][/ROW]
[ROW][C]43[/C][C] 5[/C][C] 4.669[/C][C] 0.3313[/C][/ROW]
[ROW][C]44[/C][C] 4[/C][C] 4.658[/C][C]-0.6582[/C][/ROW]
[ROW][C]45[/C][C] 5[/C][C] 4.665[/C][C] 0.3355[/C][/ROW]
[ROW][C]46[/C][C] 2[/C][C] 4.665[/C][C]-2.665[/C][/ROW]
[ROW][C]47[/C][C] 4[/C][C] 4.787[/C][C]-0.7868[/C][/ROW]
[ROW][C]48[/C][C] 5[/C][C] 4.536[/C][C] 0.4641[/C][/ROW]
[ROW][C]49[/C][C] 3[/C][C] 4.66[/C][C]-1.66[/C][/ROW]
[ROW][C]50[/C][C] 4[/C][C] 4.411[/C][C]-0.4114[/C][/ROW]
[ROW][C]51[/C][C] 4[/C][C] 4.789[/C][C]-0.7889[/C][/ROW]
[ROW][C]52[/C][C] 5[/C][C] 4.279[/C][C] 0.7214[/C][/ROW]
[ROW][C]53[/C][C] 4[/C][C] 4.546[/C][C]-0.5464[/C][/ROW]
[ROW][C]54[/C][C] 5[/C][C] 4.405[/C][C] 0.5949[/C][/ROW]
[ROW][C]55[/C][C] 4[/C][C] 4.399[/C][C]-0.3988[/C][/ROW]
[ROW][C]56[/C][C] 5[/C][C] 4.279[/C][C] 0.7214[/C][/ROW]
[ROW][C]57[/C][C] 5[/C][C] 4.403[/C][C] 0.597[/C][/ROW]
[ROW][C]58[/C][C] 4[/C][C] 4.405[/C][C]-0.4051[/C][/ROW]
[ROW][C]59[/C][C] 4[/C][C] 4.411[/C][C]-0.4114[/C][/ROW]
[ROW][C]60[/C][C] 3[/C][C] 4.405[/C][C]-1.405[/C][/ROW]
[ROW][C]61[/C][C] 5[/C][C] 4.279[/C][C] 0.7214[/C][/ROW]
[ROW][C]62[/C][C] 4[/C][C] 4.407[/C][C]-0.4072[/C][/ROW]
[ROW][C]63[/C][C] 5[/C][C] 4.53[/C][C] 0.4705[/C][/ROW]
[ROW][C]64[/C][C] 2[/C][C] 4.276[/C][C]-2.276[/C][/ROW]
[ROW][C]65[/C][C] 5[/C][C] 4.407[/C][C] 0.5928[/C][/ROW]
[ROW][C]66[/C][C] 3[/C][C] 4.409[/C][C]-1.409[/C][/ROW]
[ROW][C]67[/C][C] 5[/C][C] 4.414[/C][C] 0.5864[/C][/ROW]
[ROW][C]68[/C][C] 5[/C][C] 4.534[/C][C] 0.4662[/C][/ROW]
[ROW][C]69[/C][C] 5[/C][C] 4.542[/C][C] 0.4578[/C][/ROW]
[ROW][C]70[/C][C] 5[/C][C] 4.534[/C][C] 0.4662[/C][/ROW]
[ROW][C]71[/C][C] 5[/C][C] 4.407[/C][C] 0.5928[/C][/ROW]
[ROW][C]72[/C][C] 5[/C][C] 4.281[/C][C] 0.7193[/C][/ROW]
[ROW][C]73[/C][C] 5[/C][C] 4.411[/C][C] 0.5886[/C][/ROW]
[ROW][C]74[/C][C] 5[/C][C] 4.662[/C][C] 0.3376[/C][/ROW]
[ROW][C]75[/C][C] 5[/C][C] 4.538[/C][C] 0.462[/C][/ROW]
[ROW][C]76[/C][C] 5[/C][C] 4.409[/C][C] 0.5907[/C][/ROW]
[ROW][C]77[/C][C] 5[/C][C] 4.536[/C][C] 0.4641[/C][/ROW]
[ROW][C]78[/C][C] 4[/C][C] 4.534[/C][C]-0.5338[/C][/ROW]
[ROW][C]79[/C][C] 5[/C][C] 4.544[/C][C] 0.4557[/C][/ROW]
[ROW][C]80[/C][C] 4[/C][C] 4.403[/C][C]-0.403[/C][/ROW]
[ROW][C]81[/C][C] 4[/C][C] 4.291[/C][C]-0.2913[/C][/ROW]
[ROW][C]82[/C][C] 4[/C][C] 4.53[/C][C]-0.5295[/C][/ROW]
[ROW][C]83[/C][C] 4[/C][C] 4.536[/C][C]-0.5359[/C][/ROW]
[ROW][C]84[/C][C] 5[/C][C] 4.274[/C][C] 0.7256[/C][/ROW]
[ROW][C]85[/C][C] 2[/C][C] 4.279[/C][C]-2.279[/C][/ROW]
[ROW][C]86[/C][C] 4[/C][C] 4.285[/C][C]-0.2849[/C][/ROW]
[ROW][C]87[/C][C] 5[/C][C] 4.54[/C][C] 0.4599[/C][/ROW]
[ROW][C]88[/C][C] 4[/C][C] 4.405[/C][C]-0.4051[/C][/ROW]
[ROW][C]89[/C][C] 5[/C][C] 4.411[/C][C] 0.5886[/C][/ROW]
[ROW][C]90[/C][C] 4[/C][C] 4.654[/C][C]-0.654[/C][/ROW]
[ROW][C]91[/C][C] 5[/C][C] 4.411[/C][C] 0.5886[/C][/ROW]
[ROW][C]92[/C][C] 4[/C][C] 4.534[/C][C]-0.5338[/C][/ROW]
[ROW][C]93[/C][C] 5[/C][C] 4.66[/C][C] 0.3397[/C][/ROW]
[ROW][C]94[/C][C] 5[/C][C] 4.536[/C][C] 0.4641[/C][/ROW]
[ROW][C]95[/C][C] 5[/C][C] 4.669[/C][C] 0.3313[/C][/ROW]
[ROW][C]96[/C][C] 5[/C][C] 4.658[/C][C] 0.3418[/C][/ROW]
[ROW][C]97[/C][C] 3[/C][C] 4.276[/C][C]-1.276[/C][/ROW]
[ROW][C]98[/C][C] 5[/C][C] 4.662[/C][C] 0.3376[/C][/ROW]
[ROW][C]99[/C][C] 5[/C][C] 4.53[/C][C] 0.4705[/C][/ROW]
[ROW][C]100[/C][C] 5[/C][C] 4.409[/C][C] 0.5907[/C][/ROW]
[ROW][C]101[/C][C] 5[/C][C] 4.534[/C][C] 0.4662[/C][/ROW]
[ROW][C]102[/C][C] 5[/C][C] 4.407[/C][C] 0.5928[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298131&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298131&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 5 4.667 0.3334
2 3 4.407-1.407
3 5 4.532 0.4684
4 5 4.534 0.4662
5 5 4.658 0.3418
6 4 4.405-0.4051
7 2 4.409-2.409
8 5 4.534 0.4662
9 4 4.411-0.4114
10 5 4.534 0.4662
11 4 4.405-0.4051
12 4 4.407-0.4072
13 5 4.281 0.7193
14 5 4.407 0.5928
15 5 4.536 0.4641
16 5 4.787 0.2132
17 5 4.407 0.5928
18 5 4.793 0.2069
19 5 4.409 0.5907
20 5 4.405 0.5949
21 5 4.287 0.713
22 4 4.405-0.4051
23 5 4.538 0.462
24 5 4.411 0.5886
25 5 4.53 0.4705
26 5 4.532 0.4684
27 5 4.667 0.3334
28 5 4.66 0.3397
29 5 4.536 0.4641
30 5 4.409 0.5907
31 4 4.667-0.6666
32 5 4.532 0.4684
33 3 4.418-1.418
34 4 4.411-0.4114
35 3 4.54-1.54
36 5 4.405 0.5949
37 5 4.534 0.4662
38 5 4.405 0.5949
39 5 4.407 0.5928
40 5 4.66 0.3397
41 5 4.66 0.3397
42 4 4.536-0.5359
43 5 4.669 0.3313
44 4 4.658-0.6582
45 5 4.665 0.3355
46 2 4.665-2.665
47 4 4.787-0.7868
48 5 4.536 0.4641
49 3 4.66-1.66
50 4 4.411-0.4114
51 4 4.789-0.7889
52 5 4.279 0.7214
53 4 4.546-0.5464
54 5 4.405 0.5949
55 4 4.399-0.3988
56 5 4.279 0.7214
57 5 4.403 0.597
58 4 4.405-0.4051
59 4 4.411-0.4114
60 3 4.405-1.405
61 5 4.279 0.7214
62 4 4.407-0.4072
63 5 4.53 0.4705
64 2 4.276-2.276
65 5 4.407 0.5928
66 3 4.409-1.409
67 5 4.414 0.5864
68 5 4.534 0.4662
69 5 4.542 0.4578
70 5 4.534 0.4662
71 5 4.407 0.5928
72 5 4.281 0.7193
73 5 4.411 0.5886
74 5 4.662 0.3376
75 5 4.538 0.462
76 5 4.409 0.5907
77 5 4.536 0.4641
78 4 4.534-0.5338
79 5 4.544 0.4557
80 4 4.403-0.403
81 4 4.291-0.2913
82 4 4.53-0.5295
83 4 4.536-0.5359
84 5 4.274 0.7256
85 2 4.279-2.279
86 4 4.285-0.2849
87 5 4.54 0.4599
88 4 4.405-0.4051
89 5 4.411 0.5886
90 4 4.654-0.654
91 5 4.411 0.5886
92 4 4.534-0.5338
93 5 4.66 0.3397
94 5 4.536 0.4641
95 5 4.669 0.3313
96 5 4.658 0.3418
97 3 4.276-1.276
98 5 4.662 0.3376
99 5 4.53 0.4705
100 5 4.409 0.5907
101 5 4.534 0.4662
102 5 4.407 0.5928







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
6 0.3663 0.7325 0.6337
7 0.6183 0.7635 0.3817
8 0.5573 0.8853 0.4427
9 0.637 0.726 0.363
10 0.5633 0.8734 0.4367
11 0.4668 0.9336 0.5332
12 0.3892 0.7783 0.6108
13 0.7608 0.4784 0.2392
14 0.7669 0.4662 0.2331
15 0.7198 0.5605 0.2802
16 0.6746 0.6507 0.3253
17 0.6633 0.6735 0.3367
18 0.5881 0.8238 0.4119
19 0.5837 0.8327 0.4163
20 0.5452 0.9095 0.4548
21 0.5773 0.8455 0.4227
22 0.523 0.9541 0.477
23 0.465 0.9301 0.535
24 0.4249 0.8498 0.5751
25 0.3761 0.7522 0.6239
26 0.326 0.6521 0.674
27 0.27 0.5399 0.73
28 0.2204 0.4407 0.7796
29 0.1826 0.3652 0.8174
30 0.1602 0.3205 0.8398
31 0.1716 0.3432 0.8284
32 0.1411 0.2821 0.8589
33 0.2136 0.4271 0.7864
34 0.1781 0.3562 0.8219
35 0.296 0.592 0.704
36 0.264 0.528 0.736
37 0.2269 0.4538 0.7731
38 0.1995 0.399 0.8005
39 0.1783 0.3567 0.8217
40 0.1463 0.2926 0.8537
41 0.119 0.2379 0.881
42 0.1061 0.2122 0.8939
43 0.09212 0.1842 0.9079
44 0.103 0.206 0.897
45 0.0835 0.167 0.9165
46 0.5587 0.8827 0.4413
47 0.565 0.87 0.435
48 0.5272 0.9456 0.4728
49 0.7263 0.5473 0.2737
50 0.6902 0.6197 0.3098
51 0.7063 0.5874 0.2937
52 0.7037 0.5926 0.2963
53 0.7112 0.5776 0.2888
54 0.6916 0.6168 0.3084
55 0.6751 0.6499 0.3249
56 0.691 0.618 0.309
57 0.6884 0.6232 0.3116
58 0.6516 0.6969 0.3484
59 0.616 0.7681 0.384
60 0.7338 0.5324 0.2662
61 0.7647 0.4707 0.2353
62 0.7279 0.5441 0.2721
63 0.7022 0.5956 0.2978
64 0.9348 0.1305 0.06523
65 0.929 0.1419 0.07095
66 0.9728 0.05443 0.02722
67 0.9663 0.06739 0.03369
68 0.9568 0.08645 0.04322
69 0.9438 0.1125 0.05624
70 0.9293 0.1414 0.07069
71 0.9231 0.1539 0.07693
72 0.9364 0.1273 0.06364
73 0.9263 0.1473 0.07366
74 0.9012 0.1976 0.09882
75 0.8736 0.2528 0.1264
76 0.8646 0.2708 0.1354
77 0.8335 0.3329 0.1665
78 0.813 0.374 0.187
79 0.7665 0.467 0.2335
80 0.7107 0.5786 0.2893
81 0.6664 0.6672 0.3336
82 0.6134 0.7732 0.3866
83 0.6013 0.7974 0.3987
84 0.8174 0.3652 0.1826
85 0.9864 0.02713 0.01356
86 0.9822 0.03565 0.01782
87 0.9703 0.05939 0.0297
88 0.9537 0.0926 0.0463
89 0.9262 0.1477 0.07383
90 0.9108 0.1783 0.08915
91 0.8772 0.2456 0.1228
92 0.8919 0.2163 0.1081
93 0.8219 0.3563 0.1781
94 0.7203 0.5595 0.2797
95 0.7184 0.5632 0.2816
96 0.556 0.888 0.444

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
6 &  0.3663 &  0.7325 &  0.6337 \tabularnewline
7 &  0.6183 &  0.7635 &  0.3817 \tabularnewline
8 &  0.5573 &  0.8853 &  0.4427 \tabularnewline
9 &  0.637 &  0.726 &  0.363 \tabularnewline
10 &  0.5633 &  0.8734 &  0.4367 \tabularnewline
11 &  0.4668 &  0.9336 &  0.5332 \tabularnewline
12 &  0.3892 &  0.7783 &  0.6108 \tabularnewline
13 &  0.7608 &  0.4784 &  0.2392 \tabularnewline
14 &  0.7669 &  0.4662 &  0.2331 \tabularnewline
15 &  0.7198 &  0.5605 &  0.2802 \tabularnewline
16 &  0.6746 &  0.6507 &  0.3253 \tabularnewline
17 &  0.6633 &  0.6735 &  0.3367 \tabularnewline
18 &  0.5881 &  0.8238 &  0.4119 \tabularnewline
19 &  0.5837 &  0.8327 &  0.4163 \tabularnewline
20 &  0.5452 &  0.9095 &  0.4548 \tabularnewline
21 &  0.5773 &  0.8455 &  0.4227 \tabularnewline
22 &  0.523 &  0.9541 &  0.477 \tabularnewline
23 &  0.465 &  0.9301 &  0.535 \tabularnewline
24 &  0.4249 &  0.8498 &  0.5751 \tabularnewline
25 &  0.3761 &  0.7522 &  0.6239 \tabularnewline
26 &  0.326 &  0.6521 &  0.674 \tabularnewline
27 &  0.27 &  0.5399 &  0.73 \tabularnewline
28 &  0.2204 &  0.4407 &  0.7796 \tabularnewline
29 &  0.1826 &  0.3652 &  0.8174 \tabularnewline
30 &  0.1602 &  0.3205 &  0.8398 \tabularnewline
31 &  0.1716 &  0.3432 &  0.8284 \tabularnewline
32 &  0.1411 &  0.2821 &  0.8589 \tabularnewline
33 &  0.2136 &  0.4271 &  0.7864 \tabularnewline
34 &  0.1781 &  0.3562 &  0.8219 \tabularnewline
35 &  0.296 &  0.592 &  0.704 \tabularnewline
36 &  0.264 &  0.528 &  0.736 \tabularnewline
37 &  0.2269 &  0.4538 &  0.7731 \tabularnewline
38 &  0.1995 &  0.399 &  0.8005 \tabularnewline
39 &  0.1783 &  0.3567 &  0.8217 \tabularnewline
40 &  0.1463 &  0.2926 &  0.8537 \tabularnewline
41 &  0.119 &  0.2379 &  0.881 \tabularnewline
42 &  0.1061 &  0.2122 &  0.8939 \tabularnewline
43 &  0.09212 &  0.1842 &  0.9079 \tabularnewline
44 &  0.103 &  0.206 &  0.897 \tabularnewline
45 &  0.0835 &  0.167 &  0.9165 \tabularnewline
46 &  0.5587 &  0.8827 &  0.4413 \tabularnewline
47 &  0.565 &  0.87 &  0.435 \tabularnewline
48 &  0.5272 &  0.9456 &  0.4728 \tabularnewline
49 &  0.7263 &  0.5473 &  0.2737 \tabularnewline
50 &  0.6902 &  0.6197 &  0.3098 \tabularnewline
51 &  0.7063 &  0.5874 &  0.2937 \tabularnewline
52 &  0.7037 &  0.5926 &  0.2963 \tabularnewline
53 &  0.7112 &  0.5776 &  0.2888 \tabularnewline
54 &  0.6916 &  0.6168 &  0.3084 \tabularnewline
55 &  0.6751 &  0.6499 &  0.3249 \tabularnewline
56 &  0.691 &  0.618 &  0.309 \tabularnewline
57 &  0.6884 &  0.6232 &  0.3116 \tabularnewline
58 &  0.6516 &  0.6969 &  0.3484 \tabularnewline
59 &  0.616 &  0.7681 &  0.384 \tabularnewline
60 &  0.7338 &  0.5324 &  0.2662 \tabularnewline
61 &  0.7647 &  0.4707 &  0.2353 \tabularnewline
62 &  0.7279 &  0.5441 &  0.2721 \tabularnewline
63 &  0.7022 &  0.5956 &  0.2978 \tabularnewline
64 &  0.9348 &  0.1305 &  0.06523 \tabularnewline
65 &  0.929 &  0.1419 &  0.07095 \tabularnewline
66 &  0.9728 &  0.05443 &  0.02722 \tabularnewline
67 &  0.9663 &  0.06739 &  0.03369 \tabularnewline
68 &  0.9568 &  0.08645 &  0.04322 \tabularnewline
69 &  0.9438 &  0.1125 &  0.05624 \tabularnewline
70 &  0.9293 &  0.1414 &  0.07069 \tabularnewline
71 &  0.9231 &  0.1539 &  0.07693 \tabularnewline
72 &  0.9364 &  0.1273 &  0.06364 \tabularnewline
73 &  0.9263 &  0.1473 &  0.07366 \tabularnewline
74 &  0.9012 &  0.1976 &  0.09882 \tabularnewline
75 &  0.8736 &  0.2528 &  0.1264 \tabularnewline
76 &  0.8646 &  0.2708 &  0.1354 \tabularnewline
77 &  0.8335 &  0.3329 &  0.1665 \tabularnewline
78 &  0.813 &  0.374 &  0.187 \tabularnewline
79 &  0.7665 &  0.467 &  0.2335 \tabularnewline
80 &  0.7107 &  0.5786 &  0.2893 \tabularnewline
81 &  0.6664 &  0.6672 &  0.3336 \tabularnewline
82 &  0.6134 &  0.7732 &  0.3866 \tabularnewline
83 &  0.6013 &  0.7974 &  0.3987 \tabularnewline
84 &  0.8174 &  0.3652 &  0.1826 \tabularnewline
85 &  0.9864 &  0.02713 &  0.01356 \tabularnewline
86 &  0.9822 &  0.03565 &  0.01782 \tabularnewline
87 &  0.9703 &  0.05939 &  0.0297 \tabularnewline
88 &  0.9537 &  0.0926 &  0.0463 \tabularnewline
89 &  0.9262 &  0.1477 &  0.07383 \tabularnewline
90 &  0.9108 &  0.1783 &  0.08915 \tabularnewline
91 &  0.8772 &  0.2456 &  0.1228 \tabularnewline
92 &  0.8919 &  0.2163 &  0.1081 \tabularnewline
93 &  0.8219 &  0.3563 &  0.1781 \tabularnewline
94 &  0.7203 &  0.5595 &  0.2797 \tabularnewline
95 &  0.7184 &  0.5632 &  0.2816 \tabularnewline
96 &  0.556 &  0.888 &  0.444 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298131&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]6[/C][C] 0.3663[/C][C] 0.7325[/C][C] 0.6337[/C][/ROW]
[ROW][C]7[/C][C] 0.6183[/C][C] 0.7635[/C][C] 0.3817[/C][/ROW]
[ROW][C]8[/C][C] 0.5573[/C][C] 0.8853[/C][C] 0.4427[/C][/ROW]
[ROW][C]9[/C][C] 0.637[/C][C] 0.726[/C][C] 0.363[/C][/ROW]
[ROW][C]10[/C][C] 0.5633[/C][C] 0.8734[/C][C] 0.4367[/C][/ROW]
[ROW][C]11[/C][C] 0.4668[/C][C] 0.9336[/C][C] 0.5332[/C][/ROW]
[ROW][C]12[/C][C] 0.3892[/C][C] 0.7783[/C][C] 0.6108[/C][/ROW]
[ROW][C]13[/C][C] 0.7608[/C][C] 0.4784[/C][C] 0.2392[/C][/ROW]
[ROW][C]14[/C][C] 0.7669[/C][C] 0.4662[/C][C] 0.2331[/C][/ROW]
[ROW][C]15[/C][C] 0.7198[/C][C] 0.5605[/C][C] 0.2802[/C][/ROW]
[ROW][C]16[/C][C] 0.6746[/C][C] 0.6507[/C][C] 0.3253[/C][/ROW]
[ROW][C]17[/C][C] 0.6633[/C][C] 0.6735[/C][C] 0.3367[/C][/ROW]
[ROW][C]18[/C][C] 0.5881[/C][C] 0.8238[/C][C] 0.4119[/C][/ROW]
[ROW][C]19[/C][C] 0.5837[/C][C] 0.8327[/C][C] 0.4163[/C][/ROW]
[ROW][C]20[/C][C] 0.5452[/C][C] 0.9095[/C][C] 0.4548[/C][/ROW]
[ROW][C]21[/C][C] 0.5773[/C][C] 0.8455[/C][C] 0.4227[/C][/ROW]
[ROW][C]22[/C][C] 0.523[/C][C] 0.9541[/C][C] 0.477[/C][/ROW]
[ROW][C]23[/C][C] 0.465[/C][C] 0.9301[/C][C] 0.535[/C][/ROW]
[ROW][C]24[/C][C] 0.4249[/C][C] 0.8498[/C][C] 0.5751[/C][/ROW]
[ROW][C]25[/C][C] 0.3761[/C][C] 0.7522[/C][C] 0.6239[/C][/ROW]
[ROW][C]26[/C][C] 0.326[/C][C] 0.6521[/C][C] 0.674[/C][/ROW]
[ROW][C]27[/C][C] 0.27[/C][C] 0.5399[/C][C] 0.73[/C][/ROW]
[ROW][C]28[/C][C] 0.2204[/C][C] 0.4407[/C][C] 0.7796[/C][/ROW]
[ROW][C]29[/C][C] 0.1826[/C][C] 0.3652[/C][C] 0.8174[/C][/ROW]
[ROW][C]30[/C][C] 0.1602[/C][C] 0.3205[/C][C] 0.8398[/C][/ROW]
[ROW][C]31[/C][C] 0.1716[/C][C] 0.3432[/C][C] 0.8284[/C][/ROW]
[ROW][C]32[/C][C] 0.1411[/C][C] 0.2821[/C][C] 0.8589[/C][/ROW]
[ROW][C]33[/C][C] 0.2136[/C][C] 0.4271[/C][C] 0.7864[/C][/ROW]
[ROW][C]34[/C][C] 0.1781[/C][C] 0.3562[/C][C] 0.8219[/C][/ROW]
[ROW][C]35[/C][C] 0.296[/C][C] 0.592[/C][C] 0.704[/C][/ROW]
[ROW][C]36[/C][C] 0.264[/C][C] 0.528[/C][C] 0.736[/C][/ROW]
[ROW][C]37[/C][C] 0.2269[/C][C] 0.4538[/C][C] 0.7731[/C][/ROW]
[ROW][C]38[/C][C] 0.1995[/C][C] 0.399[/C][C] 0.8005[/C][/ROW]
[ROW][C]39[/C][C] 0.1783[/C][C] 0.3567[/C][C] 0.8217[/C][/ROW]
[ROW][C]40[/C][C] 0.1463[/C][C] 0.2926[/C][C] 0.8537[/C][/ROW]
[ROW][C]41[/C][C] 0.119[/C][C] 0.2379[/C][C] 0.881[/C][/ROW]
[ROW][C]42[/C][C] 0.1061[/C][C] 0.2122[/C][C] 0.8939[/C][/ROW]
[ROW][C]43[/C][C] 0.09212[/C][C] 0.1842[/C][C] 0.9079[/C][/ROW]
[ROW][C]44[/C][C] 0.103[/C][C] 0.206[/C][C] 0.897[/C][/ROW]
[ROW][C]45[/C][C] 0.0835[/C][C] 0.167[/C][C] 0.9165[/C][/ROW]
[ROW][C]46[/C][C] 0.5587[/C][C] 0.8827[/C][C] 0.4413[/C][/ROW]
[ROW][C]47[/C][C] 0.565[/C][C] 0.87[/C][C] 0.435[/C][/ROW]
[ROW][C]48[/C][C] 0.5272[/C][C] 0.9456[/C][C] 0.4728[/C][/ROW]
[ROW][C]49[/C][C] 0.7263[/C][C] 0.5473[/C][C] 0.2737[/C][/ROW]
[ROW][C]50[/C][C] 0.6902[/C][C] 0.6197[/C][C] 0.3098[/C][/ROW]
[ROW][C]51[/C][C] 0.7063[/C][C] 0.5874[/C][C] 0.2937[/C][/ROW]
[ROW][C]52[/C][C] 0.7037[/C][C] 0.5926[/C][C] 0.2963[/C][/ROW]
[ROW][C]53[/C][C] 0.7112[/C][C] 0.5776[/C][C] 0.2888[/C][/ROW]
[ROW][C]54[/C][C] 0.6916[/C][C] 0.6168[/C][C] 0.3084[/C][/ROW]
[ROW][C]55[/C][C] 0.6751[/C][C] 0.6499[/C][C] 0.3249[/C][/ROW]
[ROW][C]56[/C][C] 0.691[/C][C] 0.618[/C][C] 0.309[/C][/ROW]
[ROW][C]57[/C][C] 0.6884[/C][C] 0.6232[/C][C] 0.3116[/C][/ROW]
[ROW][C]58[/C][C] 0.6516[/C][C] 0.6969[/C][C] 0.3484[/C][/ROW]
[ROW][C]59[/C][C] 0.616[/C][C] 0.7681[/C][C] 0.384[/C][/ROW]
[ROW][C]60[/C][C] 0.7338[/C][C] 0.5324[/C][C] 0.2662[/C][/ROW]
[ROW][C]61[/C][C] 0.7647[/C][C] 0.4707[/C][C] 0.2353[/C][/ROW]
[ROW][C]62[/C][C] 0.7279[/C][C] 0.5441[/C][C] 0.2721[/C][/ROW]
[ROW][C]63[/C][C] 0.7022[/C][C] 0.5956[/C][C] 0.2978[/C][/ROW]
[ROW][C]64[/C][C] 0.9348[/C][C] 0.1305[/C][C] 0.06523[/C][/ROW]
[ROW][C]65[/C][C] 0.929[/C][C] 0.1419[/C][C] 0.07095[/C][/ROW]
[ROW][C]66[/C][C] 0.9728[/C][C] 0.05443[/C][C] 0.02722[/C][/ROW]
[ROW][C]67[/C][C] 0.9663[/C][C] 0.06739[/C][C] 0.03369[/C][/ROW]
[ROW][C]68[/C][C] 0.9568[/C][C] 0.08645[/C][C] 0.04322[/C][/ROW]
[ROW][C]69[/C][C] 0.9438[/C][C] 0.1125[/C][C] 0.05624[/C][/ROW]
[ROW][C]70[/C][C] 0.9293[/C][C] 0.1414[/C][C] 0.07069[/C][/ROW]
[ROW][C]71[/C][C] 0.9231[/C][C] 0.1539[/C][C] 0.07693[/C][/ROW]
[ROW][C]72[/C][C] 0.9364[/C][C] 0.1273[/C][C] 0.06364[/C][/ROW]
[ROW][C]73[/C][C] 0.9263[/C][C] 0.1473[/C][C] 0.07366[/C][/ROW]
[ROW][C]74[/C][C] 0.9012[/C][C] 0.1976[/C][C] 0.09882[/C][/ROW]
[ROW][C]75[/C][C] 0.8736[/C][C] 0.2528[/C][C] 0.1264[/C][/ROW]
[ROW][C]76[/C][C] 0.8646[/C][C] 0.2708[/C][C] 0.1354[/C][/ROW]
[ROW][C]77[/C][C] 0.8335[/C][C] 0.3329[/C][C] 0.1665[/C][/ROW]
[ROW][C]78[/C][C] 0.813[/C][C] 0.374[/C][C] 0.187[/C][/ROW]
[ROW][C]79[/C][C] 0.7665[/C][C] 0.467[/C][C] 0.2335[/C][/ROW]
[ROW][C]80[/C][C] 0.7107[/C][C] 0.5786[/C][C] 0.2893[/C][/ROW]
[ROW][C]81[/C][C] 0.6664[/C][C] 0.6672[/C][C] 0.3336[/C][/ROW]
[ROW][C]82[/C][C] 0.6134[/C][C] 0.7732[/C][C] 0.3866[/C][/ROW]
[ROW][C]83[/C][C] 0.6013[/C][C] 0.7974[/C][C] 0.3987[/C][/ROW]
[ROW][C]84[/C][C] 0.8174[/C][C] 0.3652[/C][C] 0.1826[/C][/ROW]
[ROW][C]85[/C][C] 0.9864[/C][C] 0.02713[/C][C] 0.01356[/C][/ROW]
[ROW][C]86[/C][C] 0.9822[/C][C] 0.03565[/C][C] 0.01782[/C][/ROW]
[ROW][C]87[/C][C] 0.9703[/C][C] 0.05939[/C][C] 0.0297[/C][/ROW]
[ROW][C]88[/C][C] 0.9537[/C][C] 0.0926[/C][C] 0.0463[/C][/ROW]
[ROW][C]89[/C][C] 0.9262[/C][C] 0.1477[/C][C] 0.07383[/C][/ROW]
[ROW][C]90[/C][C] 0.9108[/C][C] 0.1783[/C][C] 0.08915[/C][/ROW]
[ROW][C]91[/C][C] 0.8772[/C][C] 0.2456[/C][C] 0.1228[/C][/ROW]
[ROW][C]92[/C][C] 0.8919[/C][C] 0.2163[/C][C] 0.1081[/C][/ROW]
[ROW][C]93[/C][C] 0.8219[/C][C] 0.3563[/C][C] 0.1781[/C][/ROW]
[ROW][C]94[/C][C] 0.7203[/C][C] 0.5595[/C][C] 0.2797[/C][/ROW]
[ROW][C]95[/C][C] 0.7184[/C][C] 0.5632[/C][C] 0.2816[/C][/ROW]
[ROW][C]96[/C][C] 0.556[/C][C] 0.888[/C][C] 0.444[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298131&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298131&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
6 0.3663 0.7325 0.6337
7 0.6183 0.7635 0.3817
8 0.5573 0.8853 0.4427
9 0.637 0.726 0.363
10 0.5633 0.8734 0.4367
11 0.4668 0.9336 0.5332
12 0.3892 0.7783 0.6108
13 0.7608 0.4784 0.2392
14 0.7669 0.4662 0.2331
15 0.7198 0.5605 0.2802
16 0.6746 0.6507 0.3253
17 0.6633 0.6735 0.3367
18 0.5881 0.8238 0.4119
19 0.5837 0.8327 0.4163
20 0.5452 0.9095 0.4548
21 0.5773 0.8455 0.4227
22 0.523 0.9541 0.477
23 0.465 0.9301 0.535
24 0.4249 0.8498 0.5751
25 0.3761 0.7522 0.6239
26 0.326 0.6521 0.674
27 0.27 0.5399 0.73
28 0.2204 0.4407 0.7796
29 0.1826 0.3652 0.8174
30 0.1602 0.3205 0.8398
31 0.1716 0.3432 0.8284
32 0.1411 0.2821 0.8589
33 0.2136 0.4271 0.7864
34 0.1781 0.3562 0.8219
35 0.296 0.592 0.704
36 0.264 0.528 0.736
37 0.2269 0.4538 0.7731
38 0.1995 0.399 0.8005
39 0.1783 0.3567 0.8217
40 0.1463 0.2926 0.8537
41 0.119 0.2379 0.881
42 0.1061 0.2122 0.8939
43 0.09212 0.1842 0.9079
44 0.103 0.206 0.897
45 0.0835 0.167 0.9165
46 0.5587 0.8827 0.4413
47 0.565 0.87 0.435
48 0.5272 0.9456 0.4728
49 0.7263 0.5473 0.2737
50 0.6902 0.6197 0.3098
51 0.7063 0.5874 0.2937
52 0.7037 0.5926 0.2963
53 0.7112 0.5776 0.2888
54 0.6916 0.6168 0.3084
55 0.6751 0.6499 0.3249
56 0.691 0.618 0.309
57 0.6884 0.6232 0.3116
58 0.6516 0.6969 0.3484
59 0.616 0.7681 0.384
60 0.7338 0.5324 0.2662
61 0.7647 0.4707 0.2353
62 0.7279 0.5441 0.2721
63 0.7022 0.5956 0.2978
64 0.9348 0.1305 0.06523
65 0.929 0.1419 0.07095
66 0.9728 0.05443 0.02722
67 0.9663 0.06739 0.03369
68 0.9568 0.08645 0.04322
69 0.9438 0.1125 0.05624
70 0.9293 0.1414 0.07069
71 0.9231 0.1539 0.07693
72 0.9364 0.1273 0.06364
73 0.9263 0.1473 0.07366
74 0.9012 0.1976 0.09882
75 0.8736 0.2528 0.1264
76 0.8646 0.2708 0.1354
77 0.8335 0.3329 0.1665
78 0.813 0.374 0.187
79 0.7665 0.467 0.2335
80 0.7107 0.5786 0.2893
81 0.6664 0.6672 0.3336
82 0.6134 0.7732 0.3866
83 0.6013 0.7974 0.3987
84 0.8174 0.3652 0.1826
85 0.9864 0.02713 0.01356
86 0.9822 0.03565 0.01782
87 0.9703 0.05939 0.0297
88 0.9537 0.0926 0.0463
89 0.9262 0.1477 0.07383
90 0.9108 0.1783 0.08915
91 0.8772 0.2456 0.1228
92 0.8919 0.2163 0.1081
93 0.8219 0.3563 0.1781
94 0.7203 0.5595 0.2797
95 0.7184 0.5632 0.2816
96 0.556 0.888 0.444







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level20.021978OK
10% type I error level70.0769231OK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 2 & 0.021978 & OK \tabularnewline
10% type I error level & 7 & 0.0769231 & OK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=298131&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]2[/C][C]0.021978[/C][C]OK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]7[/C][C]0.0769231[/C][C]OK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=298131&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298131&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level20.021978OK
10% type I error level70.0769231OK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.0035, df1 = 2, df2 = 97, p-value = 0.3704
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.90583, df1 = 4, df2 = 95, p-value = 0.4639
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.81257, df1 = 2, df2 = 97, p-value = 0.4467

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.0035, df1 = 2, df2 = 97, p-value = 0.3704
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.90583, df1 = 4, df2 = 95, p-value = 0.4639
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.81257, df1 = 2, df2 = 97, p-value = 0.4467
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=298131&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.0035, df1 = 2, df2 = 97, p-value = 0.3704
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.90583, df1 = 4, df2 = 95, p-value = 0.4639
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.81257, df1 = 2, df2 = 97, p-value = 0.4467
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298131&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298131&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.0035, df1 = 2, df2 = 97, p-value = 0.3704
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.90583, df1 = 4, df2 = 95, p-value = 0.4639
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 0.81257, df1 = 2, df2 = 97, p-value = 0.4467







Variance Inflation Factors (Multicollinearity)
> vif
        EP3 `TVDSUM\\r` 
   1.030526    1.030526 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
        EP3 `TVDSUM\\r` 
   1.030526    1.030526 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=298131&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
        EP3 `TVDSUM\\r` 
   1.030526    1.030526 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=298131&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=298131&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
        EP3 `TVDSUM\\r` 
   1.030526    1.030526 



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')