Free Statistics

of Irreproducible Research!

Author's title

Author*The author of this computation has been verified*
R Software Modulerwasp_multipleregression.wasp
Title produced by softwareMultiple Regression
Date of computationWed, 21 Dec 2016 13:45:04 +0100
Cite this page as followsStatistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?v=date/2016/Dec/21/t148232483804jqnk9so849bui.htm/, Retrieved Tue, 07 May 2024 04:09:01 +0000
Statistical Computations at FreeStatistics.org, Office for Research Development and Education, URL https://freestatistics.org/blog/index.php?pk=302244, Retrieved Tue, 07 May 2024 04:09:01 +0000
QR Codes:

Original text written by user:
IsPrivate?No (this computation is public)
User-defined keywords
Estimated Impact66
Family? (F = Feedback message, R = changed R code, M = changed R Module, P = changed Parameters, D = changed Data)
-       [Multiple Regression] [MR ] [2016-12-21 12:45:04] [71d167f7de04005af677e6526bf8917e] [Current]
Feedback Forum

Post a new message
Dataseries X:
13	10	1	10
16	6	2	12
17	10	2	20
16	10	2	20
17	10	1	10
17	9	1	9
15	6	1	6
16	9	1	9
14	9	1	9
16	10	1	10
17	9	2	18
16	9	2	18
16	10	2	20
16	10	1	10
15	9	1	9
16	10	1	10
16	10	1	10
13	10	1	10
15	10	2	20
17	10	1	10
13	9	2	18
17	8	1	8
14	10	2	20
14	10	1	10
18	10	1	10
17	10	2	20
13	10	2	20
16	10	2	20
15	10	2	20
15	10	2	20
13	9	2	18
17	10	2	20
11	7	2	14
14	7	1	7
13	6	2	12
17	10	1	10
16	10	2	20
17	10	2	20
16	10	2	20
16	10	2	20
16	10	2	20
15	8	1	8
12	10	2	20
17	8	1	8
14	4	1	4
16	7	2	14
15	9	2	18
16	6	1	6
14	9	2	18
15	8	1	8
17	10	2	20
10	8	1	8
17	10	1	10
20	8	2	16
17	10	2	20
18	10	2	20
17	8	1	8
14	8	2	16
17	6	1	6
17	10	1	10
16	8	2	16
18	10	2	20
18	4	1	4
16	10	2	20
15	8	1	8
13	10	2	20
16	10	1	10
12	8	1	8
16	10	2	20
16	10	1	10
16	10	2	20
14	10	1	10
15	9	1	9
14	9	1	9
15	10	2	20
15	10	2	20
16	9	2	18
11	10	2	20
18	8	1	8
11	8	1	8
18	8	1	8
15	8	2	16
19	10	1	10
17	4	1	4
14	8	1	8
13	10	1	10
17	8	2	16
14	9	2	18
19	6	2	12
14	10	2	20
16	8	1	8
16	10	1	10
15	10	2	20
12	9	2	18
17	10	2	20
18	6	1	6
15	10	2	20
15	10	1	10
16	10	2	20
16	7	1	7




Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time8 seconds
R ServerBig Analytics Cloud Computing Center

\begin{tabular}{lllllllll}
\hline
Summary of computational transaction \tabularnewline
Raw Input view raw input (R code)  \tabularnewline
Raw Outputview raw output of R engine  \tabularnewline
Computing time8 seconds \tabularnewline
R ServerBig Analytics Cloud Computing Center \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=302244&T=0

[TABLE]
[ROW]
Summary of computational transaction[/C][/ROW] [ROW]Raw Input[/C] view raw input (R code) [/C][/ROW] [ROW]Raw Output[/C]view raw output of R engine [/C][/ROW] [ROW]Computing time[/C]8 seconds[/C][/ROW] [ROW]R Server[/C]Big Analytics Cloud Computing Center[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=302244&T=0

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=302244&T=0

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Summary of computational transaction
Raw Input view raw input (R code)
Raw Outputview raw output of R engine
Computing time8 seconds
R ServerBig Analytics Cloud Computing Center







Multiple Linear Regression - Estimated Regression Equation
TV[t] = + 17.0743 -0.153912Privacy[t] -0.995681geslacht[t] + 0.0931982Product[t] + e[t]

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Estimated Regression Equation \tabularnewline
TV[t] =  +  17.0743 -0.153912Privacy[t] -0.995681geslacht[t] +  0.0931982Product[t]  + e[t] \tabularnewline
 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=302244&T=1

[TABLE]
[ROW][C]Multiple Linear Regression - Estimated Regression Equation[/C][/ROW]
[ROW][C]TV[t] =  +  17.0743 -0.153912Privacy[t] -0.995681geslacht[t] +  0.0931982Product[t]  + e[t][/C][/ROW]
[ROW][C][/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=302244&T=1

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=302244&T=1

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Estimated Regression Equation
TV[t] = + 17.0743 -0.153912Privacy[t] -0.995681geslacht[t] + 0.0931982Product[t] + e[t]







Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+17.07 3.52+4.8500e+00 4.754e-06 2.377e-06
Privacy-0.1539 0.3952-3.8950e-01 0.6978 0.3489
geslacht-0.9957 2.562-3.8870e-01 0.6984 0.3492
Product+0.0932 0.2803+3.3250e-01 0.7402 0.3701

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Ordinary Least Squares \tabularnewline
Variable & Parameter & S.D. & T-STATH0: parameter = 0 & 2-tail p-value & 1-tail p-value \tabularnewline
(Intercept) & +17.07 &  3.52 & +4.8500e+00 &  4.754e-06 &  2.377e-06 \tabularnewline
Privacy & -0.1539 &  0.3952 & -3.8950e-01 &  0.6978 &  0.3489 \tabularnewline
geslacht & -0.9957 &  2.562 & -3.8870e-01 &  0.6984 &  0.3492 \tabularnewline
Product & +0.0932 &  0.2803 & +3.3250e-01 &  0.7402 &  0.3701 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=302244&T=2

[TABLE]
[ROW][C]Multiple Linear Regression - Ordinary Least Squares[/C][/ROW]
[ROW][C]Variable[/C][C]Parameter[/C][C]S.D.[/C][C]T-STATH0: parameter = 0[/C][C]2-tail p-value[/C][C]1-tail p-value[/C][/ROW]
[ROW][C](Intercept)[/C][C]+17.07[/C][C] 3.52[/C][C]+4.8500e+00[/C][C] 4.754e-06[/C][C] 2.377e-06[/C][/ROW]
[ROW][C]Privacy[/C][C]-0.1539[/C][C] 0.3952[/C][C]-3.8950e-01[/C][C] 0.6978[/C][C] 0.3489[/C][/ROW]
[ROW][C]geslacht[/C][C]-0.9957[/C][C] 2.562[/C][C]-3.8870e-01[/C][C] 0.6984[/C][C] 0.3492[/C][/ROW]
[ROW][C]Product[/C][C]+0.0932[/C][C] 0.2803[/C][C]+3.3250e-01[/C][C] 0.7402[/C][C] 0.3701[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=302244&T=2

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=302244&T=2

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Ordinary Least Squares
VariableParameterS.D.T-STATH0: parameter = 02-tail p-value1-tail p-value
(Intercept)+17.07 3.52+4.8500e+00 4.754e-06 2.377e-06
Privacy-0.1539 0.3952-3.8950e-01 0.6978 0.3489
geslacht-0.9957 2.562-3.8870e-01 0.6984 0.3492
Product+0.0932 0.2803+3.3250e-01 0.7402 0.3701







Multiple Linear Regression - Regression Statistics
Multiple R 0.06285
R-squared 0.00395
Adjusted R-squared-0.02718
F-TEST (value) 0.1269
F-TEST (DF numerator)3
F-TEST (DF denominator)96
p-value 0.9439
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.903
Sum Squared Residuals 347.5

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Regression Statistics \tabularnewline
Multiple R &  0.06285 \tabularnewline
R-squared &  0.00395 \tabularnewline
Adjusted R-squared & -0.02718 \tabularnewline
F-TEST (value) &  0.1269 \tabularnewline
F-TEST (DF numerator) & 3 \tabularnewline
F-TEST (DF denominator) & 96 \tabularnewline
p-value &  0.9439 \tabularnewline
Multiple Linear Regression - Residual Statistics \tabularnewline
Residual Standard Deviation &  1.903 \tabularnewline
Sum Squared Residuals &  347.5 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=302244&T=3

[TABLE]
[ROW][C]Multiple Linear Regression - Regression Statistics[/C][/ROW]
[ROW][C]Multiple R[/C][C] 0.06285[/C][/ROW]
[ROW][C]R-squared[/C][C] 0.00395[/C][/ROW]
[ROW][C]Adjusted R-squared[/C][C]-0.02718[/C][/ROW]
[ROW][C]F-TEST (value)[/C][C] 0.1269[/C][/ROW]
[ROW][C]F-TEST (DF numerator)[/C][C]3[/C][/ROW]
[ROW][C]F-TEST (DF denominator)[/C][C]96[/C][/ROW]
[ROW][C]p-value[/C][C] 0.9439[/C][/ROW]
[ROW][C]Multiple Linear Regression - Residual Statistics[/C][/ROW]
[ROW][C]Residual Standard Deviation[/C][C] 1.903[/C][/ROW]
[ROW][C]Sum Squared Residuals[/C][C] 347.5[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=302244&T=3

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=302244&T=3

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Regression Statistics
Multiple R 0.06285
R-squared 0.00395
Adjusted R-squared-0.02718
F-TEST (value) 0.1269
F-TEST (DF numerator)3
F-TEST (DF denominator)96
p-value 0.9439
Multiple Linear Regression - Residual Statistics
Residual Standard Deviation 1.903
Sum Squared Residuals 347.5







Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 13 15.47-2.471
2 16 15.28 0.7222
3 17 15.41 1.592
4 16 15.41 0.5923
5 17 15.47 1.529
6 17 15.53 1.468
7 15 15.71-0.7143
8 16 15.53 0.4679
9 14 15.53-1.532
10 16 15.47 0.5286
11 17 15.38 1.625
12 16 15.38 0.6248
13 16 15.41 0.5923
14 16 15.47 0.5286
15 15 15.53-0.5321
16 16 15.47 0.5286
17 16 15.47 0.5286
18 13 15.47-2.471
19 15 15.41-0.4077
20 17 15.47 1.529
21 13 15.38-2.375
22 17 15.59 1.407
23 14 15.41-1.408
24 14 15.47-1.471
25 18 15.47 2.529
26 17 15.41 1.592
27 13 15.41-2.408
28 16 15.41 0.5923
29 15 15.41-0.4077
30 15 15.41-0.4077
31 13 15.38-2.375
32 17 15.41 1.592
33 11 15.31-4.31
34 14 15.65-1.654
35 13 15.28-2.278
36 17 15.47 1.529
37 16 15.41 0.5923
38 17 15.41 1.592
39 16 15.41 0.5923
40 16 15.41 0.5923
41 16 15.41 0.5923
42 15 15.59-0.5929
43 12 15.41-3.408
44 17 15.59 1.407
45 14 15.84-1.836
46 16 15.31 0.6897
47 15 15.38-0.3752
48 16 15.71 0.2857
49 14 15.38-1.375
50 15 15.59-0.5929
51 17 15.41 1.592
52 10 15.59-5.593
53 17 15.47 1.529
54 20 15.34 4.657
55 17 15.41 1.592
56 18 15.41 2.592
57 17 15.59 1.407
58 14 15.34-1.343
59 17 15.71 1.286
60 17 15.47 1.529
61 16 15.34 0.6572
62 18 15.41 2.592
63 18 15.84 2.164
64 16 15.41 0.5923
65 15 15.59-0.5929
66 13 15.41-2.408
67 16 15.47 0.5286
68 12 15.59-3.593
69 16 15.41 0.5923
70 16 15.47 0.5286
71 16 15.41 0.5923
72 14 15.47-1.471
73 15 15.53-0.5321
74 14 15.53-1.532
75 15 15.41-0.4077
76 15 15.41-0.4077
77 16 15.38 0.6248
78 11 15.41-4.408
79 18 15.59 2.407
80 11 15.59-4.593
81 18 15.59 2.407
82 15 15.34-0.3428
83 19 15.47 3.529
84 17 15.84 1.164
85 14 15.59-1.593
86 13 15.47-2.471
87 17 15.34 1.657
88 14 15.38-1.375
89 19 15.28 3.722
90 14 15.41-1.408
91 16 15.59 0.4071
92 16 15.47 0.5286
93 15 15.41-0.4077
94 12 15.38-3.375
95 17 15.41 1.592
96 18 15.71 2.286
97 15 15.41-0.4077
98 15 15.47-0.4714
99 16 15.41 0.5923
100 16 15.65 0.3464

\begin{tabular}{lllllllll}
\hline
Multiple Linear Regression - Actuals, Interpolation, and Residuals \tabularnewline
Time or Index & Actuals & InterpolationForecast & ResidualsPrediction Error \tabularnewline
1 &  13 &  15.47 & -2.471 \tabularnewline
2 &  16 &  15.28 &  0.7222 \tabularnewline
3 &  17 &  15.41 &  1.592 \tabularnewline
4 &  16 &  15.41 &  0.5923 \tabularnewline
5 &  17 &  15.47 &  1.529 \tabularnewline
6 &  17 &  15.53 &  1.468 \tabularnewline
7 &  15 &  15.71 & -0.7143 \tabularnewline
8 &  16 &  15.53 &  0.4679 \tabularnewline
9 &  14 &  15.53 & -1.532 \tabularnewline
10 &  16 &  15.47 &  0.5286 \tabularnewline
11 &  17 &  15.38 &  1.625 \tabularnewline
12 &  16 &  15.38 &  0.6248 \tabularnewline
13 &  16 &  15.41 &  0.5923 \tabularnewline
14 &  16 &  15.47 &  0.5286 \tabularnewline
15 &  15 &  15.53 & -0.5321 \tabularnewline
16 &  16 &  15.47 &  0.5286 \tabularnewline
17 &  16 &  15.47 &  0.5286 \tabularnewline
18 &  13 &  15.47 & -2.471 \tabularnewline
19 &  15 &  15.41 & -0.4077 \tabularnewline
20 &  17 &  15.47 &  1.529 \tabularnewline
21 &  13 &  15.38 & -2.375 \tabularnewline
22 &  17 &  15.59 &  1.407 \tabularnewline
23 &  14 &  15.41 & -1.408 \tabularnewline
24 &  14 &  15.47 & -1.471 \tabularnewline
25 &  18 &  15.47 &  2.529 \tabularnewline
26 &  17 &  15.41 &  1.592 \tabularnewline
27 &  13 &  15.41 & -2.408 \tabularnewline
28 &  16 &  15.41 &  0.5923 \tabularnewline
29 &  15 &  15.41 & -0.4077 \tabularnewline
30 &  15 &  15.41 & -0.4077 \tabularnewline
31 &  13 &  15.38 & -2.375 \tabularnewline
32 &  17 &  15.41 &  1.592 \tabularnewline
33 &  11 &  15.31 & -4.31 \tabularnewline
34 &  14 &  15.65 & -1.654 \tabularnewline
35 &  13 &  15.28 & -2.278 \tabularnewline
36 &  17 &  15.47 &  1.529 \tabularnewline
37 &  16 &  15.41 &  0.5923 \tabularnewline
38 &  17 &  15.41 &  1.592 \tabularnewline
39 &  16 &  15.41 &  0.5923 \tabularnewline
40 &  16 &  15.41 &  0.5923 \tabularnewline
41 &  16 &  15.41 &  0.5923 \tabularnewline
42 &  15 &  15.59 & -0.5929 \tabularnewline
43 &  12 &  15.41 & -3.408 \tabularnewline
44 &  17 &  15.59 &  1.407 \tabularnewline
45 &  14 &  15.84 & -1.836 \tabularnewline
46 &  16 &  15.31 &  0.6897 \tabularnewline
47 &  15 &  15.38 & -0.3752 \tabularnewline
48 &  16 &  15.71 &  0.2857 \tabularnewline
49 &  14 &  15.38 & -1.375 \tabularnewline
50 &  15 &  15.59 & -0.5929 \tabularnewline
51 &  17 &  15.41 &  1.592 \tabularnewline
52 &  10 &  15.59 & -5.593 \tabularnewline
53 &  17 &  15.47 &  1.529 \tabularnewline
54 &  20 &  15.34 &  4.657 \tabularnewline
55 &  17 &  15.41 &  1.592 \tabularnewline
56 &  18 &  15.41 &  2.592 \tabularnewline
57 &  17 &  15.59 &  1.407 \tabularnewline
58 &  14 &  15.34 & -1.343 \tabularnewline
59 &  17 &  15.71 &  1.286 \tabularnewline
60 &  17 &  15.47 &  1.529 \tabularnewline
61 &  16 &  15.34 &  0.6572 \tabularnewline
62 &  18 &  15.41 &  2.592 \tabularnewline
63 &  18 &  15.84 &  2.164 \tabularnewline
64 &  16 &  15.41 &  0.5923 \tabularnewline
65 &  15 &  15.59 & -0.5929 \tabularnewline
66 &  13 &  15.41 & -2.408 \tabularnewline
67 &  16 &  15.47 &  0.5286 \tabularnewline
68 &  12 &  15.59 & -3.593 \tabularnewline
69 &  16 &  15.41 &  0.5923 \tabularnewline
70 &  16 &  15.47 &  0.5286 \tabularnewline
71 &  16 &  15.41 &  0.5923 \tabularnewline
72 &  14 &  15.47 & -1.471 \tabularnewline
73 &  15 &  15.53 & -0.5321 \tabularnewline
74 &  14 &  15.53 & -1.532 \tabularnewline
75 &  15 &  15.41 & -0.4077 \tabularnewline
76 &  15 &  15.41 & -0.4077 \tabularnewline
77 &  16 &  15.38 &  0.6248 \tabularnewline
78 &  11 &  15.41 & -4.408 \tabularnewline
79 &  18 &  15.59 &  2.407 \tabularnewline
80 &  11 &  15.59 & -4.593 \tabularnewline
81 &  18 &  15.59 &  2.407 \tabularnewline
82 &  15 &  15.34 & -0.3428 \tabularnewline
83 &  19 &  15.47 &  3.529 \tabularnewline
84 &  17 &  15.84 &  1.164 \tabularnewline
85 &  14 &  15.59 & -1.593 \tabularnewline
86 &  13 &  15.47 & -2.471 \tabularnewline
87 &  17 &  15.34 &  1.657 \tabularnewline
88 &  14 &  15.38 & -1.375 \tabularnewline
89 &  19 &  15.28 &  3.722 \tabularnewline
90 &  14 &  15.41 & -1.408 \tabularnewline
91 &  16 &  15.59 &  0.4071 \tabularnewline
92 &  16 &  15.47 &  0.5286 \tabularnewline
93 &  15 &  15.41 & -0.4077 \tabularnewline
94 &  12 &  15.38 & -3.375 \tabularnewline
95 &  17 &  15.41 &  1.592 \tabularnewline
96 &  18 &  15.71 &  2.286 \tabularnewline
97 &  15 &  15.41 & -0.4077 \tabularnewline
98 &  15 &  15.47 & -0.4714 \tabularnewline
99 &  16 &  15.41 &  0.5923 \tabularnewline
100 &  16 &  15.65 &  0.3464 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=302244&T=4

[TABLE]
[ROW][C]Multiple Linear Regression - Actuals, Interpolation, and Residuals[/C][/ROW]
[ROW][C]Time or Index[/C][C]Actuals[/C][C]InterpolationForecast[/C][C]ResidualsPrediction Error[/C][/ROW]
[ROW][C]1[/C][C] 13[/C][C] 15.47[/C][C]-2.471[/C][/ROW]
[ROW][C]2[/C][C] 16[/C][C] 15.28[/C][C] 0.7222[/C][/ROW]
[ROW][C]3[/C][C] 17[/C][C] 15.41[/C][C] 1.592[/C][/ROW]
[ROW][C]4[/C][C] 16[/C][C] 15.41[/C][C] 0.5923[/C][/ROW]
[ROW][C]5[/C][C] 17[/C][C] 15.47[/C][C] 1.529[/C][/ROW]
[ROW][C]6[/C][C] 17[/C][C] 15.53[/C][C] 1.468[/C][/ROW]
[ROW][C]7[/C][C] 15[/C][C] 15.71[/C][C]-0.7143[/C][/ROW]
[ROW][C]8[/C][C] 16[/C][C] 15.53[/C][C] 0.4679[/C][/ROW]
[ROW][C]9[/C][C] 14[/C][C] 15.53[/C][C]-1.532[/C][/ROW]
[ROW][C]10[/C][C] 16[/C][C] 15.47[/C][C] 0.5286[/C][/ROW]
[ROW][C]11[/C][C] 17[/C][C] 15.38[/C][C] 1.625[/C][/ROW]
[ROW][C]12[/C][C] 16[/C][C] 15.38[/C][C] 0.6248[/C][/ROW]
[ROW][C]13[/C][C] 16[/C][C] 15.41[/C][C] 0.5923[/C][/ROW]
[ROW][C]14[/C][C] 16[/C][C] 15.47[/C][C] 0.5286[/C][/ROW]
[ROW][C]15[/C][C] 15[/C][C] 15.53[/C][C]-0.5321[/C][/ROW]
[ROW][C]16[/C][C] 16[/C][C] 15.47[/C][C] 0.5286[/C][/ROW]
[ROW][C]17[/C][C] 16[/C][C] 15.47[/C][C] 0.5286[/C][/ROW]
[ROW][C]18[/C][C] 13[/C][C] 15.47[/C][C]-2.471[/C][/ROW]
[ROW][C]19[/C][C] 15[/C][C] 15.41[/C][C]-0.4077[/C][/ROW]
[ROW][C]20[/C][C] 17[/C][C] 15.47[/C][C] 1.529[/C][/ROW]
[ROW][C]21[/C][C] 13[/C][C] 15.38[/C][C]-2.375[/C][/ROW]
[ROW][C]22[/C][C] 17[/C][C] 15.59[/C][C] 1.407[/C][/ROW]
[ROW][C]23[/C][C] 14[/C][C] 15.41[/C][C]-1.408[/C][/ROW]
[ROW][C]24[/C][C] 14[/C][C] 15.47[/C][C]-1.471[/C][/ROW]
[ROW][C]25[/C][C] 18[/C][C] 15.47[/C][C] 2.529[/C][/ROW]
[ROW][C]26[/C][C] 17[/C][C] 15.41[/C][C] 1.592[/C][/ROW]
[ROW][C]27[/C][C] 13[/C][C] 15.41[/C][C]-2.408[/C][/ROW]
[ROW][C]28[/C][C] 16[/C][C] 15.41[/C][C] 0.5923[/C][/ROW]
[ROW][C]29[/C][C] 15[/C][C] 15.41[/C][C]-0.4077[/C][/ROW]
[ROW][C]30[/C][C] 15[/C][C] 15.41[/C][C]-0.4077[/C][/ROW]
[ROW][C]31[/C][C] 13[/C][C] 15.38[/C][C]-2.375[/C][/ROW]
[ROW][C]32[/C][C] 17[/C][C] 15.41[/C][C] 1.592[/C][/ROW]
[ROW][C]33[/C][C] 11[/C][C] 15.31[/C][C]-4.31[/C][/ROW]
[ROW][C]34[/C][C] 14[/C][C] 15.65[/C][C]-1.654[/C][/ROW]
[ROW][C]35[/C][C] 13[/C][C] 15.28[/C][C]-2.278[/C][/ROW]
[ROW][C]36[/C][C] 17[/C][C] 15.47[/C][C] 1.529[/C][/ROW]
[ROW][C]37[/C][C] 16[/C][C] 15.41[/C][C] 0.5923[/C][/ROW]
[ROW][C]38[/C][C] 17[/C][C] 15.41[/C][C] 1.592[/C][/ROW]
[ROW][C]39[/C][C] 16[/C][C] 15.41[/C][C] 0.5923[/C][/ROW]
[ROW][C]40[/C][C] 16[/C][C] 15.41[/C][C] 0.5923[/C][/ROW]
[ROW][C]41[/C][C] 16[/C][C] 15.41[/C][C] 0.5923[/C][/ROW]
[ROW][C]42[/C][C] 15[/C][C] 15.59[/C][C]-0.5929[/C][/ROW]
[ROW][C]43[/C][C] 12[/C][C] 15.41[/C][C]-3.408[/C][/ROW]
[ROW][C]44[/C][C] 17[/C][C] 15.59[/C][C] 1.407[/C][/ROW]
[ROW][C]45[/C][C] 14[/C][C] 15.84[/C][C]-1.836[/C][/ROW]
[ROW][C]46[/C][C] 16[/C][C] 15.31[/C][C] 0.6897[/C][/ROW]
[ROW][C]47[/C][C] 15[/C][C] 15.38[/C][C]-0.3752[/C][/ROW]
[ROW][C]48[/C][C] 16[/C][C] 15.71[/C][C] 0.2857[/C][/ROW]
[ROW][C]49[/C][C] 14[/C][C] 15.38[/C][C]-1.375[/C][/ROW]
[ROW][C]50[/C][C] 15[/C][C] 15.59[/C][C]-0.5929[/C][/ROW]
[ROW][C]51[/C][C] 17[/C][C] 15.41[/C][C] 1.592[/C][/ROW]
[ROW][C]52[/C][C] 10[/C][C] 15.59[/C][C]-5.593[/C][/ROW]
[ROW][C]53[/C][C] 17[/C][C] 15.47[/C][C] 1.529[/C][/ROW]
[ROW][C]54[/C][C] 20[/C][C] 15.34[/C][C] 4.657[/C][/ROW]
[ROW][C]55[/C][C] 17[/C][C] 15.41[/C][C] 1.592[/C][/ROW]
[ROW][C]56[/C][C] 18[/C][C] 15.41[/C][C] 2.592[/C][/ROW]
[ROW][C]57[/C][C] 17[/C][C] 15.59[/C][C] 1.407[/C][/ROW]
[ROW][C]58[/C][C] 14[/C][C] 15.34[/C][C]-1.343[/C][/ROW]
[ROW][C]59[/C][C] 17[/C][C] 15.71[/C][C] 1.286[/C][/ROW]
[ROW][C]60[/C][C] 17[/C][C] 15.47[/C][C] 1.529[/C][/ROW]
[ROW][C]61[/C][C] 16[/C][C] 15.34[/C][C] 0.6572[/C][/ROW]
[ROW][C]62[/C][C] 18[/C][C] 15.41[/C][C] 2.592[/C][/ROW]
[ROW][C]63[/C][C] 18[/C][C] 15.84[/C][C] 2.164[/C][/ROW]
[ROW][C]64[/C][C] 16[/C][C] 15.41[/C][C] 0.5923[/C][/ROW]
[ROW][C]65[/C][C] 15[/C][C] 15.59[/C][C]-0.5929[/C][/ROW]
[ROW][C]66[/C][C] 13[/C][C] 15.41[/C][C]-2.408[/C][/ROW]
[ROW][C]67[/C][C] 16[/C][C] 15.47[/C][C] 0.5286[/C][/ROW]
[ROW][C]68[/C][C] 12[/C][C] 15.59[/C][C]-3.593[/C][/ROW]
[ROW][C]69[/C][C] 16[/C][C] 15.41[/C][C] 0.5923[/C][/ROW]
[ROW][C]70[/C][C] 16[/C][C] 15.47[/C][C] 0.5286[/C][/ROW]
[ROW][C]71[/C][C] 16[/C][C] 15.41[/C][C] 0.5923[/C][/ROW]
[ROW][C]72[/C][C] 14[/C][C] 15.47[/C][C]-1.471[/C][/ROW]
[ROW][C]73[/C][C] 15[/C][C] 15.53[/C][C]-0.5321[/C][/ROW]
[ROW][C]74[/C][C] 14[/C][C] 15.53[/C][C]-1.532[/C][/ROW]
[ROW][C]75[/C][C] 15[/C][C] 15.41[/C][C]-0.4077[/C][/ROW]
[ROW][C]76[/C][C] 15[/C][C] 15.41[/C][C]-0.4077[/C][/ROW]
[ROW][C]77[/C][C] 16[/C][C] 15.38[/C][C] 0.6248[/C][/ROW]
[ROW][C]78[/C][C] 11[/C][C] 15.41[/C][C]-4.408[/C][/ROW]
[ROW][C]79[/C][C] 18[/C][C] 15.59[/C][C] 2.407[/C][/ROW]
[ROW][C]80[/C][C] 11[/C][C] 15.59[/C][C]-4.593[/C][/ROW]
[ROW][C]81[/C][C] 18[/C][C] 15.59[/C][C] 2.407[/C][/ROW]
[ROW][C]82[/C][C] 15[/C][C] 15.34[/C][C]-0.3428[/C][/ROW]
[ROW][C]83[/C][C] 19[/C][C] 15.47[/C][C] 3.529[/C][/ROW]
[ROW][C]84[/C][C] 17[/C][C] 15.84[/C][C] 1.164[/C][/ROW]
[ROW][C]85[/C][C] 14[/C][C] 15.59[/C][C]-1.593[/C][/ROW]
[ROW][C]86[/C][C] 13[/C][C] 15.47[/C][C]-2.471[/C][/ROW]
[ROW][C]87[/C][C] 17[/C][C] 15.34[/C][C] 1.657[/C][/ROW]
[ROW][C]88[/C][C] 14[/C][C] 15.38[/C][C]-1.375[/C][/ROW]
[ROW][C]89[/C][C] 19[/C][C] 15.28[/C][C] 3.722[/C][/ROW]
[ROW][C]90[/C][C] 14[/C][C] 15.41[/C][C]-1.408[/C][/ROW]
[ROW][C]91[/C][C] 16[/C][C] 15.59[/C][C] 0.4071[/C][/ROW]
[ROW][C]92[/C][C] 16[/C][C] 15.47[/C][C] 0.5286[/C][/ROW]
[ROW][C]93[/C][C] 15[/C][C] 15.41[/C][C]-0.4077[/C][/ROW]
[ROW][C]94[/C][C] 12[/C][C] 15.38[/C][C]-3.375[/C][/ROW]
[ROW][C]95[/C][C] 17[/C][C] 15.41[/C][C] 1.592[/C][/ROW]
[ROW][C]96[/C][C] 18[/C][C] 15.71[/C][C] 2.286[/C][/ROW]
[ROW][C]97[/C][C] 15[/C][C] 15.41[/C][C]-0.4077[/C][/ROW]
[ROW][C]98[/C][C] 15[/C][C] 15.47[/C][C]-0.4714[/C][/ROW]
[ROW][C]99[/C][C] 16[/C][C] 15.41[/C][C] 0.5923[/C][/ROW]
[ROW][C]100[/C][C] 16[/C][C] 15.65[/C][C] 0.3464[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=302244&T=4

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=302244&T=4

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Multiple Linear Regression - Actuals, Interpolation, and Residuals
Time or IndexActualsInterpolationForecastResidualsPrediction Error
1 13 15.47-2.471
2 16 15.28 0.7222
3 17 15.41 1.592
4 16 15.41 0.5923
5 17 15.47 1.529
6 17 15.53 1.468
7 15 15.71-0.7143
8 16 15.53 0.4679
9 14 15.53-1.532
10 16 15.47 0.5286
11 17 15.38 1.625
12 16 15.38 0.6248
13 16 15.41 0.5923
14 16 15.47 0.5286
15 15 15.53-0.5321
16 16 15.47 0.5286
17 16 15.47 0.5286
18 13 15.47-2.471
19 15 15.41-0.4077
20 17 15.47 1.529
21 13 15.38-2.375
22 17 15.59 1.407
23 14 15.41-1.408
24 14 15.47-1.471
25 18 15.47 2.529
26 17 15.41 1.592
27 13 15.41-2.408
28 16 15.41 0.5923
29 15 15.41-0.4077
30 15 15.41-0.4077
31 13 15.38-2.375
32 17 15.41 1.592
33 11 15.31-4.31
34 14 15.65-1.654
35 13 15.28-2.278
36 17 15.47 1.529
37 16 15.41 0.5923
38 17 15.41 1.592
39 16 15.41 0.5923
40 16 15.41 0.5923
41 16 15.41 0.5923
42 15 15.59-0.5929
43 12 15.41-3.408
44 17 15.59 1.407
45 14 15.84-1.836
46 16 15.31 0.6897
47 15 15.38-0.3752
48 16 15.71 0.2857
49 14 15.38-1.375
50 15 15.59-0.5929
51 17 15.41 1.592
52 10 15.59-5.593
53 17 15.47 1.529
54 20 15.34 4.657
55 17 15.41 1.592
56 18 15.41 2.592
57 17 15.59 1.407
58 14 15.34-1.343
59 17 15.71 1.286
60 17 15.47 1.529
61 16 15.34 0.6572
62 18 15.41 2.592
63 18 15.84 2.164
64 16 15.41 0.5923
65 15 15.59-0.5929
66 13 15.41-2.408
67 16 15.47 0.5286
68 12 15.59-3.593
69 16 15.41 0.5923
70 16 15.47 0.5286
71 16 15.41 0.5923
72 14 15.47-1.471
73 15 15.53-0.5321
74 14 15.53-1.532
75 15 15.41-0.4077
76 15 15.41-0.4077
77 16 15.38 0.6248
78 11 15.41-4.408
79 18 15.59 2.407
80 11 15.59-4.593
81 18 15.59 2.407
82 15 15.34-0.3428
83 19 15.47 3.529
84 17 15.84 1.164
85 14 15.59-1.593
86 13 15.47-2.471
87 17 15.34 1.657
88 14 15.38-1.375
89 19 15.28 3.722
90 14 15.41-1.408
91 16 15.59 0.4071
92 16 15.47 0.5286
93 15 15.41-0.4077
94 12 15.38-3.375
95 17 15.41 1.592
96 18 15.71 2.286
97 15 15.41-0.4077
98 15 15.47-0.4714
99 16 15.41 0.5923
100 16 15.65 0.3464







Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
7 0.6136 0.7728 0.3864
8 0.4535 0.907 0.5465
9 0.396 0.7919 0.604
10 0.2814 0.5627 0.7186
11 0.1947 0.3893 0.8053
12 0.1267 0.2535 0.8733
13 0.07953 0.1591 0.9205
14 0.04775 0.09549 0.9523
15 0.02782 0.05564 0.9722
16 0.01556 0.03113 0.9844
17 0.008339 0.01668 0.9917
18 0.02382 0.04765 0.9762
19 0.0191 0.0382 0.9809
20 0.01852 0.03703 0.9815
21 0.0497 0.09941 0.9503
22 0.04564 0.09129 0.9544
23 0.04311 0.08623 0.9569
24 0.03865 0.0773 0.9614
25 0.05831 0.1166 0.9417
26 0.05135 0.1027 0.9486
27 0.07333 0.1467 0.9267
28 0.0535 0.107 0.9465
29 0.03754 0.07508 0.9625
30 0.02573 0.05146 0.9743
31 0.03551 0.07102 0.9645
32 0.03379 0.06757 0.9662
33 0.1067 0.2135 0.8933
34 0.09541 0.1908 0.9046
35 0.0942 0.1884 0.9058
36 0.08391 0.1678 0.9161
37 0.06402 0.128 0.936
38 0.0592 0.1184 0.9408
39 0.04414 0.08828 0.9559
40 0.03241 0.06483 0.9676
41 0.02345 0.04691 0.9765
42 0.01638 0.03276 0.9836
43 0.04279 0.08558 0.9572
44 0.03936 0.07872 0.9606
45 0.03443 0.06886 0.9656
46 0.03262 0.06523 0.9674
47 0.02351 0.04701 0.9765
48 0.01762 0.03525 0.9824
49 0.01486 0.02972 0.9851
50 0.01038 0.02075 0.9896
51 0.009567 0.01913 0.9904
52 0.1221 0.2442 0.8779
53 0.1125 0.225 0.8875
54 0.318 0.636 0.682
55 0.3062 0.6125 0.6938
56 0.3655 0.7309 0.6345
57 0.3446 0.6892 0.6554
58 0.333 0.6661 0.667
59 0.3114 0.6228 0.6886
60 0.3 0.5999 0.7
61 0.2556 0.5111 0.7444
62 0.3233 0.6465 0.6767
63 0.3323 0.6645 0.6677
64 0.2963 0.5926 0.7037
65 0.2495 0.499 0.7505
66 0.2606 0.5213 0.7394
67 0.2203 0.4407 0.7797
68 0.3577 0.7153 0.6423
69 0.3204 0.6408 0.6796
70 0.274 0.548 0.726
71 0.2445 0.4891 0.7555
72 0.2157 0.4314 0.7843
73 0.1727 0.3454 0.8273
74 0.1575 0.3149 0.8425
75 0.1239 0.2478 0.8761
76 0.09587 0.1917 0.9041
77 0.07327 0.1465 0.9267
78 0.1604 0.3208 0.8396
79 0.1725 0.345 0.8275
80 0.525 0.9499 0.475
81 0.5345 0.9309 0.4655
82 0.4824 0.9649 0.5176
83 0.7579 0.4843 0.2421
84 0.7044 0.5912 0.2956
85 0.7074 0.5852 0.2926
86 0.7272 0.5455 0.2728
87 0.6566 0.6868 0.3434
88 0.6262 0.7476 0.3738
89 0.9668 0.06646 0.03323
90 0.976 0.04801 0.02401
91 0.9454 0.1092 0.05458
92 0.909 0.182 0.091
93 0.8478 0.3045 0.1522

\begin{tabular}{lllllllll}
\hline
Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
p-values & Alternative Hypothesis \tabularnewline
breakpoint index & greater & 2-sided & less \tabularnewline
7 &  0.6136 &  0.7728 &  0.3864 \tabularnewline
8 &  0.4535 &  0.907 &  0.5465 \tabularnewline
9 &  0.396 &  0.7919 &  0.604 \tabularnewline
10 &  0.2814 &  0.5627 &  0.7186 \tabularnewline
11 &  0.1947 &  0.3893 &  0.8053 \tabularnewline
12 &  0.1267 &  0.2535 &  0.8733 \tabularnewline
13 &  0.07953 &  0.1591 &  0.9205 \tabularnewline
14 &  0.04775 &  0.09549 &  0.9523 \tabularnewline
15 &  0.02782 &  0.05564 &  0.9722 \tabularnewline
16 &  0.01556 &  0.03113 &  0.9844 \tabularnewline
17 &  0.008339 &  0.01668 &  0.9917 \tabularnewline
18 &  0.02382 &  0.04765 &  0.9762 \tabularnewline
19 &  0.0191 &  0.0382 &  0.9809 \tabularnewline
20 &  0.01852 &  0.03703 &  0.9815 \tabularnewline
21 &  0.0497 &  0.09941 &  0.9503 \tabularnewline
22 &  0.04564 &  0.09129 &  0.9544 \tabularnewline
23 &  0.04311 &  0.08623 &  0.9569 \tabularnewline
24 &  0.03865 &  0.0773 &  0.9614 \tabularnewline
25 &  0.05831 &  0.1166 &  0.9417 \tabularnewline
26 &  0.05135 &  0.1027 &  0.9486 \tabularnewline
27 &  0.07333 &  0.1467 &  0.9267 \tabularnewline
28 &  0.0535 &  0.107 &  0.9465 \tabularnewline
29 &  0.03754 &  0.07508 &  0.9625 \tabularnewline
30 &  0.02573 &  0.05146 &  0.9743 \tabularnewline
31 &  0.03551 &  0.07102 &  0.9645 \tabularnewline
32 &  0.03379 &  0.06757 &  0.9662 \tabularnewline
33 &  0.1067 &  0.2135 &  0.8933 \tabularnewline
34 &  0.09541 &  0.1908 &  0.9046 \tabularnewline
35 &  0.0942 &  0.1884 &  0.9058 \tabularnewline
36 &  0.08391 &  0.1678 &  0.9161 \tabularnewline
37 &  0.06402 &  0.128 &  0.936 \tabularnewline
38 &  0.0592 &  0.1184 &  0.9408 \tabularnewline
39 &  0.04414 &  0.08828 &  0.9559 \tabularnewline
40 &  0.03241 &  0.06483 &  0.9676 \tabularnewline
41 &  0.02345 &  0.04691 &  0.9765 \tabularnewline
42 &  0.01638 &  0.03276 &  0.9836 \tabularnewline
43 &  0.04279 &  0.08558 &  0.9572 \tabularnewline
44 &  0.03936 &  0.07872 &  0.9606 \tabularnewline
45 &  0.03443 &  0.06886 &  0.9656 \tabularnewline
46 &  0.03262 &  0.06523 &  0.9674 \tabularnewline
47 &  0.02351 &  0.04701 &  0.9765 \tabularnewline
48 &  0.01762 &  0.03525 &  0.9824 \tabularnewline
49 &  0.01486 &  0.02972 &  0.9851 \tabularnewline
50 &  0.01038 &  0.02075 &  0.9896 \tabularnewline
51 &  0.009567 &  0.01913 &  0.9904 \tabularnewline
52 &  0.1221 &  0.2442 &  0.8779 \tabularnewline
53 &  0.1125 &  0.225 &  0.8875 \tabularnewline
54 &  0.318 &  0.636 &  0.682 \tabularnewline
55 &  0.3062 &  0.6125 &  0.6938 \tabularnewline
56 &  0.3655 &  0.7309 &  0.6345 \tabularnewline
57 &  0.3446 &  0.6892 &  0.6554 \tabularnewline
58 &  0.333 &  0.6661 &  0.667 \tabularnewline
59 &  0.3114 &  0.6228 &  0.6886 \tabularnewline
60 &  0.3 &  0.5999 &  0.7 \tabularnewline
61 &  0.2556 &  0.5111 &  0.7444 \tabularnewline
62 &  0.3233 &  0.6465 &  0.6767 \tabularnewline
63 &  0.3323 &  0.6645 &  0.6677 \tabularnewline
64 &  0.2963 &  0.5926 &  0.7037 \tabularnewline
65 &  0.2495 &  0.499 &  0.7505 \tabularnewline
66 &  0.2606 &  0.5213 &  0.7394 \tabularnewline
67 &  0.2203 &  0.4407 &  0.7797 \tabularnewline
68 &  0.3577 &  0.7153 &  0.6423 \tabularnewline
69 &  0.3204 &  0.6408 &  0.6796 \tabularnewline
70 &  0.274 &  0.548 &  0.726 \tabularnewline
71 &  0.2445 &  0.4891 &  0.7555 \tabularnewline
72 &  0.2157 &  0.4314 &  0.7843 \tabularnewline
73 &  0.1727 &  0.3454 &  0.8273 \tabularnewline
74 &  0.1575 &  0.3149 &  0.8425 \tabularnewline
75 &  0.1239 &  0.2478 &  0.8761 \tabularnewline
76 &  0.09587 &  0.1917 &  0.9041 \tabularnewline
77 &  0.07327 &  0.1465 &  0.9267 \tabularnewline
78 &  0.1604 &  0.3208 &  0.8396 \tabularnewline
79 &  0.1725 &  0.345 &  0.8275 \tabularnewline
80 &  0.525 &  0.9499 &  0.475 \tabularnewline
81 &  0.5345 &  0.9309 &  0.4655 \tabularnewline
82 &  0.4824 &  0.9649 &  0.5176 \tabularnewline
83 &  0.7579 &  0.4843 &  0.2421 \tabularnewline
84 &  0.7044 &  0.5912 &  0.2956 \tabularnewline
85 &  0.7074 &  0.5852 &  0.2926 \tabularnewline
86 &  0.7272 &  0.5455 &  0.2728 \tabularnewline
87 &  0.6566 &  0.6868 &  0.3434 \tabularnewline
88 &  0.6262 &  0.7476 &  0.3738 \tabularnewline
89 &  0.9668 &  0.06646 &  0.03323 \tabularnewline
90 &  0.976 &  0.04801 &  0.02401 \tabularnewline
91 &  0.9454 &  0.1092 &  0.05458 \tabularnewline
92 &  0.909 &  0.182 &  0.091 \tabularnewline
93 &  0.8478 &  0.3045 &  0.1522 \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=302244&T=5

[TABLE]
[ROW][C]Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]p-values[/C][C]Alternative Hypothesis[/C][/ROW]
[ROW][C]breakpoint index[/C][C]greater[/C][C]2-sided[/C][C]less[/C][/ROW]
[ROW][C]7[/C][C] 0.6136[/C][C] 0.7728[/C][C] 0.3864[/C][/ROW]
[ROW][C]8[/C][C] 0.4535[/C][C] 0.907[/C][C] 0.5465[/C][/ROW]
[ROW][C]9[/C][C] 0.396[/C][C] 0.7919[/C][C] 0.604[/C][/ROW]
[ROW][C]10[/C][C] 0.2814[/C][C] 0.5627[/C][C] 0.7186[/C][/ROW]
[ROW][C]11[/C][C] 0.1947[/C][C] 0.3893[/C][C] 0.8053[/C][/ROW]
[ROW][C]12[/C][C] 0.1267[/C][C] 0.2535[/C][C] 0.8733[/C][/ROW]
[ROW][C]13[/C][C] 0.07953[/C][C] 0.1591[/C][C] 0.9205[/C][/ROW]
[ROW][C]14[/C][C] 0.04775[/C][C] 0.09549[/C][C] 0.9523[/C][/ROW]
[ROW][C]15[/C][C] 0.02782[/C][C] 0.05564[/C][C] 0.9722[/C][/ROW]
[ROW][C]16[/C][C] 0.01556[/C][C] 0.03113[/C][C] 0.9844[/C][/ROW]
[ROW][C]17[/C][C] 0.008339[/C][C] 0.01668[/C][C] 0.9917[/C][/ROW]
[ROW][C]18[/C][C] 0.02382[/C][C] 0.04765[/C][C] 0.9762[/C][/ROW]
[ROW][C]19[/C][C] 0.0191[/C][C] 0.0382[/C][C] 0.9809[/C][/ROW]
[ROW][C]20[/C][C] 0.01852[/C][C] 0.03703[/C][C] 0.9815[/C][/ROW]
[ROW][C]21[/C][C] 0.0497[/C][C] 0.09941[/C][C] 0.9503[/C][/ROW]
[ROW][C]22[/C][C] 0.04564[/C][C] 0.09129[/C][C] 0.9544[/C][/ROW]
[ROW][C]23[/C][C] 0.04311[/C][C] 0.08623[/C][C] 0.9569[/C][/ROW]
[ROW][C]24[/C][C] 0.03865[/C][C] 0.0773[/C][C] 0.9614[/C][/ROW]
[ROW][C]25[/C][C] 0.05831[/C][C] 0.1166[/C][C] 0.9417[/C][/ROW]
[ROW][C]26[/C][C] 0.05135[/C][C] 0.1027[/C][C] 0.9486[/C][/ROW]
[ROW][C]27[/C][C] 0.07333[/C][C] 0.1467[/C][C] 0.9267[/C][/ROW]
[ROW][C]28[/C][C] 0.0535[/C][C] 0.107[/C][C] 0.9465[/C][/ROW]
[ROW][C]29[/C][C] 0.03754[/C][C] 0.07508[/C][C] 0.9625[/C][/ROW]
[ROW][C]30[/C][C] 0.02573[/C][C] 0.05146[/C][C] 0.9743[/C][/ROW]
[ROW][C]31[/C][C] 0.03551[/C][C] 0.07102[/C][C] 0.9645[/C][/ROW]
[ROW][C]32[/C][C] 0.03379[/C][C] 0.06757[/C][C] 0.9662[/C][/ROW]
[ROW][C]33[/C][C] 0.1067[/C][C] 0.2135[/C][C] 0.8933[/C][/ROW]
[ROW][C]34[/C][C] 0.09541[/C][C] 0.1908[/C][C] 0.9046[/C][/ROW]
[ROW][C]35[/C][C] 0.0942[/C][C] 0.1884[/C][C] 0.9058[/C][/ROW]
[ROW][C]36[/C][C] 0.08391[/C][C] 0.1678[/C][C] 0.9161[/C][/ROW]
[ROW][C]37[/C][C] 0.06402[/C][C] 0.128[/C][C] 0.936[/C][/ROW]
[ROW][C]38[/C][C] 0.0592[/C][C] 0.1184[/C][C] 0.9408[/C][/ROW]
[ROW][C]39[/C][C] 0.04414[/C][C] 0.08828[/C][C] 0.9559[/C][/ROW]
[ROW][C]40[/C][C] 0.03241[/C][C] 0.06483[/C][C] 0.9676[/C][/ROW]
[ROW][C]41[/C][C] 0.02345[/C][C] 0.04691[/C][C] 0.9765[/C][/ROW]
[ROW][C]42[/C][C] 0.01638[/C][C] 0.03276[/C][C] 0.9836[/C][/ROW]
[ROW][C]43[/C][C] 0.04279[/C][C] 0.08558[/C][C] 0.9572[/C][/ROW]
[ROW][C]44[/C][C] 0.03936[/C][C] 0.07872[/C][C] 0.9606[/C][/ROW]
[ROW][C]45[/C][C] 0.03443[/C][C] 0.06886[/C][C] 0.9656[/C][/ROW]
[ROW][C]46[/C][C] 0.03262[/C][C] 0.06523[/C][C] 0.9674[/C][/ROW]
[ROW][C]47[/C][C] 0.02351[/C][C] 0.04701[/C][C] 0.9765[/C][/ROW]
[ROW][C]48[/C][C] 0.01762[/C][C] 0.03525[/C][C] 0.9824[/C][/ROW]
[ROW][C]49[/C][C] 0.01486[/C][C] 0.02972[/C][C] 0.9851[/C][/ROW]
[ROW][C]50[/C][C] 0.01038[/C][C] 0.02075[/C][C] 0.9896[/C][/ROW]
[ROW][C]51[/C][C] 0.009567[/C][C] 0.01913[/C][C] 0.9904[/C][/ROW]
[ROW][C]52[/C][C] 0.1221[/C][C] 0.2442[/C][C] 0.8779[/C][/ROW]
[ROW][C]53[/C][C] 0.1125[/C][C] 0.225[/C][C] 0.8875[/C][/ROW]
[ROW][C]54[/C][C] 0.318[/C][C] 0.636[/C][C] 0.682[/C][/ROW]
[ROW][C]55[/C][C] 0.3062[/C][C] 0.6125[/C][C] 0.6938[/C][/ROW]
[ROW][C]56[/C][C] 0.3655[/C][C] 0.7309[/C][C] 0.6345[/C][/ROW]
[ROW][C]57[/C][C] 0.3446[/C][C] 0.6892[/C][C] 0.6554[/C][/ROW]
[ROW][C]58[/C][C] 0.333[/C][C] 0.6661[/C][C] 0.667[/C][/ROW]
[ROW][C]59[/C][C] 0.3114[/C][C] 0.6228[/C][C] 0.6886[/C][/ROW]
[ROW][C]60[/C][C] 0.3[/C][C] 0.5999[/C][C] 0.7[/C][/ROW]
[ROW][C]61[/C][C] 0.2556[/C][C] 0.5111[/C][C] 0.7444[/C][/ROW]
[ROW][C]62[/C][C] 0.3233[/C][C] 0.6465[/C][C] 0.6767[/C][/ROW]
[ROW][C]63[/C][C] 0.3323[/C][C] 0.6645[/C][C] 0.6677[/C][/ROW]
[ROW][C]64[/C][C] 0.2963[/C][C] 0.5926[/C][C] 0.7037[/C][/ROW]
[ROW][C]65[/C][C] 0.2495[/C][C] 0.499[/C][C] 0.7505[/C][/ROW]
[ROW][C]66[/C][C] 0.2606[/C][C] 0.5213[/C][C] 0.7394[/C][/ROW]
[ROW][C]67[/C][C] 0.2203[/C][C] 0.4407[/C][C] 0.7797[/C][/ROW]
[ROW][C]68[/C][C] 0.3577[/C][C] 0.7153[/C][C] 0.6423[/C][/ROW]
[ROW][C]69[/C][C] 0.3204[/C][C] 0.6408[/C][C] 0.6796[/C][/ROW]
[ROW][C]70[/C][C] 0.274[/C][C] 0.548[/C][C] 0.726[/C][/ROW]
[ROW][C]71[/C][C] 0.2445[/C][C] 0.4891[/C][C] 0.7555[/C][/ROW]
[ROW][C]72[/C][C] 0.2157[/C][C] 0.4314[/C][C] 0.7843[/C][/ROW]
[ROW][C]73[/C][C] 0.1727[/C][C] 0.3454[/C][C] 0.8273[/C][/ROW]
[ROW][C]74[/C][C] 0.1575[/C][C] 0.3149[/C][C] 0.8425[/C][/ROW]
[ROW][C]75[/C][C] 0.1239[/C][C] 0.2478[/C][C] 0.8761[/C][/ROW]
[ROW][C]76[/C][C] 0.09587[/C][C] 0.1917[/C][C] 0.9041[/C][/ROW]
[ROW][C]77[/C][C] 0.07327[/C][C] 0.1465[/C][C] 0.9267[/C][/ROW]
[ROW][C]78[/C][C] 0.1604[/C][C] 0.3208[/C][C] 0.8396[/C][/ROW]
[ROW][C]79[/C][C] 0.1725[/C][C] 0.345[/C][C] 0.8275[/C][/ROW]
[ROW][C]80[/C][C] 0.525[/C][C] 0.9499[/C][C] 0.475[/C][/ROW]
[ROW][C]81[/C][C] 0.5345[/C][C] 0.9309[/C][C] 0.4655[/C][/ROW]
[ROW][C]82[/C][C] 0.4824[/C][C] 0.9649[/C][C] 0.5176[/C][/ROW]
[ROW][C]83[/C][C] 0.7579[/C][C] 0.4843[/C][C] 0.2421[/C][/ROW]
[ROW][C]84[/C][C] 0.7044[/C][C] 0.5912[/C][C] 0.2956[/C][/ROW]
[ROW][C]85[/C][C] 0.7074[/C][C] 0.5852[/C][C] 0.2926[/C][/ROW]
[ROW][C]86[/C][C] 0.7272[/C][C] 0.5455[/C][C] 0.2728[/C][/ROW]
[ROW][C]87[/C][C] 0.6566[/C][C] 0.6868[/C][C] 0.3434[/C][/ROW]
[ROW][C]88[/C][C] 0.6262[/C][C] 0.7476[/C][C] 0.3738[/C][/ROW]
[ROW][C]89[/C][C] 0.9668[/C][C] 0.06646[/C][C] 0.03323[/C][/ROW]
[ROW][C]90[/C][C] 0.976[/C][C] 0.04801[/C][C] 0.02401[/C][/ROW]
[ROW][C]91[/C][C] 0.9454[/C][C] 0.1092[/C][C] 0.05458[/C][/ROW]
[ROW][C]92[/C][C] 0.909[/C][C] 0.182[/C][C] 0.091[/C][/ROW]
[ROW][C]93[/C][C] 0.8478[/C][C] 0.3045[/C][C] 0.1522[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=302244&T=5

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=302244&T=5

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Goldfeld-Quandt test for Heteroskedasticity
p-valuesAlternative Hypothesis
breakpoint indexgreater2-sidedless
7 0.6136 0.7728 0.3864
8 0.4535 0.907 0.5465
9 0.396 0.7919 0.604
10 0.2814 0.5627 0.7186
11 0.1947 0.3893 0.8053
12 0.1267 0.2535 0.8733
13 0.07953 0.1591 0.9205
14 0.04775 0.09549 0.9523
15 0.02782 0.05564 0.9722
16 0.01556 0.03113 0.9844
17 0.008339 0.01668 0.9917
18 0.02382 0.04765 0.9762
19 0.0191 0.0382 0.9809
20 0.01852 0.03703 0.9815
21 0.0497 0.09941 0.9503
22 0.04564 0.09129 0.9544
23 0.04311 0.08623 0.9569
24 0.03865 0.0773 0.9614
25 0.05831 0.1166 0.9417
26 0.05135 0.1027 0.9486
27 0.07333 0.1467 0.9267
28 0.0535 0.107 0.9465
29 0.03754 0.07508 0.9625
30 0.02573 0.05146 0.9743
31 0.03551 0.07102 0.9645
32 0.03379 0.06757 0.9662
33 0.1067 0.2135 0.8933
34 0.09541 0.1908 0.9046
35 0.0942 0.1884 0.9058
36 0.08391 0.1678 0.9161
37 0.06402 0.128 0.936
38 0.0592 0.1184 0.9408
39 0.04414 0.08828 0.9559
40 0.03241 0.06483 0.9676
41 0.02345 0.04691 0.9765
42 0.01638 0.03276 0.9836
43 0.04279 0.08558 0.9572
44 0.03936 0.07872 0.9606
45 0.03443 0.06886 0.9656
46 0.03262 0.06523 0.9674
47 0.02351 0.04701 0.9765
48 0.01762 0.03525 0.9824
49 0.01486 0.02972 0.9851
50 0.01038 0.02075 0.9896
51 0.009567 0.01913 0.9904
52 0.1221 0.2442 0.8779
53 0.1125 0.225 0.8875
54 0.318 0.636 0.682
55 0.3062 0.6125 0.6938
56 0.3655 0.7309 0.6345
57 0.3446 0.6892 0.6554
58 0.333 0.6661 0.667
59 0.3114 0.6228 0.6886
60 0.3 0.5999 0.7
61 0.2556 0.5111 0.7444
62 0.3233 0.6465 0.6767
63 0.3323 0.6645 0.6677
64 0.2963 0.5926 0.7037
65 0.2495 0.499 0.7505
66 0.2606 0.5213 0.7394
67 0.2203 0.4407 0.7797
68 0.3577 0.7153 0.6423
69 0.3204 0.6408 0.6796
70 0.274 0.548 0.726
71 0.2445 0.4891 0.7555
72 0.2157 0.4314 0.7843
73 0.1727 0.3454 0.8273
74 0.1575 0.3149 0.8425
75 0.1239 0.2478 0.8761
76 0.09587 0.1917 0.9041
77 0.07327 0.1465 0.9267
78 0.1604 0.3208 0.8396
79 0.1725 0.345 0.8275
80 0.525 0.9499 0.475
81 0.5345 0.9309 0.4655
82 0.4824 0.9649 0.5176
83 0.7579 0.4843 0.2421
84 0.7044 0.5912 0.2956
85 0.7074 0.5852 0.2926
86 0.7272 0.5455 0.2728
87 0.6566 0.6868 0.3434
88 0.6262 0.7476 0.3738
89 0.9668 0.06646 0.03323
90 0.976 0.04801 0.02401
91 0.9454 0.1092 0.05458
92 0.909 0.182 0.091
93 0.8478 0.3045 0.1522







Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level130.149425NOK
10% type I error level300.344828NOK

\begin{tabular}{lllllllll}
\hline
Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity \tabularnewline
Description & # significant tests & % significant tests & OK/NOK \tabularnewline
1% type I error level & 0 &  0 & OK \tabularnewline
5% type I error level & 13 & 0.149425 & NOK \tabularnewline
10% type I error level & 30 & 0.344828 & NOK \tabularnewline
\hline
\end{tabular}
%Source: https://freestatistics.org/blog/index.php?pk=302244&T=6

[TABLE]
[ROW][C]Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity[/C][/ROW]
[ROW][C]Description[/C][C]# significant tests[/C][C]% significant tests[/C][C]OK/NOK[/C][/ROW]
[ROW][C]1% type I error level[/C][C]0[/C][C] 0[/C][C]OK[/C][/ROW]
[ROW][C]5% type I error level[/C][C]13[/C][C]0.149425[/C][C]NOK[/C][/ROW]
[ROW][C]10% type I error level[/C][C]30[/C][C]0.344828[/C][C]NOK[/C][/ROW]
[/TABLE]
Source: https://freestatistics.org/blog/index.php?pk=302244&T=6

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=302244&T=6

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity
Description# significant tests% significant testsOK/NOK
1% type I error level0 0OK
5% type I error level130.149425NOK
10% type I error level300.344828NOK







Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.6929, df1 = 2, df2 = 94, p-value = 0.1895
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.60765, df1 = 6, df2 = 90, p-value = 0.7236
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.1494, df1 = 2, df2 = 94, p-value = 0.3213

\begin{tabular}{lllllllll}
\hline
Ramsey RESET F-Test for powers (2 and 3) of fitted values \tabularnewline
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.6929, df1 = 2, df2 = 94, p-value = 0.1895
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of regressors \tabularnewline
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.60765, df1 = 6, df2 = 90, p-value = 0.7236
\tabularnewline Ramsey RESET F-Test for powers (2 and 3) of principal components \tabularnewline
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.1494, df1 = 2, df2 = 94, p-value = 0.3213
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=302244&T=7

[TABLE]
[ROW][C]Ramsey RESET F-Test for powers (2 and 3) of fitted values[/C][/ROW]
[ROW][C]
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.6929, df1 = 2, df2 = 94, p-value = 0.1895
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of regressors[/C][/ROW] [ROW][C]
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.60765, df1 = 6, df2 = 90, p-value = 0.7236
[/C][/ROW] [ROW][C]Ramsey RESET F-Test for powers (2 and 3) of principal components[/C][/ROW] [ROW][C]
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.1494, df1 = 2, df2 = 94, p-value = 0.3213
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=302244&T=7

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=302244&T=7

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Ramsey RESET F-Test for powers (2 and 3) of fitted values
> reset_test_fitted
	RESET test
data:  mylm
RESET = 1.6929, df1 = 2, df2 = 94, p-value = 0.1895
Ramsey RESET F-Test for powers (2 and 3) of regressors
> reset_test_regressors
	RESET test
data:  mylm
RESET = 0.60765, df1 = 6, df2 = 90, p-value = 0.7236
Ramsey RESET F-Test for powers (2 and 3) of principal components
> reset_test_principal_components
	RESET test
data:  mylm
RESET = 1.1494, df1 = 2, df2 = 94, p-value = 0.3213







Variance Inflation Factors (Multicollinearity)
> vif
  Privacy  geslacht   Product 
 9.670027 45.238947 64.012088 

\begin{tabular}{lllllllll}
\hline
Variance Inflation Factors (Multicollinearity) \tabularnewline
> vif
  Privacy  geslacht   Product 
 9.670027 45.238947 64.012088 
\tabularnewline \hline \end{tabular} %Source: https://freestatistics.org/blog/index.php?pk=302244&T=8

[TABLE]
[ROW][C]Variance Inflation Factors (Multicollinearity)[/C][/ROW]
[ROW][C]
> vif
  Privacy  geslacht   Product 
 9.670027 45.238947 64.012088 
[/C][/ROW] [/TABLE] Source: https://freestatistics.org/blog/index.php?pk=302244&T=8

Globally Unique Identifier (entire table): ba.freestatistics.org/blog/index.php?pk=302244&T=8

As an alternative you can also use a QR Code:  

The GUIDs for individual cells are displayed in the table below:

Variance Inflation Factors (Multicollinearity)
> vif
  Privacy  geslacht   Product 
 9.670027 45.238947 64.012088 



Parameters (Session):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
Parameters (R input):
par1 = 1 ; par2 = Do not include Seasonal Dummies ; par3 = No Linear Trend ; par4 = 0 ; par5 = 0 ;
R code (references can be found in the software module):
library(lattice)
library(lmtest)
library(car)
library(MASS)
n25 <- 25 #minimum number of obs. for Goldfeld-Quandt test
mywarning <- ''
par1 <- as.numeric(par1)
if(is.na(par1)) {
par1 <- 1
mywarning = 'Warning: you did not specify the column number of the endogenous series! The first column was selected by default.'
}
if (par4=='') par4 <- 0
par4 <- as.numeric(par4)
if (par5=='') par5 <- 0
par5 <- as.numeric(par5)
x <- na.omit(t(y))
k <- length(x[1,])
n <- length(x[,1])
x1 <- cbind(x[,par1], x[,1:k!=par1])
mycolnames <- c(colnames(x)[par1], colnames(x)[1:k!=par1])
colnames(x1) <- mycolnames #colnames(x)[par1]
x <- x1
if (par3 == 'First Differences'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'Seasonal Differences (s=12)'){
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if (par3 == 'First and Seasonal Differences (s=12)'){
(n <- n -1)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+1,j] - x[i,j]
}
}
x <- x2
(n <- n - 12)
x2 <- array(0, dim=c(n,k), dimnames=list(1:n, paste('(1-B12)',colnames(x),sep='')))
for (i in 1:n) {
for (j in 1:k) {
x2[i,j] <- x[i+12,j] - x[i,j]
}
}
x <- x2
}
if(par4 > 0) {
x2 <- array(0, dim=c(n-par4,par4), dimnames=list(1:(n-par4), paste(colnames(x)[par1],'(t-',1:par4,')',sep='')))
for (i in 1:(n-par4)) {
for (j in 1:par4) {
x2[i,j] <- x[i+par4-j,par1]
}
}
x <- cbind(x[(par4+1):n,], x2)
n <- n - par4
}
if(par5 > 0) {
x2 <- array(0, dim=c(n-par5*12,par5), dimnames=list(1:(n-par5*12), paste(colnames(x)[par1],'(t-',1:par5,'s)',sep='')))
for (i in 1:(n-par5*12)) {
for (j in 1:par5) {
x2[i,j] <- x[i+par5*12-j*12,par1]
}
}
x <- cbind(x[(par5*12+1):n,], x2)
n <- n - par5*12
}
if (par2 == 'Include Monthly Dummies'){
x2 <- array(0, dim=c(n,11), dimnames=list(1:n, paste('M', seq(1:11), sep ='')))
for (i in 1:11){
x2[seq(i,n,12),i] <- 1
}
x <- cbind(x, x2)
}
if (par2 == 'Include Quarterly Dummies'){
x2 <- array(0, dim=c(n,3), dimnames=list(1:n, paste('Q', seq(1:3), sep ='')))
for (i in 1:3){
x2[seq(i,n,4),i] <- 1
}
x <- cbind(x, x2)
}
(k <- length(x[n,]))
if (par3 == 'Linear Trend'){
x <- cbind(x, c(1:n))
colnames(x)[k+1] <- 't'
}
print(x)
(k <- length(x[n,]))
head(x)
df <- as.data.frame(x)
(mylm <- lm(df))
(mysum <- summary(mylm))
if (n > n25) {
kp3 <- k + 3
nmkm3 <- n - k - 3
gqarr <- array(NA, dim=c(nmkm3-kp3+1,3))
numgqtests <- 0
numsignificant1 <- 0
numsignificant5 <- 0
numsignificant10 <- 0
for (mypoint in kp3:nmkm3) {
j <- 0
numgqtests <- numgqtests + 1
for (myalt in c('greater', 'two.sided', 'less')) {
j <- j + 1
gqarr[mypoint-kp3+1,j] <- gqtest(mylm, point=mypoint, alternative=myalt)$p.value
}
if (gqarr[mypoint-kp3+1,2] < 0.01) numsignificant1 <- numsignificant1 + 1
if (gqarr[mypoint-kp3+1,2] < 0.05) numsignificant5 <- numsignificant5 + 1
if (gqarr[mypoint-kp3+1,2] < 0.10) numsignificant10 <- numsignificant10 + 1
}
gqarr
}
bitmap(file='test0.png')
plot(x[,1], type='l', main='Actuals and Interpolation', ylab='value of Actuals and Interpolation (dots)', xlab='time or index')
points(x[,1]-mysum$resid)
grid()
dev.off()
bitmap(file='test1.png')
plot(mysum$resid, type='b', pch=19, main='Residuals', ylab='value of Residuals', xlab='time or index')
grid()
dev.off()
bitmap(file='test2.png')
sresid <- studres(mylm)
hist(sresid, freq=FALSE, main='Distribution of Studentized Residuals')
xfit<-seq(min(sresid),max(sresid),length=40)
yfit<-dnorm(xfit)
lines(xfit, yfit)
grid()
dev.off()
bitmap(file='test3.png')
densityplot(~mysum$resid,col='black',main='Residual Density Plot', xlab='values of Residuals')
dev.off()
bitmap(file='test4.png')
qqPlot(mylm, main='QQ Plot')
grid()
dev.off()
(myerror <- as.ts(mysum$resid))
bitmap(file='test5.png')
dum <- cbind(lag(myerror,k=1),myerror)
dum
dum1 <- dum[2:length(myerror),]
dum1
z <- as.data.frame(dum1)
print(z)
plot(z,main=paste('Residual Lag plot, lowess, and regression line'), ylab='values of Residuals', xlab='lagged values of Residuals')
lines(lowess(z))
abline(lm(z))
grid()
dev.off()
bitmap(file='test6.png')
acf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Autocorrelation Function')
grid()
dev.off()
bitmap(file='test7.png')
pacf(mysum$resid, lag.max=length(mysum$resid)/2, main='Residual Partial Autocorrelation Function')
grid()
dev.off()
bitmap(file='test8.png')
opar <- par(mfrow = c(2,2), oma = c(0, 0, 1.1, 0))
plot(mylm, las = 1, sub='Residual Diagnostics')
par(opar)
dev.off()
if (n > n25) {
bitmap(file='test9.png')
plot(kp3:nmkm3,gqarr[,2], main='Goldfeld-Quandt test',ylab='2-sided p-value',xlab='breakpoint')
grid()
dev.off()
}
load(file='createtable')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Estimated Regression Equation', 1, TRUE)
a<-table.row.end(a)
myeq <- colnames(x)[1]
myeq <- paste(myeq, '[t] = ', sep='')
for (i in 1:k){
if (mysum$coefficients[i,1] > 0) myeq <- paste(myeq, '+', '')
myeq <- paste(myeq, signif(mysum$coefficients[i,1],6), sep=' ')
if (rownames(mysum$coefficients)[i] != '(Intercept)') {
myeq <- paste(myeq, rownames(mysum$coefficients)[i], sep='')
if (rownames(mysum$coefficients)[i] != 't') myeq <- paste(myeq, '[t]', sep='')
}
}
myeq <- paste(myeq, ' + e[t]')
a<-table.row.start(a)
a<-table.element(a, myeq)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, mywarning)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable1.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Multiple Linear Regression - Ordinary Least Squares', 6, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Variable',header=TRUE)
a<-table.element(a,'Parameter',header=TRUE)
a<-table.element(a,'S.D.',header=TRUE)
a<-table.element(a,'T-STAT
H0: parameter = 0',header=TRUE)
a<-table.element(a,'2-tail p-value',header=TRUE)
a<-table.element(a,'1-tail p-value',header=TRUE)
a<-table.row.end(a)
for (i in 1:k){
a<-table.row.start(a)
a<-table.element(a,rownames(mysum$coefficients)[i],header=TRUE)
a<-table.element(a,formatC(signif(mysum$coefficients[i,1],5),format='g',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,2],5),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,3],4),format='e',flag='+'))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4],4),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$coefficients[i,4]/2,4),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable2.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Regression Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple R',1,TRUE)
a<-table.element(a,formatC(signif(sqrt(mysum$r.squared),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Adjusted R-squared',1,TRUE)
a<-table.element(a,formatC(signif(mysum$adj.r.squared,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (value)',1,TRUE)
a<-table.element(a,formatC(signif(mysum$fstatistic[1],6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF numerator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[2],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'F-TEST (DF denominator)',1,TRUE)
a<-table.element(a, signif(mysum$fstatistic[3],6))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'p-value',1,TRUE)
a<-table.element(a,formatC(signif(1-pf(mysum$fstatistic[1],mysum$fstatistic[2],mysum$fstatistic[3]),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Residual Statistics', 2, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Residual Standard Deviation',1,TRUE)
a<-table.element(a,formatC(signif(mysum$sigma,6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Sum Squared Residuals',1,TRUE)
a<-table.element(a,formatC(signif(sum(myerror*myerror),6),format='g',flag=' '))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable3.tab')
myr <- as.numeric(mysum$resid)
myr
if(n < 200) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a, 'Multiple Linear Regression - Actuals, Interpolation, and Residuals', 4, TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a, 'Time or Index', 1, TRUE)
a<-table.element(a, 'Actuals', 1, TRUE)
a<-table.element(a, 'Interpolation
Forecast', 1, TRUE)
a<-table.element(a, 'Residuals
Prediction Error', 1, TRUE)
a<-table.row.end(a)
for (i in 1:n) {
a<-table.row.start(a)
a<-table.element(a,i, 1, TRUE)
a<-table.element(a,formatC(signif(x[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(x[i]-mysum$resid[i],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(mysum$resid[i],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable4.tab')
if (n > n25) {
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'p-values',header=TRUE)
a<-table.element(a,'Alternative Hypothesis',3,header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'breakpoint index',header=TRUE)
a<-table.element(a,'greater',header=TRUE)
a<-table.element(a,'2-sided',header=TRUE)
a<-table.element(a,'less',header=TRUE)
a<-table.row.end(a)
for (mypoint in kp3:nmkm3) {
a<-table.row.start(a)
a<-table.element(a,mypoint,header=TRUE)
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,1],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,2],6),format='g',flag=' '))
a<-table.element(a,formatC(signif(gqarr[mypoint-kp3+1,3],6),format='g',flag=' '))
a<-table.row.end(a)
}
a<-table.end(a)
table.save(a,file='mytable5.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Meta Analysis of Goldfeld-Quandt test for Heteroskedasticity',4,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Description',header=TRUE)
a<-table.element(a,'# significant tests',header=TRUE)
a<-table.element(a,'% significant tests',header=TRUE)
a<-table.element(a,'OK/NOK',header=TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'1% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant1,6))
a<-table.element(a,formatC(signif(numsignificant1/numgqtests,6),format='g',flag=' '))
if (numsignificant1/numgqtests < 0.01) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'5% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant5,6))
a<-table.element(a,signif(numsignificant5/numgqtests,6))
if (numsignificant5/numgqtests < 0.05) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'10% type I error level',header=TRUE)
a<-table.element(a,signif(numsignificant10,6))
a<-table.element(a,signif(numsignificant10/numgqtests,6))
if (numsignificant10/numgqtests < 0.1) dum <- 'OK' else dum <- 'NOK'
a<-table.element(a,dum)
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable6.tab')
}
}
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of fitted values',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_fitted <- resettest(mylm,power=2:3,type='fitted')
a<-table.element(a,paste('
',RC.texteval('reset_test_fitted'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of regressors',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_regressors <- resettest(mylm,power=2:3,type='regressor')
a<-table.element(a,paste('
',RC.texteval('reset_test_regressors'),'
',sep=''))
a<-table.row.end(a)
a<-table.row.start(a)
a<-table.element(a,'Ramsey RESET F-Test for powers (2 and 3) of principal components',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
reset_test_principal_components <- resettest(mylm,power=2:3,type='princomp')
a<-table.element(a,paste('
',RC.texteval('reset_test_principal_components'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable8.tab')
a<-table.start()
a<-table.row.start(a)
a<-table.element(a,'Variance Inflation Factors (Multicollinearity)',1,TRUE)
a<-table.row.end(a)
a<-table.row.start(a)
vif <- vif(mylm)
a<-table.element(a,paste('
',RC.texteval('vif'),'
',sep=''))
a<-table.row.end(a)
a<-table.end(a)
table.save(a,file='mytable9.tab')